Cache Issues II
Finally caches have a mapping strategy which moves tells you where to write a given word into cache and when to overwrite with another data value fetched from main memory
Direct mapped caches hash each word of main memory into a unique location in cache
- e.g. If cache has size N bytes, then one could hash memory location m to m mod(N)
Fully associative caches remove that word in cache which was unreferenced for longest time and then stores new value into this spot
Set associative caches combine these ideas. They have 2 to 4 (to ..) locations for each hash value and replace oldest reference in that group.
- This avoids problems when two data values must be in cache but hash to same value