Programs and data stored in the same memory. Memory acts as an ideal device to store instructions and operands (RAM). Realistically, memory is a combination of storage systems with very different properties. Common types of “memory” are:
Static Random Access Memory
Dynamic Random Access Memory
Non-Voltaile Random Access Memory
Hard Disk Drive
Type | Size | Access Latency | Unit |
---|---|---|---|
Registers | 8 - 32 words | 0-1 cycles | ns |
On-board SRAM | 32 - 256 KB | 1-3 cycles | ns |
Off-board SRAM | 256 KB - 16 MB | ~10 cycles | ns |
DRAM | 128 MB - 64 GB | ~100 cycles | ns |
SSD | \(\leqslant\) 1 TB | ~10k cycles | \(\mu\)s |
HDD | \(\leqslant\) 4 TB | ~10mil cycles | ms |
Have to compromise, between smaller faster costlier vs larger slower cheaper. Use the fast as much as possible, fall back on the slow. Compiler optimization is also considered as a cache because it can reduce memory references in the code. Registers are also cool but they are pretty small in terms of size.
One option is to manage the cache directly from the code. That’s a pretty terrible idea though since then you are locked into the cache implementation from your code, and the cache would have to somehow be shared between processes. Caching is a hardware-level concern, it’s the job of the memory management unit (MMU). It is useful to know how it works though, so we can write code that cooperates with it.
Where to store cached data? How do we map address \(k\) to a cache slot?