Memory
INTRODUCTION
A CPU cache is a cache used by the central processing unit (CPU) of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which stores copies of the data from frequently used main memory locations. Most CPUs have different independent caches, including instruction and data caches, where the data cache is usually organized as a hierarchy of more cache levels (L1, L2 etc.)
When the processor needs to read from or write to a location in main memory, it first checks whether a copy of that data is in the cache. If so, the processor immediately reads from or writes to the cache, which is much faster than reading from or writing to main memory.
Most modern desktop and server CPUs have at least three independent caches: an instruction cache to speed up executable instruction fetch, a data cache to speed up data fetch and store, and a translation look aside buffer (TLB) used to speed up virtual-to-physical address translation for both executable instructions and data. The data cache is usually organized as a hierarchy of more cache levels (L1, L2, etc.; see Multi-level caches).
TYPES OF CACHE MEMORY
There are generally two types of cache memory:-
1. MEMORY CACHE
2. DISK CACHE
1. MEMORY CACHE : A Memory Cache, sometimes called cache store or a RAM cache, is a portion of memory made of high speed static RAM (SRAM) instead of slower and dynamic RAM (DRAM) used for main memory. Memory caching if effective because most programs access the same data or instruction over and over. By keeping as much of this information as possible in SRAM, the computer avoids the accessing the slower DRAM.
2. DISK CACHE : Disk caching under the same principle of memory caching, but instead of making use of high speed SRAM, a disk cache uses convectional main memory. The most recently accessed data from the disk is stored in the memory buffer. When the programs needs to access data from the disk,