*** Welcome to piglix ***

Write-back


In data processing systems (computers), a cache memory or memory cache (or sometimes also CPU cache (*)) is a fast and relatively small memory, not visible to the software, that is completely handled by the hardware, that stores the most recently used (MRU) main memory (MM) (or working memory) data.

The function of the cache memory is to speed up the MM data access (performance increasing) and most important, in multiprocessor systems with shared memory, to reduce the system bus and MM traffic that is one of the major bottleneck of these systems.

Cache memory makes use of the fast technology SRAM (static random-access memory cells), against a slower MM DRAM (dynamic random-access memory), connected directly to the processor(s).

The term "cache" is derived from the French (pronounced /ˈkæʃ/cash/ (deprecated template)) and means "hidden".

This term has many meanings depending on the context. Examples are: disk cache, TLB (translation lookaside buffer) (Page Table cache), branch prediction cache, branch history table, Branch Address Cache, trace cache, that are physical memories. Others are handled by the software, to store temporary data in reserved MM space (again disk cache (page cache), system cache, application cache, database cache, web cache, DNS cache, browser cache, router cache, etc.). Some of these last ones are actually only "buffers" , that is a non-associative memory with sequential access (strings of data) against the random accesses through an associative "memory-to-cache" address of a classic cache.


...
Wikipedia

...