Cache or cache memory is a high speed data storage layer which stores a subset of data, so that future requests for that data are served up faster than by accessing the data’s primary data storage. It allows you to efficiently reuse previously retrieved data by making it accessible from the cache memory instead of the data primary storage location thus making it faster.
What exactly is Caching?
Suppose you are visiting your facebook profile where you can check your posts, friends, profile picture, display picture and other data. Now whenever you open your facebook, it fetches this data from the data server and loads it up on your page. Since this data will be the same (if you don’t make any changes), it will take time to get the data from the database. Now to reduce this time, we add a layer of memory between the Database and your system called the cache memory which will store all the data. So, whenever data is requested for the first time, it will load up from the database and will be stored in the cache memory as well. So when you make the request for the second time, it will get the data from the cache memory instead of the database thus saving the time and making it more efficient.
Cache memory uses the RAM of the system which makes it faster than the normal API calls and data requests.
Cache memory is based on a principle called Locality of Reference. Locality of Reference refers to a phenomenon in which a computer program tends to access the same set of memory locations repeatedly for a particular period of time.
Why Cache Memory?
- Cache memory is faster than the main memory (RAM), which helps accessing the data more efficiently.
- Cache memory is located very close to the CPU, thus reducing the data retrieval time.
- Cache memory holds the data and instructions temporarily which are likely to be used by the CPU again, hence reducing the frequency of accessing the slower main memory.
Benefits of Cache Memory
- It is faster than the main memory. It is closer to the CPU, hence reducing the data retrieval time.
- Cache memory allows the CPU to operate more effectively so that the CPU can spend more time executing other instructions rather than waiting for memory response.
- Cache memory helps reduce the memory latency. Memory latency refers to the time taken for processes to retrieve the data from the memory.
- Cache memory lowers the bus traffic. Accessing data from main memory involves transferring it over to the system bus. Bus is a shared resource and excessive traffic can lead to slower data transfers. By using cache memory, the CPU can reduce the frequency of accessing the main memory resulting in less bus traffic.
- Cache memory helps improve the system scalability by reducing the memory latency and hence improving the system performance.
How Cache Memory works
Whenever a request is made to the server, the CPU first searches the data in the Cache Memory and if it is available there, it fetches the data from it. However, if data is not found in the Cache Memory, it then searches for that data in the primary memory( slower one) and loads it up in the Cache Memory. This ensures that the frequently requested/used data is always available in the Cache Memory and the number of times it is required to access the main memory can be reduced.
Cache Hit/Cache Miss
Cache Hit means the number of times data is found in the cache memorywhenever a data request has been made.
Cache Miss means the number of times data is not found in the cache memory whenever a data request has been made.
For example, if a total of 10 requests have been made, and 6 times data has been found in the cache memory. This means 4 times data is not found in the cache memory.
Thus, Cache Hit will be 6 and Cache Miss will be 4.
The Performance of the cache memory is measured by the number of cache hits to the number of the requests made. This ratio is known as the Hit Ratio
Hit Ratio = Number of cache hits / number of requests made
SImilarly, Miss Ratio is the number of cache miss to the number of requests made.
Miss Ratio = Number of the cache miss / number of requests made
Types of Cache Memory
- Level 1 Cache or L1: This is the first level of the cache memory that is present inside the processor. It is a very small memory which
ranges from 2KB to 64 KB. - Level 2 Cache or L2: This is the second level of the cache memory that may be present inside or even outside of the CPU. It can be shared between 2 cores of the CPU depending on the architecture. This is relatively a bigger memory which ranges from 256 KB to 512 KB.
- Level 3 Cache or L3: This is the third level of cache memory that is present outside of the CPU and is shared by all the cores of the CPU and is shared by all the cores of the CPU. This is used to improve the performance of the L2 and L1 cache. This Cache memory ranges from 1 MB to 8 M
Cache vs RAM
Conclusion
In conclusion, ache memory plays an important role in enhancing the speed and efficiency of the system. By storing frequently accessed data and instructions, cache memory minimizes the time required for the CPU system, thus reducing the latency and improving the overall CPU performance by making it more efficient.