How does your CPU Cache work?
Author: Stephen Orgill
What is the CPU Cache?
The cache on your CPU has become a very important part of today's computing. The cache is a very high speed and very expensive piece of memory, which is used to speed up the memory retrieval process. Due to its expensive CPU's come with a relatively small amount of cache compared with the main system memory. Budget CPU's have even less cache, this is the main way that the top processor manufacturers take the cost out of their budget CPU's.
How does the CPU Cache work?
Without the cache memory every time the CPU requested data it would send a request to the main memory which would then be sent back across the memory bus to the CPU. This is a slow process in computing terms. The idea of the cache is that this extremely fast memory would store and data that is frequently accessed and also if possible the data that is around it. This is to achieve the quickest possible response time to the CPU. Its based on playing the percentages. If a certain piece of data has been requested 5 times before, its likely that this specific piece of data will be required again and so is stored in the cache memory.
Lets take a library as an example o how caching works. Imagine a large library but with only one librarian (the standard one CPU setup). The first person comes into the library and asks for Lord of the Rings. The librarian goes off follows the path to the bookshelves (Memory Bus) retrieves the book and gives it to the person. The book is returned to the library once its finished with. Now without cache the book would be returned to the shelf. When the next person arrives and asks for Lord of the Rings, the same process happens and takes the same amount of time.
If this library had a cache system then once the book was returned it would have been put on a shelf at the librarians desk. This way once the second person comes in and asks for Lord of the Rings, the librarian only has to reach down to the shelf and retrieve the book. This significantly reduces the time it takes to retrieve the book. Back to computing this is the same idea, the data in the cache is retrieved much quicker. The computer uses its logic to determine which data is the most frequently accessed and keeps them books on the shelf so to speak.
That is a one level cache system which is used in most hard drives and other components. CPU's however use a 2 level cache system. The principles are the same. The level 1 cache is the fastest and smallest memory, level 2 cache is larger and slightly slower but still smaller and faster than the main memory. Going back to the library, when Lord of the Rings is returned this time it will be stored on the shelf. This time the library gets busy and lots of other books are returned and the shelf soon fills up. Lord of the Rings hasn't been taken out for a while and so gets taken off the shelf and put into a bookcase behind the desk. The bookcase is still closer than the rest of the library and still quick to get to. Now when the next person come in asking for Lord of the Rings, the librarian will firstly look on the shelf and see that the book isn't there. They will then proceed to the bookcase to see if the book is in there. This is the same for CPU's. They check the L1 cache first and then check the L2 cache for the data they require.
Is more Cache always better?
The answer is mostly yes but certainly not always. The main problem with having too much cache memory is that the CPU will always check the cache memory before the main system memory. Looking at our library again as an example. If 20 different people come into the library all after different books that haven't been taken out in quite a while but the library has been busy before and so the shelf and the bookcase are both full we have a problem. Each time a person asks for a book the librarian will check the shelf and then check the bookcase before realising that the book has to be in the main library. The librarian each time then trots off to get the book from the library. If this library had a non cache system it would actually be quicker in this instance because the librarian would go straight to the book in the main library instead of checking the shelf and the bookcase.
As the fact that non cache systems only work in certain circumstances and so in certain applications CPU's are definitely better with a decent amount of cache. Applications such as MPEG encoders are not good cache users because they have a constant stream of completely different data.
Does cache only store frequently accessed data?
If the cache memory has space it will store data that is close to that of the frequently accessed data. Looking back again to our library. If the first person of the day comes into the library and takes out Lord of the Rings, the intelligent librarian may well place Lord of the Rings part II on the shelf. In this case when the person brings back the book, there is a good chance that they will ask for Lord of the Rings part II. As this will happen more times than not. It was well worth the Librarian going to fetch the second part of the book in case it was required.
Cache Hit and Cache Miss
Cache hit and cache miss are just simple terms for the accuracy of what goes into the CPU's cache. If the CPU accesses its cache looking for data it will either find it or it wont. If the CPU finds what's its after that's called a cache hit. If it has to go to main memory to find it then that is called a cache miss. The percentage of hits from the overall cache requests is called the hit rate. You will be wanting to get this as high as possible for best performance.
Article Source: http://www.articlealley.com/article_34618_10.html
About the Author: Stephen Orgill
Editor - www.pantherproducts.co.uk
Computer related articles and reviews
http://www.pantherproducts.co.uk
No comments:
Post a Comment