The cache system can be located between the CPU and the MMU (i.e., a logical cache), or between the MMU and the system random access memory (i.e., a physical cache). What factors determine the optimum location of cache memory?
What will be an ideal response?
If the cache is on the CPU side of the MMU, it is caching logical addresses (i.e., the addresses generated by the
processor). The advantage of this is that it provides the fastest operation because the cache does not have to wait
for the address translation to be performed by the MMU before carrying out a look up. When a context switch
(task switch) takes place and the MMU page tables are reloaded, the mapping between logical and physical
memory changes and the logical cache must be flushed. This takes time.
Another problem with logical cache is that more than one logical address can share the same physical address
(shared data or code). If two virtual addresses that are mapped onto the same physical address are cached, then
both virtual addresses will be cached even though they refer to the same physical location. It is then possible for
one cached logical address to be updated, leaving the other one stale.
If the cache is on the memory side of the MMU, the addresses cached are physical addresses. This can incur a
delay due to address translation. However, as physical data is cached, there is no problem with context (task)
switching and the cache need not be flushed.
You might also like to view...
The definition of PII:
a. Is name, date of birth, and home address b. Is name, date of birth, home address, and home telephone number c. Is name, date of birth, and social insurance number d. Varies by jurisdiction and regulation
An example of software with a GUI is:
a) Windows b) Internet Explorer c) Visual Studio d) All of the above
In Java, a derived class can have ________ base class(es).
(a) one (b) two (c) three (d) there is no limit
In the ________ pane, there are two tabs: ACTIVE and ALL
Fill in the blank(s) with correct word