Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Way Memoization to Reduce Fetch Energy in Instruction Caches Albert Ma, Michael Zhang, and Krste Asanovic
 

Summary: Way Memoization to Reduce Fetch Energy in Instruction Caches
Albert Ma, Michael Zhang, and Krste Asanovi┤c
MIT Laboratory for Computer Science, Cambridge, MA 02139
ama|rzhang|krste @lcs.mit.edu
Abstract
Instruction caches consume a large fraction of the total
power in modern low-power microprocessors. In particu-
lar, set-associative caches, which are preferred because of
lower miss rates, require greater access energy on hits than
direct-mapped caches; this is because of the need to locate
instructions in one of several ways. Way prediction has
been proposed to reduce power dissipation in conventional
set-associative caches; however, its application to CAM-
tagged caches, which are commonly used in low-power
designs, is problematic and has not been quantitatively ex-
amined. We propose way memoization as an alternative to
way prediction. As in way prediction schemes, way mem-
oization stores way information (links) within the instruc-
tion cache, but in addition maintains a valid bit per link that
when set guarantees that the way link is valid. In contrast,

  

Source: AsanoviŠ, Krste - Computer Science and Artificial Intelligence Laboratory & Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology (MIT)
Massachusetts Institute of Technology (MIT), Computer Science and Artificial Intelligence Laboratory, SCALE Group

 

Collections: Computer Technologies and Information Sciences