Title: Energy-Efficient Cache Architecture Towards Extreme-Scale Computing Systems
Ph.D. Candidate: Abdulrahman Alshegaifi
Major Advisor: Dr. Chun-Hsi Huang
Associate Advisors: Dr. Reda A. Ammar, Dr. Sanguthevar Rajasekaran
Date/Time: Friday, December 7, 2018 10:00am
Location: H. Babbidge Library 1947 Conference Room
Abstract:
As we move to the era of exascale systems, where 1,000-core can be integrated in one die, energy efficiency is the most considerable impediments. Future exascale systems that are capable of executing a thousand times as many operations per second as those of current petascale systems are constrained by a power budget of 20MW. A representative current supercomputer typically consumes 17.8 MW. Achieving exaflop performance with nearly the same power of today’s supercomputers is a major research challenge and will force a radical change in all level of the computing stack, including circuits, hardware architectures, software, and applications. One key contributor to processor energy consumption is the cache. Caches are among the main components in processors, and they play important roles in minimizing the speed gap between CPU and main memory. In this work, we investigate an energy-efficient cache architecture towards extreme-scale computing systems. Specifically, we investigate a cache design for L1 data cache that can save energy without sacrificing the performance. Since caches are based on the principle of locality, we start by studying the data locality of two memory regions, i.e. stack and non-stack. Accordingly, we suggest a high-performance non-unified data cache architecture. We evaluate the performance of the proposed non-unified data cache design in comparison to that of a conventional unified data cache. Subsequently, we investigate the energy savings of the non-unified cache architecture. Finally, we aim to study the effect of replacement policies on each individual cache in the non-unified design in order to further optimize the overall performance of the proposed cache architecture.