Long Short-Term Memory Neural Network-Based Power Forecasting of Multi-Core Processors

Mark Sagi1,a, Martin Rapp2,a, Heba Khdr2,b, Yizhe Zhang1,b, Nael Fasfous1,c, Nguyen Anh Vu Doan1,d, Thomas Wild1,e, Jörg Henkel2,c and Andreas Herkersdorf1,f
1Chair of Integrated Systems (LIS), Technical University of Munich (TUM), Germany
amark.sagi@tum.de
byizhe.zhang@tum.de
cnael.fasfous@tum.de
danhvu.doan@tum.de
ethomas.wild@tum.de
fherkersdorf@tum.de
2Chair for Embedded Systems (CES), Karlsruhe Institute of Technology (KIT), Germany
amartin.rapp@kit.edu
bheba.khdr@kit.edu
chenkel@kit.edu

ABSTRACT


We propose a novel technique to forecast the power consumption of processor cores at run-time. Power consumption varies strongly with different running applications and within their execution phases. Accurately forecasting future power changes is highly relevant for proactive power/thermal management. While forecasting power is straightforward for known or periodic workloads, the challenge for general unknown workloads at different voltage/frequency (v/f)-levels is still unsolved. Our technique is based on a long short-term memory (LSTM) recurrent neural network (RNN) to forecast the average power consumption for both the next 1 ms and 10 ms periods. The runtime inputs for the LSTM RNN are current and past power information as well as performance counter readings. An LSTM RNN enables this forecasting due to its ability to preserve the history of power and performance counters. Our LSTM RNN needs to be trained only once at design-time while adapting during run-time to different system behavior through its internal memory. We demonstrate that our approach accurately forecasts power for unseen applications at different v/f-levels. The experimental results shows that the forecasts of our LSTM RNN provide 43% lower worst case error for the 1ms forecasts and 38% for the 10ms forecasts, compared to the state of the art.

Keywords: Power Forecasting, Recurrent Neural Network, Multi-/Many-Core.



Full Text (PDF)