DeepPM: Transformer-based Power and Performance Prediction for Energy-Aware Software

Jun S. Shim1,a, Bogyeong Han1,b, Yeseong Kim2 and Jihong Kim1,c
1Department of Computer Science and Engineering, Seoul National University
ajsshim@davinci.snu.ac.kr
bbkhan@davinci.snu.ac.kr
cjihong@davinci.snu.ac.kr
2Department of Information and Communication Engineering, DGIST
yeseongkim@dgist.ac.kr

ABSTRACT


Many system-level management and optimization techniques need accurate estimates of power consumption and performance. Earlier research has proposed many high-level/sourcelevel estimation modeling works, particularly for basic blocks. However, most of them still need to execute the target software at least once on a fine-grained simulator or real hardware to extract required features. This paper proposes a performance/power prediction framework, called Deep Power Meter (DeepPM), which estimates them accurately only using the compiled binary. Inspired by the deep learning techniques in natural language processing, we convert the program instructions in the form of vectors and predict the average power and performance of basic blocks based on a transformer model. In addition, unlike existing works based on a Long Short-Term Memory (LSTM) model structure, which only works for basic blocks with a small number of instructions, DeepPM provides highly accurate results for long basic blocks, which takes the majority of the execution time for actual application runs. In our evaluation conducted with SPEC2006 benchmark suite, we show that DeepPM can provide accurate prediction for performance and power consumption with 10.2% and 12.3% error, respectively. DeepPM also outperforms the LSTM-based model by up to 67.2% and 34.9% error for performance and power, respectively.

Keywords: Power and Performance Modeling, System Resource Prediction, Transformer.



Full Text (PDF)