NPU-Accelerated Imitation Learning for Thermal- and QoS-Aware Optimization of Heterogeneous Multi-Cores
Martin Rappa, Nikita Krohmerb, Heba Khdrc and Jörg Henkeld
Chair for Embedded Systems, Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany
amartin.rapp@kit.edu
bnikita-krohmer@web.de
cheba.khdr@kit.edu
dheba.khdr@kit.edu
ABSTRACT
Task migration and dynamic voltage and frequency scaling (DVFS) are indispensable means in thermal optimization of a heterogeneous clustered multi-core processor under userdefined quality of service (QoS) targets. However, selecting the core to execute each application and the voltage/frequency (V/f) levels of each cluster is a complex problem because 1) the diverse characteristics and QoS targets of applications require different optimizations, and 2) V/f levels are often shared between cores on a cluster, which requires a global optimization considering all running applications. State-of-the-art techniques for power or temperature minimization either rely on measurements that are often not available (such as power) or fail to consider all the dimensions of the problem (e.g., by using simplified analytical models). Imitation learning (IL) enables to use the optimality of an oracle policy, yet at low run-time overhead, by training a model from oracle demonstrations. We are the first to employ IL for temperature minimization under QoS targets. We tackle the complexity by using a neural network (NN) model and accelerate the NN inference using a neural processing unit (NPU). While such NN accelerators are becoming increasingly widespread on end devices, they are so far only used to accelerate user applications. In contrast, we use an accelerator on a real platform to accelerate NN-based resource management. Our evaluation on a HiKey970 board with an Arm big.LITTLE CPU and an NPU shows significant temperature reductions at a negligible overhead while satisfying QoS targets.
Keywords: Imitation Learning, Neural Networks, Neural Processing Unit, Temperature Minimization, Quality of Service.