HMD-Hardener: Adversarially Robust and Efficient Hardware-Assisted Runtime Malware Detection

Abhijitt Dhavlle1,a, Sanket Shukla1,b, Setareh Rafatirad2,a, Houman Homayoun2,b and Sai Manoj Pudukotai Dinakarrao1,c
1George Mason University, Fairfax, VA, USA
aadhavlle@gmu.edu
bsshukla4@gmu.edu
cspudukot@gmu.edu
2University of California Davis, Davis, CA, USA
asrafatirad@ucdavis.edu
bhhomayoun@ucdavis.edu

ABSTRACT


To overcome the performance overheads incurred by the traditional software-based malware detection techniques, machine learning (ML) based Hardware-assisted Malware Detection (HMD) has emerged as a panacea to detect malicious applications and provide security. HMD primarily relies on the generated low-level microarchitectural events captured through Hardware Performance Counters (HPCs). This work proposes an adversarial attack on the HMD systems to tamper the security by introducing perturbations in performance counter traces with an adversarial sample generator application. To craft the attack, we first deploy an adversarial sample predictor to predict the adversarial HPC pattern for a given application to be misclassified by the deployed ML classifier in the HMD. Further, as the attacker has no direct access to manipulate the HPCs generated during runtime, based on the adversarial sample predictor’s output, devise an adversarial sample generator wrapped around the victim application to produce HPC patterns similar to the adversarial predictor’s estimated trace. With the proposed attack, malware detection accuracy is reduced to 18.1% from 82%. To render the HMD robust to such attacks, we further propose adversarially training the HMD to demonstrate that hardening can render HMD resilient against attacks; the detection accuracy post hardening raises to 81.2%.



Full Text (PDF)