Lookup Table Allocation for Approximate Computing with Memory under Quality Constraints
Ye Tiana, Qian Zhangb, Ting Wangc and Qiang Xud
Department of Computer Science & Engineering, The Chinese University of Hong Kong
Shenzhen Research Institute, The Chinese University of Hong Kong
atianye@cse.cuhk.edu.hk
bqzhang@cse.cuhk.edu.hk
ctwang@cse.cuhk.edu.hk
dqxu@cse.cuhk.edu.hk
ABSTRACT
Computation kernels in emerging recognition, mining, and synthesis (RMS) applications are inherently errorresilient, where approximate computing can be applied to improve their energy efficiency by trading off computational effort and output quality. One promising approximate computing technique is to perform approximate computing with memory, which stores a subset of function responses in a lookup table (LUT), and avoids redundant computation when encountering similar input patterns. Limited by the memory space, most existing solutions simply store values for those frequently‐appeared input patterns, without considering output quality and/or intrinsic characteristic of the target kernel. In this paper, we propose a novel LUT allocation technique for approximate computing with memory, which is able to dramatically improve the hit rate of LUT and hence achieves significant energy savings under given quality constraints. We also present how to apply the proposed LUT allocation solution for multiple computation kernels. Experimental results show the efficacy of our proposed methodology.