ABC: Abstract prediction Before Concreteness

Jung-Eun Kim1,a, Richard Bradford2,b, Man-Ki Yoon1,c and Zhong Shao1,d

1Computer Science Yale University New Haven, CT, USA
2Commercial Avionics Engineering Collins Aerospace Cedar Rapids, IA, USA
ajung-eun.kim@yale.edu
brichard.bradford@collins.com
cman-ki.yoon@yale.edu
dzhong.shao@yale.edu

ABSTRACT

Learning techniques are advancing the utility and capability of modern embedded systems. However, the challenge of incorporating learning modules into embedded systems is that computing resources are scarce. For such a resource-constrained environment, we have developed a framework for learning abstract information early and learning more concretely as time allows. The intermediate results can be utilized to prepare for early decisions/actions as needed. To apply this framework to a classification task, the datasets are categorized in an abstraction hierarchy. Then the framework classifies intermediate labels from the most abstract level to the most concrete. Our proposed method outperforms the existing approaches and reference baselines in terms of accuracy. We show our framework with different architectures and on various benchmark datasets CIFAR-10, CIFAR-100, and GTSRB. We measure prediction times on GPUequipped embedded computing platforms as well.

Keywords: Adaptive Concreteness, Resource-Constrained System, Cyber-Physical System, Adaptive Neural Network.



Full Text (PDF)