Discrete Samplers for Approximate Inference in Probabilistic Machine Learning

Shirui Zhao1,a, Nimish Shah1,b, Wannes Meert2 and Marian Verhelst1,c
1MICAS-ESAT, KU Leuven
ashirui.zhao@kuleuven.be
bnimish.shah@kuleuven.be
dmarian.verhelst@kuleuven.be
2DTAI, KU Leuven
wannes.meert@kuleuven.be

ABSTRACT


Probabilistic reasoning models (PMs) and probabilistic inference bring advantages when dealing with small datasets or uncertainty on the observed data, and allow to integrate expert knowledge and create interpretable models. The main challenge of using these PMs in practice is that their inference is very computeintensive. Therefore, custom hardware architectures for the exact and approximate inference of PMs have been proposed in the SotA. The throughput, energy and area efficiency of approximate PM inference accelerators are strongly dominated by the sampler blocks required to sample arbitrary discrete distributions. This paper proposes and studies novel discrete sampler architectures towards efficient and flexible hardware implementations for PM accelerators. Both cumulative distribution table (CDT) and Knuth-Yao (KY) based sampling algorithms are assessed, based on which different sampler hardware architectures were implemented. Innovation is brought in terms of a reconfigurable CDT sampling architecture with a flexible range and a reconfigurable Knuth-Yao sampling architecture that supports both flexible range and dynamic precision. All architectures are benchmarked on real-world Bayesian Networks, demonstrating up to 13× energy efficiency benefits and 11× area efficiency improvement of the optimized reconfigurable Knuth-Yao sampler over the traditional linear CDT-based samplers used in the PM SotA.

Keywords: Probabilistic Models, Approximate Inference, Discrete Sampling, Cdt Algorithm, Knuth-Yao Algorithm.



Full Text (PDF)