Accelerating Quantum Approximate Optimization Algorithm using Machine Learning

Mahabubul Alama, Abdullah Ash-Sakib and Swaroop Ghoshc

Department of Electrical Engineering Pennsylvania State University, University Park, USA
amxa890@psu.edu
bash.saki@psu.edu
cszg212@psu.edu

ABSTRACT

We propose a machine learning based approach to accelerate quantum approximate optimization algorithm (QAOA) implementation which is a promising quantum-classical hybrid algorithm to prove the so-called quantum supremacy. In QAOA, a parametric quantum circuit and a classical optimizer iterates in a closed loop to solve hard combinatorial optimization problems. The performance of QAOA improves with increasing number of stages (depth) in the quantum circuit. However, two new parameters are introduced with each added stage for the classical optimizer increasing the number of optimization loop iterations. We note a correlation among parameters of the lower-depth and the higher-depth QAOA implementations and, exploit it by developing a machine learning model to predict the gate parameters close to the optimal values. As a result, the optimization loop converges in a fewer number of iterations. We choose graph MaxCut problem as a prototype to solve using QAOA. We perform a feature extraction routine using 100 different QAOA instances and develop a training data-set with 13; 860 optimal parameters. We present our analysis for 4 flavors of regression models and 4 flavors of classical optimizers. Finally, we show that the proposed approach can curtail the number of optimization iterations by on average 44:9% (up to 65:7%) from an analysis performed with 264 flavors of graphs.



Full Text (PDF)