Parasitic-Aware Analog Circuit Sizing with Graph Neural Networks and Bayesian Optimization

Mingjie Liu1,a, Walker J. Turner2, George F. Kokai2, Brucek Khailany2, David Z. Pan1 and Haoxing Ren2,a
1ECE Department, The University of Texas at Austin
ajay liu@utexas.edu
2NVIDIA
ahaoxingr@nvidia.com

ABSTRACT


Layout parasitics significantly impact the performance of analog integrated circuits, leading to discrepancies between schematic and post-layout performance and requiring several iterations to achieve design convergence. Prior work has accounted for parasitic effects during the initial design phase but relies on automated layout generation for estimating parasitics. In this work, we leverage recent developments in parasitic prediction using graph neural networks to eliminate the need for inthe- loop layout generation. We propose an improved surrogate performance model using parasitic graph embeddings from the pre-trained parasitic prediction network. We further leverage dropout as an efficient prediction of uncertainty for Bayesian optimization to automate transistor sizing. Experimental results demonstrate the proposed surrogate model has 20% better R2 prediction score and improves optimization convergence by 3.7 times and 2.1 times compared to conventional Gaussian process regression and neural network based Bayesian linear regression, respectively. Furthermore, the inclusion of parasitic prediction in the optimization loop could guarantee satisfaction of all design constraints, while schematic-only optimization fail numerous constraints if verified with parasitic estimations.



Full Text (PDF)