Towards Provably Efficient Quantum Algorithms for Nonlinear Dynamics and Large-scale Machine Learning Models

Jin-peng Liu, UC Berkeley
2/1, 2023 at 4:10PM-5PM in 939 Evans (for in-person talks) and https://berkeley.zoom.us/j/98667278310

Nonlinear dynamics play a prominent role in many domains and are notoriously difficult to solve. Whereas previous quantum algorithms for general nonlinear equations have been severely limited due to the linearity of quantum mechanics, we gave the first efficient quantum algorithm for nonlinear differential equations with sufficiently strong dissipation. This is an exponential improvement over the best previous quantum algorithms, whose complexity is exponential in the evolution time. We also established a lower bound showing that nonlinear differential equations with sufficiently weak dissipation have worst-case complexity exponential in time, giving an almost tight classification of the quantum complexity of simulating nonlinear dynamics. Furthermore, we design end-to-end quantum machine learning algorithms, combining efficient quantum (stochastic) gradient descent with sparse state preparation and sparse state tomography. We benchmark instances of training sparse ResNet up to 103 million parameters, and identify the dissipative and sparse regime at the early phase of fine-tuning could receive quantum enhancement. Our work shows that fault-tolerant quantum algorithms could potentially contribute to the scalability and sustainability of most state-of-the-art, large-scale machine learning models.