Kavli Institute for Theoretical Physics organises wonderful few-weeks-long workshops throughout the year, and I recently attended one on machine learning in many-body physics. The general idea of these workshops is that one gets an office to work in, there is only one or two talks a day and the participants get to interact in the informal, but super stimulating atmosphere. Also, you get an incredible ocean view! If this sounds good, you should check out KITP webpage to see if there is a workshop interesting for you (you can also propose a new one).
In this post I would like to round up a few things that I found most interesting and useful.
- Efficient Quantum Tomography: Juan Carrasquilla and collaborators propose the use of generative models to parametrise measurement statistics of informationally complete POVMs: Learning Quantum States with Generative Models
- Unsupervised Learning of Quantum States is Hard: Ying-Jer Kao and collaborators show how to use reinforcement learning for generation spin-ice states: Generation of topologically constrained states through deep reinforcement learning
- Generative models tutorial: Michael Albergo gave a great overview and intuitive explanation of generative ml models out there: An Overview of Recent Generative Models
You can find the full list of talks here. I recommend watching Aleksander Kubica, Marin Bukov and Evert van Nieuwenburg, they all do very cool stuff.
Overall it seems to me that while there was no one mindblowing result that changed how we think about physics, there has been tremendous amount of progress. The one place where ml methods seem to provide most advantage is the processing and analysis of experimental data, but the many-body state ansatzes, error correction improvements, and phase transitions analyses definitely offer very cool insights.