IAIFI Public Colloquia
We are pleased to announce the lineup for our Fall 2021 Colloquium series featuring leaders at the intersection of AI and Physics.
All times are Boston time. Please sign up for our mailing list to receive updates on future events.
You can also watch our colloquia live on YouTube.
- Surya Ganguli, Associate Professor, Applied Physics, Stanford University
- Friday, September 17, 2:00-3:00pm
- “Understanding computation using physics and exploiting physics for computation”
- YouTube Recording
- Talk Slides
- Abstract: We are witnessing an exciting interplay between physics, computation and neurobiology that spans in multiple directions. In one direction we can use the power of complex systems analysis, developed in theoretical physics and applied mathematics, to elucidate design principles governing how neural networks, both biological and artificial, learn and function. In another direction, we can exploit novel physics to instantiate new kinds of quantum neuromorphic computers using spins and photons. We will give several vignettes in both directions, including: (1) determining the best optimization problem to solve in order to perform regression in high dimensions; (2) deriving the detailed structure of the primate retina by analyzing optimal convolutional auto-encoders of natural movies; (3) describing and analyzing a quantum associative memory instantiated in a multimode cavity QED system; (4) understanding the geometry and dynamics of high dimensional optimization in the classical limit of a dissipative many-body quantum optimizer comprised of interacting photons.
- References: Y. Bahri, J. Kadmon, J. Pennington, S. Schoenholz, J. Sohl-Dickstein, and S. Ganguli, Statistical mechanics of deep learning, Annual Reviews of Condensed Matter Physics, 2020.; M. Advani and S. Ganguli, Statistical mechanics of optimal convex inference in high dimensions, Physical Review X, 2016.; M. Advani and S. Ganguli, An equivalence between high dimensional Bayes optimal inference and M-estimation, NeurIPS, 2016.; S. Deny, J. Lindsey, S. Ganguli, S. Ocko, The emergence of multiple retinal cell types through efficient coding of natural movies, Neural Information Processing Systems (NeurIPS) 2018.; Y. Yamamoto, T. Leleu, S. Ganguli and H. Mabuchi, Coherent Ising Machines: quantum optics and neural network perspectives, Applied Physics Letters 2020.; B.P. Marsh, Y, Guo, R.M. Kroeze, S. Gopalakrishnan, S. Ganguli, J. Keeling, B.L. Lev, Enhancing associative memory recall and storage capacity using confocal cavity QED, Physical Review X, 2020.
- Ben Wandelt, Director, Lagrange Institute
- Friday, October 1, 2:00-3:00pm
- “Learning the Universe”
- Abstract: To realize the advances in cosmological knowledge we desire in the coming decade will require a new way for cosmological theory, simulation, and inference to interplay. As cosmologists we wish to learn about the origin, composition, evolution, and fate of the cosmos from all accessible sources of astronomical data, such as the cosmic microwave background, galaxy surveys, or electromagnetic and gravitational wave transients. Traditionally, the field has progressed by designing, modeling and measuring intuitive summaries of the data, such as 2-point correlations. This traditional approach has a number of risks and limitations: how do we know if we computed the most informative statistics? Did we omit any summaries that would have provided additional information or break parameter degeneracies? Are current approximations to the likelihood and physical modeling sufficient? I will discuss simulation-based, full-physics modeling approaches to cosmology that are powered by new ways of designing and running simulations of cosmological observables and of confronting models with data. Innovative machine-learning methods play an important role in making this possible. The goal is to use current and next-generation data to reconstruct the cosmological initial conditions; and constrain cosmological physics much more completely than has been feasible in the past. I will discuss the current status and challenges of this new approach.
- Rose Yu, Assistant Professor, Computer Science and Engineering, UC San Diego
- Friday, October 15, 2:00-3:00pm
- “Physics-Guided AI for Learning Spatiotemporal Dynamics”
- Abstract: Applications such as public health, transportation, and climate science often require learning complex dynamics from large-scale spatiotemporal data. While deep learning has shown tremendous success in these domains, it remains a grand challenge to incorporate physical principles in a systematic manner to the design, training, and inference of such models. In this talk, we will demonstrate how to principally integrate physics in AI models and algorithms to achieve both prediction accuracy and physical consistency. We will showcase the application of these methods to problems such as forecasting COVID-19, traffic modeling, and accelerating turbulence simulations.
- Sven Krippendorf, Senior Researcher, Mathematical Physics and String Theory, Ludwig-Maximilians Universität
- Friday, October 29, 2:00-3:00pm
- “Theoretical Physicists’ Biases Meet Machine Learning”
- Abstract: Many recent successes in machine learning (ML) resemble the success story in theoretical particle physics of utilising symmetries as organising principle. I discuss an introductory example where this procedure applied in ML leads to new insights to PDEs in mathematical physics, more precisely for the study of Calabi-Yau metrics. Then I discuss methods on how to identify symmetries of a system without requiring knowledge about such symmetries, including also how to find a Lax pair/connection associated with integrable systems. If time permits, I discuss how latent representations in neural networks can offer a close resemblance of variables used in dual descriptions established analytically in physical systems.
- Yasaman Bahri, Research Scientist, Google Research, Brain Team
- Friday, November 12, 2:00-3:00pm
- Understanding deep learning
- Abstract: Deep neural networks are a rich class of function approximators that are now ubiquitous in many domains and enable new frontiers in physics and other sciences, but their function, limitations, and governing principles are not yet well-understood. I will overview a few results from a research program seeking to understand deep learning by proceeding scientifically. These investigations draw ideas, tools, and inspiration from theoretical physics, with close guidance from computational experiments, and integrate together perspectives from computer science and statistics. I will discuss some past highlights from the study of overparameterized neural networks — such as exact connections to Gaussian processes and linear models, as well as information propagation in deep networks — and then focus on emerging questions surrounding the role of scale (so-called “scaling laws”) and predictability.
- Kazuhiro Terao, Staff Scientist, SLAC National Accelerator Laboratory, Stanford University
- Friday, December 10, 2:00-3:00pm
- Details to come
IAIFI Internal Seminars
These talks are only open to IAIFI members and affiliates.
- Year 2 State of the IAIFI Town Hall
- Friday, September 10, 2:00-3:00pm
- Fabian Ruehle, Assistant Professor, Northeastern University
- Friday, September 24, 2:00-3:00pm
- “Learning metrics in extra dimensions”
- Abstract: String theory is a very promising candidate for a fundamental theory of our universe. An interesting prediction of string theory is that spacetime is ten-dimensional. Since we only observe four spacetime dimensions, the extra six dimensions are small and compact, thus evading detection. These extra six-dimensional spaces, known as Calabi-Yau spaces, are very special and elusive. They are equipped with a special metric needed to make string theory consistent. This special property is given in terms of a (notoriously hard) type of partial differential equation. While we know, thanks to the heroic work of Calabi and Yau, that this PDE has a unique solution and hence that the metric exists, we neither know what it looks like nor how to construct it explicitly. However, the metric is an important quantity that enters in many physical observables, e.g. particle masses. Thinking of the metric as a function that satisfies three constraints that enter in the Calabi-Yau theorem, we can parameterize the metric as a neural network and formulate the problem as multiple continuous optimization tasks. The neural network is trained (akin to self-supervision) by sampling points from the Calabi-Yau space and imposing the constraints entering the theorem as customized loss functions.
- Di Luo, IAIFI Fellow
- Friday, October 8, 2:00-3:00pm
- “Machine Learning for Quantum Many-body Physics”
- Abstract: The study of quantum many-body physics plays an crucial role across condensed matter physics, high energy physics and quantum information science. Due to the exponential growing nature of Hilbert space, challenges arise for exact classical simulations of high dimensional wave function which is the core object in quantum many-body physics. A natural question comes as whether machine learning, which is powerful for processing high dimensional probability distribution, can provide new methods for studying quantum many-body physics. In contrast to the standard high dimensional probability distribution, the wave function further exhibits complex phase structure and rich symmetries besides high dimensionality. It opens up a series of interesting questions for high dimensional optimization, sampling and representation imposed by quantum many-body physics. In this talk, I will discuss recent advancement of the field and present (1) neural network representations for quantum states with Fermionic anti-symmetry and gauge symmetries; (2) neural network simulations for ground state and real time dynamics in condensed matter physics, high energy physics and quantum information science; (3) quantum control protocol discovery with machine learning.
- Cengiz Pehlevan, Assistant Professor, Applied Mathematics, Harvard University (SEAS)
- Friday, October 22, 2:00-3:00pm
- Details to come
- Bryan Ostdiek, Postdoctoral Fellow, Theoretical Particle Physics, Harvard University
- Friday, November 5, 2:00-3:00pm
- Details to come
- Tess Smidt, Assistant Professor, EECS, MIT
- Friday, November 19, 2:00-3:00pm
- Details to come
- Harini Suresh
- Friday, December 10, 2:00-3:00pm
- “Ethics in AI”
Upcoming workshops involving members of the IAIFI community:
- Gaps & Frontier in AI4Science Workshop, NeurIPS 2021, December 13, 2021
- Interplay of Fundamental Physics and Machine Learning, Aspen Center for Physics, May 29-June 19, 2022
- Machine Learning at GGI - Galileo Galilei Institute, August 22-September 30, 2022
Related Public Events
Other organizations that hold public events relevant to the IAIFI community: