Scientific machine learning
I have been increasingly interested in ML/AI, particularly in integrating mathematical and physical structures into neural networks. Therefore, I mainly focus on structure-preserving ML. Scientific machine learning (SciML) offers a promising approach to tackling problems where traditional numerical algorithms fall short. For instance, when designed properly, SciML can provide an alternative method for studying dynamical systems.
Fast neural Poincaré maps for toroidal magnetic fields
Poincaré maps for toroidal magnetic fields are routinely employed to study gross confinement properties in devices built to contain hot plasmas. In most practical applications, evaluating a Poincaré map requires numerical integration of a magnetic field line, a process that can be slow and that cannot be easily accelerated using parallel computations. We propose a novel neural network architecture, the HénonNet,1 and show that it is capable of accurately learning realistic Poincaré maps from observations of a conventional field-line-following algorithm. After training, such learned Poincaré maps evaluate much faster than the field-line integration method. Moreover, the HénonNet architecture exactly reproduces the primary physics constraint imposed on field-line Poincaré maps: flux preservation. This structure-preserving property is the consequence of each layer in a HénonNet being a symplectic map. We demonstrate empirically that a HénonNet can learn to mock the confinement properties of a large magnetic island by using coiled hyperbolic invariant manifolds to produce a sticky chaotic region at the desired island location. This suggests a novel approach to designing magnetic fields with good confinement properties that may be more flexible than ensuring confinement using KAM tori.
Other structure-preserving SciML work
The recent representative papers include: physics-assisted learning,2 dynamics learning for multiscale and/or stiff ODEs3 4 and large-scale PDEs,5 learning flow maps for multiscale Hamiltonians 6 and their applications.7 Please refer to the following references for more details. I plan to write more on this topic when I have time.
-
J. W. Burby, Q. Tang and R. Maulik. Fast neural Poincare maps for toroidal magnetic fields, Plasma Physics and Controlled Fusion, 63:024001, 2020. ↩
-
X. Xie, Q. Tang, and X.-Z. Tang. Latent space dynamics learning for stiff collisional-radiative models, Machine Learning: Science and Technology, 5:045070, 2024. ↩
-
D. Serino, A. Alvarez Loya, J. W. Burby, I. G. Kevrekidis, and Q. Tang. Fast-slow neural networks for learning singularly perturbed dynamical systems, Journal of Computational Physics, 537:114090, 2025. ↩
-
A. Alvarez Loya, D. Serino, J. Burby, and Q. Tang. Structure-preserving neural ordinary differential equations for stiff systems, submitted, 16 pages, 2025. ↩
-
A. J. Linot, J. W. Burby, Q. Tang, P. Balaprakash, M. D. Graham, and R. Maulik. Stabilized Neural Ordinary Differential Equations for Long-Time Forecasting of Dynamical Systems, Journal of Computational Physics, 474:111838, 2023. ↩
-
V. Duruisseaux, J. W. Burby, and Q. Tang. Approximation of Nearly-Periodic Symplectic Maps via Structure-Preserving Neural Networks, Scientific Reports, 13.1:8351, 2023. ↩
-
E. G. Drimalas, F. Fraschetti, C.-K. Huang, and Q. Tang. Symplectic neural network and its applications to charged particle dynamics in electromagnetic fields, submitted, 15 pages, 2025. ↩