Applied and Computational Mathematics Seminar
Department of Mathematics and Statistics
Spring 2025 Schedule
Parker 328, Friday 2:00 pm - 3:00 pm (CST)
For any questions or requests, please contact Phuong Hoang (tzh0059@auburn.edu)
Speaker | Institution | Date |
Yimin Zhong | Auburn Univ | Feb 14 |
Wei Zhu | Georgia Tech | Feb 21 |
Qi Tang | Georgia Tech | Mar 21 |
Yimin Zhong | Auburn Univ | Mar 28 |
Molei Tao | Georgia Tech | Apr 4 |
James Scott | Columbia University | Apr 11 |
Catalin Trenchea | Univ of Pittsburgh | Apr 18 |
Daniel Massatt | Louisiana State Univ | Apr 25 |
Zhongqiang Zhang | Worcester Polytechnic Institute | May 2 |
Yimin Zhong
|
Date and time: Feb 14 at 2:00 pm (Parker 328) Title: Fast solvers for radiative transfer and beyond Abstract: Despite the tremendous developments in recent years, constructing efficient numerical solution methods for the radiative transfer equation (RTE) is still challenging in scientific computing. In this talk, I will present a simple yet fast computational algorithm for solving the RTE in isotropic media in steady-state and time-dependent settings. The algorithm we developed has two steps. In the first step, we solve a volume integral equation for the angularly averaged solution using iterative schemes such as the GMRES method. The computation in this step is accelerated with a variant of the fast multipole method (FMM). In the second step, we solve a scattering-free transport equation to recover the angular dependence of the solution. The algorithm does not require the underlying medium to be homogeneous. We present numerical simulations under various scenarios to demonstrate the performance of the proposed numerical algorithm for both homogeneous and heterogeneous media. Then I will extend the formulation to the time-domain and anisotropic scattering media and analyze the possibility of applying the fast algorithm.
Date and time: Mar 28 at 2:00 pm (Parker 328) Title: Numerical Understanding of Neural Networks Abstract: In this talk, I will talk about a couple of recent works on neural networks. The motivation is to see whether neural networks are suitable for general scientific computing. Our study of shallow neural networks demonstrates that shallow neural networks are in general low-pass filters from different perspectives. Based on this observation, we proposed to make use of the composition of shallow networks to construct deep neural networks, which demonstrates better performance over the vanilla fully connected neural networks of comparable parameters. |
Wei Zhu
|
Date and time: Feb 21 at 2:00 pm (Parker 328) Title: Symmetry-Preserving Machine Learning: Theory and Applications Abstract: Symmetry underlies many machine learning and scientific computing tasks, from computer vision to physical system modeling. Models designed to respect symmetry often perform better, but several questions remain. How can we measure and maintain approximate symmetry when real-world symmetries are imperfect? How much training data can symmetry-based models save? And in non-convex optimization, do these models truly converge to better solutions? In this talk, I’ll share my work on these challenges, revealing that the answers are sometimes surprising. The approach draws on applied probability, harmonic analysis, differential geometry, and optimization, but no specialized background is required.
|
Qi Tang
|
Date and time: Mar 21 at 2:00 pm (Parker 328) Title: Structure-preserving machine learning for learning dynamical systems Abstract: I will present our recent work on structure-preserving machine learning (ML) for dynamical systems. First, I introduce a structure-preserving neural ODE framework that accurately captures chaotic dynamics in dissipative systems. Inspired by the inertial manifold theorem, our model learns the ODE’s right-hand side by combining a linear and a nonlinear term, enabling long-term stability on the attractor for the Kuramoto-Sivashinsky equation. This framework is further enhanced with exponential integrators. Next, I discuss ML for singularly perturbed systems, leveraging the Fenichel normal form to simplify fast dynamics near slow manifolds. A fast-slow neural network is proposed that enforces the existence of a trainable, attractive invariant slow manifold as a hard constraint. |
Molei Tao
|
Date and time: Apr 4 at 2:00 pm (Parker 328) Title: Optimization, Sampling, and Generative Modeling in Non-Euclidean Spaces Abstract: Machine learning in non-Euclidean spaces have been rapidly attracting attention in recent years, and this talk will give some examples of progress on its mathematical and algorithmic foundations. A sequence of developments that eventually leads to the generative modeling of data on Lie groups will be reported. Such a problem occurs, for example, in the Gen-AI design of molecules. |
Catalin Trenchea
|
Date and time: Apr 18 at 2:00 pm (Parker 328) Title: TBA Abstract: TBA |
Daniel Massatt
|
Date and time: Apr 25 at 2:00 pm (Parker 328) Title: TBA Abstract: TBA |
Zhongqiang Zhang
|
Date and time: May 2 at 2:00 pm (Parker 328) Title: TBA Abstract: TBA |