Mathematics Colloquium, Zhongqiang Zhang (Worcester Polytechnic Inst.)

Warning Icon This event is in the past.

When:
March 20, 2024
2:30 p.m. to 3:30 p.m.
Where:
Faculty/Administration #1146
656 W. Kirby
Detroit, MI 48202
Event category: Seminar
In-person

Title: On some scaling issues in training physics-informed neural networks

Speaker: Zhongqiang Zhang

Associate Professor

Department of Mathematical Sciences

Worcester Polytechnic Institute

https://www.wpi.edu/~zzhang7/

Abstract

Training in physics-informed machine learning is often realized with low-order methods, such as stochastic gradient descent. Such training methods usually lead to better learning of solutions with low frequencies and small gradients. For solutions of high frequencies and large gradients, we consider two classes of problems. The first class of problems is high-dimensional Fokker-Planck equations, where the solutions are of small scales but not negligible in regions. We use tensor-neural networks and show how to deal with solutions of small scales but with large gradients. The second class of problems is low-dimensional partial differential equations with small parameters, such as boundary layers. We discuss a two-scale neural network method and introduce a streamlined approach to tackle large-gradient issues induced by small parameters.

About the Speaker:  Zhongqiang Zhang is an Associate Professor of Mathematics at Worcester Polytechnic Institute. His research interests include numerical methods for stochastic and integral differential equations, computational probability, and mathematics for machine learning. Before he joined in Worcester Polytechnic Institute in 2014, he received Ph.D. degrees in mathematics at Shanghai University in 2011 and in applied mathematics at Brown University in 2014. He co-authored a book with George Karniadakis on numerical methods for stochastic partial differential equations with white noise.


Contact

Tao Huang
taohuang@wayne.edu

Cost

Free
March 2024
SU M TU W TH F SA
252627282912
3456789
10111213141516
17181920212223
24252627282930
31123456