Structure preserving deep learning architectures for convergent and stable data-driven modeling

CWI is hosting bi-monthly seminars on the application of Machine Learning and Uncertainty Quantification in scientific computing. The next seminar is set to take place on Thursday 29 April on Structure preserving deep learning architectures for convergent and stable data-driven modeling.  

This seminar will be delivered by Nathaniel Trask where he will be discussing structure preserving deep learning architectures for convergent and stable data-driven modeling.

The unique approximation properties of deep architectures have attracted attention in recent years as a foundation for data-driven modeling in scientific machine learning (SciML) applications. The “black-box” nature of DNNs however require large amounts of data that generalize poorly in traditional engineering settings where available data is relatively small, and it is generally difficult to provide a priori guarantees about the accuracy and stability of extracted models. They adopt the perspective that tools from mimetic discretization of PDEs may be adapted to SciML settings, developing architectures and fast optimizers tailored to the specific needs of SciML. In particular, they focus on: realizing convergence competitive with FEM, preserving topological structure fundamental to conservation and multiphysics, and providing stability guarantees. In this talk they will introduce some motivating applications at Sandia spanning shock magnetohydrodynamics and semiconductor physics before providing an overview of the mathematics underpinning these efforts.

If you would like to attend this talk, please get in touch with Wouter Edeling from the SC group at CWI.

Please visit the seminar for Machine Learning and UQ in Scientific Computing webpage to get more information on upcoming seminars.