Faculty Members
Lead PI Bui-Thanh is a computational applied mathematician and software developers with applications to MHD, fluid dynamics, and infectious disease outbreaks. PI Bui-Thanh is currenlty a co-director of the Center for Scientific Machine Learning at the Oden Institute at UT Austin. He has developed various SciML approaches including unified universal approximation theory for neural network model-constrained autoencoder method for forward and inverse problems, model-constrained variational autoencoder method for Bayesian inference, model-constrained deep learning methods for inverse problems, model-constrained deep learning approach for solution of dynamical systems, architecture design of deep neural network, causal recurrent neural networks to predict sea wave height from microseismic data, ROM-ML combing ROM for fast and accurate solutions of PDEs and inverse problems, discrete DeepRitz and PINN deep learning for solution of complex nonlinear hyperelastic behavior of ventricular myocardium, integration of deep learning to finite element- based simulation of the aortic heart valve, to name a few. Bui-Thanh also has extensive expertise on PDE-constrained inverse problem, Bayesian inverse problem, uncertainty quantification (UQ), reduced-order modeling (ROM) and surrogate modeling, and scalable high-order finite element methods.
Clint Dawson has over 30 years experience in numerical method development and application of HPC technologies for flow problems. Since the mid 1990's he has worked extensively in predictive simulation of storm surges and flooding from hurricanes and tropical storms. He has worked on development of algorithms for the solution of shallow water systems and the application of these algorithms to high-fidelity predictions. This work has led to an extensive list of over 175 publications in computational science and engineering, coastal engineering, hydrology and other journals. He is a lead developer of the Advanced Circulation Model (ADCIRC) model that has been used in the study of many high-impact storms along the Gulf and eastern coastal U.S.; see for example. More recently, given the large uncertainties involved in storm surge modeling, he has focused heavily on uncertainty quantification, parameter estimation, and data assimilation for storm surge modeling. The work proposed herein by Dawson will build off of two recent NSF projects, one funded through the NSF PREEVENTS program (NSF 1854986 ("A dynamic unified framework for hurricane storm surge analysis and prediction spanning across the coastal floodplain and ocean"), and NSF 1818847 ("Construction and analysis of numerical methods for stochastic inverse problems with applications to coastal hydrodynamics"). The former project was concerned with improved algorithms for coastal flooding processes including coupled wave-surge-geomorphology and surge coupled with inland flooding due to rainfall runoff. We have investigated LSTM architectures for flood predictions in watersheds, with impressive results compared to physics-based codes; however these methods did not account for underlying physics such as mass conservation. We have also investigated improved stability preserving discontinuous Petrov-Galerkin type methods for shallow water flows. These codes have been developed using FEniCS to expedite algorithm development and testing. The latter NSF project focused on the development of new methods for data-consistent inversion of stochastic inverse problems, with applications to the quantification of uncertainties inherent in storm surge models. A follow-on project, NSF 2208461 ("Advancing the data-to-distribution pipeline for scalable data-consistent inversion to quantify uncertainties in coastal hazards") was recently awarded and will be leveraged for this project. We note that this project aligns with NSF's current emphasis on research in climate sustainability and the impacts of climate change.
Director, Oden Institute James T. Willerson Center for Cardiovascular Modeling and Simulation. Professor Sacks has over 30 years of experience on cardiovascular modeling and simulation, particularly on developing patient-specific, simulation-based approaches for the understanding and treatment of heart and heart valve diseases. His research is based on multi-scale modeling, quantification, and simulation of the biophysical behavior of the constituent cells and tissues and translation to the organ level in health, disease, and treatment. Given the functional complexities of soft tissues and organs, it is clear that computational simulations are critical in their understanding and for the rational basis for the development of therapies and replacements. In-silico implementation is typically done using the finite element (FE) method, which remains impractically slow for translational clinical time frames. Dr. Sacks has utilized neural networks (NN) for soft tissue and organ simulations with the use of a physics-based surrogate model to directly learn the displacement field solution without the need for raw training data or FE simulation datasets. He has demonstrated with this approach, termed neural network finite element (NNFE), results in a trained NNFE model with excellent agreement with the corresponding “ground truth” FE solutions over the entire physiological deformation range. More importantly,the NNFE approach provided a significantly decreased computational time for a range of finite element mesh sizes. More recently, he has shown how a NURBS-based approach can be directly integrated into the NNFE approach as a means to handle real organ geometries. While these and related approaches are in their early stages, they offer a method to perform complex organ-level simulations in clinically relevant time frames without compromising accuracy.
Ofodike Ezekoye has expertise in reacting heat and mass transfer modeling and applies these principles to fire characterization and modeling. Because of the multiscale processes required to characterize fire phenomena, Ezekoye has worked on fire phenomena from the fundamentals of combustion reactions and material degradation to large scale characterization of fire phenomena at building and community scales. At small scales the degradation mechanisms for condensed phase material systems are typically inferred using inverse methods based upon thermogravimetric,evolved gases, and other global measurement data. Models for decomposition and gas generation are parameterized based upon measurements and data for the material systems. At building and room scales, nonlinear partial differential equations for mass, momentum, species, and energy conservation with radiative transfer models are coupled to describe the evolution of the fire and the transport of smoke and hot products of combustion. In many modern highrise buildings, extensive sensor networks measure fire data. With knowledge of fire and fire products evolution, building fire protection systems can guide emergency response and direct safe occupant egress. Ezekoye and coworkers work on developing fast, low-order models for modeling smoke transport in building systems. SciML models are critical to connecting the building sensor data to prediction of fire and smoke spread in real time. Finally, fire forensic analysis is an important class of problems that connects small scale degradation physics and large scale fire damage/consequences. Fire forensics is the inversion process that seeks to identify the underlying evolution of a fire after fire spread has occurred. The modeling framework relies on appropriately parameterized fire damage (i.e., material degradation) models and fire evolution models. Ezekoye and cowokers have developed Bayesian methods for hypotheses testing that benefit from SciML tools.
Baylor Co-PI Rob Kirby has extensive experience at the interface of computational mathematics and computer science. Having held faculty positions in both computer science and mathematics in his career, he has been a leader in high-level, automated finite elements for two decades. His work on FIAT set the stage for the FEniCS projects by providing a tool to generate finite element bases for a wide range of theoretically robust elements that are otherwise difficult to implement. He also collaborated with Anders Logg on early form compilers for generating efficient low-level finite element code, and this work provided the starting point for the development of the Unified Form Language (UFL) used by Firedrake, FEniCS, and Dune. He has developed a robust and powerful interface between Firedrake and PETSc that allows users to deploy highly scalable, problem-specific solvers. This has been instrumental for work enabling patch-based smoothers for incompressible flow problems and for H(div) and H(curl) to be used through this same interface. More recently, his work on Irksome implements a wide range of effective Runge-Kutta discretizations, providing UFL abstractions for time-dependent problems. These high-impact contributions set the groundwork our proposed work scientific machine learning, as Firedrake provides a high-level, differentiable toolkit for generating powerful solvers for our application suite. Prof.~Kirby will lead the interactions with the Firedrake team and work closely with Prof.~Bui on the TorchFire API.
David Ham is a computational mathematician at Imperial College London. He has led the Firedrake project since its inception over a decade ago. He is also a founder developer of Firedrake’s automated adjoint capabilities, for which he was awarded the 2015 Wilkinson Prize for numerical software. He has contributed to the automation of the solution of forward and adjoint finite element problems at every level of the software stack, from symbolic reasoning such as extending FEniCS to work on manifolds and automating shape derivatives through to generation of highly-optimised low-level code. Of key relevance to this proposal, he won the best paper award at the Differentiable Programming Workshop of NeurIPS 2021 for his work embedding neural networks in Firedrake: the precise converse problem to the work proposed here. He currently holds funding from the United Kingdom Research and Innovation (UKRI) to extend Firedrake to suppport exascale simulations, and to expand the classes of PDE problems that Firedrake supports to include techniques such as mesh adaptivity.
John Shadid is a Distinguished Member of the Technical Staff in the Computational Mathematics Department at Sandia National Laboratories and has an appointment as a National Lab Professor in the Mathematics and Statistics Department at the University of New Mexico. Dr. Shadid is also the Associate Director of the Tokamak Disruption Simulation (TDS) DOE SciDAC Center. TDS is a multi-lab, multi-university team with overall DOE Office of Fusion Energy PI Xianzhu Tang from Los Alamos National Laboratory. John Shadid has over 30 years of expertise in the development of applied math/numerical methods, computational algorithms, and high-performance large-scale parallel computational science software development, of solution methods for highly-nonlinear coupled multiple-time-scale PDE systems. In 2018, he was elected as a SIAM Fellow and in 2019 he received the United States Association of Computational Mechanics (USACM) Thomas J.R. Hughes Medal for Computational Fluid Mechanics. John was co-PI for the Aztec software library that received a 1997 R\&D100 award and was one of the very first scalable parallel iterative solver libraries. He has also been lead-PI for several large-scale projects developing robust, scalable, implicit finite element reacting flow, magnetohydrodynamics (MHD), and multifluid electromagnetic (EM) plasma system simulation capabilities in support of DOE Office of Science oriented scientific applications. The research team he leads has developed a scalable parallel multiphysics plasma modeling capability (Drekar). Drekar implements a consistent hierarchy of plasma models, of increasing fidelity, that span visco-resistive MHD, extended MHD, generalized Ohm's law (GOL) MHD and full multifluid EM plasma models and that is used in the TDS center in support of understanding disruptions and evaluating disruption mitigation methods in magnetic confinement fusion (MCF) systems.
John Shadid will collaborate with Tan on MHD/plasma application and corresponding SciML methods.
Leticia is a Statistician working on statistical computing, and modelling and inference of complex systems. In 2009 obtained the Waterloo-Fields postdoctoral fellowship, in 2013 was distinguished with the elected membership of the International Statistical Institute and since 2014 is member of the Mexican System of Researchers. Her work in epidemics models include the uncertainty quantification and prediction of deterministic and stochastic models in networks.
She collaborates with Tan on infectious disease outbreak and corresponding SciML methods.