Karen Willcox
University of Texas at Austin
AlgorithmMathematical optimizationEngineeringAerodynamicsInverse problemOptimization problemNonlinear systemComputational fluid dynamicsMonte Carlo methodSystems engineeringMultidisciplinary approachInferenceApplied mathematicsUncertainty quantificationMathematicsEngineering design processComputer scienceControl theoryPartial differential equationReduction (complexity)
280Publications
48H-index
13.3kCitations
Publications 250
Newest
#1Parisa Khodabakhshi (University of Texas at Austin)H-Index: 5
#2Karen WillcoxH-Index: 48
Last. Max D. GunzburgerH-Index: 74
view all 3 authors...
Abstract Nonlocal models feature a finite length scale, referred to as the horizon, such that points separated by a distance smaller than the horizon interact with each other. Such models have proven to be useful in a variety of settings. However, due to the reduced sparsity of discretizations, they are also generally computationally more expensive compared to their local differential equation counterparts. We introduce a multifidelity Monte Carlo method that combines the high-fidelity nonlocal ...
Source
#1Shane A. McQuarrie (University of Texas at Austin)H-Index: 4
#2Parisa Khodabakhshi (University of Texas at Austin)H-Index: 5
Last. Karen Willcox (University of Texas at Austin)H-Index: 48
view all 3 authors...
This work formulates a new approach to reduced modeling of parameterized, time-dependent partial differential equations (PDEs). The method employs Operator Inference, a scientific machine learning framework combining data-driven learning and physics-based modeling. The parametric structure of the governing equations is embedded directly into the reduced-order model, and parameterized reduced-order operators are learned via a data-driven linear regression problem. The result is a reduced-order mo...
Source
Abstract In many contexts, it is of interest to assess the impact of selected parameters on the failure probability of a physical system. To this end, one can perform conditional reliability analysis, in which the probability of failure becomes a function of these parameters. Computing conditional reliability requires recomputing failure probabilities for a sample sequence of the parameters, which strongly increases the already high computational cost of conventional reliability analysis. We all...
Source
#1Luwen HuangH-Index: 2
Last. Karen WillcoxH-Index: 48
view all 3 authors...
#1Omar Ghattas (University of Texas at Austin)H-Index: 57
#2Karen Willcox (University of Texas at Austin)H-Index: 48
This article addresses the inference of physics models from data, from the perspectives of inverse problems and model reduction. These fields develop formulations that integrate data into physics-based models while exploiting the fact that many mathematical models of natural and engineered systems exhibit an intrinsically low-dimensional solution manifold. In inverse problems, we seek to infer uncertain components of the inputs from observations of the outputs, while in model reduction we seek l...
Source
#1Steven A. Niederer ('KCL': King's College London)H-Index: 1
#3Mark GirolamiH-Index: 65
Source
#1Michael G. Kapteyn (MIT: Massachusetts Institute of Technology)H-Index: 5
Last. Karen Willcox (University of Texas at Austin)H-Index: 48
view all 3 authors...
A unifying mathematical formulation is needed to move from one-off digital twins built through custom implementations to robust digital twin implementations at scale. This work proposes a probabilistic graphical model as a formal mathematical representation of a digital twin and its associated physical asset. We create an abstraction of the asset–twin system as a set of coupled dynamical systems, evolving over time through their respective state spaces and interacting via observed data and contr...
Source
#1Anirban Chaudhuri (MIT: Massachusetts Institute of Technology)H-Index: 9
#2Alexandre Noll Marques (MIT: Massachusetts Institute of Technology)H-Index: 8
Last. Karen Willcox (University of Texas at Austin)H-Index: 48
view all 3 authors...
This paper develops mfEGRA, a multifidelity active learning method using data-driven adaptively refined surrogates for failure boundary location in reliability analysis. This work addresses the issue of prohibitive cost of reliability analysis using Monte Carlo sampling for expensive-to-evaluate high-fidelity models by using cheaper-to-evaluate approximations of the high-fidelity model. The method builds on the efficient global reliability analysis (EGRA) method, which is a surrogate-based metho...
Source
#1Mengwu GuoH-Index: 5
#2Shane A. McQuarrie (University of Texas at Austin)H-Index: 4
Last. Karen Willcox (University of Texas at Austin)H-Index: 48
view all 3 authors...
This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.