![]() |
Dakota Reference Manual
Version 6.2
Large-Scale Engineering Optimization and Uncertainty Analysis
|
Bayesian calibration
This keyword is related to the topics:
Alias: nond_bayes_calibration
Argument(s): none
Required/Optional | Description of Group | Dakota Keyword | Dakota Keyword Description | |
---|---|---|---|---|
Required (Choose One) | Group 1 | queso | Markov Chain Monte Carlo algorithms from the QUESO package | |
gpmsa | (Experimental) Gaussian Process Models for Simulation Analysis (GPMSA) Markov Chain Monte Carlo algorithm with Gaussian Process Surrogate | |||
dream | DREAM (DiffeRential Evolution Adaptive Metropolis) | |||
Optional | standardized_space | Perform the MCMC process in a standardized probability space. | ||
Optional | likelihood_scale | Scale the log-likelihood function | ||
Optional | calibrate_sigma | Calibrate the experimental error term(s) in the likelihood function | ||
Optional | samples | Number of samples for sampling-based methods | ||
Optional | seed | Seed of the random number generator | ||
Optional | model_pointer | Identifier for model block to be used by a method |
Bayesian calibration methods take prior information on parameter values (in the form of prior distributions) and observational data (e.g. from experiments) and produce posterior distributions on the parameter values. When the computational simulation is then executed with samples from the posterior parameter distributions, the results that are produced are consistent with ("agree with") the experimental data. Calibrating parameters from a computational simulation model requires a "likelihood function" that specifies the likelihood of observing a particular observation given the model and its associated parameterization; Dakota assumes a Gaussian likelihood function currently. The algorithms that produce the posterior distributions on model parameters are Monte Carlo Markov Chain (MCMC) sampling algorithms. MCMC methods require many samples, often tens of thousands, so in the case of model calibration, often emulators of the computational simulation are used. For more details on the algorithms underlying the methods, see the Dakota User's manual.
Dakota has three Bayesian calibration methods: QUESO, DREAM, and GPMSA, specified with bayes_calibration
queso
, bayes_calibration
dream
, or bayes_calibration
gpmsa
, respectively. The QUESO method uses components from the QUESO library (Quantification of Uncertainty for Estimation, Simulation, and Optimization) developed at The University of Texas at Austin. Dakota uses its DRAM (Delayed Rejected Adaptive Metropolis) algorithm, and variants, for the MCMC sampling. DREAM (DiffeRential Evolution Adaptive Metropolis) is a method that runs multiple different chains simultaneously for global exploration, and automatically tunes the proposal covariance during the process by a self-adaptive randomized subspace sampling.[86]. GPMSA (Gaussian Process Models for Simulation Analysis) is an approach developed at Los Alamos National Laboratory. It constructs Gaussian process models to emulate the expensive computational simulation as well as model discrepancy. GPMSA also has extensive features for calibration, such as the capability to include a "model discrepancy" term and the capability to model functional data such as time series data.
The Bayesian capabilities are under active development. At this stage, the QUESO methods in Dakota are the most advanced and robust, followed by DREAM, followed by GPMSA, which is in prototype form at this time. Note that as of Dakota 6.2, the field responses and associated field data may be used with QUESO and DREAM. That is, the user can specify field simulation data and field experiment data, and Dakota will interpolate and provide the proper residuals to the Bayesian calibration.