The efficient_global method for optimization and least squares now supports concurrent refinement (adding multiple points).
(Experimental) The MIT Uncertainty Quantification (MUQ) MUQ2 library (Parno, Davis, Marzouk, et al.) enhances Dakota's Bayesian inference capability with new Markov Chain Monte Carlo (MCMC) sampling methods. MCMC samplers available in Dakota (under method > bayes_calibration > muq) include Metropolis-Hastings and Adaptive Metropolis. Future work will activate MUQ's more advanced samplers, including surrogate-based and derivative-enhanced sampling, as well as delayed rejection schemes.
(Experimental) Dakota 6.12 extends functional tensor train (FTT) surrogate models from the C3 library (Gorodetsky, University of Michigan) to support building FTT approximations across a sequence of model fidelities (multifidelity FTT) or model resolutions (multilevel FTT).
The Dakota GUI has added many significant feature improvements over the last year. The Dakota GUI now allows you to seamlessly browse Dakota's HDF5 database output files, as well as generate sophisticated graphical plots from HDF5 data with just a few clicks. HDF5 browsing and plot generation can also be driven within Dakota GUI by Next-Gen Workflow, a powerful tool that allows you to construct node-based workflows for complex tasks.
The C3 library (POC: Prof. A. Gorodetsky, University of Michigan) provides a new capability for low rank approximation by functional tensor train decomposition. C3 has been integrated in Dakota, alongside other stochastic expansion methods for UQ based on projection, regression, or interpolation, enabling an array of approaches for exploiting special structure (e.g., sparsity, low rank, low intrinsic dimension) in an input-output map.
Evaluation data (variables and responses) may now be output to disk in HDF5 format. HDF5 support has been added to all of our downloads. See the Dakota HDF5 Output section of the Reference Manual for full details.
Capabilities for multilevel polynomial chaos expansion (ML PCE) and stochastic collocation (MC SC) have been expanded and hardened to improve their efficiency, completeness, and accuracy.