New form-based editors for Dakota interface blocks and hybrid method blocks.
Limited support for visualization of Dakota uncertainty variables (normal, lognormal, weibull)
Pre-processing markup supported in Dakota text editor, which also provides a new mechanism for assigning multiple NGW-based analysis drivers at runtime.
Dark theme support for Dakota text editor
QOI
New column-based QOI extractors for extracting fields from tabular/CSV-based data
Warning: The qoiExtractor node in Next-Gen Workflow has received backwards-incompatible changes. You must delete your qoiExtractor nodes and reconfigure them upon switching to 6.13
Misc.
"dprepro" node added to Next-Gen Workflow
General enhancements for the New Dakota Study wizard
Dakota 6.12
Released: May 15, 2020 Release Highlights:
The efficient_global method for optimization and least squares now supports concurrent refinement (adding multiple points).
(Experimental) The MIT Uncertainty Quantification (MUQ) MUQ2 library (Parno, Davis, Marzouk, et al.) enhances Dakota's Bayesian inference capability with new Markov Chain Monte Carlo (MCMC) sampling methods. MCMC samplers available in Dakota (under method > bayes_calibration > muq) include Metropolis-Hastings and Adaptive Metropolis. Future work will activate MUQ's more advanced samplers, including surrogate-based and derivative-enhanced sampling, as well as delayed rejection schemes.
(Experimental) Dakota 6.12 extends functional tensor train (FTT) surrogate models from the C3 library (Gorodetsky, University of Michigan) to support building FTT approximations across a sequence of model fidelities (multifidelity FTT) or model resolutions (multilevel FTT).
Dakota 6.11
Released: November 15, 2019 Release Highlights:
The Dakota GUI has added many significant feature improvements over the last year. The Dakota GUI now allows you to seamlessly browse Dakota's HDF5 database output files, as well as generate sophisticated graphical plots from HDF5 data with just a few clicks. HDF5 browsing and plot generation can also be driven within Dakota GUI by Next-Gen Workflow, a powerful tool that allows you to construct node-based workflows for complex tasks.
The C3 library (POC: Prof. A. Gorodetsky, University of Michigan) provides a new capability for low rank approximation by functional tensor train decomposition. C3 has been integrated in Dakota, alongside other stochastic expansion methods for UQ based on projection, regression, or interpolation, enabling an array of approaches for exploiting special structure (e.g., sparsity, low rank, low intrinsic dimension) in an input-output map.
More Training Videos Available
New videos for Sensitivity Analysis, Optimization, Calibration, Uncertainty Quantification, Surrogate Models, and Parallelism have been added!
Training Videos Available
The Dakota team is excited to announce the release of introductory Dakota training via streaming video.
Dakota 6.10
Released: May 15, 2019 Release Highlights:
Evaluation data (variables and responses) may now be output to disk in HDF5 format. HDF5 support has been added to all of our downloads. See the Dakota HDF5 Output section of the Reference Manual for full details.
Capabilities for multilevel polynomial chaos expansion (ML PCE) and stochastic collocation (MC SC) have been expanded and hardened to improve their efficiency, completeness, and accuracy.
Dakota 6.9
Released: November 15, 2018 Release Highlights:
Dakota can now output method results to HDF5
Dakota's graphical user interface (GUI) was updated with several new features including the Dakota Study Wizard.
Bayesian calibration capabilities received several enhancments, including model evidence calculation with Monte Carlo sampling and 2nd-order local Laplace approximation
Dakota 6.8
Released: May 15, 2018 Release Highlights:
dprepro was completely re-written and has many new features, including the ability to execute arbitrary Python scripting in templates
Dakota's graphical user interface (GUI) was updated with many new features and bugfixes
Dakota now includes a suite of gradient-based optimization algorithms from the SNL-developed Rapid Optimization Library (ROL).
Bayesian calibration capabilities received several enhancements, including improved concurrency in evaluating optimal experimental designs
Dakota 6.7
Released: November 15, 2017 Release Highlights:
Graphical user interface improvements including support for a number of additional visualization types, and more...
Substantial improvements to multi-level and multi-fidelity methods, including greater flexbility in model hierarchies, and more...
Dakota now requires a C++11-compliant compiler, together with CMake 3.1 or newer (recommend 3.6 or greater).
Dakota in SIAM News
The following Dakota article appeared in SIAM News in April 2017: