About Dakota

These capabilities may be used on their own or as components within advanced strategies such as hybrid optimization, surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty.

Why Dakota?

Computational methods developed in structural mechanics, heat transfer, fluid mechanics, shock physics, and many other fields of engineering can be an enormous aid to understanding the complex physical systems they simulate. Often, it is desired to use these simulations as virtual prototypes to obtain an acceptable or optimized design for a particular system. This effort seeks to enhance the utility of these computational methods by enabling their use as design tools, so that simulations may be used not just for single-point predictions, but also for automated determination of system performance improvements throughout the product life cycle.

This allows analysts to address the fundamental engineering questions of foremost importance to our programs, such as

  • "What is the best design?"
  • "How safe is it?"
  • "How much confidence do I have in my answer?"

System performance objectives can be formulated to:

  • minimize weight, cost, or defects;
  • limit a critical temperature, stress, or vibration response;
  • maximize performance, reliability, throughput, reconfigurability, agility, or design robustness

A systematic, rapid method of determining these optimal solutions will lead to better designs and improved system performance and will reduce dependence on prototypes and testing, which, in turn, will shorten the design cycle and reduce development costs.

The Dakota Software Toolkit

Toward these ends, a general purpose software toolkit is under continuing development for the integration of commercial and in-house simulation capabilities with broad classes of systems analysis tools. Written in C++, the Dakota toolkit is intended as a flexible, extensible interface between simulation codes and a variety of iterative systems analysis methods, including optimization, uncertainty quantification, nonlinear least squares methods, and sensitivity/variance analysis.