![]() |
Dakota Reference Manual
Version 6.2
Large-Scale Engineering Optimization and Uncertainty Analysis
|
Morris One-at-a-Time
This keyword is related to the topics:
Alias: none
Argument(s): none
Required/Optional | Description of Group | Dakota Keyword | Dakota Keyword Description | |
---|---|---|---|---|
Optional | partitions | Number of partitions of each variable | ||
Optional | samples | Number of samples for sampling-based methods | ||
Optional | seed | Seed of the random number generator | ||
Optional | model_pointer | Identifier for model block to be used by a method |
The Morris One-At-A-Time (MOAT) method, originally proposed by Morris [62], is a screening method, designed to explore a computational model to distinguish between input variables that have negligible, linear and additive, or nonlinear or interaction effects on the output. The computer experiments performed consist of individually randomized designs which vary one input factor at a time to create a sample of its elementary effects.
The number of samples (samples
) must be a positive integer multiple of (number of continuous design variable + 1) and will be automatically adjusted if misspecified.
The number of partitions (partitions
) applies to each variable being studied and must be odd (the number of MOAT levels per variable is partitions + 1). This will also be adjusted at runtime as necessary.
For information on practical use of this method, see [73].
With MOAT, each dimension of a dimensional input space is uniformly partitioned into
levels, creating a grid of
points
at which evaluations of the model
might take place. An elementary effect corresponding to input
is computed by a forward difference
where is the
coordinate vector, and the step
is typically taken to be large (this is not intended to be a local derivative approximation). In the present implementation of MOAT, for an input variable scaled to
,
, so the step used to find elementary effects is slightly larger than half the input range.
The distribution of elementary effects over the input space characterizes the effect of input
on the output of interest. After generating
samples from this distribution, their mean,
modified mean
(using absolute value) and standard deviation
are computed for each input . The mean and modified mean give an indication of the overall effect of an input on the output. Standard deviation indicates nonlinear effects or interactions, since it is an indicator of elementary effects varying throughout the input space.