While a wealth of experience in method development for certain aspects of
the uncertainty quantification problem exists at present, a cohesive
software toolkit utilizing massively parallel computing resources does
not. Various requirements imposed on the toolkit in terms of
functionality exist. One of these is its ability to coordinate with, and
utilize output from, a wide selection of deterministic analysis codes
from a variety of disciplines non-probabilistic fashion (e.g. fuzzy sets,
imprecise probability, etc.).
The DAKOTA (Design and Analysis Kit for OpTimizAtion) framework developed
at Sandia National Laboratories has been modified to incorporate a
variety of uncertainty quantification capabilities including analytical
(AMV/AMV+, FORM/SORM) and sampling-based (Monte Carlo, Latin Hypercube
Sampling) methodologies. Extended capabilities, including importance
sampling techniques and stochastic finite element techniques, are planned
to be incorporated in the near future. These UQ enhancements leverage the
investment in massively parallel computing already made in DAKOTA. The
presentation will discuss the software design of the toolkit detailing
some of its distinguishing features including multilevel parallelism,
surrogate-based studies, mixed integer capabilities and OUU (Optimization
under Uncertainty). Finally, results of the application of the toolkit to
a large-scale engineering analysis problem will be given, demonstrating
its capabilities.