Presented at Caltech Verification and Validation Workshop, December 9-11, 1998, Michael Ortiz, org.


A Framework for Assessing Uncertainties in Simulation Predictions

K.M. Hanson
Los Alamos National Laboratory

Abstract

A framework for a full analysis of the uncertainties in simulation predictions, which arise from model parameters derived from uncertain measurements, is presented. The uncertainty analysis is placed in the context of a graphical representation of the interaction of simulations with experimental measurements. It is assumed that the simulation code correctly predicts a physical situation given the correct model parameters, together with the exact initial and boundary conditions. The uncertainties in the model parameters derived from a single experiment are determined using Bayes law, which specifies the posterior probability distribution of the inferred parameters as proportional to the likelihood times the prior probability of the parameters. The likelihood of the experimental measurements predicted by the simulation for a given set of parameters is based on the probabilistic model that describes the measurement noise. In the absence of previous information, the prior for the model parameters is often assumed to be independent of the parameters. However, physically motivated constraints can be imposed though the prior, e.g., nonnegativity constraints on masses or heat capacities. If the initial and boundary conditions are uncertain, they are to be included as simulation parameters in this analysis with a prior assigned to them that represents their uncertainties. Characterization of the posterior by means of Markov Chain Monte Carlo (MCMC), or some others means, yields the uncertainties in the model parameters. One must be careful to marginalize over the parameters that are not of interest, e.g., the initial and boundary conditions. It is essential to include correlations in characterizing the uncertainties because these are crucial for drawing valid inferences.

This approach may be extended to analyze numerous experiments that contribute to our knowledge of the model parameters. Bayes law allows accumulation of information by basing the prior on model parameters in the analysis of each experiment on the posterior that summarizes previous analyses. Equivalently, a grand analysis of all available experiments may be conceptualized, in which the total likelihood is the product of the likelihood of all experiments. Of course, in this approach each experiment requires its own simulation to predict its own set of measurements. The overall analysis may be facilitated by summarizing the posterior from each individual experiment, e.g., using a covariance matrix appropriate to a Gaussian approximation, and using that in place of the recalculation of the full simulation. A graphical display of the analysis scheme permits one to follow and analyze the logic of this grand analysis. Important to performing the correct overall analysis is keeping straight the dependencies of the final results on individual experiments, which may be determined from the graphical diagram. In the end, the uncertainty distribution in a prediction for a new situation is obtained using the Monte Carlo approach of drawing model parameters from their overall uncertainty distribution and running the simulation for each set of parameters.

Keywords: uncertainty estimation, error analysis, verification and validation, probabilistic network, model checking, adjoint differentiation

Viewgraphs for this talk (pdf, 118 KB)
Return to viewgraph list

E-mail: kmh@hansonhub.com
WWW: http://home.lanl.gov/kmh/