Presented at Uncertainty Quantification Workshop, March 18, 1999, Luis Salazar, org.

Bayesian Techniques for Quantifying Uncertainties

Kenneth M. Hanson
DX-3, Los Alamos National Laboratory


A framework is presented for a full analysis of the uncertainties in simulation predictions, which arise from model parameters derived from uncertain measurements. This framework may be useful for validating the hydrocodes to be used to assess the performance and safety of our weapons systems. An implementation based on this approach would serve as: 1) a database of experiments, showing links between analyses, 2) a logically-consistent means to make inferences about models based on all available information, and 3) a natural way to understand the limits to parameter adjustments made to match fully integrated experiments that are allowable on the basis of previous experiments. The resulting quantified uncertainties could be used to supplement expert opinions presently employed in the Enhanced Reliability Methodology.

First, consider the analysis of an individual experiment. The uncertainty analysis is placed in the context of a graphical representation of the interaction of simulations with experimental measurements. It is assumed that the simulation code correctly predicts a physical situation given the correct model parameters, together with the exact initial and boundary conditions. The uncertainties in the model parameters derived from a single experiment are determined using Bayes law, which specifies the posterior probability distribution of the inferred parameters as proportional to the likelihood times the prior probability of the parameters. The likelihood of the experimental measurements predicted by the simulation for a given set of parameters is based on the probabilistic model that characterizes the measurement noise. In the absence of previous information, the prior for the model parameters is often assumed to be ndependent of the parameters. If the initial and boundary conditions are uncertain, they are to be included as simulation parameters in this analysis with a prior assigned to them that represents their uncertainties. Characterization of the posterior by means of Markov Chain Monte Carlo (MCMC), or some others means, yields the uncertainties in the model parameters. One must marginalize over the nuisance parameters, e.g., the initial and boundary conditions. It is essential to include correlations in the uncertainties because these are crucial for drawing valid inferences.

The above approach may be extended to analyze numerous experiments that contribute to our knowledge of the model parameters. Bayes law allows accumulation of information by basing the prior on model parameters in the analysis of each experiment on the posterior that summarizes previous analyses. The overall analysis may be facilitated by summarizing the posterior from each individual experiment, e.g., in a Gaussian approximation in terms of the estimated parameters and their covariance matrix. A graphical display of the probabilistic analysis scheme permits one to follow and analyze the logic of the analysis. Important to performing the correct overall analysis is keeping straight the dependencies of the final results on individual experiments, which may be determined from the graphical diagram. In the end, the uncertainty distribution in a prediction for a new situation is obtained using the Monte Carlo approach of drawing model parameters from their overall huncertainty distribution and running the simulation for each set of parameters.

Keywords: uncertainty estimation, error analysis, verification and validation, probabilistic network, model checking, adjoint differentiation

Viewgraphs for this talk (pdf, 130 KB)
Return to viewgraph list