Click for pdf file
New CSRI Groundbreaking Poster - 2005

Dream
Our Center's dream is to have modeling and simulation be a ubiquitous part of how Sandia performs its mission. Our dream is to have optimization be a ubiquitous part of modeling and simulation. Thus our R&D activities span the spectrum of stand-alone design optimization and assessment tools, to intrusive capabilities that tightly couple with simulations and frameworks for computing sensitivities and derivatives. Our dream is that whenever a Sandian needs to run a simulation or modeling code more than once, say with varying parameters, they will want to use our tools to direct their study.

Mission
Our mission is to lead in the fields of large-scale optimization and uncertainty estimation. We research general methodologies, specific algorithms, and develop software. We enable the application of these tools to the important problems facing the Nation.

Vision
Our vision is to revolutionize how Sandia and its DOE partners make critical decisions, by integrating our tools with design, analysis, and assessment processes. Our goal is to have our methods and software adopted throughout the broader community.

Optimization and Model Insight Research Directions at Sandia National Laboratories, 2002 INFORMS Chicago Chapter CUSTOM Workshop, Managing Risk in an Uncertain World. This plenary talk gives an overview of Sandia's application challenges leading to our department's unique research focus.



People

SNL 1411 Roster (Sandia only)

 
 
   


Projects

A key strength of the department is our many effective collaborations with experts from other parts of Sandia, DOE labs, universities and industry. Major activities of the department include the following:

Validation and Verification
We research key principles and methodologies for verification and validation (V&V). We apply these principles and methodologies to computational science and engineering software of importance to DOE, in particular software developed under the ASC program. This allows us to understand how well these simulations model reality, and understand when we can have confidence in a computational answer. Our V&V research critically benefits from our department's fundamental scientific strengths in optimization and uncertainty estimation. We are actively researching the method of manufactured solutions for large scale PDE-based simulation codes.

Large-scale Optimization Software
We provide DAKOTA, a freely-available optimization framework. DAKOTA provides algorithms for design optimization, uncertainty quantification, parameter estimation, design of experiments, and sensitivity analysis. These algorithms include both Sandia-developed and externally-developed libraries. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed-integer nonlinear programming, or optimization under uncertainty. In addition, DAKOTA provides parallel computing services and hooks for invoking user-developed interfaces to simulations. DAKOTA supports hardware ranging from desktop machines to massively parallel ASC supercomputers.

The initial design violated allowable response levels by a factor of 2, whereas the design optimized by DAKOTA
removed all violations and still met strict weight targets.

We are currently using optimization for designing microsystems. We are engaged in MESA-TOP product teams, helping mature specific microsystem products, and discovering effective design methodologies.

Surrogate-Based Optimization Algorithms
We research and develop novel optimization algorithms that address the difficult nature of many engineering and scientific applications. Typically, computational models of real-world systems are expensive to evaluate, have inaccurate or nonexistent gradients, and have multiple local optima. We approximate these poorly-behaved models with well-behaved surrogate functions. Surrogate-based optimization algorithms employ multifidelity models or smooth approximate models to produce optimal solutions. Often, these problems would otherwise be intractable. We deliver this software in several packages including DAKOTA and SURFPACK++.

Uncertainty Estimation and Optimization Under Uncertainty
Uncertainty encompasses both variability and lack of knowledge. Variability is possible to quantify using classical probability theory and subsequently propagate through calculations. Lack of knowledge includes factors such as fidelity and completeness of information, and is much more difficult to incorporate into computational tools.

We develop software that quantifies the variability in predictions of complex system response. This prediction uncertainty can arise from physical uncertainties, for example uncertain material properties, which appear as parameters in the computational model. Uncertainty in computed system response can also result from the omission of pertinent physics or from an under-resolved deterministic model. We provide a suite of design tools that couple optimization and uncertainty software. This robust tool suite can identify worst-case scenarios and design systems that meet performance requirements despite uncertain service conditions and manufacturing variability.

We research lack of knowledge questions, such as how missing or over-simplified physics affects model validation. We also research how new information can be used to enrich the models. For example, we may initially have insufficient information to characterize a probability distribution, but new data may become available through additional computations or experiments. In this case we would like to modify the probability model, e.g. through Bayesian updating.

In homeland security information uncertainty is common, both variability and especially lack of knowledge. We wrote a whitepaper describing the important role that explicit uncertainty could play in making effective homeland security decisions.

Sampling and Statistical Understanding
We provide a variety of methods to allow a user to run a collection of computer simulations to assess the sensitivity of model outputs with respect to model inputs. Common categories include parameter studies, sampling methods and design of experiments. In parameter studies one steps some input parameters through a range while keeping other input parameters fixed and evaluates how the output varies. In sampling methods, one generates samples from an input space distribution and calculates the output response at the input values. Specific sampling methods available within DAKOTA include Monte Carlo, Latin Hypercube, and (coming soon) quasi-Monte Carlo. In design of experiments the output is evaluated at a set of input "design" points chosen to sample the space in a representative way. Specific design of experiment methods available within DAKOTA include Box-Behnken, Central Composite, and Factorial designs. Sensitivity metrics are a mathematical way of expressing the dependence of outputs on inputs. A variety of sensitivity metrics are available within Dakota, such as simple and partial correlation coefficients, and rank correlations. Our current research focuses on methods to generate sensitivity metrics with a minimal number of runs, and on optimal estimation of parameters in computer models using Bayesian analysis techniques.


Samples within a two-dimensional design space.
Stars and dots represent two possible ways to
sample the space using orthogonal arrays.

Large Scale PDE Constrained Optimization
We are investigating simultaneous analysis and design methods (SAND) to handle problems involving large design spaces and complex PDE constrained simulations. This investment has paid off for a number of applications. One benefit is that this approach can reduce running time by several orders of magnitude, making certain problems tractable. This broad effort involves ongoing R&D on sensitivities, rSQP algorithms, linear algebra interfaces and frameworks. Many codes would benefit from incorporating SAND methods; the first simulation codes we targeted are for compressible fluid flow and electrical simulations.

Future work includes real time optimization for chemical inversion problems: taking chemical sensor data and identifying the origins of the contaminent via PDEs, and calculating how best to redirect air or water flow to mitigate the effects of the attack.

Mesh Optimization
We are researching the general theory of mesh quality. We have developed a mathematical treatment of metrics that is more directly related to simulation convergence and error than prior approaches. It also unifies and generalizes some metrics being used by mesh generation practitioners. These metrics are deployed through the Verdict library and the Mesquite mesh optimization toolkit. Mesquite is largely funded through the Terascale Simulation Tools and Technology (TSTT) Center, part of the DOE Office of Science. Mesquite includes algorithms to optimize the placement of mesh nodes according to these metrics. We are also exploring the connection between mesh optimization and shape optimization.



Products

Downloads



Publications


Links

Collaborations and related technical groups

External

Internal

Related program groups


Back to top of page | Page Questions and Comments | SNL Questions and Comments | Acknowledgment and Disclaimer

Web Site Contact: Rosa M. Zalesak • Technical Contact: Scott A. Mitchell
Updated: July 26, 2006