Wednesday, December 9, 2009

Verification, Validation, and Uncertainty Quantification

Notes on Chapter 8: Verification, Validation, and Uncertainty Quantification by George Em Karniadakis [1]. Karniadakis provides the motivation for the topic right off:

In time-dependent systems, uncertainty increases with time, hence rendering simulation results based on deterministic models erroneous. In engineering systems, uncertainties are present at the component, subsystem, and complete system levels; therefore, they are coupled and are governed by disparate spatial and temporal scales or correlations.

His definitions are based on those published by DSMO and subsequently adopted by AIAA and others.

Verification is the process of determining that a model implementation accurately represents the developers conceptual description of the model and the solution to the model. Hence, by verification we ensure that the algorithms have been implemented correctly and that the numerical solution approaches the exact solution of the particular mathematical model typically a partial differential equation (PDE). The exact solution is rarely known for real systems, so “fabricated” solutions for simpler systems are typically employed in the verification process. Validation, on the other hand, is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model. Hence, validation determines how accurate are the results of a mathematical model when compared to the physical phenomenon simulated, so it involves comparison of simulation results with experimental data. In other words, verification asks “Are the equations solved correctly?” whereas validation asks “Are the right equations solved?” Or as stated in Roache (1998) [2], “verification deals with mathematics; validation deals with physics.”

He addresses the constant problem of validation succinctly:

Validation is not always feasible (e.g., in astronomy or in certain nanotechnology applications), and it is, in general, very costly because it requires data from many carefully conducted experiments.

Getting decision makers to pay for this experimentation or testing is especially problematic when they were initially sold on using modeling and simulation as a way to avoid testing.

After this the chapter goes into an unnecessary digresion on inductive reasoning. An unfortunate common thread that I’ve noticed in many of the V&V reports I’ve read is they seem to think Karl Popper had the last word on scientific induction! I think the V&V community would profit greatly by studying Jayne’s theoretically sound pragmatism. They would quickly recognize that the ’problems’ they perceive in scientific induction are little more than misunderstandings of probability theory as logic.

The chapter gets back on track with the discussion of types of error in simulations:

Uncertainty quantification in simulating physical systems is a much more complex subject; it includes the aforementioned numerical uncertainty, but often its main component is due to physical uncertainty. Numerical uncertainty includes in addition to spatial and temporal discretization errors, errors in solvers (e.g., incomplete iterations, loss of orthogonality), geometric discretization (e.g., linear segments), artificial boundary conditions (e.g., infinite domains), and others. Physical uncertainty includes errors due to imprecise or unknown material properties (e.g., viscosity, permeability, modulus of elasticity, etc.), boundary and initial conditions, random geometric roughness, equations of state, constitutive laws, statistical potentials, and others. Numerical uncertainty is very important and many scientific journals have established standard guidelines for how to document this type of uncertainty, especially in computational engineering (AIAA 1998 [3]).

The examples given for effects of ’uncertainty propagation’ are interesting. The first is a direct numerical simulation (DNS) of turbulent flow over a circular cylinder. In this resolved simulation, the high-wave numbers (smallest scales) are accurately captured, but there is disagreement at the low wave numbers (largest scales). This somewhat counter-intuitive result occurs because the small scales are insensitive to experimental uncertainties about boundary and initial conditions, but the large scales of motion are not.

The section on methods for dealing with modelling uncertain inputs is sparse on details. Passing mention is made of Monte Carlo and Quasi-Monte Carlo methods, sensitivity-based methods and Bayesian methods.

The section on ’Certification / Accreditation’ is interesting. Karniadakis recomends designing experiments for validation based on the specific use or application rather than based on a particular code. This point deserves some emphasis. It is an often voiced desire from decision makers to have a repository of validated codes that they can access to support their various and sundry efforts. This is an unrealistic desire. A code can not be validated as such, only a particular use of a code can be validated. In most decisions that engineering simulation supports, the use is novel (research and new product development), therefore the validated model will be developed concurrently with (in the case of product development) or as a result of (in the case of research) the broader effort in question.

The suggested hierarchical validation framework is similar to the ’test driven development’ methodologies in software engineering and the ’knowledge driven product development’ championed in the GAO’s reports on government acquisition efforts. Small component (unit) tests followed by system integration tests and then full complex system tests. When the details of ’model validation’ are understood, it is clear that rather than replacing testing, simulation truly serves to organize test designs and optimize test efforts.

The conclusions are explicit (emphasis mine):

The NSF SBES report (Oden et al. 2006 [4]) stresses the need for new developments in V&V and UQ in order to increase the reliability and utility of the simulation methods at a profound level in the future. A report on European computational science (ESF 2007 [5]) concludes that “without validation, computational data are not credible, and hence, are useless.” The aforementioned National Research Council report (2008) on integrated computational materials engineering (ICME) states that, “Sensitivity studies, understanding of real world uncertainties and experimental validation are key to gaining acceptance for and value from ICME tools that are less than 100 percent accurate.” A clear recommendation was reached by a recent study on Applied Mathematics by the U.S. Department of Energy (Brown 2008 [6]) to “significantly advance the theory and tools for quantifying the effects of uncertainty and numerical simulation error on predictions using complex models and when fitting complex models to observations.”

References

[1] WTEC Panel Report on International Assessment of Research and Development in Simulation-Based Engineering and Science, 2009, http://www.wtec.org/sbes/SBES-GlobalFinalReport.pdf

[2] Roache, P.J. 1998. Verification and validation in computational science and engineering. Albuquerque,: Hermosa Publishers.

[3] AIAA Guide for the Verification and Validation of Computational Fluid Dynamics Simulations, Reston, VA, AIAA. AIAA-G-077-1998.

[4] Oden, J.T., T. Belytschko, T.J.R. Hughes, C. Johnson, D. Keyes, A. Laub, L. Petzold, D. Srolovitz, and S. Yip. 2006. Revolutionizing engineering science through simulation: A report of the National Science blue ribbon panel on simulation-based engineering science. Arlington: National Science Foundation. Available online

[5] European Computational Science Forum of the European Science Foundation (ESF). 2007. The Forward Look Initiative. European computational science: The Lincei Initiative: From computers to scientific excellence. Information available online.

[6] Brown, D.L. (chair). 2008. Applied mathematics at the U.S. Department of Energy: Past, present and a view to the future. May, 2008.


Concepts of Model Verification and Validation has a glossary that defines most of the relevant terms.

19 comments:

  1. In an age of spreading pseudoscience and anti-rationalism, it behooves those of us who believe in the good of science and engineering to be above reproach whenever possible. Public confidence is further eroded with every error we make. Although many of society’s problems can be solved with a simple change of values, major issues such as radioactive waste disposal and environmental modeling require technological solutions that necessarily involve computational physics. As Robert Laughlin noted in this magazine, “there is a serious danger of this power [of simulations] being misused, either by accident or through deliberate deception.” Our intellectual and moral traditions will be served well by conscientious attention to verification of codes, verification of calculations, and validation, including the attention given to building new codes or modifying existing codes with specific features that enable these activities.
    --P.J. Roache, "Building PDE Codes to be Verifiable and Validatable", Computing in Science and Engineering, 2004

    ReplyDelete
  2. Description of the Sandia Validation Metrics Project
    ABSTRACT: This report describes the underlying principles and goals of the Sandia ASCI Verification and Validation Program Validation Metrics Project. It also gives a technical description of two case studies, one in structural dynamics and the other in thermomoechanics, that serve to focus the technical work of the project in Fiscal Year 2001.

    ReplyDelete
  3. The reference list linked to the AIAA standard on VV&UQ, here's the ASME's similar standard ($45), an overview is available as well (free).

    ReplyDelete
  4. Interesting sounding post doc position at NREL:
    The successful candidate will perform research on the use of modern statistical techniques and participate in work with NREL scientists to analyze and control uncertainty and sensitivity in large geospatial and temporal data sets, laboratory data, mathematical models and simulations related to renewable energy and energy efficiency research and deployment.
    [...]
    A strong background in Uncertainty Quantification and Sensitivity Analysis. Experience in Validation and Verification and applying statistical analysis to scientific, engineering, energy resource, and/or atmospheric science data sets.

    ReplyDelete
  5. NIST is putting on a conference in Boulder on Uncertainty Quantification, 1-4 Aug 2011:
    Purpose:
    Computing has become an indispensable component of modern science and engineering research. As has been repeatedly observed and documented, processing speed measured in floating point operations per second has experienced exponential growth for several decades. These hardware efficiencies have been accompanied by innovations in mathematical algorithms, numerical software, and programming tools. The result is that, by any measure, the modern computer is many orders of magnitude more powerful than its early predecessors, capable of simulating physical problems of unprecedented complexity.

    Given the success of scientific computation as a research tool, it is natural that scientists, engineers, and policy makers strive to harness this immense potential by using computational models for critical decision-making. Increasingly, computers are being used to supplement experiments, to prototype engineering systems, or to predict the safety and reliability of high-consequence systems. Such use inevitably leads one to question "How good are these simulations? Would you bet your life on them?" Unfortunately, most computational scientists today are ill equipped to address such important questions with the same scientific rigor that is routine in experimental science.

    ReplyDelete
  6. "If error is corrected whenever it is recognized, the path of error is the path of truth." —Hans Reichenbach
    From the banner on the PECOS site.

    They are also hiring:
    ...areas including Computational Engineering and Sciences, Turbulence Modeling and Optimal Experimental Design.

    ReplyDelete
  7. Interesting new journal: International Journal for Uncertainty Quantification

    It might turn out to be pretty good, the associate editor is Dongbin Xiu, who has some pretty good recent papers on using polynomial chaos expansions to do UQ.

    ReplyDelete
  8. The googlebot found this funding summary document for some of the DOE's nuclear weapons stockpile stewardship stuff, this paragraph on the 'predictive capabilities framework' is interesting:

    The Predictive Capability Framework (PCF) is an integrated roadmap that reflects the responsive scientific capabilities needed to deliver a predictive capability to the nuclear security enterprise. Participants of the PCF include Defense Science, ASC, Engineering, DSW Research & Development, and Inertial Confinement Fusion Ignition and High Yield Campaign. The PCF identifies a list of long-term integrated goals and links the progress in the predictive capabilities to the progress in the five enabling capabilities, four of which (theory/model capabilities, code/algorithm capabilities, computational facilities, and Quantification of Margins and Uncertainties (QMU) and Verification & Validation (V&V) capabilities) are developed by the ASC program. With the pending completion of major new experimental facilities and entry into peta-scale high performance computing, the PCF represents a new phase of science-based stockpile stewardship – one better aligned to the challenges of an aging and changing stockpile.

    Even thought they can't do full-up testing anymore, experimental work is still a huge part of their understanding of the useful predictive capabilities of their (huge) simulations.

    ReplyDelete
  9. Regulatory Models and the Environment: Practice, Pitfalls, and Prospects
    Modeling is a difficult enterprise even outside
    of the potentially adversarial regulatory environment. The demands grow when the regulatory requirements for accountability, transparency, public accessibility, and technical rigor are added to the challenges. Moreover, models cannot be validated (declared true) but instead should be evaluated with regard to their suitability as tools to address a specific question. The committee concluded that these characteristics make evaluation of a regulatory model more complex than
    simply comparing measurement data with model results. Evaluation also must balance the need for a model to be accurate with the need for a model to be reproducible, transparent, and useful for the regulatory decision at hand. Meeting these needs requires model evaluation to be applied over the “life cycle” of a regulatory model with an approach that includes different forms of peer review, uncertainty analysis, and extrapolation methods than for non-regulatory models.


    Their choice of terminology is unfortunate, validated generally doesn't mean true, it means understanding the degree to which a model is a suitable representation of the real world.

    ReplyDelete
  10. I mentioned in a previous comment that D. Xiu was the associate editor of a new Uncertainty Quantification journal, he has written a nice (freely accessible) review article on Fast Numerical Methods for Stochastic Computations, which is a good intro to the state of the art.

    ReplyDelete
  11. From NA digest:
    Postdoc Position in V&V/UQ at Sandia National Laboratories

    The Computer Science Research Institute (CSRI) at Sandia National Laboratories in Albuquerque, NM, is seeking outstanding applicants for a postdoctoral position in any of the broad areas of Verification and Validation (V&V), Uncertainty Quantification (UQ), data assimilation, optimization, and inverse problems. The CSRI brings university faculty, students, and others together to pursue collaborative research in a highly multidisciplinary, team-based environment. The CSRI is well known for producing the DAKOTA and Trilinos multi-function software packages. For more information, see http://www.cs.sandia.gov/CSRI/, http://www.cs.sandia.gov/dakota/, and
    http://trilinos.sandia.gov/).

    To apply, go to http://www.sandia.gov, click on Employment, click on Career Opportunities, click on Search Job Postings, enter the job opening number (64709) in the Keywords field of the Basic Job Search, press Search, and scroll down to see the posting.

    ReplyDelete
  12. An unfortunate common thread that I’ve noticed in many of the V&V reports I’ve read is they seem to think Karl Popper had the last word on scientific induction!

    Well, he didn't (hat tip Briggs).

    ReplyDelete