Scientific discourse about Uncertainty Quantification, Verification and Validation has been on a high note for the past decade. This fervor has been driven to a large extent by the demand from various constituencies that computational resources finally deliver on their promise of reproducing if not predicting physical reality. Clearly, the value of such an achievement will be enormous, ranging from superior product design to disaster mitigation and the management of complex systems such as financial markets and SmartGrids. A number of challenges are easily spotted on the path to delivering this computational surrogate. First, reality itself is elusive and is not always described with commensurate topologies. This problem is somewhat mitigated by experimental techniques that can simultaneously measure physical phenomena at multiple scales. Second, the mathematical model, which could include algorithmic approximations for calibration and forward computations, introduces additional assumptions that are often manifested as modeling and parametric uncertainties. Third, and even when these models are accurate and well-resolved numerically, uncertainties are introduced at the manufacturing stage when a physical device is constructed to represent the physical device, allowing for various tolerances and imperfections.
Physical reality, the mathematical model, its implementation into software components, and the as-built device, describe an equivalence class of objects that are expected to result in similar decisions. These can be viewed as multiple personalities of the same entity, and V&V can be viewed as the art and science of analyzing this divergence while enabling its successful resolution.
In this talk, I will describe our efforts as the psycho-analysis (and even psycho-therapy) of models, that rely on the astute packaging of information in accordance with the axioms of probability theory, and a function-analytic approach for treating randomness.