- Annette Beyer (for administrative matters)
Verification and validation (V&V) assessment of process modelling and simulation is increasing in importance in various areas of application. They include complex mechatronic and biomechanical tasks with especially strict requirements on numerical accuracy and performance. However, engineers lack precise knowledge regarding the process and its input data. This lack of knowledge and the inherent inexactness in measurement make such general V&V cycle tasks as design of a formal model and definition of relevant parameters and their ranges difficult to complete.
To assess how reliable a system is, V&V analysts have to deal with uncertainty. There are two types of uncertainty: aleatory and epistemic. Aleatory uncertainty refers to variability similar to that arising in games of chance. It cannot be reduced by further empirical study. Epistemic (reducible) uncertainty refers to the incertitude resulting from lack of knowledge. An example is the absence of evidence about the probability distribution of a parameter. In this situation, standard methods for modeling measurement uncertainty by using probability distributions cannot be applied. Here, interval methods provide a possible solution strategy.
Another option, mostly discussed in the context of risk analysis, is to use interval-valued probabilities and imprecisely specified probability distributions. The probability of an event can be specified as an interval; probability bounds analysis propagates constraints on a distribution function through mathematical operations. In a more general setting, the theory of imprecise probabilities is a powerful conceptual framework in which uncertainty is represented by closed, convex sets of probability distributions. Bayesian sensitivity analysis or Dempster-Shafer theory are further options.
A standard option in uncertainty management is Monte Carlo simulation. This is a universal data-intensive method that needs random number generators, distributions, dependencies, and a mathematical model (but not a closed analytic solution) to provide accurate results. Compared to interval methods, it yields less conservative bounds, which, however, might fail to contain the exact solution. As an implementation of convolution in probability theory, Monte Carlo methods are complementary to interval approaches.
Additionally, they play an important role in probability bounds analysis, Dempster-Shafer theory, and further approaches combining probabilistic and interval uncertainties.
The goal of this seminar is to promote and accelerate the integration of reliable numerical algorithms and statistics of imprecise data into the standard procedures for assessing and propagating uncertainty. The main contributions of this seminar were
- Expressing, evaluating and propagating measurement uncertainties; designing efficient algorithms to compute various parameters, such as means, median and other percentiles, variance, interquantile range, moments and confidence limits; summarizing the computability of such statistics from imprecise data.
- New uncertainty-supporting dependability methods for early design stages. These include the propagation of uncertainty through dependability models, the acquisition of data from similar components for analyses, and the integration of uncertain reliability and safety predictions into an optimization framework.
- Modeling and processing applications from the areas of robust geometrical design, financial simulation and optimization, robotics, mechatronics, reliability and structural safety, bioinformatics and climate science with uncertain input parameters and imprecise data.
- Discussing software for probabilistic risk and safety assessments working with real numbers, intervals, fuzzy numbers, probability distributions, and interval bounds on probability distributions that combines probability theory and interval analysis and makes the newest techniques such as interval Monte Carlo method, probability bounds analysis and fuzzy arithmetic available.
- Promoting a new interval standard for interval arithmetic as explained in the P1788 draft: ``This standard specifies basic interval arithmetic operations selecting and following one of the commonly used mathematical interval models and at least one floating-point type defined by the IEEE-754/2008 standard. Exception conditions are defined and standard handling of these conditions are specified. Consistency with the model is tempered with practical considerations based on input from representatives of vendors and owners of existing systems''.
The seminar was attended by 33 participants from 8 countries who gave 34 talks. To stimulate debate and cross-fertilization of new ideas we scheduled a mixture of tutorials, contributed talks, a meeting of the IEEE P1788 working group, and software demonstrations. The seminar started with a series of talks aimed at providing a suitable level of introduction to the main areas of discussion and providing a leveling ground for all participants.
The format of the seminar was then a series of contributed presentations on the variety of the seminar topics mentioned above. A lively discussion on the current state of the interval standardization was initiated by the talk on the hot topic of decorated intervals on Tuesday afternoon and continued during the meeting of the IEEE P1788 working group on Thursday afternoon. A session on software tools, held on Wednesday, was followed by software demonstrations on Thursday evening. There was much time for extensive discussions in between the talks, in the evenings, and during the excursion on Wednesday afternoon. The seminar had generally a very open and constructive atmosphere. As a result of the seminar there will be a special issue published in a leading journal that will not only publish papers presented at the seminar, but also provide a roadmap for the future directions of the uncertainty modeling.
- Götz Alefeld (KIT - Karlsruher Institut für Technologie, DE)
- Ekaterina Auer (Universität Duisburg-Essen, DE)
- Michael Beer (University of Liverpool, GB)
- Neli Dimitrova (Bulgarian Academy of Sciences, BG)
- Martin Fuchs (Cerfacs - Toulouse, FR)
- Jürgen Garloff (HTWG Konstanz, DE) [dblp]
- Milan Hladík (Charles University - Prague, CZ)
- Cliff Joslyn (Pacific Northwest National Lab. - Seattle, US)
- Ralph Baker Kearfott (Univ. of Louisiana - Lafayette, US)
- Stefan Kiel (Universität Duisburg-Essen, DE)
- Olga M. Kosheleva (University of Texas - El Paso, US)
- Walter Krämer (Bergische Universität Wuppertal, DE)
- Vladik Kreinovich (University of Texas - El Paso, US) [dblp]
- Jean-Luc Lamotte (UPMC - Paris, FR)
- Vincent Lefèvre (ENS - Lyon, FR) [dblp]
- Philipp Limbourg (Universität Duisburg-Essen, DE)
- Weldon A. Lodwick (University of Colorado - Denver, US)
- Wolfram Luther (Universität Duisburg-Essen, DE) [dblp]
- Olivier Mullier (Supélec - Gif-sur-Yvette, FR)
- Arnold Neumaier (Universität Wien, AT) [dblp]
- Evgenija D. Popova (Bulgarian Academy of Sciences, BG)
- John D. Pryce (Cardiff University, GB) [dblp]
- Andreas Rauh (Universität Rostock, DE)
- Gabor Rebner (Universität Duisburg-Essen, DE)
- Nathalie Revol (ENS - Lyon, FR) [dblp]
- Heinrich Rommelfanger (Goethe-Universität Frankfurt am Main, DE)
- Michel Rueher (University of Nice, FR) [dblp]
- Siegfried M. Rump (TU Hamburg-Harburg, DE) [dblp]
- Peihui Shao (University of Montréal, CA)
- Neil Stewart (University of Montréal, CA)
- Yan Wang (Georgia Institute of Technology, US)
- Jürgen Wolff von Gudenberg (Universität Würzburg, DE)
- Michael Zimmer (Bergische Universität Wuppertal, DE)
- Verification and validation
- modeling and simulation
- Uncertainty modeling - Propagation of uncertainty through and validation for computational models using interval arithmetic - Imprecise probabilities
- Sensitivity analysis - Applications to structures
- bioinformatics and finance