September 11 – 16 , 2011, Dagstuhl Seminar 11371

Uncertainty modeling and analysis with intervals: Foundations, tools, applications


Isaac E. Elishakoff (Florida Atlantic University – Boca Raton, US)
Vladik Kreinovich (University of Texas – El Paso, US)
Wolfram Luther (Universität Duisburg-Essen, DE)
Evgenija D. Popova (Bulgarian Academy of Sciences, BG)

For support, please contact

Dagstuhl Service Team


List of Participants
Dagstuhl's Impact: Documents available


Verification and validation (V&V) assessment of process modelling and simulation is increasing in importance in various areas of application. They include complex mechatronic and biomechanical tasks with especially strict requirements on numerical accuracy and performance. However, engineers lack precise knowledge regarding the process and its input data. This lack of knowledge and the inherent inexactness in measurement make such general V&V cycle tasks as design of a formal model and definition of relevant parameters and their ranges difficult to complete.

To assess how reliable a system is, V&V analysts have to deal with uncertainty. There are two types of uncertainty: aleatory and epistemic. Aleatory uncertainty refers to variability similar to that arising in games of chance. It cannot be reduced by further empirical study. Epistemic (reducible) uncertainty refers to the incertitude resulting from lack of knowledge. An example is the absence of evidence about the probability distribution of a parameter. In this situation, standard methods for modeling measurement uncertainty by using probability distributions cannot be applied. Here, interval methods provide a possible solution strategy.

Another option, mostly discussed in the context of risk analysis, is to use interval-valued probabilities and imprecisely specified probability distributions. The probability of an event can be specified as an interval; probability bounds analysis propagates constraints on a distribution function through mathematical operations. In a more general setting, the theory of imprecise probabilities is a powerful conceptual framework in which uncertainty is represented by closed, convex sets of probability distributions. Bayesian sensitivity analysis or Dempster-Shafer theory are further options.

A standard option in uncertainty management is Monte Carlo simulation. This is a universal data-intensive method that needs random number generators, distributions, dependencies, and a mathematical model (but not a closed analytic solution) to provide accurate results. Compared to interval methods, it yields less conservative bounds, which, however, might fail to contain the exact solution. As an implementation of convolution in probability theory, Monte Carlo methods are complementary to interval approaches.

Additionally, they play an important role in probability bounds analysis, Dempster-Shafer theory, and further approaches combining probabilistic and interval uncertainties.

The goal of this seminar is to promote and accelerate the integration of reliable numerical algorithms and statistics of imprecise data into the standard procedures for assessing and propagating uncertainty. The main contributions of this seminar were

  • Expressing, evaluating and propagating measurement uncertainties; designing efficient algorithms to compute various parameters, such as means, median and other percentiles, variance, interquantile range, moments and confidence limits; summarizing the computability of such statistics from imprecise data.
  • New uncertainty-supporting dependability methods for early design stages. These include the propagation of uncertainty through dependability models, the acquisition of data from similar components for analyses, and the integration of uncertain reliability and safety predictions into an optimization framework.
  • Modeling and processing applications from the areas of robust geometrical design, financial simulation and optimization, robotics, mechatronics, reliability and structural safety, bioinformatics and climate science with uncertain input parameters and imprecise data.
  • Discussing software for probabilistic risk and safety assessments working with real numbers, intervals, fuzzy numbers, probability distributions, and interval bounds on probability distributions that combines probability theory and interval analysis and makes the newest techniques such as interval Monte Carlo method, probability bounds analysis and fuzzy arithmetic available.
  • Promoting a new interval standard for interval arithmetic as explained in the P1788 draft: ``This standard specifies basic interval arithmetic operations selecting and following one of the commonly used mathematical interval models and at least one floating-point type defined by the IEEE-754/2008 standard. Exception conditions are defined and standard handling of these conditions are specified. Consistency with the model is tempered with practical considerations based on input from representatives of vendors and owners of existing systems''.

The seminar was attended by 33 participants from 8 countries who gave 34 talks. To stimulate debate and cross-fertilization of new ideas we scheduled a mixture of tutorials, contributed talks, a meeting of the IEEE P1788 working group, and software demonstrations. The seminar started with a series of talks aimed at providing a suitable level of introduction to the main areas of discussion and providing a leveling ground for all participants.

The format of the seminar was then a series of contributed presentations on the variety of the seminar topics mentioned above. A lively discussion on the current state of the interval standardization was initiated by the talk on the hot topic of decorated intervals on Tuesday afternoon and continued during the meeting of the IEEE P1788 working group on Thursday afternoon. A session on software tools, held on Wednesday, was followed by software demonstrations on Thursday evening. There was much time for extensive discussions in between the talks, in the evenings, and during the excursion on Wednesday afternoon. The seminar had generally a very open and constructive atmosphere. As a result of the seminar there will be a special issue published in a leading journal that will not only publish papers presented at the seminar, but also provide a roadmap for the future directions of the uncertainty modeling.


  • Verification And Validation
  • Modeling And Simulation


  • Uncertainty modeling - Propagation of uncertainty through and validation for computational models using interval arithmetic - Imprecise probabilities
  • Sensitivity analysis - Applications to structures
  • Mechatronics
  • Bioinformatics and finance


In the series Dagstuhl Reports each Dagstuhl Seminar and Dagstuhl Perspectives Workshop is documented. The seminar organizers, in cooperation with the collector, prepare a report that includes contributions from the participants' talks together with a summary of the seminar.


Download overview leaflet (PDF).

Dagstuhl's Impact

Please inform us when a publication was published as a result from your seminar. These publications are listed in the category Dagstuhl's Impact and are presented on a special shelf on the ground floor of the library.


Furthermore, a comprehensive peer-reviewed collection of research papers can be published in the series Dagstuhl Follow-Ups.