http://www.dagstuhl.de/06351

27. August – 01. September 2006, Dagstuhl Seminar 06351

Methods for Modelling Software Systems (MMOSS)

Organisatoren

Ed Brinksma (University of Twente, NL)
David Harel (Weizmann Institute – Rehovot, IL)
Perdita Stevens (University of Edinburgh, GB)
Roel J. Wieringa (University of Twente, NL)


Auskunft zu diesem Dagstuhl Seminar erteilt

Dagstuhl Service Team

Dokumente

Dagstuhl Seminar Proceedings DROPS
Teilnehmerliste

Summary

A proper engineering approach to the construction of complex software systems requires models for their design and verification. Such models must represent both the software system and its (intended) environment. Modelling the environment is needed to define both the events in the environment to which the system must respond, and the effects that the system must have on its environment. A typical environment may consist of other software, hardware, and physical and social systems. The environment of a manufacturing control system, for example, consists of manufacturing information systems, the physical plant, and work procedures to be followed by human operators, all of which affect and complicate the task of modelling at some point of the design and verification processes.

It is clear that the quality of a design or verification process is directly affected by the quality of the models that are being used. They should meet certain quality criteria, such as correctness, understandability and maintainability. An important question that we want to address is what the relevant quality criteria for design and verification modelling are. Are there guidelines on how to achieve them? And how can we validate them?

Models invariably introduce abstractions, and the modeller has, in principle, an obligation to show that he or she has introduced the "right" abstractions. This justification is essentially informal for a number of reasons:

  • Complexity: many systems are too complex to be represented without a substantial recourse to abstraction. In practical cases, formal proofs of the adequacy of such abstractions are infeasible, because they somehow rely on the availability of a non-existing model of the "full" system.
  • Physical reality: this is a non-formal domain by definition, and the quality of a model with respect to its physical environment must be validated by experimental or informal means.
  • Social aspects: the user environment of a software system is also informal in nature, but must be taken Into account to ensure that it will respond properly.

It Is clear that most abstractions introduced by design and verification models of real systems cannot be justified formally, and must also rely on informal argumentation. The quality of any analysis based on such models, therefore, depends crucially on the quality of informal arguments.

To build models of a certain quality, we need guidelines on how to construct them well. Too often, however, such guidelines are nonexistent or simply ignored in practice, e.g. in verification modelling often "model hacking precedes model checking". And the less we understand about the way a model is built, the harder it is to validate its quality. In design modelling, designers like to proceed as if they were starting from scratch, ignoring any models that have already been developed by others. How can we improve this practice, so that we know how to build models that can be validated?

Although the problems of modelling in design and verification have many similarities, there are differences too. On the one hand, design models represent (an aspect of) the intended structure of the software-to-be. They are prescriptions that must be used by implementors and must include details needed by the implementor. Usability of the model by the implementor is an important quality criterion of these models; completeness is another. For verification models, on the other hand, abstraction is a crucial technique, preserving just as much information about the system as needed to prove correctness, and providing models that can be efficiently verified. Each property to be verified may require a different model. Still, all models, design or verification, must have some isomorphy to the modelled system, so that a from the fact that the model has a property, we can conclude that the system has this property too. This requires a good understanding of the relationship between the model and the modelled system.

The questions that we invited participants in the seminar to address were:

Model quality:

  • What are quality criteria for models? How can they be quantified and checked?
  • What is the relationship between models and systems in design and in verification?
  • What makes an abstraction reasonable?

Modelling method:

  • What are the sources and principles for the construction of good models? What is the relation between design and verification models?
  • How can the structure of a model be coupled to the structure of the system? What criteria should be used for the structuring of models?
  • How to bridge between informal knowledge and formal representation?
  • How can we use domain knowledge, and especially engineering documentation to build correct models?

Effectiveness :

  • Can we build libraries of problem frames in the domain of embedded software, or in subdomains?

Maintainability :

  • Can we build models in such a way that changes in the system (versions) can be easily mapped to versions of models?
  • How can changes in the verification property imply changes in the verification model?

Classification

  • Modelling / Simulation
  • Sw-engineering
  • Semantics / Formal Methods
  • Verification / Logic

Keywords

  • Modelling Methods
  • Design Models
  • Verification Models
  • Problem-solution co-refinement

Buchausstellung

Bücher der Teilnehmer 

Buchausstellung im Erdgeschoss der Bibliothek

(nur in der Veranstaltungswoche).

Dokumentation

In der Reihe Dagstuhl Reports werden alle Dagstuhl-Seminare und Dagstuhl-Perspektiven-Workshops dokumentiert. Die Organisatoren stellen zusammen mit dem Collector des Seminars einen Bericht zusammen, der die Beiträge der Autoren zusammenfasst und um eine Zusammenfassung ergänzt.

 

Download Übersichtsflyer (PDF).

Publikationen

Es besteht weiterhin die Möglichkeit, eine umfassende Kollektion begutachteter Arbeiten in der Reihe Dagstuhl Follow-Ups zu publizieren.

Dagstuhl's Impact

Bitte informieren Sie uns, wenn eine Veröffentlichung ausgehend von
Ihrem Seminar entsteht. Derartige Veröffentlichungen werden von uns in der Rubrik Dagstuhl's Impact separat aufgelistet  und im Erdgeschoss der Bibliothek präsentiert.