Dagstuhl Perspectives Workshop 10482
Formal Methods – Just a Euro-Science?
( Nov 30 – Dec 03, 2010 )
- Andrzej Tarlecki (University of Warsaw, PL)
- Moshe Y. Vardi (Rice University - Houston, US)
- Reinhard Wilhelm (Universität des Saarlandes, DE)
- Annette Beyer (for administrative matters)
Formal methods are employed during system-development process to improve the quality of the system, to increase the efficiency of the development process, or to derive guarantees about qualities of the system. The term “formal methods” has traditionally been used for a number of different approaches, including modelling and specification languages, as well as methods and tools to derive properties of systems. Because of the vagueness of the term “formal methods”, it may perhaps, be desirable to replace it by “modelling, analysis, and verification.”
A good recent overview of industrial projects concentrating on the early phases of specification and design has been given in a recent survey article: Jim Woodcock, Peter Gorm Larsen, Juan Bicarregui, John S. Fitzgerald: Formal methods: Practice and experience. ACM Computing Surveys 41(4), (2009).
The Dagstuhl Perspectives Workshop, held in December 2010, concentrated mostly on methods for system analysis and verification. These are employed in the design phase as well as in later phases of system development. Model checking, equivalence checking, and abstract interpretation, both developed in academia, are the most impressive success stories.
After a very long gestation period, formal methods for the derivation of program properties have finally gained some measure of industrial acceptance. There are, however, remarkable differences in the degree of this acceptance. There is a clear correlation between the criticality of systems and the costs of failure, on one hand, and the degree to which formal methods are employed in their development, on the other hand. Hardware manufacturers and producers of safety-critical embedded systems in the transportation industry are examples of areas where applications of analysis and verification methods are perhaps most visible. A semiconductor design gone wrong is just too costly for any cost argument against the use of formal design and verification tools be acceptable. Threats of liability costs are strong arguments for the use of formal methods in the development of safety-critical embedded systems. Different application areas often entail different approaches to the use of formal methods. Safety-critical systems call for the use of sound methods to dogmatically ensure correctness. General-purpose software with strong time-to-market pressures encourage more pragmatic attitude, with emphasis on bug-chasing methods and tools.
Industrial domains with certification requirements have introduced tools based on formal methods into their development processes. However, most current certification regulations are still process-based; they regulate the development process and do not state the required properties of the result. Critics describe this as “Clean pipes, dirty water.” The trend to use formal methods will become stronger when certification standards move from process-based assurance to product-based assurance. These new standards will specify the guarantees to be given about system properties. Several current standards for transportation systems highly recommend abstract interpretation and model checking for systems at the highest criticality level. “Highly recommend” actually means “required”. The loophole is the “state-of-practice” argument. The developer can be exempted from using a highly recommended method by arguing that it is not yet the "state of practice".
Several participants of the workshop have expressed the important role of champions of a formal method. A champion, enthusiastic about the potential of and competent in the use of verification method, is often needed to introduce the method and associated tool to the development process. Often, once the champion leaves, the degree of adoption declines dramatically.
The expectations towards analysis and verification methods have always been very high, often due to unrealistic promises. These unrealistic promises have mostly been the result of the ignorance of the differentiation of roles. Three distinct roles are connected to a formal method: the researcher develops the theoretical foundations of the method; the tool developer implements the method; and the users apply the tool in an industrial setting. The different analysis and verification methods have very different requirements imposed on their users, which has implications for their acceptance in industry. Researchers and tool developers often develop their methods and tools for their own use. Subsequently, they use these tools with a high degree of expertise. The experience of such expert users is quite different from that of industrial users, who do not have such degree of expertise. Thus, reports by expert users often quite rosy and create unrealistic expectations. The expectations towards analysis and verification methods are astonishing in the light of the known undecidability or intractability of the problems they are expected to solve; the methods and tools are expected to be at the same time fully automatic, effective and efficient, and easy to use. Disappointment is unavoidable. Nevertheless, the border between what can currently be done and what is still out of reach is permanently moving, with significant progress accomplished over the last 30 years.
One challenge for further advances is higher degree of automation: the different methods require different degrees of user interaction and of user qualification. Currently, with few exceptions, such as Microsoft Research’s Boogie platform, there is little integration among different tools. Nevertheless, advances can be expected in the coming years from tool integration, starting with information exchange between tools and common exchange formats. Specifically, there is a high potential for improvement from a synergetic integration of model-based design tools with analysis and verification tools.
Scalability of the methods and tools is still considered a problem. The exploitation of large-scale parallelism may increase the size of verifiable systems. A clear identification of application areas for the various methods rather than the search for universal methods, doomed to fail, will avoid user disappointment.
The embedded-system industry has already realised that badly structured systems written in obscure programming style cannot be effectively maintained. Similarly, it cannot be expected that verification methods would cope with such systems. Systems should be designed for verifiability.
While formal methods have often been dismissed by many as “Euroscience” a rather abstract research with little chance for industrial adoption, decades of research, both basic research and tool development have started to bear fruits, attracting an increasing level of industrial interest. This interest is often accompanied by unrealistic expectations, but, at the same time, provides an opportunity and challenge to researchers working in this area, as more basic research and good tools engineering are needed to solve the challenges outlined above.
- Krzysztof Apt (CWI - Amsterdam, NL)
- Thomas Ball (Microsoft Corporation - Redmond, US) [dblp]
- Dines Bjorner (Holte, DK)
- Patrick Cousot (ENS - Paris, FR) [dblp]
- Cindy Eisner (IBM - Haifa, IL) [dblp]
- Javier Esparza (TU München, DE) [dblp]
- Steffen Görzig (Daimler AG - Böblingen, DE) [dblp]
- Yuri Gurevich (Microsoft Corporation - Redmond, US) [dblp]
- Marc Herbstritt (Schloss Dagstuhl, DE) [dblp]
- Manuel Hermenegildo (IMDEA Software - Madrid, ES) [dblp]
- Bengt Jonsson (Uppsala University, SE) [dblp]
- Joseph Roland Kiniry (IT University of Copenhagen, DK) [dblp]
- Jörg Kreiker (SMA Solar Technology, DE)
- Wei Li (Beihang University, CN)
- Wolfgang J. Paul (Universität des Saarlandes, DE) [dblp]
- Erik Poll (Radboud University Nijmegen, NL) [dblp]
- Sriram K. Rajamani (Microsoft Research India - Bangalore, IN) [dblp]
- Jean-Francois Raskin (Free University of Brussels, BE) [dblp]
- John Rushby (SRI - Menlo Park, US) [dblp]
- Donald Sannella (University of Edinburgh, GB)
- Wei Sun (Beihang University, CN)
- Andrzej Tarlecki (University of Warsaw, PL)
- Wolfgang Thomas (RWTH Aachen, DE) [dblp]
- Moshe Y. Vardi (Rice University - Houston, US) [dblp]
- Reinhard Wilhelm (Universität des Saarlandes, DE) [dblp]
- James C. P. Woodcock (University of York, GB) [dblp]
- Lenore D. Zuck (NSF - Arlington, US) [dblp]