04.09.16 - 09.09.16, Seminar 16362

Robustness in Cyber-Physical Systems

Diese Seminarbeschreibung wurde vor dem Seminar auf unseren Webseiten veröffentlicht und bei der Einladung zum Seminar verwendet.


Electronically controlled systems have become pervasive in modern society and are increasingly being used to control safety-critical applications, such as medical devices and transportation systems. At the same time, these systems are increasing in complexity at an alarming rate, making it difficult to produce system designs that provide guaranteed properties in the face of various forms of uncertainty. Cyber-physical systems (CPS) is a new multi-disciplinary field aimed at providing a rigorous framework for designing and analyzing these systems.

Engineering robustness into systems under development has always been at the heart of good engineering practice, be it robustness against manufacturing tolerances and against variations in purity of construction materials in mechanical engineering, robustness against concentrations of educts in chemical engineering, against parameter variations in the plant model within control engineering, against quantization and measurement noise in signal processing, against faults in computer architecture, against attacks in security engineering, or against unexpected inputs or results in programming. In the CPS context, all the aforementioned engineering disciplines meet, as the digital networking and embedded control involved in CPS brings many kinds of physical processes into the sphere of human and computer control. This convergence of disciplines has proven extremely fruitful in the past, inspiring profound research on hybrid and distributed control, transferring notions and methods for safety verification from computer science to control theory, transferring proof methods for stability from control theory to computer science, and shedding light on the complex interplay of control objectives and security threats, to name just a few of the many interdisciplinary breakthroughs achieved over the past two decades. Unfortunately, a joint, interdisciplinary approach to robustness remains evasive. While most researchers in the field of CPS concede that it would be ideal to unify notions across the disciplinary borders to reflect the close functional dependencies between heterogeneous components, the current state of affairs is a fragmentary coverage by the aforementioned disciplinary notions.

This Dagstuhl Seminar will bring together researchers from both academia and industry working in hybrid control systems, mechatronics, formal methods, and real-time embedded systems. Participants will identify and discuss newly available techniques related to robust design and analysis that could be applied to open issues in the area of CPS and will identify open issues and research questions that require collaboration between the communities.

Some central questions that will be examined include:

  1. What is the rationale behind the plethora of existing notions of robustness and how are they related (if at all)?
  2. What measures have to be taken in a particular design domain (e.g., embedded software design) so that notions of robustness central to other domains that are functionally impacted (e.g., feedback control) are respected?
  3. What forms of correctness guarantees are provided by the different notions of robustness and would there be potential for unification or synergy?
  4. What design measures have been established by different disciplines for achieving robustness by construction, and how can they be lifted to other disciplines?
  5. Where do current notions of robustness or current techniques of system design fall short, and can these shortcomings be alleviated by adopting ideas from related disciplines?

The overarching objective of such research would be to establish trusted engineering approaches incorporating methods for producing CPS designs that sustain their correctness and performance guarantees even when used in a well-defined vicinity of their nominal operational regimes, and that can be trusted to degrade gracefully even when some of the underlying modeling and analysis assumptions turn out to be false.