- Annette Beyer (for administrative matters)
The rapid expansion of global connectivity, distributed applications and digital services over open networks and across organizational domains requires secure IT systems that adhere to well-defined policies. Cryptography and technical IT security mechanisms support the establishment of secure channels and authorized access. However, many of today's IT applications demand sophisticated security and privacy mechanisms in both software and hardware that go beyond secure channels and authorization and include truly secure liaisons: Enterprises or manufacturers outsource their computations, data storage, and production to potentially untrusted parties over which they have limited control. Medical records are transmitted through and processed by various IT systems such as Handhelds, PCs or hospital servers. Biometric data are carried by individuals on their ID card or electronic passport. Fake and counterfeited pharmaceuticals or automotive and avionic spare parts are packaged in some countries and distributed illegally to worldwide destinations.
IT system security is, however, not only based on strong cryptographic primitives and protocols but also on technological support for secure implementation of the corresponding algorithms. In particular, this concerns security functionality provided by the underlying hardware, which is commonly deployed in the form of cryptographic hardware. The study of how to model, design, evaluate and deploy such cryptographic hardware was the focus of our seminar.
The recent trend of deploying security functionality in hardware typically assumes trust in the various parties involved in the design and manufacturing of the hardware. The life-cycle of cryptographic hardware begins with the IC design step, which results in IC blue-prints being shipped for production to (typically overseas) low-cost manufacturer's facilities. This trend is driven by economic and strategic reasons as well as by globalization. Although this model has many advantages, it also has the disadvantage that it becomes much easier for attackers to compromise hardware devices commonly used in critical infrastructure, which includes commercial, health and defense applications.
As a result, today many ICs and components are overbuilt (over-produced in an unauthorized manner). This, in turn, allows such devices and components to enter the market through gray channels and erode the revenues of legitimate Intellectual Property (IP) owners. In addition, there is a high risk that the functionality on the chip is (deliberately) modified or supplemented with a hidden trapdoor circuit, e.g., a hardware Trojan. For instance, keys which were never supposed to leave a security chip might be leaked (e.g., via padding), the tamper or leakage protection circuits of a chip may be disabled or weakened, a True Random Number Generator may be biased or the IC might have a kill switch that makes it stop functioning under certain conditions. Even in the non-malicious case, overseas manufacturers may try to cut costs by omitting or reducing security measures from the original design. Any single one of these manufacturing attacks or malpractices will have serious consequences for any security application, allow industrial espionage, privacy violations, and finally even threaten national security.
Current methods for assuring the trustworthiness of cryptographic hardware rely heavily on the skills of an evaluator. The lack of standardized methodologies and tools requires that the evaluator correctly identifies and manually evaluates each risk area. The evaluator must be aware and execute all known attacks while also formulating and exercising new forms of attack. The evaluator knows only what was found, not what's left to be found. More resources are used to obtain higher levels of assurance with the ultimate measure of assurance being what happens once the product is in production or it has been deployed. Advances in commercially viable approaches to assure the security of hardware is critical. From defining systematic approaches for assurance to identifying tools to automate and continuously improve assurance levels, significant new research is required. Moreover, commercial hardware engineering practices are well behind software engineering when it comes to establishing a set of best practices that will yield high-quality security products. Existing methods developed for high assurance hardware, typically for use by governments, either break down when considering the size of designs (e.g., microprocessors) or are unacceptable from an economic perspective. Thus, a systematic approach with a solid scientific basis is required to ensure that hardware as the security anchor (or root of trust) for computing will deliver the necessary security guarantees.
We found the seminar to be fruitful in the sense that several modeling issues were raised, which we expect will lead the community to understand better the security issues and requirements of forgery resilient hardware. In addition, the participation of both, theoretical computer scientists and more implementation oriented scientists, allowed for a better understanding from both sides: what models are realistic, what needs to be formalized to be able to prove security of an implementation, and what emerging applications of security hardware exist.
Moreover, it appears that the formal modelings of hardware primitives and the subsequent deployment of such hardware will remain hot topics for the next few years. In the future, we plan further workshops to encourage continued interdisciplinary interactions.
- Frederik Armknecht (Ruhr-Universität Bochum, DE) [dblp]
- Endre Bangerter (Bern University of Applied Sciences, CH)
- Lejla Batina (KU Leuven, BE) [dblp]
- Loïc Duflot (SGDN/DCSSI - Paris, FR)
- Marc Fischlin (TU Darmstadt, DE) [dblp]
- Jorge Guajardo Merchan (Philips Research Europe - Eindhoven, NL) [dblp]
- Shay Gueron (University of Haifa and Intel Corp., IL) [dblp]
- Helena Handschuh (San Francisco, US) [dblp]
- Stefan Katzenbeisser (TU Darmstadt, DE) [dblp]
- Darko Kirovski (Microsoft Corporation - Redmond, US)
- Markus Kuhn (University of Cambridge, GB)
- Miroslaw Kutylowski (Wroclaw University of Technology, PL)
- Hans Löhr (Ruhr-Universität Bochum, DE)
- Mark Manulis (TU Darmstadt, DE)
- Bart Preneel (KU Leuven, BE) [dblp]
- Jean-Jacques Quisquater (University of Louvain, BE) [dblp]
- Ahmad-Reza Sadeghi (Ruhr-Universität Bochum, DE) [dblp]
- Rei Safavi-Naini (University of Calgary, CA) [dblp]
- Patrick Schaumont (Virginia Polytechnic Institute - Blacksburg, US) [dblp]
- Thomas Schneider (Ruhr-Universität Bochum, DE)
- Jean-Pierre Seifert (TU Berlin, DE) [dblp]
- Adi Shamir (Weizmann Institute - Rehovot, IL) [dblp]
- Boris Skoric (TU Eindhoven, NL) [dblp]
- Francois-Xavier Standaert (University of Louvain, BE) [dblp]
- G. Edward Suh (Cornell University, US) [dblp]
- Berk Sunar (Worcester Polytechnic Institute, US)
- Philippe Teuwen (NXP Semiconductors - Leuven, BE)
- Gene Tsudik (University of California - Irvine, US) [dblp]
- Pim Tuyls (Intrinsic-ID - Mol, BE) [dblp]
- Markus Ullmann (BSI - Bonn, DE)
- Ingrid Verbauwhede (KU Leuven, BE) [dblp]
- Christian Wachsmann (Ruhr-Universität Bochum, DE) [dblp]
- Physically Unclonable Functions
- Hardware Trojan
- Trapdoor Detection
- Device Counterfeiting
- Trusted Computing