TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Research Meeting 08253

Security Hardware in Theory and Practice – A Marriage of Convenience

( Jun 18 – Jun 20, 2008 )

(Click in the middle of the image to enlarge)

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/08253

Organizer


Call for Participation

Many of todays and future IT applications demand for sophisticated security in both software and hardware. Today security in communication systems is not only based on strong cryptographic primitives and protocols but also on technological support for secure implementation of the corresponding algorithms such as tamper-proof hardware, true random number generators (TRNG), access control at hardware level, strong dependency between several parts of the hardware as well as mechanisms to securely bind the software and hardware of a computing platform. While special purpose cryptographic co-processors and hardware dongles are relatively well known, recent developments target widespread commercial use by introducing microprocessors and security chips which support the required security functionalities of the trusted platform specification. Prominent examples include Intel’s Trusted Execution Technology, AMD’s Presidio and the TPM (Trusted Platform Module) proposed by the Trusted Computing Group (TCG). Other recent promising developments concern Physical Unclonable Functions (PUFs) where inherent hardware characteristics can be exploited for the purpose of authentication, key storage, or binding other (software) components to the underlying hardware without the need for additional non-volatile secure memory.

The above mentioned aspects, however, assume trust in involved parties such as designers and manufacturers of security hardware. This assumption may not hold in such a broad sense at least not for IC manufacturers. Assuring correctness and soundness of security hardware even in the presence of untrusted manufacturers is a challenging and hard to tackle problem and of great interest to industry as well as governments. The life-cycle of security hardware begins with the ICdesign step, which results in the IC-masks being shipped for production to the manufacturer’s facilities. Notice, however, that many manufacturers outsource this step as they aim to reduce IC production costs. Thus, there is little assurance that the functionality on the chip is not (deliberately) modified or supplemented with a hidden trapdoor circuit. For instance keys which were never supposed to leave a security chip (such as a TPM) might be leaked via padding, the tamper or leakage protection circuits may be disabled or weakened, the TRNG may be biased, etc. Any single one of these manufacturing attacks will have serious consequences for any top level security application. Although there are legal means for the hardware manufacturer to provide certain guarantees to their contractors or outsourcing party, it remains an open problem to provide strong, cost-effective and easily deployable technological means for (i) detecting Trojan circuits and trapdoors in ICs manufactured in untrusted foundries, and (ii) monitoring and auditing the hardware after delivery by the manufacturer, e.g., to deter illegal redistribution through illegitimate overproduction and resell. In addition to security hardware, there have been software-based proposals that aim to provide certain security guarantees even if an adversary has complete control of the software and executing platform. These approaches, termed as whitebox security, refer to a secure runtime environment, i.e., to the execution of software in a remote untrusted environment without relying on (hardware) Trusted Computing Base. In this context, it is also assumed that the attacker has full control and access to the software and can observe the code, data, system calls and memory usage. The attractiveness of this approach comes from applications in which solutions based on a Trusted Computing Base (TCB) are inappropriate, unattainable or expensive to enable secure execution. Another related aspect of adversarial environment is known as cryptovirology, i.e., applications of cryptography to malicious software.

In this workshop we aim at discussing and investigating problems, challenges and some recent scientific and technological developments in security hardware and related technologies as well as with respect to specific aspects of software security (whitebox, cryptovirology) that are of great interest from the system security perspective, and can be deployed to solve various real world security problems in particular those mentioned above.

The main topics of this workshop are (but are not limited to):

  • PUF (Physical Unclonable Functions)
  • Secure cryptographic coprocessors including Trusted Platform Modules (TPM)
  • TRNG (true random number of generators): improving their performance
  • Trojan ICs
  • Trapdoors IC
  • Malicious Cryptography
  • IC Fingerprinting
  • Secure program execution
  • Whitebox Security
  • Foundations (Physically Observable Cryptography, PUF Modeling, etc.)

Steering and Organization Committee:

  • Jorge Guajardo (Philips Research, Eindhoven)
  • Stefan Katzenbeisser (Technical University Darmstadt)
  • Klaus Kursawe (Philips Research, Eindhoven)
  • David Naccache (Ecole Normale Supérieure)
  • Christof Paar (Ruhr-University Bochum)
  • Bart Preneel (Katholieke Universiteit Leuven )
  • Ahmad-Reza Sadeghi (Ruhr-University Bochum)
  • Jamshid Shokrollahi (Ruhr-University Bochum)
  • Berk Sunar (Worcester Polytechnic Institute)
  • Pim Tuyls (Philips Research, Eindhoven)
  • Moti Yung (Google and Columbia University)