TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 11051

Sparse Representations and Efficient Sensing of Data

( 30. Jan – 04. Feb, 2011 )


Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/11051

Organisatoren

Kontakt


Summary

Modeling of data is the crucial point in enabling various processing of it. This modeling can take many forms and shapes: it can be done in a low-level way that ties the data samples directly or in higher levels that search for structures and constellations. The task of modeling data is so fundamental that it is underlying most of the major achievements in the fields of signal and image processing. This is true also for processing of more general data sources. Indeed, the field of machine learning that addresses this general problem also recognizes the importance of such modeling. In this realm of models, there is one that stands out as quite simple yet very important { this is a model based on sparse description of the data. The core idea is to consider the data as a sparse linear combination of core elements, referred to as atoms. This model has attracted huge interest in the past decade, with many mathematicians, computer scientists, engineers, and scientists from various disciplines working on its different facets, and building a set of tools that lead all the way from pure mathematical concepts to practical tools to be used in other computational sciences as well as applications. Using this model, researchers have shown in recent years a wide battery of computational research disciplines and applications that directly benefit from it, leading to state-of-the-art results. Various reconstruction problems, data compression, sampling and sensing, separation of signals, cleaning and purifying data, adaptive numerical schemes, and more, all require the utilization of sparse representations to succeed in their tasks.

The goals of the seminar can be summarized as follows:

  • Establish communication between different focusses of research
  • Open new areas of applications
  • Manifest the future direction of the field
  • Introduce young scientists

To reach these seminar goals, the organizers identified in advance the most relevant fields of research:

  • Sampling and Compressed Sensing
  • Frames, Adaptivity and Stability
  • Algorithms and Applications

The seminar was mainly centered around these topics, and the talks and discussion groups were clustered accordingly. During the seminar, it has turned out that in particular `generalized sensing', `data modeling', and corresponding `algorithms' are currently the most important topics. Indeed, most of the proposed talks were concerned with these three issues. This finding was also manifested by the discussion groups. For a detailed description of the outcome of the discussion, we refer to Section discussiongroups.

The course of the seminar gave the impression that sparsity with all its facets is definitely one of the most important techniques in applied mathematics and computer sciences. Also of great importance are associated sampling issues. We have seen many different view points ranging from classical linear and nonlinear to compressive sensing. In particular, new results on generalized sampling show how to design effective sampling strategies for recovering sparse signals. The impact of these techniques became clear as they allow an extension of the classical finite dimensional theory of compressive sensing to infinite dimensional data models. Moreover, it was fascinating to see how sampling and sparsity concepts are by now influencing many different fields of applications ranging from image processing / compression / resolution to adaptive numerical schemes and the treatment of operator equations/inverse problems. It seems that the duality between sparse sampling and sparse recovery is a common fundamental structure behind many different applications. However, the mathematical technicalities remain quite challenging. As algorithmic issues were also discussed quite intensively, we could figure out that we are now essentially at some point where ell_1-optimization is competitive speed-wise with classical linear methods such as conjugate gradient.

Summarizing our findings during the seminar, we believe that the research agenda can be more focused on the actual bottlenecks, being in problem/signal modeling, design of sampling and recovery methods adapted to specific problems, and algorithmic improvements including performance bounds and guarantees.


Teilnehmer
  • Ronny Bergmann (Universität zu Lübeck, DE)
  • Martin Ehler (National Institutes of Health - Bethesda, US) [dblp]
  • Michael Elad (Technion - Haifa, IL) [dblp]
  • Hans Georg Feichtinger (Universität Wien, AT)
  • Mario Figueiredo (Technical University - Lisboa, PT)
  • Massimo Fornasier (TU München, DE)
  • Ulrich Friedrich (Universität Marburg, DE)
  • Onur G. Guleryuz (DoCoMo USA Labs - Palo Alto, US)
  • Anders Hansen (University of Cambridge, GB) [dblp]
  • Evelyn Herrholz (Hochschule Neubrandenburg, DE)
  • Olga Holtz (TU Berlin, DE)
  • Stefan Kunis (Universität Osnabrück, DE) [dblp]
  • Gitta Kutyniok (TU Berlin, DE) [dblp]
  • Wang-Q Lim (Universität Osnabrück, DE)
  • Dirk Lorenz (TU Braunschweig, DE)
  • Peter Maaß (Universität Bremen, DE)
  • Sangnam Nam (INRIA Rennes - Bretagne Atlantique, FR)
  • Peter Oswald (Jacobs Universität - Bremen, DE)
  • Ali Pezeshki (Colorado State University, US)
  • Götz E. Pfander (Jacobs Universität - Bremen, DE)
  • Gerlind Plonka-Hoch (Universität Göttingen, DE) [dblp]
  • Daniel Potts (TU Chemnitz, DE) [dblp]
  • Thorsten Raasch (Institut für Molekulare Biologie GmbH - Mainz, DE)
  • Stefan Schiffler (Universität Bremen, DE)
  • Jean-Luc Starck (CEA - Gif sur Yvette, FR) [dblp]
  • Gabriele Steidl (TU Kaiserslautern, DE)
  • Rob Stevenson (University of Amsterdam, NL)
  • Gerd Teschke (Hochschule Neubrandenburg, DE)
  • Bruno Torresani (University of Marseille, FR)
  • Michael Unser (EPFL - Lausanne, CH) [dblp]
  • Martin Vetterli (EPFL - Lausanne, CH)
  • Przemyslaw Wojtaszczyk (University of Warsaw, PL)
  • Xiaosheng Zhuang (Universität Osnabrück, DE)

Verwandte Seminare
  • Dagstuhl-Seminar 08492: Structured Decompositions and Efficient Algorithms (2008-11-30 - 2008-12-05) (Details)

Klassifikation
  • data-bases / information retrieval
  • data structures / algorithms / complexity
  • efficient sensing of signals
  • sparse signal representation / reconstruction

Schlagworte
  • Efficient signal sensing schemes
  • sparse signal representations
  • efficient signal reconstruction algorithms
  • impact of the methods in neighboring research fields and applications.