https://www.dagstuhl.de/11051

### 30. Januar – 04. Februar 2011, Dagstuhl-Seminar 11051

# Sparse Representations and Efficient Sensing of Data

## Organisatoren

Stephan Dahlke (Universität Marburg, DE)

Michael Elad (Technion – Haifa, IL)

Yonina C. Eldar (Technion – Haifa, IL)

Gitta Kutyniok (TU Berlin, DE)

Gerd Teschke (Hochschule Neubrandenburg, DE)

## Auskunft zu diesem Dagstuhl-Seminar erteilt

## Dokumente

Dagstuhl Report, Volume 1, Issue 1

Teilnehmerliste

Gemeinsame Dokumente

## Summary

Modeling of data is the crucial point in enabling various processing of it. This modeling can take many forms and shapes: it can be done in a low-level way that ties the data samples directly or in higher levels that search for structures and constellations. The task of modeling data is so fundamental that it is underlying most of the major achievements in the fields of signal and image processing. This is true also for processing of more general data sources. Indeed, the field of machine learning that addresses this general problem also recognizes the importance of such modeling. In this realm of models, there is one that stands out as quite simple yet very important { this is a model based on sparse description of the data. The core idea is to consider the data as a sparse linear combination of core elements, referred to as atoms. This model has attracted huge interest in the past decade, with many mathematicians, computer scientists, engineers, and scientists from various disciplines working on its different facets, and building a set of tools that lead all the way from pure mathematical concepts to practical tools to be used in other computational sciences as well as applications. Using this model, researchers have shown in recent years a wide battery of computational research disciplines and applications that directly benefit from it, leading to state-of-the-art results. Various reconstruction problems, data compression, sampling and sensing, separation of signals, cleaning and purifying data, adaptive numerical schemes, and more, all require the utilization of sparse representations to succeed in their tasks.

The goals of the seminar can be summarized as follows:

- Establish communication between different focusses of research
- Open new areas of applications
- Manifest the future direction of the field
- Introduce young scientists

- Sampling and Compressed Sensing
- Frames, Adaptivity and Stability
- Algorithms and Applications

The seminar was mainly centered around these topics, and the talks and discussion groups were clustered accordingly. During the seminar, it has turned out that in particular `generalized sensing', `data modeling', and corresponding `algorithms' are currently the most important topics. Indeed, most of the proposed talks were concerned with these three issues. This finding was also manifested by the discussion groups. For a detailed description of the outcome of the discussion, we refer to Section discussiongroups.

The course of the seminar gave the impression that sparsity with all its facets is definitely one of the most important techniques in applied mathematics and computer sciences. Also of great importance are associated sampling issues. We have seen many different view points ranging from classical linear and nonlinear to compressive sensing. In particular, new results on generalized sampling show how to design effective sampling strategies for recovering sparse signals. The impact of these techniques became clear as they allow an extension of the classical finite dimensional theory of compressive sensing to infinite dimensional data models. Moreover, it was fascinating to see how sampling and sparsity concepts are by now influencing many different fields of applications ranging from image processing / compression / resolution to adaptive numerical schemes and the treatment of operator equations/inverse problems. It seems that the duality between sparse sampling and sparse recovery is a common fundamental structure behind many different applications. However, the mathematical technicalities remain quite challenging. As algorithmic issues were also discussed quite intensively, we could figure out that we are now essentially at some point where ell_1-optimization is competitive speed-wise with classical linear methods such as conjugate gradient.

Summarizing our findings during the seminar, we believe that the research agenda can be more focused on the actual bottlenecks, being in problem/signal modeling, design of sampling and recovery methods adapted to specific problems, and algorithmic improvements including performance bounds and guarantees.

## Related Dagstuhl-Seminar

- 08492: "Structured Decompositions and Efficient Algorithms" (2008)

## Classification

- Data-bases / Information Retrieval
- Data Structures / Algorithms / Complexity
- Efficient Sensing Of Signals
- Sparse Signal Representation / Reconstruction

## Keywords

- Efficient signal sensing schemes
- Sparse signal representations
- Efficient signal reconstruction algorithms
- Impact of the methods in neighboring research fields and applications.