06. – 11. März 2022, Dagstuhl-Seminar 22101

Tensor Computations: Applications and Optimization


Paolo Bientinesi (University of Umeå, SE)
David Ham (Imperial College London, GB)
Furong Huang (University of Maryland – College Park, US)
Paul H. J. Kelly (Imperial College London, GB)
P. (Saday) Sadayappan (University of Utah – Salt Lake City, US)

Auskunft zu diesem Dagstuhl-Seminar erteilt

Dagstuhl Service Team


Gemeinsame Dokumente


Tensors are higher dimensional analogs of matrices, and represent a key data abstraction for many applications in computational science and data science. Widely used shared infrastructure exists for linear algebra, while, in contrast, for tensor computations, there is no consensus on standard building blocks. This Dagstuhl Seminar aims to bring together users, and performance optimization specialists, to build such foundations.

Tensor computations are important in a wide range of application domains, including, among others:

  • Physics – notably in quantum information theory and tensor network models in quantum many-body systems
  • Chemistry – notably in electronic structure calculations, for example using coupled-cluster methods
  • Mechanics – notably in numerical methods for solution of partial differential equations
  • Machine learning - notably both as a language for deep learning and also as a framework for multidimensional data analysis

The development of common language for tensor contractions, tensor networks, tensor decompositions, and the associated numerical methods, is yielding deep insight and cross-fertilization. Furthermore, several concurrent efforts have targeted the development of libraries, frameworks, and domain-specific compilers to support the rising demand for high-performance tensor computations.

This seminar aims to realize the potential in a new emerging recognition of the common foundations that underpin tensor computations across these very diverse domains. There is huge opportunity through coordination among the various communities. Development of high-performance libraries and code generation frameworks for tensor computations can be shaped by improved interaction with the research communities that develop applications using tensors as their key data abstraction.

This seminar builds on Seminar 20111 (March 2020) of the same name, which operated on a reduced scale due to the coronavirus pandemic. It will bring together researchers whose focus is the application of tensor computations, and researchers developing software infrastructure for efficient tensor computation primitives, including experts in high-performance computing, high-performance machine learning, compiler optimization, and in numerical methods across the spectrum of application areas.

A very fruitful exchange of ideas is anticipated, with discussions on the variety of needs and use-cases for tensor computations and the challenges/opportunities in the development of high-performance software to satisfy those needs.

Motivation text license
  Creative Commons BY 4.0
  Paolo Bientinesi, David Ham, Furong Huang, Paul H. J. Kelly, and P. (Saday) Sadayappan

Related Dagstuhl-Seminar


  • Computational Engineering / Finance / And Science
  • Machine Learning
  • Mathematical Software


  • Compilers
  • Computational science
  • Linear algebra
  • Machine learning
  • Numerical methods


In der Reihe Dagstuhl Reports werden alle Dagstuhl-Seminare und Dagstuhl-Perspektiven-Workshops dokumentiert. Die Organisatoren stellen zusammen mit dem Collector des Seminars einen Bericht zusammen, der die Beiträge der Autoren zusammenfasst und um eine Zusammenfassung ergänzt.


Download Übersichtsflyer (PDF).

Dagstuhl's Impact

Bitte informieren Sie uns, wenn eine Veröffentlichung ausgehend von Ihrem Seminar entsteht. Derartige Veröffentlichungen werden von uns in der Rubrik Dagstuhl's Impact separat aufgelistet  und im Erdgeschoss der Bibliothek präsentiert.


Es besteht weiterhin die Möglichkeit, eine umfassende Kollektion begutachteter Arbeiten in der Reihe Dagstuhl Follow-Ups zu publizieren.