March 6 – 11 , 2022, Dagstuhl Seminar 22101

Tensor Computations: Applications and Optimization


Paolo Bientinesi (University of Umeå, SE)
David Ham (Imperial College London, GB)
Furong Huang (University of Maryland – College Park, US)
Paul H. J. Kelly (Imperial College London, GB)
P. (Saday) Sadayappan (University of Utah – Salt Lake City, US)

For support, please contact

Jutka Gasiorowski for administrative matters

Andreas Dolzmann for scientific matters


List of Participants
Shared Documents


Tensors are higher dimensional analogs of matrices, and represent a key data abstraction for many applications in computational science and data science. Widely used shared infrastructure exists for linear algebra, while, in contrast, for tensor computations, there is no consensus on standard building blocks. This Dagstuhl Seminar aims to bring together users, and performance optimization specialists, to build such foundations.

Tensor computations are important in a wide range of application domains, including, among others:

  • Physics – notably in quantum information theory and tensor network models in quantum many-body systems
  • Chemistry – notably in electronic structure calculations, for example using coupled-cluster methods
  • Mechanics – notably in numerical methods for solution of partial differential equations
  • Machine learning - notably both as a language for deep learning and also as a framework for multidimensional data analysis

The development of common language for tensor contractions, tensor networks, tensor decompositions, and the associated numerical methods, is yielding deep insight and cross-fertilization. Furthermore, several concurrent efforts have targeted the development of libraries, frameworks, and domain-specific compilers to support the rising demand for high-performance tensor computations.

This seminar aims to realize the potential in a new emerging recognition of the common foundations that underpin tensor computations across these very diverse domains. There is huge opportunity through coordination among the various communities. Development of high-performance libraries and code generation frameworks for tensor computations can be shaped by improved interaction with the research communities that develop applications using tensors as their key data abstraction.

This seminar builds on Seminar 20111 (March 2020) of the same name, which operated on a reduced scale due to the coronavirus pandemic. It will bring together researchers whose focus is the application of tensor computations, and researchers developing software infrastructure for efficient tensor computation primitives, including experts in high-performance computing, high-performance machine learning, compiler optimization, and in numerical methods across the spectrum of application areas.

A very fruitful exchange of ideas is anticipated, with discussions on the variety of needs and use-cases for tensor computations and the challenges/opportunities in the development of high-performance software to satisfy those needs.

Motivation text license
  Creative Commons BY 4.0
  Paolo Bientinesi, David Ham, Furong Huang, Paul H. J. Kelly, and P. (Saday) Sadayappan

Related Dagstuhl Seminar


  • Computational Engineering / Finance / And Science
  • Machine Learning
  • Mathematical Software


  • Compilers
  • Computational science
  • Linear algebra
  • Machine learning
  • Numerical methods


In the series Dagstuhl Reports each Dagstuhl Seminar and Dagstuhl Perspectives Workshop is documented. The seminar organizers, in cooperation with the collector, prepare a report that includes contributions from the participants' talks together with a summary of the seminar.


Download overview leaflet (PDF).

Dagstuhl's Impact

Please inform us when a publication was published as a result from your seminar. These publications are listed in the category Dagstuhl's Impact and are presented on a special shelf on the ground floor of the library.


Furthermore, a comprehensive peer-reviewed collection of research papers can be published in the series Dagstuhl Follow-Ups.