This event gathered leading researchers working on applied and theoretical aspects of the satisfiability (SAT) problem in areas like Boolean satisfiability (SAT) solving and proof complexity and computational complexity theory more broadly, as well as representatives from neighbouring areas such as, e.g., satisfiability modulo theories (SMT) solving, maximum satisfiability (MaxSAT) solving, pseudo-Boolean optimization, constraint programming, and mixed integer linear programming (MIP) on the applied side, and from other areas of computational complexity theory such as exact exponential-time algorithms and parameterized complexity on the theoretical side. This was meant to create an environment conducive to exchange of ideas and techniques between different fields of research. Among the goals of the workshop were to develop a better scientific understanding of real-world efficient computation in general and of the starkly different perspective between the theory and practice of NP-hard problems in particular, to explore new approaches for SAT and other challenging combinatorial problems that would have the potential to go beyond the current state of the art, and to stimulate a technology transfer between SAT and other related areas.
This workshop is part of a highly successful series starting at the Banff International Research Station (BIRS) in Canada in 2014, and with follow-up editions held at Schloss Dagstuhl in 2015 (Seminar 15171), at the Fields Institute in Toronto, Canada in 2016, and at the BIRS-affiliated Casa Matemática Oaxaca in Oaxaca, Mexico in 2018. After this fifth edition at Schloss Dagstuhl in October 2022, a sixth edition "Satisfiability: Theory, Practice, and Beyond" has already been organized at the Simons Institute for the Theory of Computing at UC Berkeley in April 2023 as part of an eponymous two-month scientific program.
Topic of the Workshop
What served as the point of departure of this workshop is one of the most significant problems in all of mathematics and computer science, namely that of proving logic formulas. This is a problem of immense importance both theoretically and practically. On the one hand, it is believed to be intractable in general, and deciding whether this is so is one of the famous million dollar Clay Millennium Problems (the P vs. NP problem). On the other hand, today so-called SAT solvers are routinely and successfully used to solve large-scale real-world instances in a wide range of application areas (such as hardware and software verification, electronic design automation, artificial intelligence research, cryptography, bioinformatics, and operations research, just to name a few examples).
During the last two decades there have been dramatic - and surprising - developments in SAT solving technology that have improved real-world performance by many orders of magnitude. However, while modern solvers can often handle formulas with millions of variables, there are also tiny formulas with just a few hundred variables that cause even the very best solvers to stumble. The fundamental question of when SAT solvers perform well or badly, and what underlying properties of the formulas influence performance, remains very poorly understood. Other practical SAT solving issues, such as how to optimize memory management and how to exploit parallelization on modern multicore architectures, are even less well studied and understood from a theoretical point of view.
Perhaps even more surprisingly, the best SAT solvers today are still based on relatively simple methods from the early 1960s (though the introduction of so-called conflict-driven learning in the 1990s was a very important addition), searching for proofs in the so-called resolution proof system. Although other mathematical methods of reasoning are known that are much stronger than resolution in theory, in particular methods based on algebra (Gröbner bases) and geometry (cutting planes), attempts to harness the power of such methods have mostly failed to deliver significant improvements in practical performance for SAT solving. And while resolution is a fairly well-understood proof system, even very basic questions about these stronger algebraic and geometric methods remain wide open.
This is an interesting contrast to developments in neighbouring areas such as, e.g., constraint programming and mixed integer programming. There much more powerful methods of reasoning are successfully used to guide the search, but compared to SAT solving the attempts to employ conflict-driven learning have had much less of an impact. Also, while for SAT solvers it is at least possible to understand some aspects of their performance (by analysing proof systems such as resolution), a corresponding theoretical framework for constraint programming and mixed integer linear programming seems to be mostly missing.
In this workshop, we gathered leading researchers working on SAT and other challenging combinatorial optimization problems in order to stimulate an increased exchange of ideas between theoreticians and practitioners. As discussed above, previous editions of this workshop series at the Banff International Research Station, Schloss Dagstuhl, Fields Institute, and Casa Matemática Oaxaca have had a major impact on the involved communities and has helped to create bridges and stimulate the emergence of a joint research agenda. We are happy to report that the October 2022 workshop at Schloss Dagstuhl fully delivered on the expectation of serving as the valuable next step on this journey. During recent years we have already seen how computational complexity theory can shed light on the power and limitations on current and possible future techniques for SAT and other optimization problems, and and that problems encountered on the applied side can spawn interesting new areas in theoretical research. We see great potential for continued interdisciplinary research at the border between theory and practice in this area, and believe that more vigorous interaction between practitioners and theoreticians could have major long-term impact in both academia and industry.
Goals of the Workshop
A strong case can be made for the importance of increased exchange between the two fields of SAT solving on the one hand and proof complexity (and more broadly computational complexity) on the other. Given how many questions that would seem to be of mutual interest, it is striking that the level of interaction had been so low until our workshop series started with the first two meetings in Banff in 2014 and Dagstuhl in 2015. Below, we outline some of the concrete questions that served as motivation for organizing our second Dagstuhl workshop in this series in 2022, and for broadening the scope from Boolean satisfiability to combinatorial solving and optimization in general. We want to stress that this inst is far from exhaustive, and we believe that one important outcome of the seminar was to uncover also other questions at the intersection of theoretical and applied research, and of different research areas within combinatorial solving and optimization.
What Makes Formulas Hard or Easy in Practice for Modern SAT Solvers?
The best SAT solvers known today are based on the DPLL procedure, augmented with optimizations such as conflict-driven clause learning (CDCL) and restart strategies. The propositional proof system underlying such algorithms, resolution, is arguably the most well-studied system in all of proof complexity.
Given the progress during the last decade on solving large-scale instances, it is natural to ask what lies behind the spectacular success of CDCL solvers at solving these instances. And given that there are still very small formulas that resist even the most powerful CDCL solvers, a complementary interesting question is if one can determine whether a particular formula is hard or tractable. Somewhat unexpectedly, very little turns out to be known about these questions.
In view of the fundamental nature of the SAT problem, and in view of the wide applicability of modern SAT solvers, this seems like a clear example of a question of great practical importance where the theoretical field of proof complexity could potentially provide useful insights. In particular, one can ask whether one could find theoretical complexity measures for formulas than would capture their practical hardness in some nice and clean way. Besides greatly advancing our theoretical understanding, answering such a question could also have applied impact in the longer term by clarifying the limitations, and potential for further improvements, of modern SAT solvers.
Can Proof Complexity Shed Light on Crucial SAT Solving Issues?
Understanding the hardness of proving formulas in practice is not the only problem for which more applied researchers would welcome contributions from theoretical computer scientists. Examples of some other possible practical questions that would benefit from a deeper theoretical understanding follow below.
- Firstly, we would like to study the question of memory management. One major concern for clause learning algorithms is to determine how many clauses to keep in memory. Also, once the algorithm runs out of the memory currently available, one needs to determine which clauses to throw away. These questions can have huge implications for performance, but are poorly understood.
- In addition to clause learning, the concept of restarts is known to have decisive impact on the performance on modern CDCL solvers. It would be nice to understand theoretically why this is so. The reason why clause learning increases efficiency greatly is clear - without it the solver will only generate so-called tree-like proofs, and tree-like resolution is known to be exponentially weaker than general resolution. However, there is still ample room for improvement of our understanding of the role of restarts and what are good restart strategies.
- Given that modern computers are multi-core architectures, a highly topical question is whether this (rather coarse-grained) parallelization can be used to speed up SAT solving. While there are some highly successful attempts in parallelizing SAT for solving theoretical problems in, e.g., extremal combinatorics, the speed-ups obtained for more applied problems are rather modest or sometimes non-existent. This is a barrier for further adoption of SAT technology already today, and will be become a more substantial problem as thousands of cores and cloud computing are becoming the dominant computing platforms in the future. A theoretical understanding of if and how SAT can be parallelized will be essential to develop new parallelization strategies to adapt SAT to this new computing paradigm.
We believe that progress on any of these questions has the potential of influencing the further development of both theoretical and applied research, and to stimulate a further cross-pollination between these two areas.
Can we build SAT Solvers based on Stronger Proof Systems than Resolution?
Although the performance of modern CDCL SAT solvers is impressive, it is nevertheless astonishing, not to say disappointing, that the state-of-the-art solvers are still based on resolution. This method lies close to the bottom in the hierarchy of propositional proof systems, and there are many other proof systems based on different forms of mathematical reasoning that are known to be strictly stronger. Some of these appear to be natural candidates on which to build stronger SAT solvers than those using CDCL.
In particular, proof systems such as polynomial calculus (based on algebraic reasoning) and cutting planes (based on geometry) are known to be exponentially more powerful than resolution. While there has been some work on building SAT solvers on top of these proof systems, progress has been fairly limited. We believe it would be fruitful to discuss what the barriers are that stops us from building stronger algebraic or geometric SAT solvers, and what is the potential for future improvements. An important part of this work would seem to be to gain a deeper theoretical understanding of the power and limitations of these proof methods. Here there are a number of fairly long-standing open theoretical questions. At the same time, only in the last couple of years proof complexity has made substantial progress, giving hope that the time is ripe for decisive break-throughs in these areas.
Can Technology Be Transferred Between Different Combinatorial Optimization Paradigms?
Continuing the discussion of stronger methods of reasoning, it is natural to ask whether techniques from e.g., constraint programming (CP) and mixed integer linear programming (MIP) could be imported into SAT solving and vice versa. At a high level, the main loop of combinatorial solvers in all of these paradigms consists of two phases:
- During the search phase, the solver makes decisions (guesses) about variable assignments and propagates “obvious” consequences until it either finds a solution or discovers a violated constraint (a conflict).
- In case of a conflict, during the backtracking phase the solver analyses what went wrong and reverses some decision(s) to remove the violation, after which it switches to search again.
For CP and MIP solvers, significant effort is spent during the search phase on making intelligent decisions and deriving (sometimes not so obvious) consequences. In comparison, the backtracking phase is not so sophisticated. For SAT and pseudo-Boolean (PB) solvers it is exactly the other way around. The decisions during the search phase are done quite naively, and since the constraints are relatively simple, propagations cannot be too strong and are fairly easy to detect. Once a conflict is reached, however, an elaborate conflict analysis algorithm combines the constraints involved in this conflict to learn a new, globally valid constraint that is added to the input formula.
Could it be possible to make more efficient use of conflict analysis in MIP and CP solving? CDCL-style analysis has already been tried with some success, but since this approach boils down to reasoning with clauses it is provably exponentially weaker than what pseudo-Boolean techniques can offer. In the other direction, a quite tempting proposition is to to integrate into SAT and PB solvers the vastly stronger propagation used in CP and MIP. Another change of perspective would be to turn core-guided MaxSAT solving into a general pseudo-Boolean optimization technique, using PB conflict analysis to extract better cores, and letting these cores serve as a heuristic for introducing new variables in the spirit of extended resolution.
Organization of the Workshop
The scientific program of the seminar consisted of 29~presentations. Among these there were 14 50-minute surveys of different core topics of the workshop. These talks occupied most of the morning schedule Monday-Thursday, and were intended to make sure that the diverse audience would have a bit of a common background for the more technical talks reporting on recent research projects. The list of survey talks and speakers were as follows:
- SAT: interactions between theory and practice (Olaf Beyersdorff)
- SAT and computational complexity theory (Ryan Williams)
- Proof complexity and SAT solving (Jakob Nordström)
- Efficient proof search in proof complexity, a.k.a. automatability (Susanna de Rezende)
- Satisfiability modulo theories (SMT) solving (Nikolaj Bjørner)
- Quantified Boolean formula (QBF) solving and proof complexity (Meena Mahajan)
- Constraint programming (Ciaran McCreesh)
- Mixed integer linear programming (Ambros Gleixner)
- Algebraic methods for circuit verification (Daniela Kaufmann)
- First-order theorem proving (Laura Kovacs and Martin Suda)
- Automated planning (Malte Helmert)
- Formally verified combinatorial solvers (Mathias Fleury)
- Certifying solvers with proof logging (Armin Biere)
- Formally verified proof checking for certifying solvers (Yong Kiam Tan)
The rest of the talks were 25-minute presentations on recent research of the participants. The time after lunch each day was left for self-organized collaborations and discussions, and there was no schedule on Wednesday afternoon.
Based on polling of participants before the seminar week, it was decided to have an open problem session on Thursday afternoon. The poll also asked whether a panel discussion should be organized, but the support for this idea was weaker, and several participants emphasized that the workshop program should not be too dense and that the evenings should be left free of any program. Therefore, the organizers decided not to have a panel discussion. As a nice contribution, some of the participants of the workshop organized a music night on the last evening of the workshop.
The Boolean satisfiability (SAT) problem plays a fascinating dual role in computer science. By the theory of NP-completeness, this problem captures thousands of important applications in different fields, and a rich mathematical theory has been developed showing that all these problems are likely to be infeasible to solve in the worst case. But real-world problems are typically not worst-case, and in recent decades exceedingly efficient algorithms based on so-called conflict-driven clause learning (CDCL) have turned SAT solvers into highly practical tools for solving large-scale real-world problems in a wide range of application areas. Analogous developments have taken place for problems beyond NP such as SAT-based optimization (MaxSAT), pseudo-Boolean optimization, satisfiability modulo theories (SMT) solving, quantified Boolean formula (QBF) solving, constraint programming, and mixed integer programming, where the conflict-driven paradigm has sometimes been added to other powerful techniques.
The current state of the art in combinatorial solving presents a host of exciting challenges at the borderline between theory and practice. Can we gain a deeper scientific understanding of the techniques and heuristics used in modern combinatorial solvers and why they are so successful? Can we develop tools for rigorous analysis of the potential and limitations of these algorithms? Can computational complexity theory be extended to shed light on real-world settings that go beyond worst case? Can more powerful methods of reasoning developed in theoretical research be harnessed to yield improvements in practical performance? And can state-of-the-art combinatorial solvers be enhanced to not only solve problems, but also provide verifiable proofs of correctness for the solutions they produce?
In this Dagstuhl Seminar, we aim to gather leading applied and theoretical researchers working on SAT and combinatorial optimization more broadly in order to stimulate an exchange of ideas and techniques. We see great opportunities for fruitful interplay between theory and practice in these areas, as well as for technology transfer between different paradigms in combinatorial optimization, and believe that a more vigorous interaction has potential for major long-term impact in computer science, as well for applications in industry.
- Jeremias Berg (University of Helsinki, FI)
- Olaf Beyersdorff (Friedrich-Schiller-Universität Jena, DE) [dblp]
- Armin Biere (Universität Freiburg, DE) [dblp]
- Nikolaj S. Bjørner (Microsoft - Redmond, US) [dblp]
- Bart Bogaerts (VU - Brussels, BE)
- Benjamin Böhm (Friedrich-Schiller-Universität Jena, DE)
- Jonas Conneryd (Lund University, SE)
- Susanna de Rezende (Lund University, SE) [dblp]
- Katalin Fazekas (TU Wien, AT)
- Mathias Fleury (Universität Freiburg, DE) [dblp]
- Vijay Ganesh (University of Waterloo, CA) [dblp]
- Mexi Gioni (Zuse Institut Berlin, DE)
- Ambros M. Gleixner (HTW - Berlin, DE) [dblp]
- Malte Helmert (Universität Basel, CH) [dblp]
- Marijn J. H. Heule (Carnegie Mellon University - Pittsburgh, US) [dblp]
- Mikoláš Janota (Czech Technical University - Prague, CZ) [dblp]
- Matti Järvisalo (University of Helsinki, FI) [dblp]
- Daniela Kaufmann (TU Wien, AT) [dblp]
- Antonina Kolokolova (University of Newfoundland, CA) [dblp]
- Laura Kovács (TU Wien, AT) [dblp]
- Chunxiao (Ian) Li (University of Waterloo, CA)
- Meena Mahajan (The Institute of Mathematical Sciences - Chennai, IN) [dblp]
- Ciaran McCreesh (University of Glasgow, GB)
- Kuldeep S. Meel (National University of Singapore, SG) [dblp]
- Jakob Nordström (University of Copenhagen, DK & Lund University, SE) [dblp]
- Andy Oertel (Lund University, SE)
- Albert Oliveras (UPC Barcelona Tech, ES) [dblp]
- Pavel Pudlák (The Czech Academy of Sciences - Prague, CZ) [dblp]
- Torsten Schaub (Universität Potsdam, DE) [dblp]
- Andre Schidler (TU Wien, AT) [dblp]
- Laurent Simon (University of Bordeaux, FR) [dblp]
- Friedrich Slivovsky (TU Wien, AT) [dblp]
- Martin Suda (Czech Technical University - Prague, CZ) [dblp]
- Stefan Szeider (TU Wien, AT) [dblp]
- Yong Kiam Tan (Infocomm Research - Singapore, SG)
- Dieter Vandesande (VU - Brussels, BE)
- Marc Vinyals (University of Newfoundland, CA) [dblp]
- Ryan Williams (MIT - Cambridge, US) [dblp]
- Emre Yolcu (Carnegie Mellon University - Pittsburgh, US)
- Dagstuhl Seminar 15171: Theory and Practice of SAT Solving (2015-04-19 - 2015-04-24) (Details)
- Computational Complexity
- Data Structures and Algorithms
- Logic in Computer Science
- Boolean satisfiability
- SAT solving
- computational complexity
- proof complexity
- combinatorial optimization