LIPIcs, Volume 167

5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)



Thumbnail PDF

Event

FSCD 2020, June 29 to July 6, 2020, Paris, France (Virtual Conference)

Editor

Zena M. Ariola
  • University of Oregon, Eugene, Oregon, USA

Publication Details

  • published at: 2020-06-28
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-155-9
  • DBLP: db/conf/rta/fscd2020

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 167, FSCD 2020, Complete Volume

Authors: Zena M. Ariola


Abstract
LIPIcs, Volume 167, FSCD 2020, Complete Volume

Cite as

5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 1-732, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@Proceedings{ariola:LIPIcs.FSCD.2020,
  title =	{{LIPIcs, Volume 167, FSCD 2020, Complete Volume}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{1--732},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020},
  URN =		{urn:nbn:de:0030-drops-123219},
  doi =		{10.4230/LIPIcs.FSCD.2020},
  annote =	{Keywords: LIPIcs, Volume 167, FSCD 2020, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Zena M. Ariola


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 0:i-0:xx, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{ariola:LIPIcs.FSCD.2020.0,
  author =	{Ariola, Zena M.},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{0:i--0:xx},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.0},
  URN =		{urn:nbn:de:0030-drops-123228},
  doi =		{10.4230/LIPIcs.FSCD.2020.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Solvability in a Probabilistic Setting (Invited Talk)

Authors: Simona Ronchi Della Rocca, Ugo Dal Lago, and Claudia Faggian


Abstract
The notion of solvability, crucial in the λ-calculus, is conservatively extended to a probabilistic setting, and a complete characterization of it is given. The employed technical tool is a type assignment system, based on non-idempotent intersection types, whose typable terms turn out to be precisely the terms which are solvable with nonnull probability. We also supply an operational characterization of solvable terms, through the notion of head normal form, and a denotational model of Λ_⊕, itself induced by the type system, which equates all the unsolvable terms.

Cite as

Simona Ronchi Della Rocca, Ugo Dal Lago, and Claudia Faggian. Solvability in a Probabilistic Setting (Invited Talk). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 1:1-1:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{ronchidellarocca_et_al:LIPIcs.FSCD.2020.1,
  author =	{Ronchi Della Rocca, Simona and Dal Lago, Ugo and Faggian, Claudia},
  title =	{{Solvability in a Probabilistic Setting}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{1:1--1:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.1},
  URN =		{urn:nbn:de:0030-drops-123237},
  doi =		{10.4230/LIPIcs.FSCD.2020.1},
  annote =	{Keywords: Probabilistic Computation, Lambda Calculus, Solvability, Intersection Types}
}
Document
Invited Talk
A Modal Analysis of Metaprogramming, Revisited (Invited Talk)

Authors: Brigitte Pientka


Abstract
Metaprogramming is the art of writing programs that produce or manipulate other programs. This opens the possibility to eliminate boilerplate code and exploit domain-specific knowledge to build high-performance programs. Unfortunately, designing language extensions to support type-safe multi-staged metaprogramming remains very challenging. In this talk, we outline a modal type-theoretic foundation for multi-staged metaprogramming which supports the generation and the analysis of polymorphic code. It has two main ingredients: first, we exploit contextual modal types to describe open code together with the context in which it is meaningful; second, we model code as a higher-order abstract syntax (HOAS) tree within a context. These two ideas provide the appropriate abstractions for both generating and pattern matching on open code without committing to a concrete representation of variable binding and contexts. Our work is a first step towards building a general type-theoretic foundation for multi-staged metaprogramming which on the one hand enforces strong type guarantees and on the other hand makes it easy to generate and manipulate code. This will allow us to exploit the full potential of metaprogramming without sacrificing reliability of and trust in the code we are producing and running.

Cite as

Brigitte Pientka. A Modal Analysis of Metaprogramming, Revisited (Invited Talk). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 2:1-2:3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{pientka:LIPIcs.FSCD.2020.2,
  author =	{Pientka, Brigitte},
  title =	{{A Modal Analysis of Metaprogramming, Revisited}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{2:1--2:3},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.2},
  URN =		{urn:nbn:de:0030-drops-123242},
  doi =		{10.4230/LIPIcs.FSCD.2020.2},
  annote =	{Keywords: Type systems, Metaprogramming, Modal Type System}
}
Document
Invited Talk
Quotients in Dependent Type Theory (Invited Talk)

Authors: Andrew M. Pitts


Abstract
Constructs that involve taking a quotient are commonplace in mathematics. Here I will consider how they are treated within dependent type theory. The notion of quotient type has its origins in the Nuprl theorem-proving system [R. L. Constable et al., 1986] for extensional type theory. Later Hofmann formulated a version for intensional type theory in his thesis [M. Hofmann, 1995]. This depends on having a pre-existing notion of intensional identity type. Hofmann used Martin-Löf’s notion, the indexed family inductively generated from proofs of reflexivity [B. Nordström et al., 1990]. The recent homotopical view of identity in terms of path types [The Univalent Foundations Program, 2013] gives a more liberal perspective and has brought with it the notion of higher inductive type (HIT) [P. L. Lumsdaine and M. Shulman, 2019], subsuming both inductive and quotient types. Inductively defined indexed families of types (in all their various forms) are perhaps the most useful concept that dependent type theory has contributed to the practice of computer assistance for formalizing mathematical proofs. However, it is often the case that a particular application of such types needs not only to inductively generate a collection of objects, but also to make identifications between the objects. In classical mathematics one can first generate and then identify, using the Axiom of Choice to lift infinitary constructions to the quotient. HITs can allow one to avoid such non-constructive uses of choice by inter-twining generation and identification. Perhaps more important than the constructive/non-constructive issue is that simultaneously declaring how to generate and how to identify can be a very natural way of defining some construct from the user’s point of view. This is why HITs promise to be so useful, once we have robust and convenient mechanisms in theorem-proving systems for defining HITs and defining functions out of HITs. Although some HITs have been axiomatized in various systems, at the moment the only system I know of with built in support for defining quite general forms of HIT and using them is the implementation of cubical type theory [C. Cohen et al., 2018] within recent versions of the Agda system [A. Vezzosi et al., 2019]. The higher dimensional aspect of identity in cubical type theory is fascinating; nevertheless, the simpler one-dimensional version of identity, in which one has uniqueness of identity proofs (UIP), is adequate for many applications. Although some regard UIP as a bug of early versions of Agda with it’s original form of dependent pattern matching [T. Coquand, 1992], it is by choice a feature of the Lean prover [J. Avigad et al., 2020]. Altenkirch and Kaposi [T. Altenkirch and A. Kaposi, 2016] have termed the one-dimensional version of HITs quotient inductive types (QITs) and they promise to be very useful even in the setting of type theory with UIP. In this talk I survey some of these developments, including a recent reduction of QITs to quotients [M. P. Fiore et al., 2020], and the prospects for better support in theorem-provers for quotient constructions.

Cite as

Andrew M. Pitts. Quotients in Dependent Type Theory (Invited Talk). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 3:1-3:2, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{pitts:LIPIcs.FSCD.2020.3,
  author =	{Pitts, Andrew M.},
  title =	{{Quotients in Dependent Type Theory}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{3:1--3:2},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.3},
  URN =		{urn:nbn:de:0030-drops-123256},
  doi =		{10.4230/LIPIcs.FSCD.2020.3},
  annote =	{Keywords: dependent type theory, quotient, quotient inductive type, theorem-proving systems}
}
Document
Invited Talk
Certifying the Weighted Path Order (Invited Talk)

Authors: René Thiemann, Jonas Schöpf, Christian Sternagel, and Akihisa Yamada


Abstract
The weighted path order (WPO) unifies and extends several termination proving techniques that are known in term rewriting. Consequently, the first tool implementing WPO could prove termination of rewrite systems for which all previous tools failed. However, we should not blindly trust such results, since there might be problems with the implementation or the paper proof of WPO. In this work, we increase the reliability of these automatically generated proofs. To this end, we first formally prove the properties of WPO in Isabelle/HOL, and then develop a verified algorithm to certify termination proofs that are generated by tools using WPO. We also include support for max-polynomial interpretations, an important ingredient in WPO. Here we establish a connection to an existing verified SMT solver. Moreover, we extend the termination tools NaTT and TTT2, so that they can now generate certifiable WPO proofs.

Cite as

René Thiemann, Jonas Schöpf, Christian Sternagel, and Akihisa Yamada. Certifying the Weighted Path Order (Invited Talk). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 4:1-4:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{thiemann_et_al:LIPIcs.FSCD.2020.4,
  author =	{Thiemann, Ren\'{e} and Sch\"{o}pf, Jonas and Sternagel, Christian and Yamada, Akihisa},
  title =	{{Certifying the Weighted Path Order}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{4:1--4:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.4},
  URN =		{urn:nbn:de:0030-drops-123263},
  doi =		{10.4230/LIPIcs.FSCD.2020.4},
  annote =	{Keywords: certification, Isabelle/HOL, reduction order, termination analysis}
}
Document
Efficient Full Higher-Order Unification

Authors: Petar Vukmirović, Alexander Bentkamp, and Visa Nummelin


Abstract
We developed a procedure to enumerate complete sets of higher-order unifiers based on work by Jensen and Pietrzykowski. Our procedure removes many redundant unifiers by carefully restricting the search space and tightly integrating decision procedures for fragments that admit a finite complete set of unifiers. We identify a new such fragment and describe a procedure for computing its unifiers. Our unification procedure is implemented in the Zipperposition theorem prover. Experimental evaluation shows a clear advantage over Jensen and Pietrzykowski’s procedure.

Cite as

Petar Vukmirović, Alexander Bentkamp, and Visa Nummelin. Efficient Full Higher-Order Unification. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 5:1-5:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{vukmirovic_et_al:LIPIcs.FSCD.2020.5,
  author =	{Vukmirovi\'{c}, Petar and Bentkamp, Alexander and Nummelin, Visa},
  title =	{{Efficient Full Higher-Order Unification}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{5:1--5:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.5},
  URN =		{urn:nbn:de:0030-drops-123271},
  doi =		{10.4230/LIPIcs.FSCD.2020.5},
  annote =	{Keywords: unification, higher-order logic, theorem proving, term rewriting, indexing data structures}
}
Document
Comprehension and Quotient Structures in the Language of 2-Categories

Authors: Paul-André Melliès and Nicolas Rolland


Abstract
Lawvere observed in his celebrated work on hyperdoctrines that the set-theoretic schema of comprehension can be elegantly expressed in the functorial language of categorical logic, as a comprehension structure on the functor p:E→ B defining the hyperdoctrine. In this paper, we formulate and study a strictly ordered hierarchy of three notions of comprehension structure on a given functor p:E→ B, which we call (i) comprehension structure, (ii) comprehension structure with section, and (iii) comprehension structure with image. Our approach is 2-categorical and we thus formulate the three levels of comprehension structure on a general morphism p:𝐄→𝐁 in a 2-category K. This conceptual point of view on comprehension structures enables us to revisit the work by Fumex, Ghani and Johann on the duality between comprehension structures and quotient structures on a given functor p:E→B. In particular, we show how to lift the comprehension and quotient structures on a functor p:E→ B to the categories of algebras or coalgebras associated to functors F_E:E→E and F_B:B→B of interest, in order to interpret reasoning by induction and coinduction in the traditional language of categorical logic, formulated in an appropriate 2-categorical way.

Cite as

Paul-André Melliès and Nicolas Rolland. Comprehension and Quotient Structures in the Language of 2-Categories. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 6:1-6:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{mellies_et_al:LIPIcs.FSCD.2020.6,
  author =	{Melli\`{e}s, Paul-Andr\'{e} and Rolland, Nicolas},
  title =	{{Comprehension and Quotient Structures in the Language of 2-Categories}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{6:1--6:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.6},
  URN =		{urn:nbn:de:0030-drops-123282},
  doi =		{10.4230/LIPIcs.FSCD.2020.6},
  annote =	{Keywords: Comprehension structures, quotient structures, comprehension structures with section, comprehension structures with image, 2-categories, formal adjunctions, path objects, categorical logic, inductive reasoning on algebras, coinductive reasoning on coalgebras}
}
Document
A Complete Normal-Form Bisimilarity for Algebraic Effects and Handlers

Authors: Dariusz Biernacki, Sergueï Lenglet, and Piotr Polesiuk


Abstract
We present a complete coinductive syntactic theory for an untyped calculus of algebraic operations and handlers, a relatively recent concept that augments a programming language with unprecedented flexibility to define, combine and interpret computational effects. Our theory takes the form of a normal-form bisimilarity and its soundness w.r.t. contextual equivalence hinges on using so-called context variables to test evaluation contexts comprising normal forms other than values. The theory is formulated in purely syntactic elementary terms and its completeness demonstrates the discriminating power of handlers. It crucially takes advantage of the clean separation of effect handling code from effect raising construct, a distinctive feature of algebraic effects, not present in other closely related control structures such as delimited-control operators.

Cite as

Dariusz Biernacki, Sergueï Lenglet, and Piotr Polesiuk. A Complete Normal-Form Bisimilarity for Algebraic Effects and Handlers. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 7:1-7:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{biernacki_et_al:LIPIcs.FSCD.2020.7,
  author =	{Biernacki, Dariusz and Lenglet, Sergue\"{i} and Polesiuk, Piotr},
  title =	{{A Complete Normal-Form Bisimilarity for Algebraic Effects and Handlers}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{7:1--7:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.7},
  URN =		{urn:nbn:de:0030-drops-123295},
  doi =		{10.4230/LIPIcs.FSCD.2020.7},
  annote =	{Keywords: algebraic effect, handler, behavioral equivalence, bisimilarity}
}
Document
Pomsets with Boxes: Protection, Separation, and Locality in Concurrent Kleene Algebra

Authors: Paul Brunet and David Pym


Abstract
Concurrent Kleene Algebra is an elegant tool for equational reasoning about concurrent programs. An important feature of concurrent programs that is missing from CKA is the ability to restrict legal interleavings. To remedy this we extend the standard model of CKA, namely pomsets, with a new feature, called boxes, which can specify that part of the system is protected from outside interference. We study the algebraic properties of this new model. Another drawback of CKA is that the language used for expressing properties of programs is the same as that which is used to express programs themselves. This is often too restrictive for practical purposes. We provide a logic, "pomset logic", that is an assertion language for specifying such properties, and which is interpreted on pomsets with boxes. In contrast with other approaches, this logic is not state-based, but rather characterizes the runtime behaviour of a program. We develop the basic metatheory for the relationship between pomset logic and CKA, including frame rules to support local reasoning, and illustrate this relationship with simple examples.

Cite as

Paul Brunet and David Pym. Pomsets with Boxes: Protection, Separation, and Locality in Concurrent Kleene Algebra. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 8:1-8:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{brunet_et_al:LIPIcs.FSCD.2020.8,
  author =	{Brunet, Paul and Pym, David},
  title =	{{Pomsets with Boxes: Protection, Separation, and Locality in Concurrent Kleene Algebra}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{8:1--8:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.8},
  URN =		{urn:nbn:de:0030-drops-123308},
  doi =		{10.4230/LIPIcs.FSCD.2020.8},
  annote =	{Keywords: Concurrent Kleene Algebra, Pomsets, Atomicity, Semantics, Separation, Local reasoning, Bunched logic, Frame rules}
}
Document
Undecidability of Semi-Unification on a Napkin

Authors: Andrej Dudenhefner


Abstract
Semi-unification (unification combined with matching) has been proven undecidable by Kfoury, Tiuryn, and Urzyczyn in the 1990s. The original argument reduces Turing machine immortality via Turing machine boundedness to semi-unification. The latter part is technically most challenging, involving several intermediate models of computation. This work presents a novel, simpler reduction from Turing machine boundedness to semi-unification. In contrast to the original argument, we directly translate boundedness to solutions of semi-unification and vice versa. In addition, the reduction is mechanized in the Coq proof assistant, relying on a mechanization-friendly stack machine model that corresponds to space-bounded Turing machines. Taking advantage of the simpler proof, the mechanization is comparatively short and fully constructive.

Cite as

Andrej Dudenhefner. Undecidability of Semi-Unification on a Napkin. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 9:1-9:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{dudenhefner:LIPIcs.FSCD.2020.9,
  author =	{Dudenhefner, Andrej},
  title =	{{Undecidability of Semi-Unification on a Napkin}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{9:1--9:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.9},
  URN =		{urn:nbn:de:0030-drops-123311},
  doi =		{10.4230/LIPIcs.FSCD.2020.9},
  annote =	{Keywords: undecidability, semi-unification, mechanization}
}
Document
Conditional Bisimilarity for Reactive Systems

Authors: Mathias Hülsbusch, Barbara König, Sebastian Küpper, and Lara Stoltenow


Abstract
Reactive systems à la Leifer and Milner, an abstract categorical framework for rewriting, provide a suitable framework for deriving bisimulation congruences. This is done by synthesizing interactions with the environment in order to obtain a compositional semantics. We enrich the notion of reactive systems by conditions on two levels: first, as in earlier work, we consider rules enriched with application conditions and second, we investigate the notion of conditional bisimilarity. Conditional bisimilarity allows us to say that two system states are bisimilar provided that the environment satisfies a given condition. We present several equivalent definitions of conditional bisimilarity, including one that is useful for concrete proofs and that employs an up-to-context technique, and we compare with related behavioural equivalences. We instantiate reactive systems in order to obtain DPO graph rewriting and consider a case study in this setting.

Cite as

Mathias Hülsbusch, Barbara König, Sebastian Küpper, and Lara Stoltenow. Conditional Bisimilarity for Reactive Systems. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 10:1-10:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{hulsbusch_et_al:LIPIcs.FSCD.2020.10,
  author =	{H\"{u}lsbusch, Mathias and K\"{o}nig, Barbara and K\"{u}pper, Sebastian and Stoltenow, Lara},
  title =	{{Conditional Bisimilarity for Reactive Systems}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{10:1--10:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.10},
  URN =		{urn:nbn:de:0030-drops-123322},
  doi =		{10.4230/LIPIcs.FSCD.2020.10},
  annote =	{Keywords: conditional bisimilarity, reactive systems, up-to context, graph transformation}
}
Document
A Fast Decision Procedure For Uniqueness of Normal Forms w.r.t. Conversion of Shallow Term Rewriting Systems

Authors: Masaomi Yamaguchi and Takahito Aoto


Abstract
Uniqueness of normal forms w.r.t. conversion (UNC) of term rewriting systems (TRSs) guarantees that there are no distinct convertible normal forms. It was recently shown that the UNC property of TRSs is decidable for shallow TRSs (Radcliffe et al., 2010). The existing procedure mainly consists of testing whether there exists a counterexample in a finite set of candidates; however, the procedure suffers a bottleneck of having a sheer number of such candidates. In this paper, we propose a new procedure which consists of checking a smaller number of such candidates and enumerating such candidates more efficiently. Correctness of the proposed procedure is proved and its complexity is analyzed. Furthermore, these two procedures have been implemented and it is experimentally confirmed that the proposed procedure runs much faster than the existing procedure.

Cite as

Masaomi Yamaguchi and Takahito Aoto. A Fast Decision Procedure For Uniqueness of Normal Forms w.r.t. Conversion of Shallow Term Rewriting Systems. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 11:1-11:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{yamaguchi_et_al:LIPIcs.FSCD.2020.11,
  author =	{Yamaguchi, Masaomi and Aoto, Takahito},
  title =	{{A Fast Decision Procedure For Uniqueness of Normal Forms w.r.t. Conversion of Shallow Term Rewriting Systems}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{11:1--11:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.11},
  URN =		{urn:nbn:de:0030-drops-123338},
  doi =		{10.4230/LIPIcs.FSCD.2020.11},
  annote =	{Keywords: uniqueness of normal forms w.r.t. conversion, shallow term rewriting systems, decision procedure}
}
Document
Modules over Monads and Operational Semantics

Authors: André Hirschowitz, Tom Hirschowitz, and Ambroise Lafont


Abstract
This paper is a contribution to the search for efficient and high-level mathematical tools to specify and reason about (abstract) programming languages or calculi. Generalising the reduction monads of Ahrens et al., we introduce transition monads, thus covering new applications such as ̅λμ-calculus, π-calculus, Positive GSOS specifications, differential λ-calculus, and the big-step, simply-typed, call-by-value λ-calculus. Finally, we design a suitable notion of signature for transition monads.

Cite as

André Hirschowitz, Tom Hirschowitz, and Ambroise Lafont. Modules over Monads and Operational Semantics. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 12:1-12:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{hirschowitz_et_al:LIPIcs.FSCD.2020.12,
  author =	{Hirschowitz, Andr\'{e} and Hirschowitz, Tom and Lafont, Ambroise},
  title =	{{Modules over Monads and Operational Semantics}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{12:1--12:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.12},
  URN =		{urn:nbn:de:0030-drops-123341},
  doi =		{10.4230/LIPIcs.FSCD.2020.12},
  annote =	{Keywords: Operational semantics, Category theory}
}
Document
Type Safety of Rewrite Rules in Dependent Types

Authors: Frédéric Blanqui


Abstract
The expressiveness of dependent type theory can be extended by identifying types modulo some additional computation rules. But, for preserving the decidability of type-checking or the logical consistency of the system, one must make sure that those user-defined rewriting rules preserve typing. In this paper, we give a new method to check that property using Knuth-Bendix completion.

Cite as

Frédéric Blanqui. Type Safety of Rewrite Rules in Dependent Types. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 13:1-13:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{blanqui:LIPIcs.FSCD.2020.13,
  author =	{Blanqui, Fr\'{e}d\'{e}ric},
  title =	{{Type Safety of Rewrite Rules in Dependent Types}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{13:1--13:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.13},
  URN =		{urn:nbn:de:0030-drops-123353},
  doi =		{10.4230/LIPIcs.FSCD.2020.13},
  annote =	{Keywords: subject-reduction, rewriting, dependent types}
}
Document
Refining Constructive Hybrid Games

Authors: Rose Bohrer and André Platzer


Abstract
We extend the constructive differential game logic (CdGL) of hybrid games with a refinement connective that relates two hybrid games. In addition to CdGL’s ability to prove the existence of winning strategies for specific postconditions of hybrid games, game refinements relate two games to one another. That makes it possible to prove that any winning strategy for any postcondition of one game carries over to a winning strategy for the other. Since CdGL is constructive, a computable winning strategy can be extracted from a proof that a player wins a game. A folk theorem says that any such winning strategy for a hybrid game gives rise to a corresponding hybrid system satisfying the same property. We make this precise using CdGL’s game refinements and prove correct the construction of hybrid systems from winning strategies of hybrid games.

Cite as

Rose Bohrer and André Platzer. Refining Constructive Hybrid Games. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 14:1-14:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bohrer_et_al:LIPIcs.FSCD.2020.14,
  author =	{Bohrer, Rose and Platzer, Andr\'{e}},
  title =	{{Refining Constructive Hybrid Games}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{14:1--14:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.14},
  URN =		{urn:nbn:de:0030-drops-123369},
  doi =		{10.4230/LIPIcs.FSCD.2020.14},
  annote =	{Keywords: Hybrid Games, Constructive Logic, Refinement, Game Logic}
}
Document
Data-Flow Analyses as Effects and Graded Monads

Authors: Andrej Ivašković, Alan Mycroft, and Dominic Orchard


Abstract
In static analysis, two frameworks have been studied extensively: monotone data-flow analysis and type-and-effect systems. Whilst both are seen as general analysis frameworks, their relationship has remained unclear. Here we show that monotone data-flow analyses can be encoded as effect systems in a uniform way, via algebras of transfer functions. This helps to answer questions about the most appropriate structure for general effect algebras, especially with regards capturing control-flow precisely. Via the perspective of capturing data-flow analyses, we show the recent suggestion of using effect quantales is not general enough as it excludes non-distributive analyses e.g., constant propagation. By rephrasing the McCarthy transformation, we then model monotone data-flow effects via graded monads. This provides a model of data-flow analyses that can be used to reason about analysis correctness at the semantic level, and to embed data-flow analyses into type systems.

Cite as

Andrej Ivašković, Alan Mycroft, and Dominic Orchard. Data-Flow Analyses as Effects and Graded Monads. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 15:1-15:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{ivaskovic_et_al:LIPIcs.FSCD.2020.15,
  author =	{Iva\v{s}kovi\'{c}, Andrej and Mycroft, Alan and Orchard, Dominic},
  title =	{{Data-Flow Analyses as Effects and Graded Monads}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{15:1--15:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.15},
  URN =		{urn:nbn:de:0030-drops-123376},
  doi =		{10.4230/LIPIcs.FSCD.2020.15},
  annote =	{Keywords: data-flow analysis, effect systems, graded monads, correctness}
}
Document
A Profunctorial Scott Semantics

Authors: Zeinab Galal


Abstract
In this paper, we study the bicategory of profunctors with the free finite coproduct pseudo-comonad and show that it constitutes a model of linear logic that generalizes the Scott model. We formalize the connection between the two models as a change of base for enriched categories which induces a pseudo-functor that preserves all the linear logic structure. We prove that morphisms in the co-Kleisli bicategory correspond to the concept of strongly finitary functors (sifted colimits preserving functors) between presheaf categories. We further show that this model provides solutions of recursive type equations which provides 2-dimensional models of the pure lambda calculus and we also exhibit a fixed point operator on terms.

Cite as

Zeinab Galal. A Profunctorial Scott Semantics. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 16:1-16:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{galal:LIPIcs.FSCD.2020.16,
  author =	{Galal, Zeinab},
  title =	{{A Profunctorial Scott Semantics}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{16:1--16:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.16},
  URN =		{urn:nbn:de:0030-drops-123387},
  doi =		{10.4230/LIPIcs.FSCD.2020.16},
  annote =	{Keywords: Linear Logic, Scott Semantics, Profunctors}
}
Document
String Diagrams for Optics

Authors: Guillaume Boisseau


Abstract
Optics are a data representation for compositional data access, with lenses as a popular special case. Hedges has presented a diagrammatic calculus for lenses, but in a way that does not generalize to other classes of optic. We present a calculus that works for all optics, not just lenses; this is done by embedding optics into their presheaf category, which naturally features string diagrams. We apply our calculus to the common case of lenses, extend it to effectful lenses, and explore how the laws of optics manifest in this setting.

Cite as

Guillaume Boisseau. String Diagrams for Optics. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 17:1-17:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{boisseau:LIPIcs.FSCD.2020.17,
  author =	{Boisseau, Guillaume},
  title =	{{String Diagrams for Optics}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{17:1--17:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.17},
  URN =		{urn:nbn:de:0030-drops-123399},
  doi =		{10.4230/LIPIcs.FSCD.2020.17},
  annote =	{Keywords: Optic, string diagram, lens, category theory, Yoneda lemma}
}
Document
A Reflection on Continuation-Composing Style

Authors: Dariusz Biernacki, Mateusz Pyzik, and Filip Sieczkowski


Abstract
We present a study of the continuation-composing style (CCS) that describes the image of the CPS translation of Danvy and Filinski’s shift and reset delimited-control operators. In CCS continuations are composable rather than abortive as in the traditional CPS, and, therefore, the structure of terms is considerably more complex. We show that the CPS translation from Moggi’s computational lambda calculus extended with shift and reset has a right inverse and that the two translations form a reflection i.e., a Galois connection in which the target is isomorphic to a subset of the source (the orders are given by the reduction relations). Furthermore, we use this result to show that Plotkin’s call-by-value lambda calculus extended with shift and reset is isomorphic to the image of the CPS translation. This result, in particular, provides a first direct-style transformation for delimited continuations that is an inverse of the CPS transformation up to syntactic identity.

Cite as

Dariusz Biernacki, Mateusz Pyzik, and Filip Sieczkowski. A Reflection on Continuation-Composing Style. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 18:1-18:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{biernacki_et_al:LIPIcs.FSCD.2020.18,
  author =	{Biernacki, Dariusz and Pyzik, Mateusz and Sieczkowski, Filip},
  title =	{{A Reflection on Continuation-Composing Style}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{18:1--18:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.18},
  URN =		{urn:nbn:de:0030-drops-123402},
  doi =		{10.4230/LIPIcs.FSCD.2020.18},
  annote =	{Keywords: delimited control, continuation-passing style, reflection, call-by-value lambda calculus, computational lambda calculus}
}
Document
A Probabilistic Higher-Order Fixpoint Logic

Authors: Yo Mitani, Naoki Kobayashi, and Takeshi Tsukada


Abstract
We introduce PHFL, a probabilistic extension of higher-order fixpoint logic, which can also be regarded as a higher-order extension of probabilistic temporal logics such as PCTL and the μ^p-calculus. We show that PHFL is strictly more expressive than the μ^p-calculus, and that the PHFL model-checking problem for finite Markov chains is undecidable even for the μ-only, order-1 fragment of PHFL. Furthermore the full PHFL is far more expressive: we give a translation from Lubarsky’s μ-arithmetic to PHFL, which implies that PHFL model checking is Π^1₁-hard and Σ^1₁-hard. As a positive result, we characterize a decidable fragment of the PHFL model-checking problems using a novel type system.

Cite as

Yo Mitani, Naoki Kobayashi, and Takeshi Tsukada. A Probabilistic Higher-Order Fixpoint Logic. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 19:1-19:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{mitani_et_al:LIPIcs.FSCD.2020.19,
  author =	{Mitani, Yo and Kobayashi, Naoki and Tsukada, Takeshi},
  title =	{{A Probabilistic Higher-Order Fixpoint Logic}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{19:1--19:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.19},
  URN =		{urn:nbn:de:0030-drops-123413},
  doi =		{10.4230/LIPIcs.FSCD.2020.19},
  annote =	{Keywords: Probabilistic logics, higher-order fixpoint logic, model checking}
}
Document
Adaptive Non-Linear Pattern Matching Automata

Authors: Rick Erkens and Maurice Laveaux


Abstract
Efficient pattern matching is fundamental for practical term rewrite engines. By preprocessing the given patterns into a finite deterministic automaton the matching patterns can be decided in a single traversal of the relevant parts of the input term. Most automaton-based techniques are restricted to linear patterns, where each variable occurs at most once, and require an additional post-processing step to check so-called variable consistency. However, we can show that interleaving the variable consistency and pattern matching phases can reduce the number of required steps to find a match all matches. Therefore, we take the existing adaptive pattern matching automata as introduced by Sekar et al and extend it these with consistency checks. We prove that the resulting deterministic pattern matching automaton is correct, and show that its evaluation depth is can be shorter than two-phase approaches.

Cite as

Rick Erkens and Maurice Laveaux. Adaptive Non-Linear Pattern Matching Automata. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 20:1-20:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{erkens_et_al:LIPIcs.FSCD.2020.20,
  author =	{Erkens, Rick and Laveaux, Maurice},
  title =	{{Adaptive Non-Linear Pattern Matching Automata}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{20:1--20:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.20},
  URN =		{urn:nbn:de:0030-drops-123427},
  doi =		{10.4230/LIPIcs.FSCD.2020.20},
  annote =	{Keywords: Pattern matching, Term indexing, Tree automata}
}
Document
On Average-Case Hardness of Higher-Order Model Checking

Authors: Yoshiki Nakamura, Kazuyuki Asada, Naoki Kobayashi, Ryoma Sin'ya, and Takeshi Tsukada


Abstract
We study a mixture between the average case and worst case complexities of higher-order model checking, the problem of deciding whether the tree generated by a given λ Y-term (or equivalently, a higher-order recursion scheme) satisfies the property expressed by a given tree automaton. Higher-order model checking has recently been studied extensively in the context of higher-order program verification. Although the worst-case complexity of the problem is k-EXPTIME complete for order-k terms, various higher-order model checkers have been developed that run efficiently for typical inputs, and program verification tools have been constructed on top of them. One may, therefore, hope that higher-order model checking can be solved efficiently in the average case, despite the worst-case complexity. We provide a negative result, by showing that, under certain assumptions, for almost every term, the higher-order model checking problem specialized for the term is k-EXPTIME hard with respect to the size of automata. The proof is based on a novel intersection type system that characterizes terms that do not contain any useless subterms.

Cite as

Yoshiki Nakamura, Kazuyuki Asada, Naoki Kobayashi, Ryoma Sin'ya, and Takeshi Tsukada. On Average-Case Hardness of Higher-Order Model Checking. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 21:1-21:23, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nakamura_et_al:LIPIcs.FSCD.2020.21,
  author =	{Nakamura, Yoshiki and Asada, Kazuyuki and Kobayashi, Naoki and Sin'ya, Ryoma and Tsukada, Takeshi},
  title =	{{On Average-Case Hardness of Higher-Order Model Checking}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{21:1--21:23},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.21},
  URN =		{urn:nbn:de:0030-drops-123439},
  doi =		{10.4230/LIPIcs.FSCD.2020.21},
  annote =	{Keywords: Higher-order model checking, average-case complexity, intersection type system}
}
Document
Size-Preserving Translations from Order-(n+1) Word Grammars to Order-n Tree Grammars

Authors: Kazuyuki Asada and Naoki Kobayashi


Abstract
Higher-order grammars have recently been studied actively in the context of automated verification of higher-order programs. Asada and Kobayashi have previously shown that, for any order-(n+1) word grammar, there exists an order-n grammar whose frontier language coincides with the language generated by the word grammar. Their translation, however, blows up the size of the grammar, which inhibited complexity-preserving reductions from decision problems on word grammars to those on tree grammars. In this paper, we present a new translation from order-(n+1) word grammars to order-n tree grammars that is size-preserving in the sense that the size of the output tree grammar is polynomial in the size of an input tree grammar. The new translation and its correctness proof are arguably much simpler than the previous translation and proof.

Cite as

Kazuyuki Asada and Naoki Kobayashi. Size-Preserving Translations from Order-(n+1) Word Grammars to Order-n Tree Grammars. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 22:1-22:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{asada_et_al:LIPIcs.FSCD.2020.22,
  author =	{Asada, Kazuyuki and Kobayashi, Naoki},
  title =	{{Size-Preserving Translations from Order-(n+1) Word Grammars to Order-n Tree Grammars}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{22:1--22:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.22},
  URN =		{urn:nbn:de:0030-drops-123440},
  doi =		{10.4230/LIPIcs.FSCD.2020.22},
  annote =	{Keywords: higher-order grammar, word language, tree language, complexity}
}
Document
A Syntax for Mutual Inductive Families

Authors: Ambrus Kaposi and Jakob von Raumer


Abstract
Inductive families of types are a feature of most languages based on dependent types. They are usually described either by syntactic schemes or by encodings of strictly positive functors such as combinator languages or containers. The former approaches are informal and give only external signatures, the latter approaches suffer from encoding overheads and do not directly represent mutual types. In this paper we propose a direct method for describing signatures for mutual inductive families using a domain-specific type theory. A signature is a context (roughly speaking, a list of types) in this small type theory. Algebras, displayed algebras and sections are defined by models of this type theory: the standard model, the logical predicate and a logical relation interpretation, respectively. We reduce the existence of initial algebras for these signatures to the existence of the syntax of our domain-specific type theory. As this theory is very simple, its normal syntax can be encoded using indexed W-types. To the best of our knowledge, this is the first formalisation of the folklore fact that mutual inductive types can be reduced to indexed W-types. The contents of this paper were formalised in the proof assistant Agda.

Cite as

Ambrus Kaposi and Jakob von Raumer. A Syntax for Mutual Inductive Families. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 23:1-23:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kaposi_et_al:LIPIcs.FSCD.2020.23,
  author =	{Kaposi, Ambrus and von Raumer, Jakob},
  title =	{{A Syntax for Mutual Inductive Families}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{23:1--23:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.23},
  URN =		{urn:nbn:de:0030-drops-123451},
  doi =		{10.4230/LIPIcs.FSCD.2020.23},
  annote =	{Keywords: type theory, inductive types, mutual inductive types, W-types, Agda}
}
Document
Towards Constructive Hybrid Semantics

Authors: Tim Lukas Diezel and Sergey Goncharov


Abstract
With hybrid systems becoming ever more pervasive, the underlying semantic challenges emerge in their entirety. The need for principled semantic foundations has been recognized previously in the case of discrete computation and discrete data, with subsequent implementations in programming languages and proof assistants. Hybrid systems, contrastingly, do not directly fit into the classical semantic paradigms due to the presence of quite specific "non-programmable" features, such as Zeno behaviour and the inherent indispensable reliance on a notion of continuous time. Here, we analyze the phenomenon of hybrid semantics from a constructive viewpoint. In doing so, we propose a monad-based semantics, generic over a given ordered monoid representing the time domain, hence abstracting from the monoid of constructive reals. We implement our construction as a higher inductive-inductive type in the recent cubical extension of the Agda proof assistant, significantly using state-of-the-art advances of homotopy type theory. We show that classically, i.e. under the axiom of choice, our construction admits a charaterization in terms of directed sequence completion.

Cite as

Tim Lukas Diezel and Sergey Goncharov. Towards Constructive Hybrid Semantics. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 24:1-24:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{diezel_et_al:LIPIcs.FSCD.2020.24,
  author =	{Diezel, Tim Lukas and Goncharov, Sergey},
  title =	{{Towards Constructive Hybrid Semantics}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{24:1--24:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.24},
  URN =		{urn:nbn:de:0030-drops-123466},
  doi =		{10.4230/LIPIcs.FSCD.2020.24},
  annote =	{Keywords: Hybrid semantics, Elgot iteration, Homotopy type theory, Agda}
}
Document
A Gentzen-Style Monadic Translation of Gödel’s System T

Authors: Chuangjie Xu


Abstract
We introduce a syntactic translation of Gödel’s System 𝖳 parametrized by a weak notion of a monad, and prove a corresponding fundamental theorem of logical relation. Our translation structurally corresponds to Gentzen’s negative translation of classical logic. By instantiating the monad and the logical relation, we reveal the well-known properties and structures of 𝖳-definable functionals including majorizability, continuity and bar recursion. Our development has been formalized in the Agda proof assistant.

Cite as

Chuangjie Xu. A Gentzen-Style Monadic Translation of Gödel’s System T. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 25:1-25:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{xu:LIPIcs.FSCD.2020.25,
  author =	{Xu, Chuangjie},
  title =	{{A Gentzen-Style Monadic Translation of G\"{o}del’s System T}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{25:1--25:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.25},
  URN =		{urn:nbn:de:0030-drops-123472},
  doi =		{10.4230/LIPIcs.FSCD.2020.25},
  annote =	{Keywords: monadic translation, G\"{o}del’s System T, logical relation, negative translation, majorizability, continuity, bar recursion, Agda}
}
Document
Unital Anti-Unification: Type and Algorithms

Authors: David M. Cerna and Temur Kutsia


Abstract
Unital equational theories are defined by axioms that assert the existence of the unit element for some function symbols. We study anti-unification (AU) in unital theories and address the problems of establishing generalization type and designing anti-unification algorithms. First, we prove that when the term signature contains at least two unital functions, anti-unification is of the nullary type by showing that there exists an AU problem, which does not have a minimal complete set of generalizations. Next, we consider two special cases: the linear variant and the fragment with only one unital symbol, and design AU algorithms for them. The algorithms are terminating, sound, complete, and return tree grammars from which the set of generalizations can be constructed. Anti-unification for both special cases is finitary. Further, the algorithm for the one-unital fragment is extended to the unrestricted case. It terminates and returns a tree grammar which produces an infinite set of generalizations. At the end, we discuss how the nullary type of unital anti-unification might affect the anti-unification problem in some combined theories, and list some open questions.

Cite as

David M. Cerna and Temur Kutsia. Unital Anti-Unification: Type and Algorithms. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 26:1-26:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{cerna_et_al:LIPIcs.FSCD.2020.26,
  author =	{Cerna, David M. and Kutsia, Temur},
  title =	{{Unital Anti-Unification: Type and Algorithms}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{26:1--26:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.26},
  URN =		{urn:nbn:de:0030-drops-123482},
  doi =		{10.4230/LIPIcs.FSCD.2020.26},
  annote =	{Keywords: Anti-unification, tree grammars, unital theories, collapse theories}
}
Document
Symbolic Execution Game Semantics

Authors: Yu-Yang Lin and Nikos Tzevelekos


Abstract
We present a framework for symbolically executing and model checking higher-order programs with external (open) methods. We focus on the client-library paradigm and in particular we aim to check libraries with respect to any definable client. We combine traditional symbolic execution techniques with operational game semantics to build a symbolic execution semantics that captures arbitrary external behaviour. We prove the symbolic semantics to be sound and complete. This yields a bounded technique by imposing bounds on the depth of recursion and callbacks. We provide an implementation of our technique in the 𝕂 framework and showcase its performance on a custom benchmark based on higher-order coding errors such as reentrancy bugs.

Cite as

Yu-Yang Lin and Nikos Tzevelekos. Symbolic Execution Game Semantics. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 27:1-27:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{lin_et_al:LIPIcs.FSCD.2020.27,
  author =	{Lin, Yu-Yang and Tzevelekos, Nikos},
  title =	{{Symbolic Execution Game Semantics}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{27:1--27:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.27},
  URN =		{urn:nbn:de:0030-drops-123493},
  doi =		{10.4230/LIPIcs.FSCD.2020.27},
  annote =	{Keywords: game semantics, symbolic execution, higher-order open programs}
}
Document
Strongly Normalizing Higher-Order Relational Queries

Authors: Wilmer Ricciotti and James Cheney


Abstract
Language-integrated query is a powerful programming construct allowing database queries and ordinary program code to interoperate seamlessly and safely. Language-integrated query techniques rely on classical results about monadic comprehension calculi, including the conservativity theorem for nested relational calculus. Conservativity implies that query expressions can freely use nesting and unnesting, yet as long as the query result type is a flat relation, these capabilities do not lead to an increase in expressiveness over flat relational queries. Wong showed how such queries can be translated to SQL via a constructive rewriting algorithm, and Cooper and others advocated higher-order nested relational calculi as a basis for language-integrated queries in functional languages such as Links and F#. However there is no published proof of the central strong normalization property for higher-order nested relational queries: a previous proof attempt does not deal correctly with rewrite rules that duplicate subterms. This paper fills the gap in the literature, explaining the difficulty with a previous proof attempt, and showing how to extend the ⊤⊤-lifting approach of Lindley and Stark to accommodate duplicating rewrites. We also sketch how to extend the proof to a recently-introduced calculus for heterogeneous queries mixing set and multiset semantics.

Cite as

Wilmer Ricciotti and James Cheney. Strongly Normalizing Higher-Order Relational Queries. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 28:1-28:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{ricciotti_et_al:LIPIcs.FSCD.2020.28,
  author =	{Ricciotti, Wilmer and Cheney, James},
  title =	{{Strongly Normalizing Higher-Order Relational Queries}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{28:1--28:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.28},
  URN =		{urn:nbn:de:0030-drops-123506},
  doi =		{10.4230/LIPIcs.FSCD.2020.28},
  annote =	{Keywords: Strong normalization, ⊤⊤-lifting, Nested relational calculus, Language-integrated query}
}
Document
Semi-Axiomatic Sequent Calculus

Authors: Henry DeYoung, Frank Pfenning, and Klaas Pruiksma


Abstract
We present the semi-axiomatic sequent calculus (SAX) that blends features of Gentzen’s sequent calculus with an axiomatic formulation of intuitionistic logic. We develop and prove a suitable analogue to cut elimination and then show that a natural computational interpretation of SAX provides a simple form of shared memory concurrency.

Cite as

Henry DeYoung, Frank Pfenning, and Klaas Pruiksma. Semi-Axiomatic Sequent Calculus. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 29:1-29:22, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{deyoung_et_al:LIPIcs.FSCD.2020.29,
  author =	{DeYoung, Henry and Pfenning, Frank and Pruiksma, Klaas},
  title =	{{Semi-Axiomatic Sequent Calculus}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{29:1--29:22},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.29},
  URN =		{urn:nbn:de:0030-drops-123515},
  doi =		{10.4230/LIPIcs.FSCD.2020.29},
  annote =	{Keywords: Sequent calculus, Curry-Howard isomorphism, shared memory concurrency}
}
Document
Constraint Solving over Multiple Similarity Relations

Authors: Besik Dundua, Temur Kutsia, Mircea Marin, and Cleopatra Pau


Abstract
Similarity relations are reflexive, symmetric, and transitive fuzzy relations. They help to make approximate inferences, replacing the notion of equality. Similarity-based unification has been quite intensively investigated, as a core computational method for approximate reasoning and declarative programming. In this paper we consider solving constraints over several similarity relations, instead of a single one. Multiple similarities pose challenges to constraint solving, since we can not rely on the transitivity property anymore. Existing methods for unification with fuzzy proximity relations (reflexive, symmetric, non-transitive relations) do not provide a solution that would adequately reflect particularities of dealing with multiple similarities. To address this problem, we develop a constraint solving algorithm for multiple similarity relations, prove its termination, soundness, and completeness properties, and discuss applications.

Cite as

Besik Dundua, Temur Kutsia, Mircea Marin, and Cleopatra Pau. Constraint Solving over Multiple Similarity Relations. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 30:1-30:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{dundua_et_al:LIPIcs.FSCD.2020.30,
  author =	{Dundua, Besik and Kutsia, Temur and Marin, Mircea and Pau, Cleopatra},
  title =	{{Constraint Solving over Multiple Similarity Relations}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{30:1--30:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.30},
  URN =		{urn:nbn:de:0030-drops-123522},
  doi =		{10.4230/LIPIcs.FSCD.2020.30},
  annote =	{Keywords: Fuzzy relations, similarity, constraint solving}
}
Document
Encoding Agda Programs Using Rewriting

Authors: Guillaume Genestier


Abstract
We present in this paper an encoding in an extension with rewriting of the Edimburgh Logical Framework (LF) [Harper et al., 1993] of two common features: universe polymorphism and eta-convertibility. This encoding is at the root of the translator between Agda and Dedukti developped by the author.

Cite as

Guillaume Genestier. Encoding Agda Programs Using Rewriting. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 31:1-31:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{genestier:LIPIcs.FSCD.2020.31,
  author =	{Genestier, Guillaume},
  title =	{{Encoding Agda Programs Using Rewriting}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{31:1--31:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.31},
  URN =		{urn:nbn:de:0030-drops-123530},
  doi =		{10.4230/LIPIcs.FSCD.2020.31},
  annote =	{Keywords: Logical Frameworks, Rewriting, Universe Polymorphism, Eta Conversion}
}
Document
The Difference λ-Calculus: A Language for Difference Categories

Authors: Mario Alvarez-Picallo and C.-H. Luke Ong


Abstract
Cartesian difference categories are a recent generalisation of Cartesian differential categories which introduce a notion of "infinitesimal" arrows satisfying an analogue of the Kock-Lawvere axiom, with the axioms of a Cartesian differential category being satisfied only "up to an infinitesimal perturbation". In this work, we construct a simply-typed calculus in the spirit of the differential λ-calculus equipped with syntactic "infinitesimals" and show how its models correspond to difference λ-categories, a family of Cartesian difference categories equipped with suitably well-behaved exponentials.

Cite as

Mario Alvarez-Picallo and C.-H. Luke Ong. The Difference λ-Calculus: A Language for Difference Categories. In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 32:1-32:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{alvarezpicallo_et_al:LIPIcs.FSCD.2020.32,
  author =	{Alvarez-Picallo, Mario and Ong, C.-H. Luke},
  title =	{{The Difference \lambda-Calculus: A Language for Difference Categories}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{32:1--32:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.32},
  URN =		{urn:nbn:de:0030-drops-123549},
  doi =		{10.4230/LIPIcs.FSCD.2020.32},
  annote =	{Keywords: Cartesian difference categories, Cartesian differential categories, Change actions, Differential lambda-calculus, Difference lambda-calculus}
}
Document
System Description
Rast: Resource-Aware Session Types with Arithmetic Refinements (System Description)

Authors: Ankush Das and Frank Pfenning


Abstract
Traditional session types prescribe bidirectional communication protocols for concurrent computations, where well-typed programs are guaranteed to adhere to the protocols. Recent work has extended session types with refinements from linear arithmetic, capturing intrinsic properties of processes and data. These refinements then play a central role in describing sequential and parallel complexity bounds on session-typed programs. The Rast language and system provide an open-source implementation of session-typed concurrent programs extended with arithmetic refinements as well as ergometric and temporal types to capture work and span of program execution. Type checking relies on Cooper’s algorithm for quantifier elimination in Presburger arithmetic with a few significant optimizations, and a heuristic extension to nonlinear constraints. Rast furthermore includes a reconstruction engine so that most program constructs pertaining the layers of refinements and resources are inserted automatically. We provide a variety of examples to demonstrate the expressivity of the language.

Cite as

Ankush Das and Frank Pfenning. Rast: Resource-Aware Session Types with Arithmetic Refinements (System Description). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 33:1-33:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{das_et_al:LIPIcs.FSCD.2020.33,
  author =	{Das, Ankush and Pfenning, Frank},
  title =	{{Rast: Resource-Aware Session Types with Arithmetic Refinements}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{33:1--33:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.33},
  URN =		{urn:nbn:de:0030-drops-123558},
  doi =		{10.4230/LIPIcs.FSCD.2020.33},
  annote =	{Keywords: Session Types, Resource Analysis, Refinement Types}
}
Document
System Description
Hierarchy Builder: Algebraic hierarchies Made Easy in Coq with Elpi (System Description)

Authors: Cyril Cohen, Kazuhiko Sakaguchi, and Enrico Tassi


Abstract
It is nowadays customary to organize libraries of machine checked proofs around hierarchies of algebraic structures. One influential example is the Mathematical Components library on top of which the long and intricate proof of the Odd Order Theorem could be fully formalized. Still, building algebraic hierarchies in a proof assistant such as Coq requires a lot of manual labor and often a deep expertise in the internals of the prover. Moreover, according to our experience, making a hierarchy evolve without causing breakage in client code is equally tricky: even a simple refactoring such as splitting a structure into two simpler ones is hard to get right. In this paper we describe HB, a high level language to build hierarchies of algebraic structures and to make these hierarchies evolve without breaking user code. The key concepts are the ones of factory, builder and abbreviation that let the hierarchy developer describe an actual interface for their library. Behind that interface the developer can provide appropriate code to ensure backward compatibility. We implement the HB language in the hierarchy-builder addon for the Coq system using the Elpi extension language.

Cite as

Cyril Cohen, Kazuhiko Sakaguchi, and Enrico Tassi. Hierarchy Builder: Algebraic hierarchies Made Easy in Coq with Elpi (System Description). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 34:1-34:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.FSCD.2020.34,
  author =	{Cohen, Cyril and Sakaguchi, Kazuhiko and Tassi, Enrico},
  title =	{{Hierarchy Builder: Algebraic hierarchies Made Easy in Coq with Elpi}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{34:1--34:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.34},
  URN =		{urn:nbn:de:0030-drops-123562},
  doi =		{10.4230/LIPIcs.FSCD.2020.34},
  annote =	{Keywords: Algebraic Hierarchy, Packed Classes, Coq, Elpi, Metaprogramming, \lambdaProlog}
}
Document
System Description
The New Rewriting Engine of Dedukti (System Description)

Authors: Gabriel Hondet and Frédéric Blanqui


Abstract
Dedukti is a type-checker for the λΠ-calculus modulo rewriting, an extension of Edinburgh’s logical framework LF where functions and type symbols can be defined by rewrite rules. It therefore contains an engine for rewriting LF terms and types according to the rewrite rules given by the user. A key component of this engine is the matching algorithm to find which rules can be fired. In this paper, we describe the class of rewrite rules supported by Dedukti and the new implementation of the matching algorithm. Dedukti supports non-linear rewrite rules on terms with binders using higher-order pattern-matching as in Combinatory Reduction Systems (CRS). The new matching algorithm extends the technique of decision trees introduced by Luc Maranget in the OCaml compiler to this more general context.

Cite as

Gabriel Hondet and Frédéric Blanqui. The New Rewriting Engine of Dedukti (System Description). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 35:1-35:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{hondet_et_al:LIPIcs.FSCD.2020.35,
  author =	{Hondet, Gabriel and Blanqui, Fr\'{e}d\'{e}ric},
  title =	{{The New Rewriting Engine of Dedukti}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{35:1--35:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.35},
  URN =		{urn:nbn:de:0030-drops-123577},
  doi =		{10.4230/LIPIcs.FSCD.2020.35},
  annote =	{Keywords: rewriting, higher-order pattern-matching, decision trees}
}
Document
System Description
WANDA - a Higher Order Termination Tool (System Description)

Authors: Cynthia Kop


Abstract
Wanda is a fully automatic termination analysis tool for higher-order term rewriting. In this paper, we will discuss the methodology used in Wanda. Most pertinently, this includes a higher-order dependency pair framework and a variation of the higher-order recursive path ordering, as well as some non-termination analysis techniques and delegation to a first-order tool. Additionally, we will discuss Wanda’s internal rewriting formalism, and how to use Wanda in practice for systems in two different formalisms. We also present experimental results that consider both formalisms.

Cite as

Cynthia Kop. WANDA - a Higher Order Termination Tool (System Description). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 36:1-36:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kop:LIPIcs.FSCD.2020.36,
  author =	{Kop, Cynthia},
  title =	{{WANDA - a Higher Order Termination Tool}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{36:1--36:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.36},
  URN =		{urn:nbn:de:0030-drops-123587},
  doi =		{10.4230/LIPIcs.FSCD.2020.36},
  annote =	{Keywords: higher-order term rewriting, termination, automatic analysis, dependency pair framework, higher-order recursive path ordering}
}
Document
System Description
A Type Checker for a Logical Framework with Union and Intersection Types (System Description)

Authors: Claude Stolze and Luigi Liquori


Abstract
We present the syntax, semantics, typing, subtyping, unification, refinement, and REPL of BULL, a prototype theorem prover based on the Δ-Framework, i.e. a fully-typed Logical Framework à la Edinburgh LF decorated with union and intersection types, as described in previous papers by the authors. BULL also implements a subtyping algorithm for the Type Theory Ξ of Barbanera-Dezani-de'Liguoro. BULL has a command-line interface where the user can declare axioms, terms, and perform computations and some basic terminal-style features like error pretty-printing, subexpressions highlighting, and file loading. Moreover, it can typecheck a proof or normalize it. These terms can be incomplete, therefore the typechecking algorithm uses unification to try to construct the missing subterms. BULL uses the syntax of Berardi’s Pure Type Systems to improve the compactness and the modularity of the kernel. Abstract and concrete syntax are mostly aligned and similar to the concrete syntax of Coq. BULL uses a higher-order unification algorithm for terms, while typechecking and partial type inference are done by a bidirectional refinement algorithm, similar to the one found in Matita and Beluga. The refinement can be split into two parts: the essence refinement and the typing refinement. Binders are implemented using commonly-used de Bruijn indices. We have defined a concrete language syntax that will allow user to write Δ-terms. We have defined the reduction rules and an evaluator. We have implemented from scratch a refiner which does partial typechecking and type reconstruction. We have experimented BULL with classical examples of the intersection and union literature, such as the ones formalized by Pfenning with his Refinement Types in LF and by Pierce. We hope that this research vein could be useful to experiment, in a proof theoretical setting, forms of polymorphism alternatives to Girard’s parametric one.

Cite as

Claude Stolze and Luigi Liquori. A Type Checker for a Logical Framework with Union and Intersection Types (System Description). In 5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 167, pp. 37:1-37:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{stolze_et_al:LIPIcs.FSCD.2020.37,
  author =	{Stolze, Claude and Liquori, Luigi},
  title =	{{A Type Checker for a Logical Framework with Union and Intersection Types}},
  booktitle =	{5th International Conference on Formal Structures for Computation and Deduction (FSCD 2020)},
  pages =	{37:1--37:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-155-9},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{167},
  editor =	{Ariola, Zena M.},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.FSCD.2020.37},
  URN =		{urn:nbn:de:0030-drops-123597},
  doi =		{10.4230/LIPIcs.FSCD.2020.37},
  annote =	{Keywords: Intersection types, Union types, Dependent types, Subtyping, Type checker, Refiner, \Delta-Framework}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail