LIPIcs, Volume 283

34th International Symposium on Algorithms and Computation (ISAAC 2023)



Thumbnail PDF

Event

ISAAC 2023, December 3-6, 2023, Kyoto, Japan

Editors

Satoru Iwata
  • University of Tokyo, Tokyo,
  • Hokkaido University, Sapporo, Japan
Naonori Kakimura
  • Keio University, Yokohama, Japan

Publication Details

  • published at: 2023-11-28
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-289-1
  • DBLP: db/conf/isaac/isaac2023

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 283, ISAAC 2023, Complete Volume

Authors: Satoru Iwata and Naonori Kakimura


Abstract
LIPIcs, Volume 283, ISAAC 2023, Complete Volume

Cite as

34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 1-960, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@Proceedings{iwata_et_al:LIPIcs.ISAAC.2023,
  title =	{{LIPIcs, Volume 283, ISAAC 2023, Complete Volume}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{1--960},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023},
  URN =		{urn:nbn:de:0030-drops-193011},
  doi =		{10.4230/LIPIcs.ISAAC.2023},
  annote =	{Keywords: LIPIcs, Volume 283, ISAAC 2023, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Satoru Iwata and Naonori Kakimura


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 0:i-0:xvi, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{iwata_et_al:LIPIcs.ISAAC.2023.0,
  author =	{Iwata, Satoru and Kakimura, Naonori},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{0:i--0:xvi},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.0},
  URN =		{urn:nbn:de:0030-drops-193029},
  doi =		{10.4230/LIPIcs.ISAAC.2023.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Group Fairness: From Multiwinner Voting to Participatory Budgeting (Invited Talk)

Authors: Edith Elkind


Abstract
Many cities around the world allocate a part of their budget based on residents' votes, following a process known as participatory budgeting. It is important to understand which outcomes of this process should be viewed as fair, and whether fair outcomes could be computed efficiently. We summarise recent progress on this topic. We first focus on a special case of participatory budgeting where all candidate projects have the same cost (known as multiwinner voting), formulate progressively more demanding notions of fairness for this setting, and identify efficiently computable voting rules that satisfy them. We then discuss the challenges of extending these ideas to the general model.

Cite as

Edith Elkind. Group Fairness: From Multiwinner Voting to Participatory Budgeting (Invited Talk). In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 1:1-1:3, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{elkind:LIPIcs.ISAAC.2023.1,
  author =	{Elkind, Edith},
  title =	{{Group Fairness: From Multiwinner Voting to Participatory Budgeting}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{1:1--1:3},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.1},
  URN =		{urn:nbn:de:0030-drops-193038},
  doi =		{10.4230/LIPIcs.ISAAC.2023.1},
  annote =	{Keywords: multiwinner voting, participatory budgeting, justified representation}
}
Document
Invited Talk
Faithful Graph Drawing (Invited Talk)

Authors: Seok-Hee Hong


Abstract
Graph drawing aims to compute good geometric representations of graphs in two or three dimensions. It has wide applications in network visualisation, such as social networks and biological networks, arising from many other disciplines. This talk will review fundamental theoretical results as well as recent advances in graph drawing, including symmetric graph drawing, generalisation of the Tutte’s barycenter theorem, Steinitz’s theorem, and Fáry’s theorem, and the so-called beyond planar graphs such as k-planar graphs. I will conclude my talk with recent progress in visualization of big complex graphs, including sublinear-time graph drawing algorithms and faithful graph drawing.

Cite as

Seok-Hee Hong. Faithful Graph Drawing (Invited Talk). In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{hong:LIPIcs.ISAAC.2023.2,
  author =	{Hong, Seok-Hee},
  title =	{{Faithful Graph Drawing}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.2},
  URN =		{urn:nbn:de:0030-drops-193044},
  doi =		{10.4230/LIPIcs.ISAAC.2023.2},
  annote =	{Keywords: Graph drawing, Planar graphs, Beyond planar graphs, Tutte’s barycenter theorem, Steinitz’s theorem, F\'{a}ry’s theorem, Sublinear-time graph drawing algorithm, Faithful graph drawing, Symmetric graph drawing}
}
Document
Realizability of Free Spaces of Curves

Authors: Hugo A. Akitaya, Maike Buchin, Majid Mirzanezhad, Leonie Ryvkin, and Carola Wenk


Abstract
The free space diagram is a popular tool to compute the well-known Fréchet distance. As the Fréchet distance is used in many different fields, many variants have been established to cover the specific needs of these applications. Often the question arises whether a certain pattern in the free space diagram is realizable, i.e., whether there exists a pair of polygonal chains whose free space diagram corresponds to it. The answer to this question may help in deciding the computational complexity of these distance measures, as well as allowing to design more efficient algorithms for restricted input classes that avoid certain free space patterns. Therefore we study the inverse problem: Given a potential free space diagram, do there exist curves that generate this diagram? Our problem of interest is closely tied to the classic Distance Geometry problem. We settle the complexity of Distance Geometry in ℝ^{>2}, showing ∃ℝ-hardness. We use this to show that for curves in ℝ^{≥2} the realizability problem is ∃ℝ-complete, both for continuous and for discrete Fréchet distance. We prove that the continuous case in ℝ¹ is only weakly NP-hard, and we provide a pseudo-polynomial time algorithm and show that it is fixed-parameter tractable. Interestingly, for the discrete case in ℝ¹ we show that the problem becomes solvable in polynomial time.

Cite as

Hugo A. Akitaya, Maike Buchin, Majid Mirzanezhad, Leonie Ryvkin, and Carola Wenk. Realizability of Free Spaces of Curves. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 3:1-3:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{a.akitaya_et_al:LIPIcs.ISAAC.2023.3,
  author =	{A. Akitaya, Hugo and Buchin, Maike and Mirzanezhad, Majid and Ryvkin, Leonie and Wenk, Carola},
  title =	{{Realizability of Free Spaces of Curves}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{3:1--3:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.3},
  URN =		{urn:nbn:de:0030-drops-193057},
  doi =		{10.4230/LIPIcs.ISAAC.2023.3},
  annote =	{Keywords: Fr\'{e}chet distance, Distance Geometry, free space diagram, inverse problem}
}
Document
k-Universality of Regular Languages

Authors: Duncan Adamson, Pamela Fleischmann, Annika Huch, Tore Koß, Florin Manea, and Dirk Nowotka


Abstract
A subsequence of a word w is a word u such that u = w[i₁] w[i₂] … w[i_k], for some set of indices 1 ≤ i₁ < i₂ < … < i_k ≤ |w|. A word w is k-subsequence universal over an alphabet Σ if every word in Σ^k appears in w as a subsequence. In this paper, we study the intersection between the set of k-subsequence universal words over some alphabet Σ and regular languages over Σ. We call a regular language L k-∃-subsequence universal if there exists a k-subsequence universal word in L, and k-∀-subsequence universal if every word of L is k-subsequence universal. We give algorithms solving the problems of deciding if a given regular language, represented by a finite automaton recognising it, is k-∃-subsequence universal and, respectively, if it is k-∀-subsequence universal, for a given k. The algorithms are FPT w.r.t. the size of the input alphabet, and their run-time does not depend on k; they run in polynomial time in the number n of states of the input automaton when the size of the input alphabet is O(log n). Moreover, we show that the problem of deciding if a given regular language is k-∃-subsequence universal is NP-complete, when the language is over a large alphabet. Further, we provide algorithms for counting the number of k-subsequence universal words (paths) accepted by a given deterministic (respectively, nondeterministic) finite automaton, and ranking an input word (path) within the set of k-subsequence universal words accepted by a given finite automaton.

Cite as

Duncan Adamson, Pamela Fleischmann, Annika Huch, Tore Koß, Florin Manea, and Dirk Nowotka. k-Universality of Regular Languages. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 4:1-4:21, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{adamson_et_al:LIPIcs.ISAAC.2023.4,
  author =	{Adamson, Duncan and Fleischmann, Pamela and Huch, Annika and Ko{\ss}, Tore and Manea, Florin and Nowotka, Dirk},
  title =	{{k-Universality of Regular Languages}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{4:1--4:21},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.4},
  URN =		{urn:nbn:de:0030-drops-193064},
  doi =		{10.4230/LIPIcs.ISAAC.2023.4},
  annote =	{Keywords: String Algorithms, Regular Languages, Finite Automata, Subsequences}
}
Document
Unified Almost Linear Kernels for Generalized Covering and Packing Problems on Nowhere Dense Classes

Authors: Jungho Ahn, Jinha Kim, and O-joung Kwon


Abstract
Let ℱ be a family of graphs, and let p,r be nonnegative integers. For a graph G and an integer k, the (p,r,ℱ)-Covering problem asks whether there is a set D ⊆ V(G) of size at most k such that if the p-th power of G has an induced subgraph isomorphic to a graph in ℱ, then it is at distance at most r from D. The (p,r,ℱ)-Packing problem asks whether G^p has k induced subgraphs H₁,…,H_k such that each H_i is isomorphic to a graph in ℱ, and for i,j ∈ {1,…,k}, the distance between V(H_i) and V(H_j) in G is larger than r. We show that for every fixed nonnegative integers p,r and every fixed nonempty finite family ℱ of connected graphs, (p,r,ℱ)-Covering with p ≤ 2r+1 and (p,r,ℱ)-Packing with p ≤ 2⌊r/2⌋+1 admit almost linear kernels on every nowhere dense class of graphs, parameterized by the solution size k. As corollaries, we prove that Distance-r Vertex Cover, Distance-r Matching, ℱ-Free Vertex Deletion, and Induced-ℱ-Packing for any fixed finite family ℱ of connected graphs admit almost linear kernels on every nowhere dense class of graphs. Our results extend the results for Distance-r Dominating Set by Drange et al. (STACS 2016) and Eickmeyer et al. (ICALP 2017), and for Distance-r Independent Set by Pilipczuk and Siebertz (EJC 2021).

Cite as

Jungho Ahn, Jinha Kim, and O-joung Kwon. Unified Almost Linear Kernels for Generalized Covering and Packing Problems on Nowhere Dense Classes. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 5:1-5:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{ahn_et_al:LIPIcs.ISAAC.2023.5,
  author =	{Ahn, Jungho and Kim, Jinha and Kwon, O-joung},
  title =	{{Unified Almost Linear Kernels for Generalized Covering and Packing Problems on Nowhere Dense Classes}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{5:1--5:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.5},
  URN =		{urn:nbn:de:0030-drops-193072},
  doi =		{10.4230/LIPIcs.ISAAC.2023.5},
  annote =	{Keywords: kernelization, independent set, dominating set, covering, packing}
}
Document
Geometric TSP on Sets

Authors: Henk Alkema and Mark de Berg


Abstract
In One-of-a-Set TSP, also known as the Generalised TSP, the input is a collection 𝒫 : = {P_1, ..., P_r} of sets in a metric space and the goal is to compute a minimum-length tour that visits one element from each set. In the Euclidean variant of this problem, each P_i is a set of points in ℝ^d that is contained in a given hypercube H_i. We investigate how the complexity of Euclidean One-of-a-Set TSP depends on λ, the ply of the set ℋ := {H_1, ..., H_r} of hypercubes (The ply is the smallest λ such that every point in ℝ^d is in at most λ of the hypercubes). Furthermore, we show that the problem can be solved in 2^O(λ^{1/d} n^{1-1/d}) time, where n : = ∑_{i=1}^r |P_i| is the total number of points. Finally, we show that the problem cannot be solved in 2^o(n) time when λ = Θ(n), unless the Exponential Time Hypothesis (ETH) fails. In Rectilinear One-of-a-Cube TSP, the input is a set ℋ of hypercubes in ℝ^d and the goal is to compute a minimum-length rectilinear tour that visits every hypercube. We show that the problem can be solved in 2^O(λ^{1/d} n^{1-1/d} log n) time, where n is the number of hypercubes.

Cite as

Henk Alkema and Mark de Berg. Geometric TSP on Sets. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 6:1-6:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{alkema_et_al:LIPIcs.ISAAC.2023.6,
  author =	{Alkema, Henk and de Berg, Mark},
  title =	{{Geometric TSP on Sets}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{6:1--6:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.6},
  URN =		{urn:nbn:de:0030-drops-193083},
  doi =		{10.4230/LIPIcs.ISAAC.2023.6},
  annote =	{Keywords: Euclidean TSP, TSP on Sets, Rectilinear TSP, TSP on Neighbourhoods}
}
Document
Depth-Three Circuits for Inner Product and Majority Functions

Authors: Kazuyuki Amano


Abstract
We consider the complexity of depth-three Boolean circuits with limited bottom fan-in that compute some explicit functions. This is one of the simplest circuit classes for which we cannot derive tight bounds on the complexity for many functions. A Σ₃^k-circuit is a depth-three OR ∘ AND ∘ OR circuit in which each bottom gate has fan-in at most k. First, we investigate the complexity of Σ₃^k-circuits computing the inner product mod two function IP_n on n pairs of variables for small values of k. We give an explicit construction of a Σ²₃-circuit of size smaller than 2^{0.952n} for IP_n as well as a Σ³₃-circuit of size smaller than 2^{0.692n}. These improve the known upper bounds of 2^{n-o(n)} for Σ₃²-circuits and 3^{n/2} ∼ 2^{0.792n} for Σ₃³-circuits by Golovnev, Kulikov and Williams (ITCS 2021), and also the upper bound of 2^{(0.965…)n} for Σ₃²-circuits shown in a recent concurrent work by Göös, Guan and Mosnoi (MFCS 2023). Second, we investigate the complexity of the majority function MAJ_n aiming for exploring the effect of negations. Currently, the smallest known depth-three circuit for MAJ_n is a monotone circuit. A Σ₃^{(+k,-𝓁)}-circuit is a Σ₃-circuit in which each bottom gate has at most k positive literals and 𝓁 negative literals as its input. We show that, for k ≤ 2, the minimum size of a Σ₃^{(+k,-∞)}-circuit for MAJ_n is essentially equal to the minimum size of a monotone Σ₃^k-circuit for MAJ_n. In sharp contrast, we also show that, for k = 3,4 and 5, there exists a Σ₃^{(+k, -𝓁)}-circuit computing MAJ_n (for an appropriately chosen 𝓁) that is smaller than the smallest known monotone Σ₃^k-circuit for MAJ_n. Our results suggest that negations may help to speed up the computation of the majority function even for depth-three circuits. All these constructions rely on efficient circuits or formulas on a small number of variables that we found through a computer search.

Cite as

Kazuyuki Amano. Depth-Three Circuits for Inner Product and Majority Functions. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 7:1-7:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{amano:LIPIcs.ISAAC.2023.7,
  author =	{Amano, Kazuyuki},
  title =	{{Depth-Three Circuits for Inner Product and Majority Functions}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{7:1--7:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.7},
  URN =		{urn:nbn:de:0030-drops-193092},
  doi =		{10.4230/LIPIcs.ISAAC.2023.7},
  annote =	{Keywords: Circuit complexity, depth-3 circuits, upper bounds, lower bounds, computer-assisted proof}
}
Document
Recognizing Unit Multiple Intervals Is Hard

Authors: Virginia Ardévol Martínez, Romeo Rizzi, Florian Sikora, and Stéphane Vialette


Abstract
Multiple interval graphs are a well-known generalization of interval graphs introduced in the 1970s to deal with situations arising naturally in scheduling and allocation. A d-interval is the union of d intervals on the real line, and a graph is a d-interval graph if it is the intersection graph of d-intervals. In particular, it is a unit d-interval graph if it admits a d-interval representation where every interval has unit length. Whereas it has been known for a long time that recognizing 2-interval graphs and other related classes such as 2-track interval graphs is NP-complete, the complexity of recognizing unit 2-interval graphs remains open. Here, we settle this question by proving that the recognition of unit 2-interval graphs is also NP-complete. Our proof technique uses a completely different approach from the other hardness results of recognizing related classes. Furthermore, we extend the result for unit d-interval graphs for any d ⩾ 2, which does not follow directly in graph recognition problems -as an example, it took almost 20 years to close the gap between d = 2 and d > 2 for the recognition of d-track interval graphs. Our result has several implications, including that recognizing (x, …, x) d-interval graphs and depth r unit 2-interval graphs is NP-complete for every x ⩾ 11 and every r ⩾ 4.

Cite as

Virginia Ardévol Martínez, Romeo Rizzi, Florian Sikora, and Stéphane Vialette. Recognizing Unit Multiple Intervals Is Hard. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 8:1-8:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{ardevolmartinez_et_al:LIPIcs.ISAAC.2023.8,
  author =	{Ard\'{e}vol Mart{\'\i}nez, Virginia and Rizzi, Romeo and Sikora, Florian and Vialette, St\'{e}phane},
  title =	{{Recognizing Unit Multiple Intervals Is Hard}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{8:1--8:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.8},
  URN =		{urn:nbn:de:0030-drops-193102},
  doi =		{10.4230/LIPIcs.ISAAC.2023.8},
  annote =	{Keywords: Interval graphs, unit multiple interval graphs, recognition, NP-hardness}
}
Document
Non-Clairvoyant Makespan Minimization Scheduling with Predictions

Authors: Evripidis Bampis, Alexander Kononov, Giorgio Lucarelli, and Fanny Pascual


Abstract
We revisit the classical non-clairvoyant problem of scheduling a set of n jobs on a set of m parallel identical machines where the processing time of a job is not known until the job finishes. Our objective is the minimization of the makespan, i.e., the date at which the last job terminates its execution. We adopt the framework of learning-augmented algorithms and we study the question of whether (possibly erroneous) predictions may help design algorithms with a competitive ratio which is good when the prediction is accurate (consistency), deteriorates gradually with respect to the prediction error (smoothness), and not too bad and bounded when the prediction is arbitrarily bad (robustness). We first consider the non-preemptive case and we devise lower bounds, as a function of the error of the prediction, for any deterministic learning-augmented algorithm. Then we analyze a variant of Longest Processing Time first (LPT) algorithm (with and without release dates) and we prove that it is consistent, smooth, and robust. Furthermore, we study the preemptive case and we provide lower bounds for any deterministic algorithm with predictions as a function of the prediction error. Finally, we introduce a variant of the classical Round Robin algorithm (RR), the Predicted Proportional Round Robin algorithm (PPRR), which we prove to be consistent, smooth and robust.

Cite as

Evripidis Bampis, Alexander Kononov, Giorgio Lucarelli, and Fanny Pascual. Non-Clairvoyant Makespan Minimization Scheduling with Predictions. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 9:1-9:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bampis_et_al:LIPIcs.ISAAC.2023.9,
  author =	{Bampis, Evripidis and Kononov, Alexander and Lucarelli, Giorgio and Pascual, Fanny},
  title =	{{Non-Clairvoyant Makespan Minimization Scheduling with Predictions}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{9:1--9:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.9},
  URN =		{urn:nbn:de:0030-drops-193114},
  doi =		{10.4230/LIPIcs.ISAAC.2023.9},
  annote =	{Keywords: scheduling, online, learning-augmented algorithm}
}
Document
Small-Space Algorithms for the Online Language Distance Problem for Palindromes and Squares

Authors: Gabriel Bathie, Tomasz Kociumaka, and Tatiana Starikovskaya


Abstract
We study the online variant of the language distance problem for two classical formal languages, the language of palindromes and the language of squares, and for the two most fundamental distances, the Hamming distance and the edit (Levenshtein) distance. In this problem, defined for a fixed formal language L, we are given a string T of length n, and the task is to compute the minimal distance to L from every prefix of T. We focus on the low-distance regime, where one must compute only the distances smaller than a given threshold k. In this work, our contribution is twofold: 1) First, we show streaming algorithms, which access the input string T only through a single left-to-right scan. Both for palindromes and squares, our algorithms use O(k polylog n) space and time per character in the Hamming-distance case and O(k² polylog n) space and time per character in the edit-distance case. These algorithms are randomised by necessity, and they err with probability inverse-polynomial in n. 2) Second, we show deterministic read-only online algorithms, which are also provided with read-only random access to the already processed characters of T. Both for palindromes and squares, our algorithms use O(k polylog n) space and time per character in the Hamming-distance case and O(k⁴ polylog n) space and amortised time per character in the edit-distance case.

Cite as

Gabriel Bathie, Tomasz Kociumaka, and Tatiana Starikovskaya. Small-Space Algorithms for the Online Language Distance Problem for Palindromes and Squares. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 10:1-10:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bathie_et_al:LIPIcs.ISAAC.2023.10,
  author =	{Bathie, Gabriel and Kociumaka, Tomasz and Starikovskaya, Tatiana},
  title =	{{Small-Space Algorithms for the Online Language Distance Problem for Palindromes and Squares}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{10:1--10:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.10},
  URN =		{urn:nbn:de:0030-drops-193124},
  doi =		{10.4230/LIPIcs.ISAAC.2023.10},
  annote =	{Keywords: Approximate pattern matching, streaming algorithms, palindromes, squares}
}
Document
Sparse Graphs of Twin-Width 2 Have Bounded Tree-Width

Authors: Benjamin Bergougnoux, Jakub Gajarský, Grzegorz Guśpiel, Petr Hliněný, Filip Pokrývka, and Marek Sokołowski


Abstract
Twin-width is a structural width parameter introduced by Bonnet, Kim, Thomassé and Watrigant [FOCS 2020]. Very briefly, its essence is a gradual reduction (a contraction sequence) of the given graph down to a single vertex while maintaining limited difference of neighbourhoods of the vertices, and it can be seen as widely generalizing several other traditional structural parameters. Having such a sequence at hand allows to solve many otherwise hard problems efficiently. Our paper focuses on a comparison of twin-width to the more traditional tree-width on sparse graphs. Namely, we prove that if a graph G of twin-width at most 2 contains no K_{t,t} subgraph for some integer t, then the tree-width of G is bounded by a polynomial function of t. As a consequence, for any sparse graph class C we obtain a polynomial time algorithm which for any input graph G ∈ C either outputs a contraction sequence of width at most c (where c depends only on C), or correctly outputs that G has twin-width more than 2. On the other hand, we present an easy example of a graph class of twin-width 3 with unbounded tree-width, showing that our result cannot be extended to higher values of twin-width.

Cite as

Benjamin Bergougnoux, Jakub Gajarský, Grzegorz Guśpiel, Petr Hliněný, Filip Pokrývka, and Marek Sokołowski. Sparse Graphs of Twin-Width 2 Have Bounded Tree-Width. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 11:1-11:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bergougnoux_et_al:LIPIcs.ISAAC.2023.11,
  author =	{Bergougnoux, Benjamin and Gajarsk\'{y}, Jakub and Gu\'{s}piel, Grzegorz and Hlin\v{e}n\'{y}, Petr and Pokr\'{y}vka, Filip and Soko{\l}owski, Marek},
  title =	{{Sparse Graphs of Twin-Width 2 Have Bounded Tree-Width}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{11:1--11:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.11},
  URN =		{urn:nbn:de:0030-drops-193130},
  doi =		{10.4230/LIPIcs.ISAAC.2023.11},
  annote =	{Keywords: twin-width, tree-width, excluded grid, sparsity}
}
Document
Substring Complexity in Sublinear Space

Authors: Giulia Bernardini, Gabriele Fici, Paweł Gawrychowski, and Solon P. Pissis


Abstract
Shannon’s entropy is a definitive lower bound for statistical compression. Unfortunately, no such clear measure exists for the compressibility of repetitive strings. Thus, ad hoc measures are employed to estimate the repetitiveness of strings, e.g., the size z of the Lempel–Ziv parse or the number r of equal-letter runs of the Burrows-Wheeler transform. A more recent one is the size γ of a smallest string attractor. Let T be a string of length n. A string attractor of T is a set of positions of T capturing the occurrences of all the substrings of T. Unfortunately, Kempa and Prezza [STOC 2018] showed that computing γ is NP-hard. Kociumaka et al. [LATIN 2020] considered a new measure of compressibility that is based on the function S_T(k) counting the number of distinct substrings of length k of T, also known as the substring complexity of T. This new measure is defined as δ = sup{S_T(k)/k, k ≥ 1} and lower bounds all the relevant ad hoc measures previously considered. In particular, δ ≤ γ always holds and δ can be computed in 𝒪(n) time using Θ(n) working space. Kociumaka et al. showed that one can construct an 𝒪(δ log n/(δ))-sized representation of T supporting efficient direct access and efficient pattern matching queries on T. Given that for highly compressible strings, δ is significantly smaller than n, it is natural to pose the following question: Can we compute δ efficiently using sublinear working space? It is straightforward to show that in the comparison model, any algorithm computing δ using 𝒪(b) space requires Ω(n^{2-o(1)}/b) time through a reduction from the element distinctness problem [Yao, SIAM J. Comput. 1994]. We thus wanted to investigate whether we can indeed match this lower bound. We address this algorithmic challenge by showing the following bounds to compute δ: - 𝒪((n³log b)/b²) time using 𝒪(b) space, for any b ∈ [1,n], in the comparison model. - 𝒪̃(n²/b) time using 𝒪̃(b) space, for any b ∈ [√n,n], in the word RAM model. This gives an 𝒪̃(n^{1+ε})-time and 𝒪̃(n^{1-ε})-space algorithm to compute δ, for any 0 < ε ≤ 1/2. Let us remark that our algorithms compute S_T(k), for all k, within the same complexities.

Cite as

Giulia Bernardini, Gabriele Fici, Paweł Gawrychowski, and Solon P. Pissis. Substring Complexity in Sublinear Space. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 12:1-12:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{bernardini_et_al:LIPIcs.ISAAC.2023.12,
  author =	{Bernardini, Giulia and Fici, Gabriele and Gawrychowski, Pawe{\l} and Pissis, Solon P.},
  title =	{{Substring Complexity in Sublinear Space}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{12:1--12:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.12},
  URN =		{urn:nbn:de:0030-drops-193143},
  doi =		{10.4230/LIPIcs.ISAAC.2023.12},
  annote =	{Keywords: sublinear-space algorithm, string algorithm, substring complexity}
}
Document
New Support Size Bounds for Integer Programming, Applied to Makespan Minimization on Uniformly Related Machines

Authors: Sebastian Berndt, Hauke Brinkop, Klaus Jansen, Matthias Mnich, and Tobias Stamm


Abstract
Mixed-integer linear programming (MILP) is at the core of many advanced algorithms for solving fundamental problems in combinatorial optimization. The complexity of solving MILPs directly correlates with their support size, which is the minimum number of non-zero integer variables in an optimal solution. A hallmark result by Eisenbrand and Shmonin (Oper. Res. Lett. , 2006) shows that any feasible integer linear program (ILP) has a solution with support size s ≤ 2m⋅log(4mΔ), where m is the number of constraints, and Δ is the largest absolute coefficient in any constraint. Our main combinatorial result are improved support size bounds for ILPs. We show that any ILP has a solution with support size s ≤ m⋅(log(3A_max)+√{log(A_max)}), where A_max≔ ‖A‖₁ denotes the 1-norm of the constraint matrix A. Furthermore, we show support bounds in the linearized form s ≤ 2m⋅log(1.46 A_max). Our upper bounds also hold with A_max replaced by √mΔ, which improves on the previously best constants in the linearized form. Our main algorithmic result are the fastest known approximation schemes for fundamental scheduling problems, which use the improved support bounds as one ingredient. We design an efficient approximation scheme (EPTAS) for makespan minimization on uniformly related machines (Q||C_{max}). Our EPTAS yields a (1+ε)-approximation for Q||C_{max} on N jobs in time 2^𝒪(1/ε log³(1/ε)log(log(1/ε))) + 𝒪(N), which improves over the previously fastest algorithm by Jansen, Klein and Verschae (Math. Oper. Res., 2020) with run time 2^𝒪(1/ε log⁴(1/ε)) + N^𝒪(1). Arguably, our approximation scheme is also simpler than all previous EPTASes for Q||C_max, as we reduce the problem to a novel MILP formulation which greatly benefits from the small support.

Cite as

Sebastian Berndt, Hauke Brinkop, Klaus Jansen, Matthias Mnich, and Tobias Stamm. New Support Size Bounds for Integer Programming, Applied to Makespan Minimization on Uniformly Related Machines. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 13:1-13:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{berndt_et_al:LIPIcs.ISAAC.2023.13,
  author =	{Berndt, Sebastian and Brinkop, Hauke and Jansen, Klaus and Mnich, Matthias and Stamm, Tobias},
  title =	{{New Support Size Bounds for Integer Programming, Applied to Makespan Minimization on Uniformly Related Machines}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{13:1--13:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.13},
  URN =		{urn:nbn:de:0030-drops-193155},
  doi =		{10.4230/LIPIcs.ISAAC.2023.13},
  annote =	{Keywords: Integer programming, scheduling algorithms, uniformly related machines, makespan minimization}
}
Document
Improved Guarantees for the a Priori TSP

Authors: Jannis Blauth, Meike Neuwohner, Luise Puhlmann, and Jens Vygen


Abstract
We revisit the a priori TSP (with independent activation) and prove stronger approximation guarantees than were previously known. In the a priori TSP, we are given a metric space (V,c) and an activation probability p(v) for each customer v ∈ V. We ask for a TSP tour T for V that minimizes the expected length after cutting T short by skipping the inactive customers. All known approximation algorithms select a nonempty subset S of the customers and construct a master route solution, consisting of a TSP tour for S and two edges connecting every customer v ∈ V⧵S to a nearest customer in S. We address the following questions. If we randomly sample the subset S, what should be the sampling probabilities? How much worse than the optimum can the best master route solution be? The answers to these questions (we provide almost matching lower and upper bounds) lead to improved approximation guarantees: less than 3.1 with randomized sampling, and less than 5.9 with a deterministic polynomial-time algorithm.

Cite as

Jannis Blauth, Meike Neuwohner, Luise Puhlmann, and Jens Vygen. Improved Guarantees for the a Priori TSP. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 14:1-14:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{blauth_et_al:LIPIcs.ISAAC.2023.14,
  author =	{Blauth, Jannis and Neuwohner, Meike and Puhlmann, Luise and Vygen, Jens},
  title =	{{Improved Guarantees for the a Priori TSP}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{14:1--14:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.14},
  URN =		{urn:nbn:de:0030-drops-193161},
  doi =		{10.4230/LIPIcs.ISAAC.2023.14},
  annote =	{Keywords: A priori TSP, random sampling, stochastic combinatorial optimization}
}
Document
An FPT Algorithm for Splitting a Necklace Among Two Thieves

Authors: Michaela Borzechowski, Patrick Schnider, and Simon Weber


Abstract
It is well-known that the 2-Thief-Necklace-Splitting problem reduces to the discrete Ham Sandwich problem. In fact, this reduction was crucial in the proof of the PPA-completeness of the Ham Sandwich problem [Filos-Ratsikas and Goldberg, STOC'19]. Recently, a variant of the Ham Sandwich problem called α-Ham Sandwich has been studied, in which the point sets are guaranteed to be well-separated [Steiger and Zhao, DCG'10]. The complexity of this search problem remains unknown, but it is known to lie in the complexity class UEOPL [Chiu, Choudhary and Mulzer, ICALP'20]. We define the analogue of this well-separation condition in the necklace splitting problem - a necklace is n-separable, if every subset A of the n types of jewels can be separated from the types [n]⧵A by at most n separator points. Since this version of necklace splitting reduces to α-Ham Sandwich in a solution-preserving way it follows that instances of this version always have unique solutions. We furthermore provide two FPT algorithms: The first FPT algorithm solves 2-Thief-Necklace-Splitting on (n-1+𝓁)-separable necklaces with n types of jewels and m total jewels in time 2^O(𝓁log𝓁) + O(m²). In particular, this shows that 2-Thief-Necklace-Splitting is polynomial-time solvable on n-separable necklaces. Thus, attempts to show hardness of α-Ham Sandwich through reduction from the 2-Thief-Necklace-Splitting problem cannot work. The second FPT algorithm tests (n-1+𝓁)-separability of a given necklace with n types of jewels in time 2^O(𝓁²) ⋅ n⁴. In particular, n-separability can thus be tested in polynomial time, even though testing well-separation of point sets is co-NP-complete [Bergold et al., SWAT'22].

Cite as

Michaela Borzechowski, Patrick Schnider, and Simon Weber. An FPT Algorithm for Splitting a Necklace Among Two Thieves. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 15:1-15:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{borzechowski_et_al:LIPIcs.ISAAC.2023.15,
  author =	{Borzechowski, Michaela and Schnider, Patrick and Weber, Simon},
  title =	{{An FPT Algorithm for Splitting a Necklace Among Two Thieves}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{15:1--15:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.15},
  URN =		{urn:nbn:de:0030-drops-193178},
  doi =		{10.4230/LIPIcs.ISAAC.2023.15},
  annote =	{Keywords: Necklace splitting, n-separability, well-separation, ham sandwich, FPT}
}
Document
Fast Convolutions for Near-Convex Sequences

Authors: Cornelius Brand and Alexandra Lassota


Abstract
We develop algorithms for (min,+)-Convolution and related convolution problems such as Super Additivity Testing, Convolution 3-Sum and Minimum Consecutive Subsums which use the degree of convexity of the instance as a parameter. Assuming the min-plus conjecture (Künnemann-Paturi-Schneider, ICALP'17 and Cygan et al., ICALP'17), our results interpolate in an optimal manner between fully convex instances, which can be solved in near-linear time using Legendre transformations, and general non-convex sequences, where the trivial quadratic-time algorithm is conjectured to be best possible, up to subpolynomial factors.

Cite as

Cornelius Brand and Alexandra Lassota. Fast Convolutions for Near-Convex Sequences. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 16:1-16:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{brand_et_al:LIPIcs.ISAAC.2023.16,
  author =	{Brand, Cornelius and Lassota, Alexandra},
  title =	{{Fast Convolutions for Near-Convex Sequences}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{16:1--16:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.16},
  URN =		{urn:nbn:de:0030-drops-193188},
  doi =		{10.4230/LIPIcs.ISAAC.2023.16},
  annote =	{Keywords: (min,+)-convolution, fine-grained complexity, convex sequences}
}
Document
Matrix Completion: Approximating the Minimum Diameter

Authors: Diptarka Chakraborty and Sanjana Dey


Abstract
In this paper, we focus on the matrix completion problem and aim to minimize the diameter over an arbitrary alphabet. Given a matrix M with missing entries, our objective is to complete the matrix by filling in the missing entries in a way that minimizes the maximum (Hamming) distance between any pair of rows in the completed matrix (also known as the diameter of the matrix). It is worth noting that this problem is already known to be NP-hard. Currently, the best-known upper bound is a 4-approximation algorithm derived by applying the triangle inequality together with a well-known 2-approximation algorithm for the radius minimization variant. In this work, we make the following contributions: - We present a novel 3-approximation algorithm for the diameter minimization variant of the matrix completion problem. To the best of our knowledge, this is the first approximation result that breaks below the straightforward 4-factor bound. - Furthermore, we establish that the diameter minimization variant of the matrix completion problem is (2-ε)-inapproximable, for any ε > 0, even when considering a binary alphabet, under the assumption that 𝖯 ≠ NP. This is the first result that demonstrates a hardness of approximation for this problem.

Cite as

Diptarka Chakraborty and Sanjana Dey. Matrix Completion: Approximating the Minimum Diameter. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 17:1-17:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{chakraborty_et_al:LIPIcs.ISAAC.2023.17,
  author =	{Chakraborty, Diptarka and Dey, Sanjana},
  title =	{{Matrix Completion: Approximating the Minimum Diameter}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{17:1--17:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.17},
  URN =		{urn:nbn:de:0030-drops-193197},
  doi =		{10.4230/LIPIcs.ISAAC.2023.17},
  annote =	{Keywords: Incomplete Data, Matrix Completion, Hamming Distance, Diameter Minimization, Approximation Algorithms, Hardness of Approximation}
}
Document
Distance Queries over Dynamic Interval Graphs

Authors: Jingbang Chen, Meng He, J. Ian Munro, Richard Peng, Kaiyu Wu, and Daniel J. Zhang


Abstract
We design the first dynamic distance oracles for interval graphs, which are intersection graphs of a set of intervals on the real line, and for proper interval graphs, which are intersection graphs of a set of intervals in which no interval is properly contained in another. For proper interval graphs, we design a linear space data structure which supports distance queries (computing the distance between two query vertices) and vertex insertion or deletion in O(lg n) worst-case time, where n is the number of vertices currently in G. Under incremental (insertion only) or decremental (deletion only) settings, we design linear space data structures that support distance queries in O(lg n) worst-case time and vertex insertion or deletion in O(lg n) amortized time, where n is the maximum number of vertices in the graph. Under fully dynamic settings, we design a data structure that represents an interval graph G in O(n) words of space to support distance queries in O(n lg n/S(n)) worst-case time and vertex insertion or deletion in O(S(n)+lg n) worst-case time, where n is the number of vertices currently in G and S(n) is an arbitrary function that satisfies S(n) = Ω(1) and S(n) = O(n). This implies an O(n)-word solution with O(√{nlg n})-time support for both distance queries and updates. All four data structures can answer shortest path queries by reporting the vertices in the shortest path between two query vertices in O(lg n) worst-case time per vertex. We also study the hardness of supporting distance queries under updates over an intersection graph of 3D axis-aligned line segments, which generalizes our problem to 3D. Finally, we solve the problem of computing the diameter of a dynamic connected interval graph.

Cite as

Jingbang Chen, Meng He, J. Ian Munro, Richard Peng, Kaiyu Wu, and Daniel J. Zhang. Distance Queries over Dynamic Interval Graphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 18:1-18:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ISAAC.2023.18,
  author =	{Chen, Jingbang and He, Meng and Munro, J. Ian and Peng, Richard and Wu, Kaiyu and Zhang, Daniel J.},
  title =	{{Distance Queries over Dynamic Interval Graphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{18:1--18:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.18},
  URN =		{urn:nbn:de:0030-drops-193207},
  doi =		{10.4230/LIPIcs.ISAAC.2023.18},
  annote =	{Keywords: interval graph, proper interval graph, intersection graph, geometric intersection graph, distance oracle, distance query, shortest path query, dynamic graph}
}
Document
FPT Approximation Using Treewidth: Capacitated Vertex Cover, Target Set Selection and Vector Dominating Set

Authors: Huairui Chu and Bingkai Lin


Abstract
Treewidth is a useful tool in designing graph algorithms. Although many NP-hard graph problems can be solved in linear time when the input graphs have small treewidth, there are problems which remain hard on graphs of bounded treewidth. In this paper, we consider three vertex selection problems that are W[1]-hard when parameterized by the treewidth of the input graph, namely the capacitated vertex cover problem, the target set selection problem and the vector dominating set problem. We provide two new methods to obtain FPT approximation algorithms for these problems. For the capacitated vertex cover problem and the vector dominating set problem, we obtain (1+o(1))-approximation FPT algorithms. For the target set selection problem, we give an FPT algorithm providing a tradeoff between its running time and the approximation ratio.

Cite as

Huairui Chu and Bingkai Lin. FPT Approximation Using Treewidth: Capacitated Vertex Cover, Target Set Selection and Vector Dominating Set. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 19:1-19:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{chu_et_al:LIPIcs.ISAAC.2023.19,
  author =	{Chu, Huairui and Lin, Bingkai},
  title =	{{FPT Approximation Using Treewidth: Capacitated Vertex Cover, Target Set Selection and Vector Dominating Set}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{19:1--19:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.19},
  URN =		{urn:nbn:de:0030-drops-193216},
  doi =		{10.4230/LIPIcs.ISAAC.2023.19},
  annote =	{Keywords: FPT approximation algorithm, Treewidth, Capacitated vertex cover, Target set selection, Vector dominating set}
}
Document
Improved Approximation for Two-Dimensional Vector Multiple Knapsack

Authors: Tomer Cohen, Ariel Kulik, and Hadas Shachnai


Abstract
We study the uniform 2-dimensional vector multiple knapsack (2VMK) problem, a natural variant of multiple knapsack arising in real-world applications such as virtual machine placement. The input for 2VMK is a set of items, each associated with a 2-dimensional weight vector and a positive profit, along with m 2-dimensional bins of uniform (unit) capacity in each dimension. The goal is to find an assignment of a subset of the items to the bins, such that the total weight of items assigned to a single bin is at most one in each dimension, and the total profit is maximized. Our main result is a (1 - (ln 2)/2 - ε)-approximation algorithm for 2VMK, for every fixed ε > 0, thus improving the best known ratio of (1 - 1/e - ε) which follows as a special case from a result of [Fleischer at al., MOR 2011]. Our algorithm relies on an adaptation of the Round&Approx framework of [Bansal et al., SICOMP 2010], originally designed for set covering problems, to maximization problems. The algorithm uses randomized rounding of a configuration-LP solution to assign items to ≈ m⋅ln 2 ≈ 0.693⋅m of the bins, followed by a reduction to the (1-dimensional) Multiple Knapsack problem for assigning items to the remaining bins.

Cite as

Tomer Cohen, Ariel Kulik, and Hadas Shachnai. Improved Approximation for Two-Dimensional Vector Multiple Knapsack. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 20:1-20:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{cohen_et_al:LIPIcs.ISAAC.2023.20,
  author =	{Cohen, Tomer and Kulik, Ariel and Shachnai, Hadas},
  title =	{{Improved Approximation for Two-Dimensional Vector Multiple Knapsack}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{20:1--20:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.20},
  URN =		{urn:nbn:de:0030-drops-193229},
  doi =		{10.4230/LIPIcs.ISAAC.2023.20},
  annote =	{Keywords: vector multiple knapsack, two-dimensional packing, randomized rounding, approximation algorithms}
}
Document
A Compact DAG for Storing and Searching Maximal Common Subsequences

Authors: Alessio Conte, Roberto Grossi, Giulia Punzi, and Takeaki Uno


Abstract
Maximal Common Subsequences (MCSs) between two strings X and Y are subsequences of both X and Y that are maximal under inclusion. MCSs relax and generalize the well known and widely used concept of Longest Common Subsequences (LCSs), which can be seen as MCSs of maximum length. While the number both LCSs and MCSs can be exponential in the length of the strings, LCSs have been long exploited for string and text analysis, as simple compact representations of all LCSs between two strings, built via dynamic programming or automata, have been known since the '70s. MCSs appear to have a more challenging structure: even listing them efficiently was an open problem open until recently, thus narrowing the complexity difference between the two problems, but the gap remained significant. In this paper we close the complexity gap: we show how to build DAG of polynomial size - in polynomial time - which allows for efficient operations on the set of all MCSs such as enumeration in Constant Amortized Time per solution (CAT), counting, and random access to the i-th element (i.e., rank and select operations). Other than improving known algorithmic results, this work paves the way for new sequence analysis methods based on MCSs.

Cite as

Alessio Conte, Roberto Grossi, Giulia Punzi, and Takeaki Uno. A Compact DAG for Storing and Searching Maximal Common Subsequences. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 21:1-21:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{conte_et_al:LIPIcs.ISAAC.2023.21,
  author =	{Conte, Alessio and Grossi, Roberto and Punzi, Giulia and Uno, Takeaki},
  title =	{{A Compact DAG for Storing and Searching Maximal Common Subsequences}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{21:1--21:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.21},
  URN =		{urn:nbn:de:0030-drops-193231},
  doi =		{10.4230/LIPIcs.ISAAC.2023.21},
  annote =	{Keywords: Maximal common subsequence, DAG, Compact data structures, Enumeration, Constant amortized time, Random access}
}
Document
Prefix Sorting DFAs: A Recursive Algorithm

Authors: Nicola Cotumaccio


Abstract
In the past thirty years, numerous algorithms for building the suffix array of a string have been proposed. In 2021, the notion of suffix array was extended from strings to DFAs, and it was shown that the resulting data structure can be built in O(m² + n^{5/2}) time, where n is the number of states and m is the number of edges [SODA 2021]. Recently, algorithms running in O(mn) and O(n²log n) time have been described [CPM 2023]. In this paper, we improve the previous bounds by proposing an O(n²) recursive algorithm inspired by Farach’s algorithm for building a suffix tree [FOCS 1997]. To this end, we provide insight into the rich lexicographic and combinatorial structure of a graph, so contributing to the fascinating journey which might lead to solve the long-standing open problem of building the suffix tree of a graph.

Cite as

Nicola Cotumaccio. Prefix Sorting DFAs: A Recursive Algorithm. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 22:1-22:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{cotumaccio:LIPIcs.ISAAC.2023.22,
  author =	{Cotumaccio, Nicola},
  title =	{{Prefix Sorting DFAs: A Recursive Algorithm}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{22:1--22:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.22},
  URN =		{urn:nbn:de:0030-drops-193242},
  doi =		{10.4230/LIPIcs.ISAAC.2023.22},
  annote =	{Keywords: Suffix Array, Burrows-Wheeler Transform, FM-index, Recursive Algorithms, Graph Theory, Pattern Matching}
}
Document
Clustering in Polygonal Domains

Authors: Mark de Berg, Leyla Biabani, Morteza Monemizadeh, and Leonidas Theocharous


Abstract
We study various clustering problems for a set D of n points in a polygonal domain P under the geodesic distance. We start by studying the discrete k-median problem for D in P. We develop an exact algorithm which runs in time poly(n,m) + n^O(√k), where m is the complexity of the domain. Subsequently, we show that our approach can also be applied to solve the k-center problem with z outliers in the same running time. Next, we turn our attention to approximation algorithms. In particular, we study the k-center problem in a simple polygon and show how to obtain a (1+ε)-approximation algorithm which runs in time 2^{O((k log(k))/ε)} (n log(m) + m). To obtain this, we demonstrate that a previous approach by Bădoiu et al. [Bâdoiu et al., 2002; Bâdoiu and Clarkson, 2003] that works in ℝ^d, carries over to the setting of simple polygons. Finally, we study the 1-center problem in a simple polygon in the presence of z outliers. We show that a coreset C of size O(z) exists, such that the 1-center of C is a 3-approximation of the 1-center of D, when z outliers are allowed. This result is actually more general and carries over to any metric space, which to the best of our knowledge was not known so far. By extending this approach, we show that for the 1-center problem under the Euclidean metric in ℝ², there exists an ε-coreset of size O(z/ε).

Cite as

Mark de Berg, Leyla Biabani, Morteza Monemizadeh, and Leonidas Theocharous. Clustering in Polygonal Domains. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 23:1-23:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{deberg_et_al:LIPIcs.ISAAC.2023.23,
  author =	{de Berg, Mark and Biabani, Leyla and Monemizadeh, Morteza and Theocharous, Leonidas},
  title =	{{Clustering in Polygonal Domains}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{23:1--23:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.23},
  URN =		{urn:nbn:de:0030-drops-193252},
  doi =		{10.4230/LIPIcs.ISAAC.2023.23},
  annote =	{Keywords: clustering, geodesic distance, coreset, outliers}
}
Document
Finding Diverse Minimum s-t Cuts

Authors: Mark de Berg, Andrés López Martínez, and Frits Spieksma


Abstract
Recently, many studies have been devoted to finding diverse solutions in classical combinatorial problems, such as Vertex Cover (Baste et al., IJCAI'20), Matching (Fomin et al., ISAAC'20) and Spanning Tree (Hanaka et al., AAAI'21). Finding diverse solutions is important in settings where the user is not able to specify all criteria of the desired solution. Motivated by an application in the field of system identification, we initiate the algorithmic study of k-Diverse Minimum s-t Cuts which, given a directed graph G = (V, E), two specified vertices s,t ∈ V, and an integer k > 0, asks for a collection of k minimum s-t cuts in G that has maximum diversity. We investigate the complexity of the problem for two diversity measures for a collection of cuts: (i) the sum of all pairwise Hamming distances, and (ii) the cardinality of the union of cuts in the collection. We prove that k-Diverse Minimum s-t Cuts can be solved in strongly polynomial time for both diversity measures via submodular function minimization. We obtain this result by establishing a connection between ordered collections of minimum s-t cuts and the theory of distributive lattices. When restricted to finding only collections of mutually disjoint solutions, we provide a more practical algorithm that finds a maximum set of pairwise disjoint minimum s-t cuts. For graphs with small minimum s-t cut, it runs in the time of a single max-flow computation. These results stand in contrast to the problem of finding k diverse global minimum cuts - which is known to be NP-hard even for the disjoint case (Hanaka et al., AAAI'23) - and partially answer a long-standing open question of Wagner (Networks 1990) about improving the complexity of finding disjoint collections of minimum s-t cuts.

Cite as

Mark de Berg, Andrés López Martínez, and Frits Spieksma. Finding Diverse Minimum s-t Cuts. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 24:1-24:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{deberg_et_al:LIPIcs.ISAAC.2023.24,
  author =	{de Berg, Mark and L\'{o}pez Mart{\'\i}nez, Andr\'{e}s and Spieksma, Frits},
  title =	{{Finding Diverse Minimum s-t Cuts}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{24:1--24:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.24},
  URN =		{urn:nbn:de:0030-drops-193267},
  doi =		{10.4230/LIPIcs.ISAAC.2023.24},
  annote =	{Keywords: S-T MinCut, Diversity, Lattice Theory, Submodular Function Minimization}
}
Document
Efficient Algorithms for Euclidean Steiner Minimal Tree on Near-Convex Terminal Sets

Authors: Anubhav Dhar, Soumita Hait, and Sudeshna Kolay


Abstract
The Euclidean Steiner Minimal Tree problem takes as input a set P of points in the Euclidean plane and finds the minimum length network interconnecting all the points of P. In this paper, in continuation to the works of [Du et al., 1987] and [Weng and Booth, 1995], we study Euclidean Steiner Minimal Tree when P is formed by the vertices of a pair of regular, concentric and parallel n-gons. We restrict our attention to the cases where the two polygons are not very close to each other. In such cases, we show that Euclidean Steiner Minimal Tree is polynomial-time solvable, and we describe an explicit structure of a Euclidean Steiner minimal tree for P. We also consider point sets P of size n where the number of input points not on the convex hull of P is f(n) ≤ n. We give an exact algorithm with running time 2^𝒪(f(n) log n) for such input point sets P. Note that when f(n) = 𝒪(n/(log n)), our algorithm runs in single-exponential time, and when f(n) = o(n) the running time is 2^o(n log n) which is better than the known algorithm in [Hwang et al., 1992]. We know that no FPTAS exists for Euclidean Steiner Minimal Tree unless P = NP [Garey et al., 1977]. On the other hand FPTASes exist for Euclidean Steiner Minimal Tree on convex point sets [Scott Provan, 1988]. In this paper, we show that if the number of input points in P not belonging to the convex hull of P is 𝒪(log n), then an FPTAS exists for Euclidean Steiner Minimal Tree. In contrast, we show that for any ε ∈ (0,1], when there are Ω(n^ε) points not belonging to the convex hull of the input set, then no FPTAS can exist for Euclidean Steiner Minimal Tree unless P = NP.

Cite as

Anubhav Dhar, Soumita Hait, and Sudeshna Kolay. Efficient Algorithms for Euclidean Steiner Minimal Tree on Near-Convex Terminal Sets. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 25:1-25:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{dhar_et_al:LIPIcs.ISAAC.2023.25,
  author =	{Dhar, Anubhav and Hait, Soumita and Kolay, Sudeshna},
  title =	{{Efficient Algorithms for Euclidean Steiner Minimal Tree on Near-Convex Terminal Sets}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{25:1--25:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.25},
  URN =		{urn:nbn:de:0030-drops-193273},
  doi =		{10.4230/LIPIcs.ISAAC.2023.25},
  annote =	{Keywords: Steiner minimal tree, Euclidean Geometry, Almost Convex point sets, FPTAS, strong NP-completeness}
}
Document
Rectilinear-Upward Planarity Testing of Digraphs

Authors: Walter Didimo, Michael Kaufmann, Giuseppe Liotta, Giacomo Ortali, and Maurizio Patrignani


Abstract
A rectilinear-upward planar drawing of a digraph G is a crossing-free drawing of G where each edge is either a horizontal or a vertical segment, and such that no directed edge points downward. Rectilinear-Upward Planarity Testing is the problem of deciding whether a digraph G admits a rectilinear-upward planar drawing. We show that: (i) Rectilinear-Upward Planarity Testing is NP-complete, even if G is biconnected; (ii) it can be solved in linear time when an upward planar embedding of G is fixed; (iii) the problem is polynomial-time solvable for biconnected digraphs of treewidth at most two, i.e., for digraphs whose underlying undirected graph is a series-parallel graph; (iv) for any biconnected digraph the problem is fixed-parameter tractable when parameterized by the number of sources and sinks in the digraph.

Cite as

Walter Didimo, Michael Kaufmann, Giuseppe Liotta, Giacomo Ortali, and Maurizio Patrignani. Rectilinear-Upward Planarity Testing of Digraphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 26:1-26:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{didimo_et_al:LIPIcs.ISAAC.2023.26,
  author =	{Didimo, Walter and Kaufmann, Michael and Liotta, Giuseppe and Ortali, Giacomo and Patrignani, Maurizio},
  title =	{{Rectilinear-Upward Planarity Testing of Digraphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{26:1--26:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.26},
  URN =		{urn:nbn:de:0030-drops-193283},
  doi =		{10.4230/LIPIcs.ISAAC.2023.26},
  annote =	{Keywords: Graph drawing, orthogonal drawings, upward drawings, rectilinear planarity, upward planarity}
}
Document
A Unified Worst Case for Classical Simplex and Policy Iteration Pivot Rules

Authors: Yann Disser and Nils Mosis


Abstract
We construct a family of Markov decision processes for which the policy iteration algorithm needs an exponential number of improving switches with Dantzig’s rule, with Bland’s rule, and with the Largest Increase pivot rule. This immediately translates to a family of linear programs for which the simplex algorithm needs an exponential number of pivot steps with the same three pivot rules. Our results yield a unified construction that simultaneously reproduces well-known lower bounds for these classical pivot rules, and we are able to infer that any (deterministic or randomized) combination of them cannot avoid an exponential worst-case behavior. Regarding the policy iteration algorithm, pivot rules typically switch multiple edges simultaneously and our lower bound for Dantzig’s rule and the Largest Increase rule, which perform only single switches, seem novel. Regarding the simplex algorithm, the individual lower bounds were previously obtained separately via deformed hypercube constructions. In contrast to previous bounds for the simplex algorithm via Markov decision processes, our rigorous analysis is reasonably concise.

Cite as

Yann Disser and Nils Mosis. A Unified Worst Case for Classical Simplex and Policy Iteration Pivot Rules. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 27:1-27:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{disser_et_al:LIPIcs.ISAAC.2023.27,
  author =	{Disser, Yann and Mosis, Nils},
  title =	{{A Unified Worst Case for Classical Simplex and Policy Iteration Pivot Rules}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{27:1--27:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.27},
  URN =		{urn:nbn:de:0030-drops-193292},
  doi =		{10.4230/LIPIcs.ISAAC.2023.27},
  annote =	{Keywords: Bland’s pivot rule, Dantzig’s pivot rule, Largest Increase pivot rule, Markov decision process, policy iteration, simplex algorithm}
}
Document
Exact Matching: Correct Parity and FPT Parameterized by Independence Number

Authors: Nicolas El Maalouly, Raphael Steiner, and Lasse Wulf


Abstract
Given an integer k and a graph where every edge is colored either red or blue, the goal of the exact matching problem is to find a perfect matching with the property that exactly k of its edges are red. Soon after Papadimitriou and Yannakakis (JACM 1982) introduced the problem, a randomized polynomial-time algorithm solving the problem was described by Mulmuley et al. (Combinatorica 1987). Despite a lot of effort, it is still not known today whether a deterministic polynomial-time algorithm exists. This makes the exact matching problem an important candidate to test the popular conjecture that the complexity classes P and RP are equal. In a recent article (MFCS 2022), progress was made towards this goal by showing that for bipartite graphs of bounded bipartite independence number, a polynomial time algorithm exists. In terms of parameterized complexity, this algorithm was an XP-algorithm parameterized by the bipartite independence number. In this article, we introduce novel algorithmic techniques that allow us to obtain an FPT-algorithm. If the input is a general graph we show that one can at least compute a perfect matching M which has the correct number of red edges modulo 2, in polynomial time. This is motivated by our last result, in which we prove that an FPT algorithm for general graphs, parameterized by the independence number, reduces to the problem of finding in polynomial time a perfect matching M with at most k red edges and the correct number of red edges modulo 2.

Cite as

Nicolas El Maalouly, Raphael Steiner, and Lasse Wulf. Exact Matching: Correct Parity and FPT Parameterized by Independence Number. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 28:1-28:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{elmaalouly_et_al:LIPIcs.ISAAC.2023.28,
  author =	{El Maalouly, Nicolas and Steiner, Raphael and Wulf, Lasse},
  title =	{{Exact Matching: Correct Parity and FPT Parameterized by Independence Number}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{28:1--28:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.28},
  URN =		{urn:nbn:de:0030-drops-193302},
  doi =		{10.4230/LIPIcs.ISAAC.2023.28},
  annote =	{Keywords: Perfect Matching, Exact Matching, Independence Number, Parameterized Complexity}
}
Document
Approximation Guarantees for Shortest Superstrings: Simpler and Better

Authors: Matthias Englert, Nicolaos Matsakis, and Pavel Veselý


Abstract
The Shortest Superstring problem is an NP-hard problem, in which given as input a set of strings, we are looking for a string of minimum length that contains all input strings as substrings. The Greedy Conjecture (Tarhio and Ukkonen, 1988) states that the GREEDY algorithm, which repeatedly merges the two strings of maximum overlap, is 2-approximate. We have recently shown (STOC 2022) that the approximation guarantee of GREEDY is at most (13+√{57})/6 ≈ 3.425. Before that, the best established upper bound for this was 3.5 by Kaplan and Shafrir (IPL 2005), which improved upon the upper bound of 4 by Blum et al. (STOC 1991). To derive our previous result, we established two incomparable upper bounds on the overlap sum of all cycle-closing edges in an optimal cycle cover and utilized lemmas of Blum et al. We improve the more involved one of the two bounds and, at the same time, make its proof more straightforward. This results in an improved approximation guarantee of (√{67}+2)/3 ≈ 3.396 for GREEDY. Additionally, our result implies an algorithm for the Shortest Superstring problem having an approximation guarantee of (√{67}+14)/9 ≈ 2.466, improving slightly upon the previously best guarantee of (√{57}+37)/18 ≈ 2.475 (STOC 2022).

Cite as

Matthias Englert, Nicolaos Matsakis, and Pavel Veselý. Approximation Guarantees for Shortest Superstrings: Simpler and Better. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 29:1-29:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{englert_et_al:LIPIcs.ISAAC.2023.29,
  author =	{Englert, Matthias and Matsakis, Nicolaos and Vesel\'{y}, Pavel},
  title =	{{Approximation Guarantees for Shortest Superstrings: Simpler and Better}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{29:1--29:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.29},
  URN =		{urn:nbn:de:0030-drops-193319},
  doi =		{10.4230/LIPIcs.ISAAC.2023.29},
  annote =	{Keywords: Shortest Superstring problem, Approximation Algorithms}
}
Document
Rapid Mixing for the Hardcore Glauber Dynamics and Other Markov Chains in Bounded-Treewidth Graphs

Authors: David Eppstein and Daniel Frishberg


Abstract
We give a new rapid mixing result for a natural random walk on the independent sets of a graph G. We show that when G has bounded treewidth, this random walk - known as the Glauber dynamics for the hardcore model - mixes rapidly for all fixed values of the standard parameter λ > 0, giving a simple alternative to existing sampling algorithms for these structures. We also show rapid mixing for analogous Markov chains on dominating sets, b-edge covers, b-matchings, maximal independent sets, and maximal b-matchings. (For b-matchings, maximal independent sets, and maximal b-matchings we also require bounded degree.) Our results imply simpler alternatives to known algorithms for the sampling and approximate counting problems in these graphs. We prove our results by applying a divide-and-conquer framework we developed in a previous paper, as an alternative to the projection-restriction technique introduced by Jerrum, Son, Tetali, and Vigoda. We extend this prior framework to handle chains for which the application of that framework is not straightforward, strengthening existing results by Dyer, Goldberg, and Jerrum and by Heinrich for the Glauber dynamics on q-colorings of graphs of bounded treewidth and bounded degree.

Cite as

David Eppstein and Daniel Frishberg. Rapid Mixing for the Hardcore Glauber Dynamics and Other Markov Chains in Bounded-Treewidth Graphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 30:1-30:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{eppstein_et_al:LIPIcs.ISAAC.2023.30,
  author =	{Eppstein, David and Frishberg, Daniel},
  title =	{{Rapid Mixing for the Hardcore Glauber Dynamics and Other Markov Chains in Bounded-Treewidth Graphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{30:1--30:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.30},
  URN =		{urn:nbn:de:0030-drops-193324},
  doi =		{10.4230/LIPIcs.ISAAC.2023.30},
  annote =	{Keywords: Glauber dynamics, mixing time, projection-restriction, multicommodity flow}
}
Document
Matching Cuts in Graphs of High Girth and H-Free Graphs

Authors: Carl Feghali, Felicia Lucke, Daniël Paulusma, and Bernard Ries


Abstract
The (Perfect) Matching Cut problem is to decide if a connected graph has a (perfect) matching that is also an edge cut. The Disconnected Perfect Matching problem is to decide if a connected graph has a perfect matching that contains a matching cut. Both Matching Cut and Disconnected Perfect Matching are NP-complete for planar graphs of girth 5, whereas Perfect Matching Cut is known to be NP-complete even for subcubic bipartite graphs of arbitrarily large fixed girth. We prove that Matching Cut and Disconnected Perfect Matching are also NP-complete for bipartite graphs of arbitrarily large fixed girth and bounded maximum degree. Our result for Matching Cut resolves a 20-year old open problem. We also show that the more general problem d-Cut, for every fixed d ≥ 1, is NP-complete for bipartite graphs of arbitrarily large fixed girth and bounded maximum degree. Furthermore, we show that Matching Cut, Perfect Matching Cut and Disconnected Perfect Matching are NP-complete for H-free graphs whenever H contains a connected component with two vertices of degree at least 3. Afterwards, we update the state-of-the-art summaries for H-free graphs and compare them with each other, and with a known and full classification of the Maximum Matching Cut problem, which is to determine a largest matching cut of a graph G. Finally, by combining existing results, we obtain a complete complexity classification of Perfect Matching Cut for H-subgraph-free graphs where H is any finite set of graphs.

Cite as

Carl Feghali, Felicia Lucke, Daniël Paulusma, and Bernard Ries. Matching Cuts in Graphs of High Girth and H-Free Graphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 31:1-31:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{feghali_et_al:LIPIcs.ISAAC.2023.31,
  author =	{Feghali, Carl and Lucke, Felicia and Paulusma, Dani\"{e}l and Ries, Bernard},
  title =	{{Matching Cuts in Graphs of High Girth and H-Free Graphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{31:1--31:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.31},
  URN =		{urn:nbn:de:0030-drops-193332},
  doi =		{10.4230/LIPIcs.ISAAC.2023.31},
  annote =	{Keywords: matching cut, perfect matching, girth, H-free graph}
}
Document
Computing Paths of Large Rank in Planar Frameworks Deterministically

Authors: Fedor V. Fomin, Petr A. Golovach, Tuukka Korhonen, and Giannos Stamoulis


Abstract
A framework consists of an undirected graph G and a matroid M whose elements correspond to the vertices of G. Recently, Fomin et al. [SODA 2023] and Eiben et al. [ArXiV 2023] developed parameterized algorithms for computing paths of rank k in frameworks. More precisely, for vertices s and t of G, and an integer k, they gave FPT algorithms parameterized by k deciding whether there is an (s,t)-path in G whose vertex set contains a subset of elements of M of rank k. These algorithms are based on Schwartz-Zippel lemma for polynomial identity testing and thus are randomized, and therefore the existence of a deterministic FPT algorithm for this problem remains open. We present the first deterministic FPT algorithm that solves the problem in frameworks whose underlying graph G is planar. While the running time of our algorithm is worse than the running times of the recent randomized algorithms, our algorithm works on more general classes of matroids. In particular, this is the first FPT algorithm for the case when matroid M is represented over rationals. Our main technical contribution is the nontrivial adaptation of the classic irrelevant vertex technique to frameworks to reduce the given instance to one of bounded treewidth. This allows us to employ the toolbox of representative sets to design a dynamic programming procedure solving the problem efficiently on instances of bounded treewidth.

Cite as

Fedor V. Fomin, Petr A. Golovach, Tuukka Korhonen, and Giannos Stamoulis. Computing Paths of Large Rank in Planar Frameworks Deterministically. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 32:1-32:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{fomin_et_al:LIPIcs.ISAAC.2023.32,
  author =	{Fomin, Fedor V. and Golovach, Petr A. and Korhonen, Tuukka and Stamoulis, Giannos},
  title =	{{Computing Paths of Large Rank in Planar Frameworks Deterministically}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{32:1--32:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.32},
  URN =		{urn:nbn:de:0030-drops-193341},
  doi =		{10.4230/LIPIcs.ISAAC.2023.32},
  annote =	{Keywords: Planar graph, longest path, linear matroid, irrelevant vertex}
}
Document
Pattern-Avoiding Binary Trees - Generation, Counting, and Bijections

Authors: Petr Gregor, Torsten Mütze, and Namrata


Abstract
In this paper we propose a notion of pattern avoidance in binary trees that generalizes the avoidance of contiguous tree patterns studied by Rowland and non-contiguous tree patterns studied by Dairyko, Pudwell, Tyner, and Wynn. Specifically, we propose algorithms for generating different classes of binary trees that are characterized by avoiding one or more of these generalized patterns. This is achieved by applying the recent Hartung-Hoang-Mütze-Williams generation framework, by encoding binary trees via permutations. In particular, we establish a one-to-one correspondence between tree patterns and certain mesh permutation patterns. We also conduct a systematic investigation of all tree patterns on at most 5 vertices, and we establish bijections between pattern-avoiding binary trees and other combinatorial objects, in particular pattern-avoiding lattice paths and set partitions.

Cite as

Petr Gregor, Torsten Mütze, and Namrata. Pattern-Avoiding Binary Trees - Generation, Counting, and Bijections. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 33:1-33:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{gregor_et_al:LIPIcs.ISAAC.2023.33,
  author =	{Gregor, Petr and M\"{u}tze, Torsten and Namrata},
  title =	{{Pattern-Avoiding Binary Trees - Generation, Counting, and Bijections}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{33:1--33:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.33},
  URN =		{urn:nbn:de:0030-drops-193350},
  doi =		{10.4230/LIPIcs.ISAAC.2023.33},
  annote =	{Keywords: Generation, binary tree, pattern avoidance, permutation, bijection}
}
Document
Computing a Subtrajectory Cluster from c-Packed Trajectories

Authors: Joachim Gudmundsson, Zijin Huang, André van Renssen, and Sampson Wong


Abstract
We present a near-linear time approximation algorithm for the subtrajectory cluster problem of c-packed trajectories. Given a trajectory T of complexity n, an approximation factor ε, and a desired distance d, the problem involves finding m subtrajectories of T such that their pair-wise Fréchet distance is at most (1 + ε)d. At least one subtrajectory must be of length l or longer. A trajectory T is c-packed if the intersection of T and any ball B with radius r is at most c⋅r in length. Previous results by Gudmundsson and Wong [Gudmundsson and Wong, 2022] established an Ω(n³) lower bound unless the Strong Exponential Time Hypothesis fails, and they presented an O(n³ log² n) time algorithm. We circumvent this conditional lower bound by studying subtrajectory cluster on c-packed trajectories, resulting in an algorithm with an O((c² n/ε²)log(c/ε)log(n/ε)) time complexity.

Cite as

Joachim Gudmundsson, Zijin Huang, André van Renssen, and Sampson Wong. Computing a Subtrajectory Cluster from c-Packed Trajectories. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 34:1-34:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{gudmundsson_et_al:LIPIcs.ISAAC.2023.34,
  author =	{Gudmundsson, Joachim and Huang, Zijin and van Renssen, Andr\'{e} and Wong, Sampson},
  title =	{{Computing a Subtrajectory Cluster from c-Packed Trajectories}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{34:1--34:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.34},
  URN =		{urn:nbn:de:0030-drops-193364},
  doi =		{10.4230/LIPIcs.ISAAC.2023.34},
  annote =	{Keywords: Subtrajectory cluster, c-packed trajectories, Computational geometry}
}
Document
Shortest Beer Path Queries in Digraphs with Bounded Treewidth

Authors: Joachim Gudmundsson and Yuan Sha


Abstract
A beer digraph G is a real-valued weighted directed graph where some of the vertices have beer stores. A beer path from a vertex u to a vertex v in G is a path in G from u to v that visits at least one beer store. In this paper we consider the online shortest beer path query in beer digraphs with bounded treewidth t. Assume that a tree decomposition of treewidth t on a beer digraph with n vertices is given. We show that after O(t³n) time preprocessing on the beer digraph, (i) a beer distance query can be answered in O(t³α(n)) time, where α(n) is the inverse Ackermann function, and (ii) a shortest beer path can be reported in O(t³α(n)L) time, where L is the number of edges on the path. In the process we show an improved O(t³α(n)L) time shortest path query algorithm, compared with the currently best O(t⁴α(n)L) time algorithm [Chaudhuri & Zaroliagis, 2000]. We also consider queries in a dynamic setting where the weight of an edge in G can change over time. We show two data structures. Assume t is constant and let β be any constant in (0,1). The first data structure uses O(n) preprocessing time, answers a beer distance query in O(α(n)) time and reports a shortest beer path in O(α(n) L) time. It can be updated in O(n^β) time after an edge weight change. The second data structure has O(n) preprocessing time, answers a beer distance query in O(log n) time, reports a shortest beer path in O(log n + L) time, and can be updated in O(log n) time after an edge weight change.

Cite as

Joachim Gudmundsson and Yuan Sha. Shortest Beer Path Queries in Digraphs with Bounded Treewidth. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 35:1-35:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{gudmundsson_et_al:LIPIcs.ISAAC.2023.35,
  author =	{Gudmundsson, Joachim and Sha, Yuan},
  title =	{{Shortest Beer Path Queries in Digraphs with Bounded Treewidth}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{35:1--35:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.35},
  URN =		{urn:nbn:de:0030-drops-193379},
  doi =		{10.4230/LIPIcs.ISAAC.2023.35},
  annote =	{Keywords: Graph algorithms, Shortest Path, Data structures, Bounded treewidth}
}
Document
Coloring and Recognizing Mixed Interval Graphs

Authors: Grzegorz Gutowski, Konstanty Junosza-Szaniawski, Felix Klesen, Paweł Rzążewski, Alexander Wolff, and Johannes Zink


Abstract
A mixed interval graph is an interval graph that has, for every pair of intersecting intervals, either an arc (directed arbitrarily) or an (undirected) edge. We are particularly interested in scenarios where edges and arcs are defined by the geometry of intervals. In a proper coloring of a mixed interval graph G, an interval u receives a lower (different) color than an interval v if G contains arc (u,v) (edge {u,v}). Coloring of mixed graphs has applications, for example, in scheduling with precedence constraints; see a survey by Sotskov [Mathematics, 2020]. For coloring general mixed interval graphs, we present a min {ω(G), λ(G)+1}-approximation algorithm, where ω(G) is the size of a largest clique and λ(G) is the length of a longest directed path in G. For the subclass of bidirectional interval graphs (introduced recently for an application in graph drawing), we show that optimal coloring is NP-hard. This was known for general mixed interval graphs. We introduce a new natural class of mixed interval graphs, which we call containment interval graphs. In such a graph, there is an arc (u,v) if interval u contains interval v, and there is an edge {u,v} if u and v overlap. We show that these graphs can be recognized in polynomial time, that coloring them with the minimum number of colors is NP-hard, and that there is a 2-approximation algorithm for coloring.

Cite as

Grzegorz Gutowski, Konstanty Junosza-Szaniawski, Felix Klesen, Paweł Rzążewski, Alexander Wolff, and Johannes Zink. Coloring and Recognizing Mixed Interval Graphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 36:1-36:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{gutowski_et_al:LIPIcs.ISAAC.2023.36,
  author =	{Gutowski, Grzegorz and Junosza-Szaniawski, Konstanty and Klesen, Felix and Rz\k{a}\.{z}ewski, Pawe{\l} and Wolff, Alexander and Zink, Johannes},
  title =	{{Coloring and Recognizing Mixed Interval Graphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{36:1--36:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.36},
  URN =		{urn:nbn:de:0030-drops-193388},
  doi =		{10.4230/LIPIcs.ISAAC.2023.36},
  annote =	{Keywords: Interval Graphs, Mixed Graphs, Graph Coloring}
}
Document
Shortest Beer Path Queries Based on Graph Decomposition

Authors: Tesshu Hanaka, Hirotaka Ono, Kunihiko Sadakane, and Kosuke Sugiyama


Abstract
Given a directed edge-weighted graph G = (V, E) with beer vertices B ⊆ V, a beer path between two vertices u and v is a path between u and v that visits at least one beer vertex in B, and the beer distance between two vertices is the shortest length of beer paths. We consider indexing problems on beer paths, that is, a graph is given a priori, and we construct some data structures (called indexes) for the graph. Then later, we are given two vertices, and we find the beer distance or beer path between them using the data structure. For such a scheme, efficient algorithms using indexes for the beer distance and beer path queries have been proposed for outerplanar graphs and interval graphs. For example, Bacic et al. (2021) present indexes with size O(n) for outerplanar graphs and an algorithm using them that answers the beer distance between given two vertices in O(α(n)) time, where α(⋅) is the inverse Ackermann function; the performance is shown to be optimal. This paper proposes indexing data structures and algorithms for beer path queries on general graphs based on two types of graph decomposition: the tree decomposition and the triconnected component decomposition. We propose indexes with size O(m+nr²) based on the triconnected component decomposition, where r is the size of the largest triconnected component. For a given query u,v ∈ V, our algorithm using the indexes can output the beer distance in query time O(α(m)). In particular, our indexing data structures and algorithms achieve the optimal performance (the space and the query time) for series-parallel graphs, which is a wider class of outerplanar graphs.

Cite as

Tesshu Hanaka, Hirotaka Ono, Kunihiko Sadakane, and Kosuke Sugiyama. Shortest Beer Path Queries Based on Graph Decomposition. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 37:1-37:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{hanaka_et_al:LIPIcs.ISAAC.2023.37,
  author =	{Hanaka, Tesshu and Ono, Hirotaka and Sadakane, Kunihiko and Sugiyama, Kosuke},
  title =	{{Shortest Beer Path Queries Based on Graph Decomposition}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{37:1--37:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.37},
  URN =		{urn:nbn:de:0030-drops-193397},
  doi =		{10.4230/LIPIcs.ISAAC.2023.37},
  annote =	{Keywords: graph algorithm, shortest path problem, SPQR tree}
}
Document
Temporal Separators with Deadlines

Authors: Hovhannes A. Harutyunyan, Kamran Koupayi, and Denis Pankratov


Abstract
We study temporal analogues of the Unrestricted Vertex Separator problem from the static world. An (s,z)-temporal separator is a set of vertices whose removal disconnects vertex s from vertex z for every time step in a temporal graph. The (s,z)-Temporal Separator problem asks to find the minimum size of an (s,z)-temporal separator for the given temporal graph. The (s,z)-Temporal Separator problem is known to be NP-hard in general, although some special cases (such as bounded treewidth) admit efficient algorithms [Fluschnik et al., 2020]. We introduce a generalization of this problem called the (s,z,t)-Temporal Separator problem, where the goal is to find a smallest subset of vertices whose removal eliminates all temporal paths from s to z which take less than t time steps. Let τ denote the number of time steps over which the temporal graph is defined (we consider discrete time steps). We characterize the set of parameters τ and t when the problem is NP-hard and when it is polynomial time solvable. Then we present a τ-approximation algorithm for the (s,z)-Temporal Separator problem and convert it to a τ²-approximation algorithm for the (s,z,t)-Temporal Separator problem. We also present an inapproximability lower bound of Ω(ln(n) + ln(τ)) for the (s,z,t)-Temporal Separator problem assuming that NP ⊄ DTIME(n^{log log n}). Then we consider three special families of graphs: (1) graphs of branchwidth at most 2, (2) graphs G such that the removal of s and z leaves a tree, and (3) graphs of bounded pathwidth. We present polynomial-time algorithms to find a minimum (s,z,t)-temporal separator for (1) and (2). As for (3), we show a polynomial-time reduction from the Discrete Segment Covering problem with bounded-length segments to the (s,z,t)-Temporal Separator problem where the temporal graph has bounded pathwidth.

Cite as

Hovhannes A. Harutyunyan, Kamran Koupayi, and Denis Pankratov. Temporal Separators with Deadlines. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 38:1-38:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{harutyunyan_et_al:LIPIcs.ISAAC.2023.38,
  author =	{Harutyunyan, Hovhannes A. and Koupayi, Kamran and Pankratov, Denis},
  title =	{{Temporal Separators with Deadlines}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{38:1--38:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.38},
  URN =		{urn:nbn:de:0030-drops-193407},
  doi =		{10.4230/LIPIcs.ISAAC.2023.38},
  annote =	{Keywords: Temporal graphs, dynamic graphs, vertex separator, vertex cut, separating set, deadlines, inapproximability, approximation algorithms}
}
Document
Regularization of Low Error PCPs and an Application to MCSP

Authors: Shuichi Hirahara and Dana Moshkovitz


Abstract
In a regular PCP the verifier queries each proof symbol in the same number of tests. This number is called the degree of the proof, and it is at least 1/(sq) where s is the soundness error and q is the number of queries. It is incredibly useful to have regularity and reduced degree in PCP. There is an expander-based transformation by Papadimitriou and Yannakakis that transforms any PCP with a constant number of queries and constant soundness error to a regular PCP with constant degree. There are also transformations for low error projection and unique PCPs. Other PCPs are constructed especially to be regular. In this work we show how to regularize and reduce degree of PCPs with a possibly large number of queries and low soundness error. As an application, we prove NP-hardness of an unweighted variant of the collective minimum monotone satisfying assignment problem, which was introduced by Hirahara (FOCS'22) to prove NP-hardness of MCSP^* (the partial function variant of the Minimum Circuit Size Problem) under randomized reductions. We present a simplified proof and sufficient conditions under which MCSP^* is NP-hard under the standard notion of reduction: MCSP^* is NP-hard under deterministic polynomial-time many-one reductions if there exists a function in E that satisfies certain direct sum properties.

Cite as

Shuichi Hirahara and Dana Moshkovitz. Regularization of Low Error PCPs and an Application to MCSP. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 39:1-39:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{hirahara_et_al:LIPIcs.ISAAC.2023.39,
  author =	{Hirahara, Shuichi and Moshkovitz, Dana},
  title =	{{Regularization of Low Error PCPs and an Application to MCSP}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{39:1--39:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.39},
  URN =		{urn:nbn:de:0030-drops-193411},
  doi =		{10.4230/LIPIcs.ISAAC.2023.39},
  annote =	{Keywords: PCP theorem, regularization, Minimum Circuit Size Problem}
}
Document
Structural Parameterizations of b-Coloring

Authors: Lars Jaffke, Paloma T. Lima, and Roohani Sharma


Abstract
The b-Coloring problem, which given a graph G and an integer k asks whether G has a proper k-coloring such that each color class has a vertex adjacent to all color classes except its own, is known to be FPT parameterized by the vertex cover number and XP and 𝖶[1]-hard parameterized by clique-width. Its complexity when parameterized by the treewidth of the input graph remained an open problem. We settle this question by showing that b-Coloring is XNLP-complete when parameterized by the pathwidth of the input graph. Besides determining the precise parameterized complexity of this problem, this implies that b-Coloring parameterized by pathwidth is 𝖶[t]-hard for all t, and resolves the parameterized complexity of b-Coloring parameterized by treewidth. We complement this result by showing that b-Coloring is FPT when parameterized by neighborhood diversity and by twin cover, two parameters that generalize vertex cover to more dense graphs, but are incomparable to pathwidth.

Cite as

Lars Jaffke, Paloma T. Lima, and Roohani Sharma. Structural Parameterizations of b-Coloring. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 40:1-40:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jaffke_et_al:LIPIcs.ISAAC.2023.40,
  author =	{Jaffke, Lars and Lima, Paloma T. and Sharma, Roohani},
  title =	{{Structural Parameterizations of b-Coloring}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{40:1--40:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.40},
  URN =		{urn:nbn:de:0030-drops-193429},
  doi =		{10.4230/LIPIcs.ISAAC.2023.40},
  annote =	{Keywords: b-coloring, structural parameterization, XNLP, pathwidth, neighborhood diversity, twin cover}
}
Document
Clustering What Matters in Constrained Settings: Improved Outlier to Outlier-Free Reductions

Authors: Ragesh Jaiswal and Amit Kumar


Abstract
Constrained clustering problems generalize classical clustering formulations, e.g., k-median, k-means, by imposing additional constraints on the feasibility of a clustering. There has been significant recent progress in obtaining approximation algorithms for these problems, both in the metric and the Euclidean settings. However, the outlier version of these problems, where the solution is allowed to leave out m points from the clustering, is not well understood. In this work, we give a general framework for reducing the outlier version of a constrained k-median or k-means problem to the corresponding outlier-free version with only (1+ε)-loss in the approximation ratio. The reduction is obtained by mapping the original instance of the problem to f(k, m, ε) instances of the outlier-free version, where f(k, m, ε) = ((k+m)/ε)^O(m). As specific applications, we get the following results: - First FPT (in the parameters k and m) (1+ε)-approximation algorithm for the outlier version of capacitated k-median and k-means in Euclidean spaces with hard capacities. - First FPT (in the parameters k and m) (3+ε) and (9+ε) approximation algorithms for the outlier version of capacitated k-median and k-means, respectively, in general metric spaces with hard capacities. - First FPT (in the parameters k and m) (2-δ)-approximation algorithm for the outlier version of the k-median problem under the Ulam metric. Our work generalizes the results of Bhattacharya et al. and Agrawal et al. to a larger class of constrained clustering problems. Further, our reduction works for arbitrary metric spaces and so can extend clustering algorithms for outlier-free versions in both Euclidean and arbitrary metric spaces.

Cite as

Ragesh Jaiswal and Amit Kumar. Clustering What Matters in Constrained Settings: Improved Outlier to Outlier-Free Reductions. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 41:1-41:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jaiswal_et_al:LIPIcs.ISAAC.2023.41,
  author =	{Jaiswal, Ragesh and Kumar, Amit},
  title =	{{Clustering What Matters in Constrained Settings: Improved Outlier to Outlier-Free Reductions}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{41:1--41:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.41},
  URN =		{urn:nbn:de:0030-drops-193433},
  doi =		{10.4230/LIPIcs.ISAAC.2023.41},
  annote =	{Keywords: clustering, constrained, outlier}
}
Document
Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes

Authors: Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk


Abstract
The celebrated notion of important separators bounds the number of small (S,T)-separators in a graph which are "farthest from S" in a technical sense. In this paper, we introduce a generalization of this powerful algorithmic primitive, tailored to undirected graphs, that is phrased in terms of k-secluded vertex sets: sets with an open neighborhood of size at most k. In this terminology, the bound on important separators says that there are at most 4^k maximal k-secluded connected vertex sets C containing S but disjoint from T. We generalize this statement significantly: even when we demand that G[C] avoids a finite set ℱ of forbidden induced subgraphs, the number of such maximal subgraphs is 2^𝒪(k) and they can be enumerated efficiently. This enumeration algorithm allows us to make significant improvements for two problems from the literature. Our first application concerns the Connected k-Secluded ℱ-free subgraph problem, where ℱ is a finite set of forbidden induced subgraphs. Given a graph in which each vertex has a positive integer weight, the problem asks to find a maximum-weight connected k-secluded vertex set C ⊆ V(G) such that G[C] does not contain an induced subgraph isomorphic to any F ∈ ℱ. The parameterization by k is known to be solvable in triple-exponential time via the technique of recursive understanding, which we improve to single-exponential. Our second application concerns the deletion problem to scattered graph classes. A scattered graph class is defined by demanding that every connected component is contained in at least one of the prescribed graph classes Π_1, …, Π_d. The deletion problem to a scattered graph class is to find a vertex set of size at most k whose removal yields a graph from the class. We obtain a single-exponential algorithm whenever each class Π_i is characterized by a finite number of forbidden induced subgraphs. This generalizes and improves upon earlier results in the literature.

Cite as

Bart M. P. Jansen, Jari J. H. de Kroon, and Michał Włodarczyk. Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 42:1-42:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jansen_et_al:LIPIcs.ISAAC.2023.42,
  author =	{Jansen, Bart M. P. and de Kroon, Jari J. H. and W{\l}odarczyk, Micha{\l}},
  title =	{{Single-Exponential FPT Algorithms for Enumerating Secluded ℱ-Free Subgraphs and Deleting to Scattered Graph Classes}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{42:1--42:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.42},
  URN =		{urn:nbn:de:0030-drops-193440},
  doi =		{10.4230/LIPIcs.ISAAC.2023.42},
  annote =	{Keywords: fixed-parameter tractability, important separators, secluded subgraphs}
}
Document
Is the Algorithmic Kadison-Singer Problem Hard?

Authors: Ben Jourdan, Peter Macgregor, and He Sun


Abstract
We study the following KS₂(c) problem: let c ∈ ℝ^+ be some constant, and v₁,…, v_m ∈ ℝ^d be vectors such that ‖v_i‖² ≤ α for any i ∈ [m] and ∑_{i=1}^m ⟨v_i, x⟩² = 1 for any x ∈ ℝ^d with ‖x‖ = 1. The KS₂(c) problem asks to find some S ⊂ [m], such that it holds for all x ∈ ℝ^d with ‖x‖ = 1 that |∑_{i∈S} ⟨v_i, x⟩² - 1/2| ≤ c⋅√α, or report no if such S doesn't exist. Based on the work of Marcus et al. [Adam Marcus et al., 2013] and Weaver [Nicholas Weaver, 2004], the KS₂(c) problem can be seen as the algorithmic Kadison-Singer problem with parameter c ∈ ℝ^+. Our first result is a randomised algorithm with one-sided error for the KS₂(c) problem such that (1) our algorithm finds a valid set S ⊂ [m] with probability at least 1-2/d, if such S exists, or (2) reports no with probability 1, if no valid sets exist. The algorithm has running time O(binom(m,n)⋅poly(m, d)) for n = O(d/ε² log(d) log(1/(c√α))), where ε is a parameter which controls the error of the algorithm. This presents the first algorithm for the Kadison-Singer problem whose running time is quasi-polynomial in m in a certain regime, although having exponential dependency on d. Moreover, it shows that the algorithmic Kadison-Singer problem is easier to solve in low dimensions. Our second result is on the computational complexity of the KS₂(c) problem. We show that the KS₂(1/(4√2)) problem is FNP-hard for general values of d, and solving the KS₂(1/(4√2)) problem is as hard as solving the NAE-3SAT problem.

Cite as

Ben Jourdan, Peter Macgregor, and He Sun. Is the Algorithmic Kadison-Singer Problem Hard?. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 43:1-43:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{jourdan_et_al:LIPIcs.ISAAC.2023.43,
  author =	{Jourdan, Ben and Macgregor, Peter and Sun, He},
  title =	{{Is the Algorithmic Kadison-Singer Problem Hard?}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{43:1--43:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.43},
  URN =		{urn:nbn:de:0030-drops-193457},
  doi =		{10.4230/LIPIcs.ISAAC.2023.43},
  annote =	{Keywords: Kadison-Singer problem, spectral sparsification}
}
Document
Succinct Planar Encoding with Minor Operations

Authors: Frank Kammer and Johannes Meintrup


Abstract
Let G be an unlabeled planar and simple n-vertex graph. Unlabeled graphs are graphs where the label-information is either not given or lost during the construction of data-structures. We present a succinct encoding of G that provides induced-minor operations, i.e., edge contractions and vertex deletions. Any sequence of such operations is processed in O(n) time in the word-RAM model. At all times the encoding provides constant time (per element output) neighborhood access and degree queries. Optional hash tables extend the encoding with constant expected time adjacency queries and edge-deletion (thus, all minor operations are supported) such that any number of edge deletions are computed in O(n) expected time. Constructing the encoding requires O(n) bits and O(n) time. The encoding requires ℋ(n) + o(n) bits of space with ℋ(n) being the entropy of encoding a planar graph with n vertices. Our data structure is based on the recent result of Holm et al. [ESA 2017] who presented a linear time contraction data structure that allows to maintain parallel edges and works for labeled graphs, but uses Θ(n log n) bits of space. We combine the techniques used by Holm et al. with novel ideas and the succinct encoding of Blelloch and Farzan [CPM 2010] for arbitrary separable graphs. Our result partially answers the question raised by Blelloch and Farzan whether their encoding can be modified to allow modifications of the graph.

Cite as

Frank Kammer and Johannes Meintrup. Succinct Planar Encoding with Minor Operations. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 44:1-44:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kammer_et_al:LIPIcs.ISAAC.2023.44,
  author =	{Kammer, Frank and Meintrup, Johannes},
  title =	{{Succinct Planar Encoding with Minor Operations}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{44:1--44:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.44},
  URN =		{urn:nbn:de:0030-drops-193460},
  doi =		{10.4230/LIPIcs.ISAAC.2023.44},
  annote =	{Keywords: planar graph, r-division, separator, succinct encoding, graph minors}
}
Document
Improved Approximation Algorithm for Capacitated Facility Location with Uniform Facility Cost

Authors: Mong-Jen Kao


Abstract
We consider the hard-capacitated facility location problem with uniform facility cost (CFL-UFC). This problem arises as an indicator variation between the general CFL problem and the uncapacitated facility location (UFL) problem, and is related to the profound capacitated k-median problem (CKM). In this work, we present a rounding-based 4-approximation algorithm for this problem, built on a two-staged rounding scheme that incorporates a set of novel ideas and also techniques developed in the past for both facility location and capacitated covering problems. Our result improves the decades-old LP-based ratio of 5 for this problem due to Levi et al. since 2004. We believe that the techniques developed in this work are of independent interests and may further lead to insights and implications for related problems.

Cite as

Mong-Jen Kao. Improved Approximation Algorithm for Capacitated Facility Location with Uniform Facility Cost. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 45:1-45:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kao:LIPIcs.ISAAC.2023.45,
  author =	{Kao, Mong-Jen},
  title =	{{Improved Approximation Algorithm for Capacitated Facility Location with Uniform Facility Cost}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{45:1--45:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.45},
  URN =		{urn:nbn:de:0030-drops-193474},
  doi =		{10.4230/LIPIcs.ISAAC.2023.45},
  annote =	{Keywords: Capacitated facility location, Hard capacities, Uniform facility cost}
}
Document
The st-Planar Edge Completion Problem Is Fixed-Parameter Tractable

Authors: Liana Khazaliya, Philipp Kindermann, Giuseppe Liotta, Fabrizio Montecchiani, and Kirill Simonov


Abstract
The problem of deciding whether a biconnected planar digraph G = (V,E) can be augmented to become an st-planar graph by adding a set of oriented edges E' ⊆ V × V is known to be NP-complete. We show that the problem is fixed-parameter tractable when parameterized by the size of the set E'.

Cite as

Liana Khazaliya, Philipp Kindermann, Giuseppe Liotta, Fabrizio Montecchiani, and Kirill Simonov. The st-Planar Edge Completion Problem Is Fixed-Parameter Tractable. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 46:1-46:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{khazaliya_et_al:LIPIcs.ISAAC.2023.46,
  author =	{Khazaliya, Liana and Kindermann, Philipp and Liotta, Giuseppe and Montecchiani, Fabrizio and Simonov, Kirill},
  title =	{{The st-Planar Edge Completion Problem Is Fixed-Parameter Tractable}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{46:1--46:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.46},
  URN =		{urn:nbn:de:0030-drops-193483},
  doi =		{10.4230/LIPIcs.ISAAC.2023.46},
  annote =	{Keywords: st-planar graphs, parameterized complexity, upward planarity}
}
Document
A Combinatorial Certifying Algorithm for Linear Programming Problems with Gainfree Leontief Substitution Systems

Authors: Kei Kimura and Kazuhisa Makino


Abstract
Linear programming (LP) problems with gainfree Leontief substitution systems have been intensively studied in economics and operations research, and include the feasibility problem of a class of Horn systems, which arises in, e.g., polyhedral combinatorics and logic. This subclass of LP problems admits a strongly polynomial time algorithm, where devising such an algorithm for general LP problems is one of the major theoretical open questions in mathematical optimization and computer science. Recently, much attention has been paid to devising certifying algorithms in software engineering, since those algorithms enable one to confirm the correctness of outputs of programs with simple computations. Devising a combinatorial certifying algorithm for the feasibility of the fundamental class of Horn systems remains open for almost a decade. In this paper, we provide the first combinatorial (and strongly polynomial time) certifying algorithm for LP problems with gainfree Leontief substitution systems. As a by-product, we resolve the open question on the feasibility of the class of Horn systems.

Cite as

Kei Kimura and Kazuhisa Makino. A Combinatorial Certifying Algorithm for Linear Programming Problems with Gainfree Leontief Substitution Systems. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 47:1-47:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kimura_et_al:LIPIcs.ISAAC.2023.47,
  author =	{Kimura, Kei and Makino, Kazuhisa},
  title =	{{A Combinatorial Certifying Algorithm for Linear Programming Problems with Gainfree Leontief Substitution Systems}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{47:1--47:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.47},
  URN =		{urn:nbn:de:0030-drops-193492},
  doi =		{10.4230/LIPIcs.ISAAC.2023.47},
  annote =	{Keywords: linear programming problem, certifying algorithm, Horn system}
}
Document
Reconfiguration of the Union of Arborescences

Authors: Yusuke Kobayashi, Ryoga Mahara, and Tamás Schwarcz


Abstract
An arborescence in a digraph is an acyclic arc subset in which every vertex except a root has exactly one incoming arc. In this paper, we show the reconfigurability of the union of k arborescences for fixed k in the following sense: for any pair of arc subsets that can be partitioned into k arborescences, one can be transformed into the other by exchanging arcs one by one so that every intermediate arc subset can also be partitioned into k arborescences. This generalizes the result by Ito et al. (2023), who showed the case with k = 1. Since the union of k arborescences can be represented as a common matroid basis of two matroids, our result gives a new non-trivial example of matroid pairs for which two common bases are always reconfigurable to each other.

Cite as

Yusuke Kobayashi, Ryoga Mahara, and Tamás Schwarcz. Reconfiguration of the Union of Arborescences. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 48:1-48:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kobayashi_et_al:LIPIcs.ISAAC.2023.48,
  author =	{Kobayashi, Yusuke and Mahara, Ryoga and Schwarcz, Tam\'{a}s},
  title =	{{Reconfiguration of the Union of Arborescences}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{48:1--48:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.48},
  URN =		{urn:nbn:de:0030-drops-193502},
  doi =		{10.4230/LIPIcs.ISAAC.2023.48},
  annote =	{Keywords: Arborescence packing, common matroid basis, combinatorial reconfiguration}
}
Document
An Approximation Algorithm for Two-Edge-Connected Subgraph Problem via Triangle-Free Two-Edge-Cover

Authors: Yusuke Kobayashi and Takashi Noguchi


Abstract
The 2-Edge-Connected Spanning Subgraph problem (2-ECSS) is one of the most fundamental and well-studied problems in the context of network design. We are given an undirected graph G, and the objective is to find a 2-edge-connected spanning subgraph H of G with the minimum number of edges. For this problem, a lot of approximation algorithms have been proposed in the literature. In particular, very recently, Garg, Grandoni, and Ameli gave an approximation algorithm for 2-ECSS with a factor of 1.326, which is the best approximation ratio. In this paper, under the assumption that a maximum triangle-free 2-matching can be found in polynomial time in a graph, we give a (1.3+ε)-approximation algorithm for 2-ECSS, where ε is an arbitrarily small positive fixed constant. Note that a complicated polynomial-time algorithm for finding a maximum triangle-free 2-matching is announced by Hartvigsen in his PhD thesis, but it has not been peer-reviewed or checked in any other way. In our algorithm, we compute a minimum triangle-free 2-edge-cover in G with the aid of the algorithm for finding a maximum triangle-free 2-matching. Then, with the obtained triangle-free 2-edge-cover, we apply the arguments by Garg, Grandoni, and Ameli.

Cite as

Yusuke Kobayashi and Takashi Noguchi. An Approximation Algorithm for Two-Edge-Connected Subgraph Problem via Triangle-Free Two-Edge-Cover. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 49:1-49:10, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kobayashi_et_al:LIPIcs.ISAAC.2023.49,
  author =	{Kobayashi, Yusuke and Noguchi, Takashi},
  title =	{{An Approximation Algorithm for Two-Edge-Connected Subgraph Problem via Triangle-Free Two-Edge-Cover}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{49:1--49:10},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.49},
  URN =		{urn:nbn:de:0030-drops-193514},
  doi =		{10.4230/LIPIcs.ISAAC.2023.49},
  annote =	{Keywords: approximation algorithm, survivable network design, minimum 2-edge-connected spanning subgraph, triangle-free 2-matching}
}
Document
On Min-Max Graph Balancing with Strict Negative Correlation Constraints

Authors: Ting-Yu Kuo, Yu-Han Chen, Andrea Frosini, Sun-Yuan Hsieh, Shi-Chun Tsai, and Mong-Jen Kao


Abstract
We consider the min-max graph balancing problem with strict negative correlation (SNC) constraints. The graph balancing problem arises as an equivalent formulation of the classic unrelated machine scheduling problem, where we are given a hypergraph G = (V,E) with vertex-dependent edge weight function p: E×V ↦ ℤ^{≥0} that represents the processing time of the edges (jobs). The SNC constraints, which are given as edge subsets C_1,C_2,…,C_k, require that the edges in the same subset cannot be assigned to the same vertex at the same time. Under these constraints, the goal is to compute an edge orientation (assignment) that minimizes the maximum workload of the vertices. In this paper, we conduct a general study on the approximability of this problem. First, we show that, in the presence of SNC constraints, the case with max_{e ∈ E} |e| = max_i |C_i| = 2 is the only case for which approximation solutions can be obtained. Further generalization on either direction, e.g., max_{e ∈ E} |e| or max_i |C_i|, will directly make computing a feasible solution an NP-complete problem to solve. Then, we present a 2-approximation algorithm for the case with max_{e ∈ E} |e| = max_i |C_i| = 2, based on a set of structural simplifications and a tailored assignment LP for this problem. We note that our approach is general and can be applied to similar settings, e.g., scheduling with SNC constraints to minimize the weighted completion time, to obtain similar approximation guarantees. Further cases are discussed to describe the landscape of the approximability of this prbolem. For the case with |V| ≤ 2, which is already known to be NP-hard, we present a fully-polynomial time approximation scheme (FPTAS). On the other hand, we show that the problem is at least as hard as vertex cover to approximate when |V| ≥ 3.

Cite as

Ting-Yu Kuo, Yu-Han Chen, Andrea Frosini, Sun-Yuan Hsieh, Shi-Chun Tsai, and Mong-Jen Kao. On Min-Max Graph Balancing with Strict Negative Correlation Constraints. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 50:1-50:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{kuo_et_al:LIPIcs.ISAAC.2023.50,
  author =	{Kuo, Ting-Yu and Chen, Yu-Han and Frosini, Andrea and Hsieh, Sun-Yuan and Tsai, Shi-Chun and Kao, Mong-Jen},
  title =	{{On Min-Max Graph Balancing with Strict Negative Correlation Constraints}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{50:1--50:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.50},
  URN =		{urn:nbn:de:0030-drops-193524},
  doi =		{10.4230/LIPIcs.ISAAC.2023.50},
  annote =	{Keywords: Unrelated Scheduling, Graph Balancing, Strict Correlation Constraints}
}
Document
On the Line-Separable Unit-Disk Coverage and Related Problems

Authors: Gang Liu and Haitao Wang


Abstract
Given a set P of n points and a set S of m disks in the plane, the disk coverage problem asks for a smallest subset of disks that together cover all points of P. The problem is NP-hard. In this paper, we consider a line-separable unit-disk version of the problem where all disks have the same radius and their centers are separated from the points of P by a line 𝓁. We present an m^{2/3} n^{2/3} 2^O(log^*(m+n)) + O((n+m)log(n+m)) time algorithm for the problem. This improves the previously best result of O(nm + n log n) time. Our techniques also solve the line-constrained version of the problem, where centers of all disks of S are located on a line 𝓁 while points of P can be anywhere in the plane. Our algorithm runs in O(m√n + (n+m)log(n+m)) time, which improves the previously best result of O(nm log(m+n)) time. In addition, our results lead to an algorithm of n^{10/3} 2^O(log^*n) time for a half-plane coverage problem (given n half-planes and n points, find a smallest subset of half-planes covering all points); this improves the previously best algorithm of O(n⁴log n) time. Further, if all half-planes are lower ones, our algorithm runs in n^{4/3} 2^O(log^*n) time while the previously best algorithm takes O(n²log n) time.

Cite as

Gang Liu and Haitao Wang. On the Line-Separable Unit-Disk Coverage and Related Problems. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 51:1-51:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{liu_et_al:LIPIcs.ISAAC.2023.51,
  author =	{Liu, Gang and Wang, Haitao},
  title =	{{On the Line-Separable Unit-Disk Coverage and Related Problems}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{51:1--51:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.51},
  URN =		{urn:nbn:de:0030-drops-193535},
  doi =		{10.4230/LIPIcs.ISAAC.2023.51},
  annote =	{Keywords: disk coverage, line-separable, unit-disk, line-constrained, half-planes}
}
Document
Improved Smoothed Analysis of 2-Opt for the Euclidean TSP

Authors: Bodo Manthey and Jesse van Rhijn


Abstract
The 2-opt heuristic is a simple local search heuristic for the Travelling Salesperson Problem (TSP). Although it usually performs well in practice, its worst-case running time is poor. Attempts to reconcile this difference have used smoothed analysis, in which adversarial instances are perturbed probabilistically. We are interested in the classical model of smoothed analysis for the Euclidean TSP, in which the perturbations are Gaussian. This model was previously used by Manthey & Veenstra, who obtained smoothed complexity bounds polynomial in n, the dimension d, and the perturbation strength σ^{-1}. However, their analysis only works for d ≥ 4. The only previous analysis for d ≤ 3 was performed by Englert, Röglin & Vöcking, who used a different perturbation model which can be translated to Gaussian perturbations. Their model yields bounds polynomial in n and σ^{-d}, and super-exponential in d. As the fact that no direct analysis exists for Gaussian perturbations that yields polynomial bounds for all d is somewhat unsatisfactory, we perform this missing analysis. Along the way, we improve all existing smoothed complexity bounds for Euclidean 2-opt with Gaussian perturbations.

Cite as

Bodo Manthey and Jesse van Rhijn. Improved Smoothed Analysis of 2-Opt for the Euclidean TSP. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 52:1-52:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{manthey_et_al:LIPIcs.ISAAC.2023.52,
  author =	{Manthey, Bodo and van Rhijn, Jesse},
  title =	{{Improved Smoothed Analysis of 2-Opt for the Euclidean TSP}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{52:1--52:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.52},
  URN =		{urn:nbn:de:0030-drops-193549},
  doi =		{10.4230/LIPIcs.ISAAC.2023.52},
  annote =	{Keywords: Travelling salesman problem, smoothed analysis, probabilistic analysis, local search, heuristics, 2-opt}
}
Document
On the Complexity of the Eigenvalue Deletion Problem

Authors: Neeldhara Misra, Harshil Mittal, Saket Saurabh, and Dhara Thakkar


Abstract
For any fixed positive integer r and a given budget k, the r-Eigenvalue Vertex Deletion (r-EVD) problem asks if a graph G admits a subset S of at most k vertices such that the adjacency matrix of G⧵S has at most r distinct eigenvalues. The edge deletion, edge addition, and edge editing variants are defined analogously. For r = 1, r-EVD is equivalent to the Vertex Cover problem. For r = 2, it turns out that r-EVD amounts to removing a subset S of at most k vertices so that G⧵ S is a cluster graph where all connected components have the same size. We show that r-EVD is NP-complete even on bipartite graphs with maximum degree four for every fixed r > 2, and FPT when parameterized by the solution size and the maximum degree of the graph. We also establish several results for the special case when r = 2. For the vertex deletion variant, we show that 2-EVD is NP-complete even on triangle-free and 3d-regular graphs for any d ≥ 2, and also NP-complete on d-regular graphs for any d ≥ 8. The edge deletion, addition, and editing variants are all NP-complete for r = 2. The edge deletion problem admits a polynomial time algorithm if the input is a cluster graph, while - in contrast - the edge addition variant is hard even when the input is a cluster graph. We show that the edge addition variant has a quadratic kernel. The edge deletion and vertex deletion variants admit a single-exponential FPT algorithm when parameterized by the solution size alone. Our main contribution is to develop the complexity landscape for the problem of modifying a graph with the aim of reducing the number of distinct eigenvalues in the spectrum of its adjacency matrix. It turns out that this captures, apart from Vertex Cover, also a natural variation of the problem of modifying to a cluster graph as a special case, which we believe may be of independent interest.

Cite as

Neeldhara Misra, Harshil Mittal, Saket Saurabh, and Dhara Thakkar. On the Complexity of the Eigenvalue Deletion Problem. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 53:1-53:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{misra_et_al:LIPIcs.ISAAC.2023.53,
  author =	{Misra, Neeldhara and Mittal, Harshil and Saurabh, Saket and Thakkar, Dhara},
  title =	{{On the Complexity of the Eigenvalue Deletion Problem}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{53:1--53:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.53},
  URN =		{urn:nbn:de:0030-drops-193555},
  doi =		{10.4230/LIPIcs.ISAAC.2023.53},
  annote =	{Keywords: Graph Modification, Rank Reduction, Eigenvalues}
}
Document
Connected Vertex Cover on AT-Free Graphs

Authors: Joydeep Mukherjee and Tamojit Saha


Abstract
Asteroidal Triple (AT) in a graph is an independent set of three vertices such that every pair of them has a path between them avoiding the neighbourhood of the third. A graph is called AT-free if it does not contain any asteroidal triple. A connected vertex cover of a graph is a subset of its vertices which contains at least one endpoint of each edge and induces a connected subgraph. Settling the complexity of computing a minimum connected vertex cover in an AT-free graph was mentioned as an open problem in Escoffier et al. [Escoffier et al., 2010]. In this paper we answer the question by presenting an exact polynomial time algorithm for computing a minimum connected vertex cover problem on AT-free graphs.

Cite as

Joydeep Mukherjee and Tamojit Saha. Connected Vertex Cover on AT-Free Graphs. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 54:1-54:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{mukherjee_et_al:LIPIcs.ISAAC.2023.54,
  author =	{Mukherjee, Joydeep and Saha, Tamojit},
  title =	{{Connected Vertex Cover on AT-Free Graphs}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{54:1--54:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.54},
  URN =		{urn:nbn:de:0030-drops-193566},
  doi =		{10.4230/LIPIcs.ISAAC.2023.54},
  annote =	{Keywords: Graph Algorithm, AT-free graphs, Connected Vertex Cover, Optimization}
}
Document
On the Fine-Grained Query Complexity of Symmetric Functions

Authors: Supartha Podder, Penghui Yao, and Zekun Ye


Abstract
Watrous conjectured that the randomized and quantum query complexities of symmetric functions are polynomially equivalent, which was resolved by Ambainis and Aaronson [Scott Aaronson and Andris Ambainis, 2014], and was later improved in [André Chailloux, 2019; Shalev Ben-David et al., 2020]. This paper explores a fine-grained version of the Watrous conjecture, including the randomized and quantum algorithms with success probabilities arbitrarily close to 1/2. Our contributions include the following: 1) An analysis of the optimal success probability of quantum and randomized query algorithms of two fundamental partial symmetric Boolean functions given a fixed number of queries. We prove that for any quantum algorithm computing these two functions using T queries, there exist randomized algorithms using poly(T) queries that achieve the same success probability as the quantum algorithm, even if the success probability is arbitrarily close to 1/2. These two classes of functions are instrumental in analyzing general symmetric functions. 2) We establish that for any total symmetric Boolean function f, if a quantum algorithm uses T queries to compute f with success probability 1/2+β, then there exists a randomized algorithm using O(T²) queries to compute f with success probability 1/2 + Ω(δβ²) on a 1-δ fraction of inputs, where β,δ can be arbitrarily small positive values. As a corollary, we prove a randomized version of Aaronson-Ambainis Conjecture [Scott Aaronson and Andris Ambainis, 2014] for total symmetric Boolean functions in the regime where the success probability of algorithms can be arbitrarily close to 1/2. 3) We present polynomial equivalences for several fundamental complexity measures of partial symmetric Boolean functions. Specifically, we first prove that for certain partial symmetric Boolean functions, quantum query complexity is at most quadratic in approximate degree for any error arbitrarily close to 1/2. Next, we show exact quantum query complexity is at most quadratic in degree. Additionally, we give the tight bounds of several complexity measures, indicating their polynomial equivalence. Conversely, we exhibit an exponential separation between randomized and exact quantum query complexity for certain partial symmetric Boolean functions.

Cite as

Supartha Podder, Penghui Yao, and Zekun Ye. On the Fine-Grained Query Complexity of Symmetric Functions. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 55:1-55:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{podder_et_al:LIPIcs.ISAAC.2023.55,
  author =	{Podder, Supartha and Yao, Penghui and Ye, Zekun},
  title =	{{On the Fine-Grained Query Complexity of Symmetric Functions}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{55:1--55:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.55},
  URN =		{urn:nbn:de:0030-drops-193570},
  doi =		{10.4230/LIPIcs.ISAAC.2023.55},
  annote =	{Keywords: Query complexity, Symmetric functions, Quantum advantages}
}
Document
Testing Properties of Distributions in the Streaming Model

Authors: Sampriti Roy and Yadu Vasudev


Abstract
We study distribution testing in the standard access model and the conditional access model when the memory available to the testing algorithm is bounded. In both scenarios, we consider the samples appear in an online fashion. The goal is to test the properties of distribution using an optimal number of samples subject to a memory constraint on how many samples can be stored at a given time. First, we provide a trade-off between the sample complexity and the space complexity for testing identity when the samples are drawn according to the conditional access oracle. We then show that we can learn a succinct representation of a monotone distribution efficiently with a memory constraint on the number of samples that are stored that is almost optimal. We also show that the algorithm for monotone distributions can be extended to a larger class of decomposable distributions.

Cite as

Sampriti Roy and Yadu Vasudev. Testing Properties of Distributions in the Streaming Model. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 56:1-56:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{roy_et_al:LIPIcs.ISAAC.2023.56,
  author =	{Roy, Sampriti and Vasudev, Yadu},
  title =	{{Testing Properties of Distributions in the Streaming Model}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{56:1--56:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.56},
  URN =		{urn:nbn:de:0030-drops-193588},
  doi =		{10.4230/LIPIcs.ISAAC.2023.56},
  annote =	{Keywords: Property testing, distribution testing, streaming}
}
Document
A Strongly Polynomial-Time Algorithm for Weighted General Factors with Three Feasible Degrees

Authors: Shuai Shao and Stanislav Živný


Abstract
General factors are a generalization of matchings. Given a graph G with a set π(v) of feasible degrees, called a degree constraint, for each vertex v of G, the general factor problem is to find a (spanning) subgraph F of G such that deg_F(v) ∈ π(v) for every v of G. When all degree constraints are symmetric Δ-matroids, the problem is solvable in polynomial time. The weighted general factor problem is to find a general factor of the maximum total weight in an edge-weighted graph. Strongly polynomial-time algorithms are only known for weighted general factor problems that are reducible to the weighted matching problem by gadget constructions. In this paper, we present a strongly polynomial-time algorithm for a type of weighted general factor problems with real-valued edge weights that is provably not reducible to the weighted matching problem by gadget constructions. As an application, we obtain a strongly polynomial-time algorithm for the terminal backup problem by reducing it to the weighted general factor problem.

Cite as

Shuai Shao and Stanislav Živný. A Strongly Polynomial-Time Algorithm for Weighted General Factors with Three Feasible Degrees. In 34th International Symposium on Algorithms and Computation (ISAAC 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 283, pp. 57:1-57:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023)


Copy BibTex To Clipboard

@InProceedings{shao_et_al:LIPIcs.ISAAC.2023.57,
  author =	{Shao, Shuai and \v{Z}ivn\'{y}, Stanislav},
  title =	{{A Strongly Polynomial-Time Algorithm for Weighted General Factors with Three Feasible Degrees}},
  booktitle =	{34th International Symposium on Algorithms and Computation (ISAAC 2023)},
  pages =	{57:1--57:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-289-1},
  ISSN =	{1868-8969},
  year =	{2023},
  volume =	{283},
  editor =	{Iwata, Satoru and Kakimura, Naonori},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2023.57},
  URN =		{urn:nbn:de:0030-drops-193597},
  doi =		{10.4230/LIPIcs.ISAAC.2023.57},
  annote =	{Keywords: matchings, factors, edge constraint satisfaction problems, terminal backup problem, delta matroids}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail