LIPIcs, Volume 212

32nd International Symposium on Algorithms and Computation (ISAAC 2021)



Thumbnail PDF

Event

ISAAC 2021, December 6-8, 2021, Fukuoka, Japan

Editors

Hee-Kap Ahn
  • Pohang University of Science and Technology, Korea
Kunihiko Sadakane
  • The University of Tokyo, Japan

Publication Details

  • published at: 2021-11-30
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-214-3
  • DBLP: db/conf/isaac/isaac2021

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 212, ISAAC 2021, Complete Volume

Authors: Hee-Kap Ahn and Kunihiko Sadakane


Abstract
LIPIcs, Volume 212, ISAAC 2021, Complete Volume

Cite as

32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 1-1188, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@Proceedings{ahn_et_al:LIPIcs.ISAAC.2021,
  title =	{{LIPIcs, Volume 212, ISAAC 2021, Complete Volume}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{1--1188},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021},
  URN =		{urn:nbn:de:0030-drops-154329},
  doi =		{10.4230/LIPIcs.ISAAC.2021},
  annote =	{Keywords: LIPIcs, Volume 212, ISAAC 2021, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Hee-Kap Ahn and Kunihiko Sadakane


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 0:i-0:xviii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{ahn_et_al:LIPIcs.ISAAC.2021.0,
  author =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{0:i--0:xviii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.0},
  URN =		{urn:nbn:de:0030-drops-154330},
  doi =		{10.4230/LIPIcs.ISAAC.2021.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
Streaming Pattern Matching (Invited Talk)

Authors: Tatiana Starikovskaya


Abstract
Many classical algorithms for string processing assume that the input can be accessed in full via constant-time random access, which poses a serious limitation in the modern era of data deluge. In this talk, we will focus on the streaming model of computation that allows to overcome this issue. In this model of computation, we assume that the input arrives as a stream, one character at a time, which captures a situation when the data are sequential measurements or an output of an algorithm. The space complexity is defined as all the space used, including the space used to store any information about the input, which allows to develop ultra-efficient algorithms. The first streaming algorithm for pattern matching was presented in the seminal paper of Porat and Porat in FOCS 2009. For a pattern of length m, the algorithm uses only O(log m) space, while any classical algorithm requires Ω(m) space. This result served as a foundation of the area of streaming algorithms for pattern matching. After a brief survey of the area, we will discuss two questions in more details: the k-mismatch problem and the pattern matching with k-edits problem. In the k-mismatch problem, one is given a pattern and a text, and the task is to find all substrings of the text that have at most k mismatches with the pattern. The current best algorithm for this problem was given by Clifford, Kociumaka, and Porat in SODA 2019, and for a pattern of length m it uses O(k log m) space and Õ(√k) time per character of the text. In the pattern matching with k-edits problem, the task is similar, but one must find substrings that can be transformed into the pattern by at most k edits, i.e. substitutions, insertions, and deletions of a character. For this problem, the first streaming algorithm was presented by Kociumaka, Porat, and Starikovskaya in FOCS 2021. The algorithm takes Õ(poly(k)) space and Õ(poly(k)) time per character of the text.

Cite as

Tatiana Starikovskaya. Streaming Pattern Matching (Invited Talk). In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, p. 1:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{starikovskaya:LIPIcs.ISAAC.2021.1,
  author =	{Starikovskaya, Tatiana},
  title =	{{Streaming Pattern Matching}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{1:1--1:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.1},
  URN =		{urn:nbn:de:0030-drops-154345},
  doi =		{10.4230/LIPIcs.ISAAC.2021.1},
  annote =	{Keywords: Streaming algorithms, Pattern matching, Hamming distance, Edit distance}
}
Document
Invited Talk
Spanning Properties of Variants of the Delaunay Graph (Invited Talk)

Authors: Prosenjit Bose


Abstract
A weighted geometric graph G is a graph whose n vertices are points in the plane and whose m edges are line segments weighted by the Euclidean distance between their endpoints. A t-spanner of G is a connected spanning subgraph G' with the property that for every pair of vertices x, y, the shortest path from x to y in G' has weight at most t ≥ 1 times the shortest path from x to y in G. The parameter t is commonly referred to as the spanning ratio or the stretch factor. Typically, G is a graph with Ω(n²) edges. As such, the goal in this area is to construct a subgraph G' that possesses several desirable properties such as O(n) edges and spanning ratio close to 1. In addition, when planarity is one of the desired properties, variants of Delaunay graphs play a vital role in the construction of planar geometric spanners. In this talk, we will provide a comprehensive overview of various results concerning the spanning ratio, among other several other properties, of different types of Delaunay graphs and their subgraphs.

Cite as

Prosenjit Bose. Spanning Properties of Variants of the Delaunay Graph (Invited Talk). In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bose:LIPIcs.ISAAC.2021.2,
  author =	{Bose, Prosenjit},
  title =	{{Spanning Properties of Variants of the Delaunay Graph}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.2},
  URN =		{urn:nbn:de:0030-drops-154352},
  doi =		{10.4230/LIPIcs.ISAAC.2021.2},
  annote =	{Keywords: Delaunay Graph, Geometric Graph, Graph Spanner}
}
Document
Subquadratic Algorithms for Some 3Sum-Hard Geometric Problems in the Algebraic Decision Tree Model

Authors: Boris Aronov, Mark de Berg, Jean Cardinal, Esther Ezra, John Iacono, and Micha Sharir


Abstract
We present subquadratic algorithms in the algebraic decision-tree model for several 3Sum-hard geometric problems, all of which can be reduced to the following question: Given two sets A, B, each consisting of n pairwise disjoint segments in the plane, and a set C of n triangles in the plane, we want to count, for each triangle Δ ∈ C, the number of intersection points between the segments of A and those of B that lie in Δ. The problems considered in this paper have been studied by Chan (2020), who gave algorithms that solve them, in the standard real-RAM model, in O((n²/log²n) log^O(1) log n) time. We present solutions in the algebraic decision-tree model whose cost is O(n^{60/31+ε}), for any ε > 0. Our approach is based on a primal-dual range searching mechanism, which exploits the multi-level polynomial partitioning machinery recently developed by Agarwal, Aronov, Ezra, and Zahl (2020). A key step in the procedure is a variant of point location in arrangements, say of lines in the plane, which is based solely on the order type of the lines, a "handicap" that turns out to be beneficial for speeding up our algorithm.

Cite as

Boris Aronov, Mark de Berg, Jean Cardinal, Esther Ezra, John Iacono, and Micha Sharir. Subquadratic Algorithms for Some 3Sum-Hard Geometric Problems in the Algebraic Decision Tree Model. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 3:1-3:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{aronov_et_al:LIPIcs.ISAAC.2021.3,
  author =	{Aronov, Boris and de Berg, Mark and Cardinal, Jean and Ezra, Esther and Iacono, John and Sharir, Micha},
  title =	{{Subquadratic Algorithms for Some 3Sum-Hard Geometric Problems in the Algebraic Decision Tree Model}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{3:1--3:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.3},
  URN =		{urn:nbn:de:0030-drops-154363},
  doi =		{10.4230/LIPIcs.ISAAC.2021.3},
  annote =	{Keywords: Computational geometry, Algebraic decision-tree model, Polynomial partitioning, Primal-dual range searching, Order types, Point location, Hierarchical partitions}
}
Document
Approximate Maximum Halfspace Discrepancy

Authors: Michael Matheny and Jeff M. Phillips


Abstract
Consider the geometric range space (X, H_d) where X ⊂ ℝ^d and H_d is the set of ranges defined by d-dimensional halfspaces. In this setting we consider that X is the disjoint union of a red and blue set. For each halfspace h ∈ H_d define a function Φ(h) that measures the "difference" between the fraction of red and fraction of blue points which fall in the range h. In this context the maximum discrepancy problem is to find the h^* = arg max_{h ∈ (X, H_d)} Φ(h). We aim to instead find an ĥ such that Φ(h^*) - Φ(ĥ) ≤ ε. This is the central problem in linear classification for machine learning, in spatial scan statistics for spatial anomaly detection, and shows up in many other areas. We provide a solution for this problem in O(|X| + (1/ε^d) log⁴ (1/ε)) time, for constant d, which improves polynomially over the previous best solutions. For d = 2 we show that this is nearly tight through conditional lower bounds. For different classes of Φ we can either provide a Ω(|X|^{3/2 - o(1)}) time lower bound for the exact solution with a reduction to APSP, or an Ω(|X| + 1/ε^{2-o(1)}) lower bound for the approximate solution with a reduction to 3Sum. A key technical result is a ε-approximate halfspace range counting data structure of size O(1/ε^d) with O(log (1/ε)) query time, which we can build in O(|X| + (1/ε^d) log⁴ (1/ε)) time.

Cite as

Michael Matheny and Jeff M. Phillips. Approximate Maximum Halfspace Discrepancy. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 4:1-4:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{matheny_et_al:LIPIcs.ISAAC.2021.4,
  author =	{Matheny, Michael and Phillips, Jeff M.},
  title =	{{Approximate Maximum Halfspace Discrepancy}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{4:1--4:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.4},
  URN =		{urn:nbn:de:0030-drops-154377},
  doi =		{10.4230/LIPIcs.ISAAC.2021.4},
  annote =	{Keywords: range spaces, halfspaces, scan statistics, fine-grained complexity}
}
Document
The VC-Dimension of Limited Visibility Terrains

Authors: Matt Gibson-Lopez and Zhongxiu Yang


Abstract
Visibility problems are fundamental to computational geometry, and many versions of geometric set cover where coverage is based on visibility have been considered. In most settings, points can see "infinitely far" so long as visibility is not "blocked" by some obstacle. In many applications, this may be an unreasonable assumption. In this paper, we consider a new model of visibility where no point can see any other point beyond a sight radius ρ. In particular, we consider this visibility model in the context of terrains. We show that the VC-dimension of limited visibility terrains is exactly 7. We give lower bound construction that shatters a set of 7 points, and we prove that shattering 8 points is not possible.

Cite as

Matt Gibson-Lopez and Zhongxiu Yang. The VC-Dimension of Limited Visibility Terrains. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 5:1-5:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{gibsonlopez_et_al:LIPIcs.ISAAC.2021.5,
  author =	{Gibson-Lopez, Matt and Yang, Zhongxiu},
  title =	{{The VC-Dimension of Limited Visibility Terrains}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{5:1--5:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.5},
  URN =		{urn:nbn:de:0030-drops-154386},
  doi =		{10.4230/LIPIcs.ISAAC.2021.5},
  annote =	{Keywords: VC-dimension, Terrain Guarding, Limited Visibility}
}
Document
Clustering with Neighborhoods

Authors: Hongyao Huang, Georgiy Klimenko, and Benjamin Raichel


Abstract
In the standard planar k-center clustering problem, one is given a set P of n points in the plane, and the goal is to select k center points, so as to minimize the maximum distance over points in P to their nearest center. Here we initiate the systematic study of the clustering with neighborhoods problem, which generalizes the k-center problem to allow the covered objects to be a set of general disjoint convex objects C rather than just a point set P. For this problem we first show that there is a PTAS for approximating the number of centers. Specifically, if r_opt is the optimal radius for k centers, then in n^O(1/ε²) time we can produce a set of (1+ε)k centers with radius ≤ r_opt. If instead one considers the standard goal of approximating the optimal clustering radius, while keeping k as a hard constraint, we show that the radius cannot be approximated within any factor in polynomial time unless P = NP, even when C is a set of line segments. When C is a set of unit disks we show the problem is hard to approximate within a factor of (√{13}-√3)(2-√3) ≈ 6.99. This hardness result complements our main result, where we show that when the objects are disks, of possibly differing radii, there is a (5+2√3)≈ 8.46 approximation algorithm. Additionally, for unit disks we give an O(n log k)+(k/ε)^O(k) time (1+ε)-approximation to the optimal radius, that is, an FPTAS for constant k whose running time depends only linearly on n. Finally, we show that the one dimensional version of the problem, even when intersections are allowed, can be solved exactly in O(n log n) time.

Cite as

Hongyao Huang, Georgiy Klimenko, and Benjamin Raichel. Clustering with Neighborhoods. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 6:1-6:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{huang_et_al:LIPIcs.ISAAC.2021.6,
  author =	{Huang, Hongyao and Klimenko, Georgiy and Raichel, Benjamin},
  title =	{{Clustering with Neighborhoods}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{6:1--6:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.6},
  URN =		{urn:nbn:de:0030-drops-154398},
  doi =		{10.4230/LIPIcs.ISAAC.2021.6},
  annote =	{Keywords: Clustering, Approximation, Hardness}
}
Document
Approximating Longest Spanning Tree with Neighborhoods

Authors: Ahmad Biniaz


Abstract
We study the following maximization problem in the Euclidean plane: Given a collection of neighborhoods (polygonal regions) in the plane, the goal is to select a point in each neighborhood so that the longest spanning tree on selected points has maximum length. It is not known whether or not this problem is NP-hard. We present an approximation algorithm with ratio 0.548 for this problem. This improves the previous best known ratio of 0.511. The presented algorithm takes linear time after computing a diameter. Even though our algorithm itself is fairly simple, its analysis is rather involved. In some part we deal with a minimization problem with multiple variables. We use a sequence of geometric transformations to reduce the number of variables and simplify the analysis.

Cite as

Ahmad Biniaz. Approximating Longest Spanning Tree with Neighborhoods. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 7:1-7:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{biniaz:LIPIcs.ISAAC.2021.7,
  author =	{Biniaz, Ahmad},
  title =	{{Approximating Longest Spanning Tree with Neighborhoods}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{7:1--7:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.7},
  URN =		{urn:nbn:de:0030-drops-154401},
  doi =		{10.4230/LIPIcs.ISAAC.2021.7},
  annote =	{Keywords: Euclidean maximum spanning tree, spanning tree with neighborhoods, approximation algorithms}
}
Document
Self-Improving Voronoi Construction for a Hidden Mixture of Product Distributions

Authors: Siu-Wing Cheng and Man Ting Wong


Abstract
We propose a self-improving algorithm for computing Voronoi diagrams under a given convex distance function with constant description complexity. The n input points are drawn from a hidden mixture of product distributions; we are only given an upper bound m = o(√n) on the number of distributions in the mixture, and the property that for each distribution, an input instance is drawn from it with a probability of Ω(1/n). For any ε ∈ (0,1), after spending O(mn log^O(1)(mn) + m^ε n^(1+ε) log(mn)) time in a training phase, our algorithm achieves an O(1/ε n log m + 1/ε n 2^O(log^* n) + 1/ε H) expected running time with probability at least 1 - O(1/n), where H is the entropy of the distribution of the Voronoi diagram output. The expectation is taken over the input distribution and the randomized decisions of the algorithm. For the Euclidean metric, the expected running time improves to O(1/ε n log m + 1/ε H).

Cite as

Siu-Wing Cheng and Man Ting Wong. Self-Improving Voronoi Construction for a Hidden Mixture of Product Distributions. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 8:1-8:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cheng_et_al:LIPIcs.ISAAC.2021.8,
  author =	{Cheng, Siu-Wing and Wong, Man Ting},
  title =	{{Self-Improving Voronoi Construction for a Hidden Mixture of Product Distributions}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{8:1--8:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.8},
  URN =		{urn:nbn:de:0030-drops-154418},
  doi =		{10.4230/LIPIcs.ISAAC.2021.8},
  annote =	{Keywords: entropy, Voronoi diagram, convex distance function, hidden mixture of product distributions}
}
Document
Connected Coordinated Motion Planning with Bounded Stretch

Authors: Sándor P. Fekete, Phillip Keldenich, Ramin Kosfeld, Christian Rieck, and Christian Scheffer


Abstract
We consider the problem of coordinated motion planning for a swarm of simple, identical robots: From a given start grid configuration of robots, we need to reach a desired target configuration via a sequence of parallel, continuous, collision-free robot motions, such that the set of robots induces a connected grid graph at all integer times. The objective is to minimize the makespan of the motion schedule, i.e., to reach the new configuration in a minimum amount of time. We show that this problem is NP-hard, even for deciding whether a makespan of 2 can be achieved, while it is possible to check in polynomial time whether a makespan of 1 can be achieved. On the algorithmic side, we establish simultaneous constant-factor approximation for two fundamental parameters, by achieving constant stretch for constant scale. Scaled shapes (which arise by increasing all dimensions of a given object by the same multiplicative factor) have been considered in previous seminal work on self-assembly, often with unbounded or logarithmic scale factors; we provide methods for a generalized scale factor, bounded by a constant. Moreover, our algorithm achieves a constant stretch factor: If mapping the start configuration to the target configuration requires a maximum Manhattan distance of d, then the total duration of our overall schedule is 𝒪(d), which is optimal up to constant factors.

Cite as

Sándor P. Fekete, Phillip Keldenich, Ramin Kosfeld, Christian Rieck, and Christian Scheffer. Connected Coordinated Motion Planning with Bounded Stretch. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 9:1-9:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{fekete_et_al:LIPIcs.ISAAC.2021.9,
  author =	{Fekete, S\'{a}ndor P. and Keldenich, Phillip and Kosfeld, Ramin and Rieck, Christian and Scheffer, Christian},
  title =	{{Connected Coordinated Motion Planning with Bounded Stretch}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{9:1--9:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.9},
  URN =		{urn:nbn:de:0030-drops-154423},
  doi =		{10.4230/LIPIcs.ISAAC.2021.9},
  annote =	{Keywords: Motion planning, parallel motion, bounded stretch, scaled shape, makespan, connectivity, swarm robotics}
}
Document
Enclosing Depth and Other Depth Measures

Authors: Patrick Schnider


Abstract
We study families of depth measures defined by natural sets of axioms. We show that any such depth measure is a constant factor approximation of Tukey depth. We further investigate the dimensions of depth regions, showing that the Cascade conjecture, introduced by Kalai for Tverberg depth, holds for all depth measures which satisfy our most restrictive set of axioms, which includes Tukey depth. Along the way, we introduce and study a new depth measure called enclosing depth, which we believe to be of independent interest, and show its relation to a constant-fraction Radon theorem on certain two-colored point sets.

Cite as

Patrick Schnider. Enclosing Depth and Other Depth Measures. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 10:1-10:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{schnider:LIPIcs.ISAAC.2021.10,
  author =	{Schnider, Patrick},
  title =	{{Enclosing Depth and Other Depth Measures}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{10:1--10:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.10},
  URN =		{urn:nbn:de:0030-drops-154431},
  doi =		{10.4230/LIPIcs.ISAAC.2021.10},
  annote =	{Keywords: Depth measures, Tukey depth, Tverberg theorem, Combinatorial Geometry}
}
Document
Illuminating the x-Axis by α-Floodlights

Authors: Bengt J. Nilsson, David Orden, Leonidas Palios, Carlos Seara, and Paweł Żyliński


Abstract
Given a set S of regions with piece-wise linear boundary and a positive angle α < 90°, we consider the problem of computing the locations and orientations of the minimum number of α-floodlights positioned at points in S which suffice to illuminate the entire x-axis. We show that the problem can be solved in O(n log n) time and O(n) space, where n is the number of vertices of the set S.

Cite as

Bengt J. Nilsson, David Orden, Leonidas Palios, Carlos Seara, and Paweł Żyliński. Illuminating the x-Axis by α-Floodlights. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 11:1-11:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{nilsson_et_al:LIPIcs.ISAAC.2021.11,
  author =	{Nilsson, Bengt J. and Orden, David and Palios, Leonidas and Seara, Carlos and \.{Z}yli\'{n}ski, Pawe{\l}},
  title =	{{Illuminating the x-Axis by \alpha-Floodlights}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{11:1--11:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.11},
  URN =		{urn:nbn:de:0030-drops-154444},
  doi =		{10.4230/LIPIcs.ISAAC.2021.11},
  annote =	{Keywords: Computational Geometry, Visibility, Art Gallery Problems, Floodlights}
}
Document
On Geometric Priority Set Cover Problems

Authors: Aritra Banik, Rajiv Raman, and Saurabh Ray


Abstract
We study the priority set cover problem for simple geometric set systems in the plane. For pseudo-halfspaces in the plane we obtain a PTAS via local search by showing that the corresponding set system admits a planar support. We show that the problem is APX-hard even for unit disks in the plane and argue that in this case the standard local search algorithm can output a solution that is arbitrarily bad compared to the optimal solution. We then present an LP-relative constant factor approximation algorithm (which also works in the weighted setting) for unit disks via quasi-uniform sampling. As a consequence we obtain a constant factor approximation for the capacitated set cover problem with unit disks. For arbitrary size disks, we show that the problem is at least as hard as the vertex cover problem in general graphs even when the disks have nearly equal sizes. We also present a few simple results for unit squares and orthants in the plane.

Cite as

Aritra Banik, Rajiv Raman, and Saurabh Ray. On Geometric Priority Set Cover Problems. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 12:1-12:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{banik_et_al:LIPIcs.ISAAC.2021.12,
  author =	{Banik, Aritra and Raman, Rajiv and Ray, Saurabh},
  title =	{{On Geometric Priority Set Cover Problems}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{12:1--12:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.12},
  URN =		{urn:nbn:de:0030-drops-154459},
  doi =		{10.4230/LIPIcs.ISAAC.2021.12},
  annote =	{Keywords: Approximation algorithms, geometric set cover, local search, quasi-uniform sampling}
}
Document
The Complexity of Sharing a Pizza

Authors: Patrick Schnider


Abstract
Assume you have a 2-dimensional pizza with 2n ingredients that you want to share with your friend. For this you are allowed to cut the pizza using several straight cuts, and then give every second piece to your friend. You want to do this fairly, that is, your friend and you should each get exactly half of each ingredient. How many cuts do you need? It was recently shown using topological methods that n cuts always suffice. In this work, we study the computational complexity of finding such n cuts. Our main result is that this problem is PPA-complete when the ingredients are represented as point sets. For this, we give a new proof that for point sets n cuts suffice, which does not use any topological methods. We further prove several hardness results as well as a higher-dimensional variant for the case where the ingredients are well-separated.

Cite as

Patrick Schnider. The Complexity of Sharing a Pizza. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 13:1-13:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{schnider:LIPIcs.ISAAC.2021.13,
  author =	{Schnider, Patrick},
  title =	{{The Complexity of Sharing a Pizza}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{13:1--13:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.13},
  URN =		{urn:nbn:de:0030-drops-154460},
  doi =		{10.4230/LIPIcs.ISAAC.2021.13},
  annote =	{Keywords: pizza sharing, Ham-Sandwich theorem, PPA, computational geometry, computational complexity}
}
Document
Dynamic Data Structures for k-Nearest Neighbor Queries

Authors: Sarita de Berg and Frank Staals


Abstract
Our aim is to develop dynamic data structures that support k-nearest neighbors (k-NN) queries for a set of n point sites in O(f(n) + k) time, where f(n) is some polylogarithmic function of n. The key component is a general query algorithm that allows us to find the k-NN spread over t substructures simultaneously, thus reducing a O(tk) term in the query time to O(k). Combining this technique with the logarithmic method allows us to turn any static k-NN data structure into a data structure supporting both efficient insertions and queries. For the fully dynamic case, this technique allows us to recover the deterministic, worst-case, O(log²n/log log n +k) query time for the Euclidean distance claimed before, while preserving the polylogarithmic update times. We adapt this data structure to also support fully dynamic geodesic k-NN queries among a set of sites in a simple polygon. For this purpose, we design a shallow cutting based, deletion-only k-NN data structure. More generally, we obtain a dynamic k-NN data structure for any type of distance functions for which we can build vertical shallow cuttings. We apply all of our methods in the plane for the Euclidean distance, the geodesic distance, and general, constant-complexity, algebraic distance functions.

Cite as

Sarita de Berg and Frank Staals. Dynamic Data Structures for k-Nearest Neighbor Queries. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 14:1-14:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{deberg_et_al:LIPIcs.ISAAC.2021.14,
  author =	{de Berg, Sarita and Staals, Frank},
  title =	{{Dynamic Data Structures for k-Nearest Neighbor Queries}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{14:1--14:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.14},
  URN =		{urn:nbn:de:0030-drops-154473},
  doi =		{10.4230/LIPIcs.ISAAC.2021.14},
  annote =	{Keywords: data structure, simple polygon, geodesic distance, nearest neighbor searching}
}
Document
Preference-Based Trajectory Clustering - An Application of Geometric Hitting Sets

Authors: Florian Barth, Stefan Funke, and Claudius Proissl


Abstract
In a road network with multicriteria edge costs we consider the problem of computing a minimum number of driving preferences such that a given set of paths/trajectories is optimal under at least one of these preferences. While the exact formulation and solution of this problem appears theoretically hard, we show that in practice one can solve the problem exactly even for non-homeopathic instance sizes of several thousand trajectories in a road network of several million nodes. We also present a parameterized guaranteed-polynomial-time scheme with very good practical performance.

Cite as

Florian Barth, Stefan Funke, and Claudius Proissl. Preference-Based Trajectory Clustering - An Application of Geometric Hitting Sets. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 15:1-15:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{barth_et_al:LIPIcs.ISAAC.2021.15,
  author =	{Barth, Florian and Funke, Stefan and Proissl, Claudius},
  title =	{{Preference-Based Trajectory Clustering - An Application of Geometric Hitting Sets}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{15:1--15:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.15},
  URN =		{urn:nbn:de:0030-drops-154481},
  doi =		{10.4230/LIPIcs.ISAAC.2021.15},
  annote =	{Keywords: Route planning, personalization, computational geometry}
}
Document
Efficiently Partitioning the Edges of a 1-Planar Graph into a Planar Graph and a Forest

Authors: Sam Barr and Therese Biedl


Abstract
1-planar graphs are graphs that can be drawn in the plane such that any edge intersects with at most one other edge. Ackerman showed that the edges of a 1-planar graph can be partitioned into a planar graph and a forest, and claims that the proof leads to a linear time algorithm. However, it is not clear how one would obtain such an algorithm from his proof. In this paper, we first reprove Ackerman’s result (in fact, we prove a slightly more general statement) and then show that the split can be found in linear time by using an edge-contraction data structure by Holm, Italiano, Karczmarz, Łącki, Rotenberg and Sankowski.

Cite as

Sam Barr and Therese Biedl. Efficiently Partitioning the Edges of a 1-Planar Graph into a Planar Graph and a Forest. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 16:1-16:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{barr_et_al:LIPIcs.ISAAC.2021.16,
  author =	{Barr, Sam and Biedl, Therese},
  title =	{{Efficiently Partitioning the Edges of a 1-Planar Graph into a Planar Graph and a Forest}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{16:1--16:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.16},
  URN =		{urn:nbn:de:0030-drops-154492},
  doi =		{10.4230/LIPIcs.ISAAC.2021.16},
  annote =	{Keywords: 1-planar graphs, edge partitions, algorithms, data structures}
}
Document
On the Kernel and Related Problems in Interval Digraphs

Authors: Mathew C. Francis, Pavol Hell, and Dalu Jacob


Abstract
Given a digraph G, a set X ⊆ V(G) is said to be an absorbing set (resp. dominating set) if every vertex in the graph is either in X or is an in-neighbour (resp. out-neighbour) of a vertex in X. A set S ⊆ V(G) is said to be an independent set if no two vertices in S are adjacent in G. A kernel (resp. solution) of G is an independent and absorbing (resp. dominating) set in G. The problem of deciding if there is a kernel (or solution) in an input digraph is known to be NP-complete. Similarly, the problems of computing a minimum cardinality kernel, absorbing set (or dominating set) and the problems of computing a maximum cardinality kernel, independent set are all known to be NP-hard for general digraphs. We explore the algorithmic complexity of these problems in the well known class of interval digraphs. A digraph G is an interval digraph if a pair of intervals (S_u,T_u) can be assigned to each vertex u of G such that (u,v) ∈ E(G) if and only if S_u ∩ T_v ≠ ∅. Many different subclasses of interval digraphs have been defined and studied in the literature by restricting the kinds of pairs of intervals that can be assigned to the vertices. We observe that several of these classes, like interval catch digraphs, interval nest digraphs, adjusted interval digraphs and chronological interval digraphs, are subclasses of the more general class of reflexive interval digraphs - which arise when we require that the two intervals assigned to a vertex have to intersect. We see as our main contribution the identification of the class of reflexive interval digraphs as an important class of digraphs. We show that all the problems mentioned above are efficiently solvable, in most of the cases even linear-time solvable, in the class of reflexive interval digraphs, but are APX-hard on even the very restricted class of interval digraphs called point-point digraphs, where the two intervals assigned to each vertex are required to be degenerate, i.e. they consist of a single point each. The results we obtain improve and generalize several existing algorithms and structural results for reflexive interval digraphs. We also obtain some new results for undirected graphs along the way: (a) We get an O(n(n+m)) time algorithm for computing a minimum cardinality (undirected) independent dominating set in cocomparability graphs, which slightly improves the existing O(n³) time algorithm for the same problem by Kratsch and Stewart; and (b) We show that the Red Blue Dominating Set problem, which is NP-complete even for planar bipartite graphs, is linear-time solvable on interval bigraphs, which is a class of bipartite (undirected) graphs closely related to interval digraphs.

Cite as

Mathew C. Francis, Pavol Hell, and Dalu Jacob. On the Kernel and Related Problems in Interval Digraphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 17:1-17:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{francis_et_al:LIPIcs.ISAAC.2021.17,
  author =	{Francis, Mathew C. and Hell, Pavol and Jacob, Dalu},
  title =	{{On the Kernel and Related Problems in Interval Digraphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{17:1--17:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.17},
  URN =		{urn:nbn:de:0030-drops-154505},
  doi =		{10.4230/LIPIcs.ISAAC.2021.17},
  annote =	{Keywords: Interval digraphs, kernel, absorbing set, dominating set, independent set, algorithms, approximation hardness}
}
Document
Interval Query Problem on Cube-Free Median Graphs

Authors: Soh Kumabe


Abstract
In this paper, we introduce the interval query problem on cube-free median graphs. Let G be a cube-free median graph and 𝒮 be a commutative semigroup. For each vertex v in G, we are given an element p(v) in 𝒮. For each query, we are given two vertices u,v in G and asked to calculate the sum of p(z) over all vertices z belonging to a u-v shortest path. This is a common generalization of range query problems on trees and grids. In this paper, we provide an algorithm to answer each interval query in O(log² n) time. The required data structure is constructed in O(n log³ n) time and O(n log² n) space. To obtain our algorithm, we introduce a new technique, named the staircases decomposition, to decompose an interval of cube-free median graphs into simpler substructures.

Cite as

Soh Kumabe. Interval Query Problem on Cube-Free Median Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 18:1-18:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{kumabe:LIPIcs.ISAAC.2021.18,
  author =	{Kumabe, Soh},
  title =	{{Interval Query Problem on Cube-Free Median Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{18:1--18:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.18},
  URN =		{urn:nbn:de:0030-drops-154510},
  doi =		{10.4230/LIPIcs.ISAAC.2021.18},
  annote =	{Keywords: Data Structures, Range Query Problems, Median Graphs}
}
Document
Untangling Circular Drawings: Algorithms and Complexity

Authors: Sujoy Bhore, Guangping Li, Martin Nöllenburg, Ignaz Rutter, and Hsiang-Yun Wu


Abstract
We consider the problem of untangling a given (non-planar) straight-line circular drawing δ_G of an outerplanar graph G = (V,E) into a planar straight-line circular drawing by shifting a minimum number of vertices to a new position on the circle. For an outerplanar graph G, it is clear that such a crossing-free circular drawing always exists and we define the circular shifting number shift°(δ_G) as the minimum number of vertices that need to be shifted to resolve all crossings of δ_G. We show that the problem Circular Untangling, asking whether shift°(δ_G) ≤ K for a given integer K, is NP-complete. Based on this result we study Circular Untangling for almost-planar circular drawings, in which a single edge is involved in all the crossings. In this case we provide a tight upper bound shift°(δ_G) ≤ ⌊n/2⌋-1, where n is the number of vertices in G, and present a polynomial-time algorithm to compute the circular shifting number of almost-planar drawings.

Cite as

Sujoy Bhore, Guangping Li, Martin Nöllenburg, Ignaz Rutter, and Hsiang-Yun Wu. Untangling Circular Drawings: Algorithms and Complexity. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 19:1-19:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bhore_et_al:LIPIcs.ISAAC.2021.19,
  author =	{Bhore, Sujoy and Li, Guangping and N\"{o}llenburg, Martin and Rutter, Ignaz and Wu, Hsiang-Yun},
  title =	{{Untangling Circular Drawings: Algorithms and Complexity}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{19:1--19:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.19},
  URN =		{urn:nbn:de:0030-drops-154528},
  doi =		{10.4230/LIPIcs.ISAAC.2021.19},
  annote =	{Keywords: graph drawing, straight-line drawing, outerplanarity, NP-hardness, untangling}
}
Document
Algorithms and Complexity on Indexing Elastic Founder Graphs

Authors: Massimo Equi, Tuukka Norri, Jarno Alanko, Bastien Cazaux, Alexandru I. Tomescu, and Veli Mäkinen


Abstract
We study the problem of matching a string in a labeled graph. Previous research has shown that unless the Orthogonal Vectors Hypothesis (OVH) is false, one cannot solve this problem in strongly sub-quadratic time, nor index the graph in polynomial time to answer queries efficiently (Equi et al. ICALP 2019, SOFSEM 2021). These conditional lower-bounds cover even deterministic graphs with binary alphabet, but there naturally exist also graph classes that are easy to index: E.g. Wheeler graphs (Gagie et al. Theor. Comp. Sci. 2017) cover graphs admitting a Burrows-Wheeler transform -based indexing scheme. However, it is NP-complete to recognize if a graph is a Wheeler graph (Gibney, Thankachan, ESA 2019). We propose an approach to alleviate the construction bottleneck of Wheeler graphs. Rather than starting from an arbitrary graph, we study graphs induced from multiple sequence alignments. Elastic degenerate strings (Bernadini et al. SPIRE 2017, ICALP 2019) can be seen as such graphs, and we introduce here their generalization: elastic founder graphs. We first prove that even such induced graphs are hard to index under OVH. Then we introduce two subclasses that are easy to index. Moreover, we give a near-linear time algorithm to construct indexable elastic founder graphs. This algorithm is based on an earlier segmentation algorithm for gapless multiple sequence alignments inducing non-elastic founder graphs (Mäkinen et al., WABI 2020), but uses more involved techniques to cope with repetitive string collections synchronized with gaps. Finally, we show that one of the subclasses admits a reduction to Wheeler graphs in polynomial time.

Cite as

Massimo Equi, Tuukka Norri, Jarno Alanko, Bastien Cazaux, Alexandru I. Tomescu, and Veli Mäkinen. Algorithms and Complexity on Indexing Elastic Founder Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 20:1-20:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{equi_et_al:LIPIcs.ISAAC.2021.20,
  author =	{Equi, Massimo and Norri, Tuukka and Alanko, Jarno and Cazaux, Bastien and Tomescu, Alexandru I. and M\"{a}kinen, Veli},
  title =	{{Algorithms and Complexity on Indexing Elastic Founder Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{20:1--20:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.20},
  URN =		{urn:nbn:de:0030-drops-154532},
  doi =		{10.4230/LIPIcs.ISAAC.2021.20},
  annote =	{Keywords: orthogonal vectors hypothesis, multiple sequence alignment, segmentation}
}
Document
Partitioning H-Free Graphs of Bounded Diameter

Authors: Christoph Brause, Petr Golovach, Barnaby Martin, Daniël Paulusma, and Siani Smith


Abstract
A natural way of increasing our understanding of NP-complete graph problems is to restrict the input to a special graph class. Classes of H-free graphs, that is, graphs that do not contain some graph H as an induced subgraph, have proven to be an ideal testbed for such a complexity study. However, if the forbidden graph H contains a cycle or claw, then these problems often stay NP-complete. A recent complexity study (MFCS 2019) on the k-Colouring problem shows that we may still obtain tractable results if we also bound the diameter of the H-free input graph. We continue this line of research by initiating a complexity study on the impact of bounding the diameter for a variety of classical vertex partitioning problems restricted to H-free graphs. We prove that bounding the diameter does not help for Independent Set, but leads to new tractable cases for problems closely related to 3-Colouring. That is, we show that Near-Bipartiteness, Independent Feedback Vertex Set, Independent Odd Cycle Transversal, Acyclic 3-Colouring and Star 3-Colouring are all polynomial-time solvable for chair-free graphs of bounded diameter. To obtain these results we exploit a new structural property of 3-colourable chair-free graphs.

Cite as

Christoph Brause, Petr Golovach, Barnaby Martin, Daniël Paulusma, and Siani Smith. Partitioning H-Free Graphs of Bounded Diameter. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 21:1-21:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{brause_et_al:LIPIcs.ISAAC.2021.21,
  author =	{Brause, Christoph and Golovach, Petr and Martin, Barnaby and Paulusma, Dani\"{e}l and Smith, Siani},
  title =	{{Partitioning H-Free Graphs of Bounded Diameter}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{21:1--21:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.21},
  URN =		{urn:nbn:de:0030-drops-154543},
  doi =		{10.4230/LIPIcs.ISAAC.2021.21},
  annote =	{Keywords: vertex partitioning problem, H-free, diameter, complexity dichotomy}
}
Document
Clique-Based Separators for Geometric Intersection Graphs

Authors: Mark de Berg, Sándor Kisfaludi-Bak, Morteza Monemizadeh, and Leonidas Theocharous


Abstract
Let F be a set of n objects in the plane and let 𝒢^{×}(F) be its intersection graph. A balanced clique-based separator of 𝒢^{×}(F) is a set 𝒮 consisting of cliques whose removal partitions 𝒢^{×}(F) into components of size at most δ n, for some fixed constant δ < 1. The weight of a clique-based separator is defined as ∑_{C ∈ 𝒮}log (|C|+1). Recently De Berg et al. (SICOMP 2020) proved that if S consists of convex fat objects, then 𝒢^{×}(F) admits a balanced clique-based separator of weight O(√n). We extend this result in several directions, obtaining the following results. - Map graphs admit a balanced clique-based separator of weight O(√n), which is tight in the worst case. - Intersection graphs of pseudo-disks admit a balanced clique-based separator of weight O(n^{2/3} log n). If the pseudo-disks are polygonal and of total complexity O(n) then the weight of the separator improves to O(√n log n). - Intersection graphs of geodesic disks inside a simple polygon admit a balanced clique-based separator of weight O(n^{2/3} log n). - Visibility-restricted unit-disk graphs in a polygonal domain with r reflex vertices admit a balanced clique-based separator of weight O(√n + r log(n/r)), which is tight in the worst case. These results immediately imply sub-exponential algorithms for MAXIMUM INDEPENDENT SET (and, hence, VERTEX COVER), for FEEDBACK VERTEX SET, and for q-Coloring for constant q in these graph classes.

Cite as

Mark de Berg, Sándor Kisfaludi-Bak, Morteza Monemizadeh, and Leonidas Theocharous. Clique-Based Separators for Geometric Intersection Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 22:1-22:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{deberg_et_al:LIPIcs.ISAAC.2021.22,
  author =	{de Berg, Mark and Kisfaludi-Bak, S\'{a}ndor and Monemizadeh, Morteza and Theocharous, Leonidas},
  title =	{{Clique-Based Separators for Geometric Intersection Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{22:1--22:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.22},
  URN =		{urn:nbn:de:0030-drops-154556},
  doi =		{10.4230/LIPIcs.ISAAC.2021.22},
  annote =	{Keywords: Computational geometry, intersection graphs, separator theorems}
}
Document
Near-Optimal Distance Oracles for Vertex-Labeled Planar Graphs

Authors: Jacob Evald, Viktor Fredslund-Hansen, and Christian Wulff-Nilsen


Abstract
Given an undirected n-vertex planar graph G = (V,E,ω) with non-negative edge weight function ω:E → ℝ and given an assigned label to each vertex, a vertex-labeled distance oracle is a data structure which for any query consisting of a vertex u and a label λ reports the shortest path distance from u to the nearest vertex with label λ. We show that if there is a distance oracle for undirected n-vertex planar graphs with non-negative edge weights using s(n) space and with query time q(n), then there is a vertex-labeled distance oracle with Õ(s(n)) space and Õ(q(n)) query time. Using the state-of-the-art distance oracle of Long and Pettie [Long and Pettie, 2021], our construction produces a vertex-labeled distance oracle using n^{1+o(1)} space and query time Õ(1) at one extreme, Õ(n) space and n^o(1) query time at the other extreme, as well as such oracles for the full tradeoff between space and query time obtained in their paper. This is the first non-trivial exact vertex-labeled distance oracle for planar graphs and, to our knowledge, for any interesting graph class other than trees.

Cite as

Jacob Evald, Viktor Fredslund-Hansen, and Christian Wulff-Nilsen. Near-Optimal Distance Oracles for Vertex-Labeled Planar Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 23:1-23:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{evald_et_al:LIPIcs.ISAAC.2021.23,
  author =	{Evald, Jacob and Fredslund-Hansen, Viktor and Wulff-Nilsen, Christian},
  title =	{{Near-Optimal Distance Oracles for Vertex-Labeled Planar Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{23:1--23:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.23},
  URN =		{urn:nbn:de:0030-drops-154566},
  doi =		{10.4230/LIPIcs.ISAAC.2021.23},
  annote =	{Keywords: distance oracle, vertex labels, color distance oracle, planar graph}
}
Document
A Characterization of Individualization-Refinement Trees

Authors: Markus Anders, Jendrik Brachter, and Pascal Schweitzer


Abstract
Individualization-Refinement (IR) algorithms form the standard method and currently the only practical method for symmetry computations of graphs and combinatorial objects in general. Through backtracking, on each graph an IR-algorithm implicitly creates an IR-tree whose order is the determining factor of the running time of the algorithm. We give a precise and constructive characterization which trees are IR-trees. This characterization is applicable both when the tree is regarded as an uncolored object but also when regarded as a colored object where vertex colors stem from a node invariant. We also provide a construction that given a tree produces a corresponding graph whenever possible. This provides a constructive proof that our necessary conditions are also sufficient for the characterization.

Cite as

Markus Anders, Jendrik Brachter, and Pascal Schweitzer. A Characterization of Individualization-Refinement Trees. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 24:1-24:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{anders_et_al:LIPIcs.ISAAC.2021.24,
  author =	{Anders, Markus and Brachter, Jendrik and Schweitzer, Pascal},
  title =	{{A Characterization of Individualization-Refinement Trees}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{24:1--24:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.24},
  URN =		{urn:nbn:de:0030-drops-154578},
  doi =		{10.4230/LIPIcs.ISAAC.2021.24},
  annote =	{Keywords: individualization refinement algorithms, backtracking trees, graph isomorphism}
}
Document
Truly Subquadratic Exact Distance Oracles with Constant Query Time for Planar Graphs

Authors: Viktor Fredslund-Hansen, Shay Mozes, and Christian Wulff-Nilsen


Abstract
We present a truly subquadratic size distance oracle for reporting, in constant time, the exact shortest-path distance between any pair of vertices of an undirected, unweighted planar graph G. For any ε > 0, our distance oracle requires O(n^{5/3+ε}) space and is capable of answering shortest-path distance queries exactly for any pair of vertices of G in worst-case time O(log (1/ε)). Previously no truly sub-quadratic size distance oracles with constant query time for answering exact shortest paths distance queries existed.

Cite as

Viktor Fredslund-Hansen, Shay Mozes, and Christian Wulff-Nilsen. Truly Subquadratic Exact Distance Oracles with Constant Query Time for Planar Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 25:1-25:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{fredslundhansen_et_al:LIPIcs.ISAAC.2021.25,
  author =	{Fredslund-Hansen, Viktor and Mozes, Shay and Wulff-Nilsen, Christian},
  title =	{{Truly Subquadratic Exact Distance Oracles with Constant Query Time for Planar Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{25:1--25:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.25},
  URN =		{urn:nbn:de:0030-drops-154586},
  doi =		{10.4230/LIPIcs.ISAAC.2021.25},
  annote =	{Keywords: distance oracle, planar graph, shortest paths, subquadratic}
}
Document
Interval Edge Coloring of Bipartite Graphs with Small Vertex Degrees

Authors: Anna Małafiejska, Michał Małafiejski, Krzysztof M. Ocetkiewicz, and Krzysztof Pastuszak


Abstract
An edge coloring of a graph G is called interval edge coloring if for each v ∈ V(G) the set of colors on edges incident to v forms an interval of integers. A graph G is interval colorable if there is an interval coloring of G. For an interval colorable graph G, by the interval chromatic index of G, denoted by χ'_i(G), we mean the smallest number k such that G is interval colorable with k colors. A bipartite graph G is called (α,β)-biregular if each vertex in one part has degree α and each vertex in the other part has degree β. A graph G is called (α*,β*)-bipartite if G is a subgraph of an (α,β)-biregular graph and the maximum degree in one part is α and the maximum degree in the other part is β. In the paper we study the problem of interval edge colorings of (k*,2*)-bipartite graphs, for k ∈ {3,4,5}, and of (5*,3*)-bipartite graphs. We prove that every (5*,2*)-bipartite graph admits an interval edge coloring using at most 6 colors, which can be found in O(n^{3/2}) time, and we prove that an interval edge 5-coloring of a (5*,2*)-bipartite graph can be found in O(n^{3/2}) time, if it exists. We show that every (4^*,2^*)-bipartite graph admits an interval edge 4-coloring, which can be found in O(n) time. The two following problems of interval edge coloring are known to be NP-complete: 6-coloring of (6,3)-biregular graphs (Asratian and Casselgren (2006)) and 5-coloring of (5*,5*)-bipartite graphs (Giaro (1997)). In the paper we prove NP-completeness of 5-coloring of (5*,3*)-bipartite graphs.

Cite as

Anna Małafiejska, Michał Małafiejski, Krzysztof M. Ocetkiewicz, and Krzysztof Pastuszak. Interval Edge Coloring of Bipartite Graphs with Small Vertex Degrees. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 26:1-26:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{malafiejska_et_al:LIPIcs.ISAAC.2021.26,
  author =	{Ma{\l}afiejska, Anna and Ma{\l}afiejski, Micha{\l} and Ocetkiewicz, Krzysztof M. and Pastuszak, Krzysztof},
  title =	{{Interval Edge Coloring of Bipartite Graphs with Small Vertex Degrees}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{26:1--26:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.26},
  URN =		{urn:nbn:de:0030-drops-154596},
  doi =		{10.4230/LIPIcs.ISAAC.2021.26},
  annote =	{Keywords: interval edge coloring, biregular graphs, coloring algorithm}
}
Document
Selected Neighbor Degree Forest Realization

Authors: Amotz Bar-Noy, David Peleg, Dror Rawitz, and Elad Yehezkel


Abstract
The classical degree realization problem is defined as follows: Given a sequence d̄ = (d_1,…,d_n) of positive integers, construct an n-vertex graph in which each vertex u_i has degree d_i (or decide that no such graph exists). In this article, we present and study the related selected neighbor degree realization problem, which requires that each vertex u_i of G has a neighbor of degree d_i. We solve the problem when G is required to be acyclic (i.e., a forest), and present a sufficient and necessary condition for a given sequence to be realizable.

Cite as

Amotz Bar-Noy, David Peleg, Dror Rawitz, and Elad Yehezkel. Selected Neighbor Degree Forest Realization. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 27:1-27:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{barnoy_et_al:LIPIcs.ISAAC.2021.27,
  author =	{Bar-Noy, Amotz and Peleg, David and Rawitz, Dror and Yehezkel, Elad},
  title =	{{Selected Neighbor Degree Forest Realization}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{27:1--27:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.27},
  URN =		{urn:nbn:de:0030-drops-154609},
  doi =		{10.4230/LIPIcs.ISAAC.2021.27},
  annote =	{Keywords: network realization, graph algorithms, lower bound}
}
Document
MAX CUT in Weighted Random Intersection Graphs and Discrepancy of Sparse Random Set Systems

Authors: Sotiris Nikoletseas, Christoforos Raptopoulos, and Paul Spirakis


Abstract
Let V be a set of n vertices, M a set of m labels, and let 𝐑 be an m × n matrix of independent Bernoulli random variables with probability of success p; columns of 𝐑 are incidence vectors of label sets assigned to vertices. A random instance G(V, E, 𝐑^T 𝐑) of the weighted random intersection graph model is constructed by drawing an edge with weight equal to the number of common labels (namely [𝐑^T 𝐑]_{v,u}) between any two vertices u, v for which this weight is strictly larger than 0. In this paper we study the average case analysis of Weighted Max Cut, assuming the input is a weighted random intersection graph, i.e. given G(V, E, 𝐑^T 𝐑) we wish to find a partition of V into two sets so that the total weight of the edges having exactly one endpoint in each set is maximized. In particular, we initially prove that the weight of a maximum cut of G(V, E, 𝐑^T 𝐑) is concentrated around its expected value, and then show that, when the number of labels is much smaller than the number of vertices (in particular, m = n^α, α < 1), a random partition of the vertices achieves asymptotically optimal cut weight with high probability. Furthermore, in the case n = m and constant average degree (i.e. p = Θ(1)/n), we show that with high probability, a majority type randomized algorithm outputs a cut with weight that is larger than the weight of a random cut by a multiplicative constant strictly larger than 1. Then, we formally prove a connection between the computational problem of finding a (weighted) maximum cut in G(V, E, 𝐑^T 𝐑) and the problem of finding a 2-coloring that achieves minimum discrepancy for a set system Σ with incidence matrix 𝐑 (i.e. minimum imbalance over all sets in Σ). We exploit this connection by proposing a (weak) bipartization algorithm for the case m = n, p = Θ(1)/n that, when it terminates, its output can be used to find a 2-coloring with minimum discrepancy in a set system with incidence matrix 𝐑. In fact, with high probability, the latter 2-coloring corresponds to a bipartition with maximum cut-weight in G(V, E, 𝐑^T 𝐑). Finally, we prove that our (weak) bipartization algorithm terminates in polynomial time, with high probability, at least when p = c/n, c < 1.

Cite as

Sotiris Nikoletseas, Christoforos Raptopoulos, and Paul Spirakis. MAX CUT in Weighted Random Intersection Graphs and Discrepancy of Sparse Random Set Systems. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 28:1-28:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{nikoletseas_et_al:LIPIcs.ISAAC.2021.28,
  author =	{Nikoletseas, Sotiris and Raptopoulos, Christoforos and Spirakis, Paul},
  title =	{{MAX CUT in Weighted Random Intersection Graphs and Discrepancy of Sparse Random Set Systems}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{28:1--28:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.28},
  URN =		{urn:nbn:de:0030-drops-154612},
  doi =		{10.4230/LIPIcs.ISAAC.2021.28},
  annote =	{Keywords: Random Intersection Graphs, Maximum Cut, Discrepancy}
}
Document
The Impact of Geometry on Monochrome Regions in the Flip Schelling Process

Authors: Thomas Bläsius, Tobias Friedrich, Martin S. Krejca, and Louise Molitor


Abstract
Schelling’s classical segregation model gives a coherent explanation for the wide-spread phenomenon of residential segregation. We introduce an agent-based saturated open-city variant, the Flip Schelling Process (FSP), in which agents, placed on a graph, have one out of two types and, based on the predominant type in their neighborhood, decide whether to change their types; similar to a new agent arriving as soon as another agent leaves the vertex. We investigate the probability that an edge {u,v} is monochrome, i.e., that both vertices u and v have the same type in the FSP, and we provide a general framework for analyzing the influence of the underlying graph topology on residential segregation. In particular, for two adjacent vertices, we show that a highly decisive common neighborhood, i.e., a common neighborhood where the absolute value of the difference between the number of vertices with different types is high, supports segregation and, moreover, that large common neighborhoods are more decisive. As an application, we study the expected behavior of the FSP on two common random graph models with and without geometry: (1) For random geometric graphs, we show that the existence of an edge {u,v} makes a highly decisive common neighborhood for u and v more likely. Based on this, we prove the existence of a constant c > 0 such that the expected fraction of monochrome edges after the FSP is at least 1/2 + c. (2) For Erdős-Rényi graphs we show that large common neighborhoods are unlikely and that the expected fraction of monochrome edges after the FSP is at most 1/2 + o(1). Our results indicate that the cluster structure of the underlying graph has a significant impact on the obtained segregation strength.

Cite as

Thomas Bläsius, Tobias Friedrich, Martin S. Krejca, and Louise Molitor. The Impact of Geometry on Monochrome Regions in the Flip Schelling Process. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 29:1-29:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{blasius_et_al:LIPIcs.ISAAC.2021.29,
  author =	{Bl\"{a}sius, Thomas and Friedrich, Tobias and Krejca, Martin S. and Molitor, Louise},
  title =	{{The Impact of Geometry on Monochrome Regions in the Flip Schelling Process}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{29:1--29:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.29},
  URN =		{urn:nbn:de:0030-drops-154623},
  doi =		{10.4230/LIPIcs.ISAAC.2021.29},
  annote =	{Keywords: Agent-based Model, Schelling Segregation, Spin System}
}
Document
Piecewise-Linear Farthest-Site Voronoi Diagrams

Authors: Franz Aurenhammer, Evanthia Papadopoulou, and Martin Suderland


Abstract
Voronoi diagrams induced by distance functions whose unit balls are convex polyhedra are piecewise-linear structures. Nevertheless, analyzing their combinatorial and algorithmic properties in dimensions three and higher is an intriguing problem. The situation turns easier when the farthest-site variants of such Voronoi diagrams are considered, where each site gets assigned the region of all points in space farthest from (rather than closest to) it. We give asymptotically tight upper and lower worst-case bounds on the combinatorial size of farthest-site Voronoi diagrams for convex polyhedral distance functions in general dimensions, and propose an optimal construction algorithm. Our approach is uniform in the sense that (1) it can be extended from point sites to sites that are convex polyhedra, (2) it covers the case where the distance function is additively and/or multiplicatively weighted, and (3) it allows an anisotropic scenario where each site gets allotted its particular convex distance polytope.

Cite as

Franz Aurenhammer, Evanthia Papadopoulou, and Martin Suderland. Piecewise-Linear Farthest-Site Voronoi Diagrams. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 30:1-30:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{aurenhammer_et_al:LIPIcs.ISAAC.2021.30,
  author =	{Aurenhammer, Franz and Papadopoulou, Evanthia and Suderland, Martin},
  title =	{{Piecewise-Linear Farthest-Site Voronoi Diagrams}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{30:1--30:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.30},
  URN =		{urn:nbn:de:0030-drops-154633},
  doi =		{10.4230/LIPIcs.ISAAC.2021.30},
  annote =	{Keywords: Voronoi diagram, farthest-site, polyhedral distance, polyhedral sites, general dimensions}
}
Document
Effective Resistance and Capacitance in Simplicial Complexes and a Quantum Algorithm

Authors: Mitchell Black and William Maxwell


Abstract
We investigate generalizations of the graph theoretic notions of effective resistance and capacitance to simplicial complexes and prove analogs of formulas known in the case of graphs. In graphs the effective resistance between two vertices is O(n); however, we show that in a simplicial complex the effective resistance of a null-homologous cycle may be exponential. This is caused by relative torsion in the simplicial complex. We provide upper bounds on both effective resistance and capacitance that are polynomial in the number of simplices as well as the maximum cardinality of the torsion subgroup of a relative homology group denoted 𝒯_{max}(𝒦). We generalize the quantum algorithm deciding st-connectivity in a graph and obtain an algorithm deciding whether or not a (d-1)-dimensional cycle γ is null-homologous in a d-dimensional simplicial complex 𝒦. The quantum algorithm has query complexity parameterized by the effective resistance and capacitance of γ. Using our upper bounds we find that the query complexity is O (n^{5/2}⋅ d^{1/2} ⋅ 𝒯_{max}(𝒦)²). Under the assumptions that γ is the boundary of a d-simplex (which may or may not be included in the complex) and that 𝒦 is relative torsion-free, we match the O(n^{3/2}) query complexity obtained for st-connectivity. These assumptions always hold in the case of st-connectivity. We provide an implementation of the algorithm whose running time is polynomial in the size of the complex and the relative torsion. Finally, we prove a duality theorem relating effective resistance and capacitance when 𝒦 is d-dimensional and admits an embedding into ℝ^{d+1}.

Cite as

Mitchell Black and William Maxwell. Effective Resistance and Capacitance in Simplicial Complexes and a Quantum Algorithm. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 31:1-31:27, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{black_et_al:LIPIcs.ISAAC.2021.31,
  author =	{Black, Mitchell and Maxwell, William},
  title =	{{Effective Resistance and Capacitance in Simplicial Complexes and a Quantum Algorithm}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{31:1--31:27},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.31},
  URN =		{urn:nbn:de:0030-drops-154641},
  doi =		{10.4230/LIPIcs.ISAAC.2021.31},
  annote =	{Keywords: Simplicial complexes, quantum computing}
}
Document
Anonymity-Preserving Space Partitions

Authors: Úrsula Hébert-Johnson, Chinmay Sonar, Subhash Suri, and Vaishali Surianarayanan


Abstract
We consider a multidimensional space partitioning problem, which we call Anonymity-Preserving Partition. Given a set P of n points in ℝ^d and a collection H of m axis-parallel hyperplanes, the hyperplanes of H partition the space into an arrangement A(H) of rectangular cells. Given an integer parameter t > 0, we call a cell C in this arrangement deficient if 0 < |C ∩ P| < t; that is, the cell contains at least one but fewer than t data points of P. Our problem is to remove the minimum number of hyperplanes from H so that there are no deficient cells. We show that the problem is NP-complete for all dimensions d ≥ 2. We present a polynomial-time d-approximation algorithm, for any fixed d, and we also show that the problem can be solved exactly in time (2d-0.924)^k m^O(1) + O(n), where k is the solution size. The one-dimensional case of the problem, where all hyperplanes are parallel, can be solved optimally in polynomial time, but we show that a related Interval Anonymity problem is NP-complete even in one dimension.

Cite as

Úrsula Hébert-Johnson, Chinmay Sonar, Subhash Suri, and Vaishali Surianarayanan. Anonymity-Preserving Space Partitions. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 32:1-32:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{hebertjohnson_et_al:LIPIcs.ISAAC.2021.32,
  author =	{H\'{e}bert-Johnson, \'{U}rsula and Sonar, Chinmay and Suri, Subhash and Surianarayanan, Vaishali},
  title =	{{Anonymity-Preserving Space Partitions}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{32:1--32:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.32},
  URN =		{urn:nbn:de:0030-drops-154654},
  doi =		{10.4230/LIPIcs.ISAAC.2021.32},
  annote =	{Keywords: Anonymity, Hitting Set, LP, Constant Approximation, Fixed-Parameter Tractable, Space Partitions, Parameterized Complexity}
}
Document
Impatient PPSZ - A Faster Algorithm for CSP

Authors: Shibo Li and Dominik Scheder


Abstract
PPSZ is the fastest known algorithm for (d,k)-CSP problems, for most values of d and k. It goes through the variables in random order and sets each variable randomly to one of the d colors, excluding those colors that can be ruled out by looking at few constraints at a time. We propose and analyze a modification of PPSZ: whenever all but 2 colors can be ruled out for some variable, immediately set that variable randomly to one of the remaining colors. We show that our new "impatient PPSZ" outperforms PPSZ exponentially for all k and all d ≥ 3 on formulas with a unique satisfying assignment.

Cite as

Shibo Li and Dominik Scheder. Impatient PPSZ - A Faster Algorithm for CSP. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 33:1-33:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{li_et_al:LIPIcs.ISAAC.2021.33,
  author =	{Li, Shibo and Scheder, Dominik},
  title =	{{Impatient PPSZ - A Faster Algorithm for CSP}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{33:1--33:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.33},
  URN =		{urn:nbn:de:0030-drops-154662},
  doi =		{10.4230/LIPIcs.ISAAC.2021.33},
  annote =	{Keywords: Randomized algorithms, Constraint Satisfaction Problems, exponential-time algorithms}
}
Document
Fine-Grained Meta-Theorems for Vertex Integrity

Authors: Michael Lampis and Valia Mitsou


Abstract
Vertex Integrity is a graph measure which sits squarely between two more well-studied notions, namely vertex cover and tree-depth, and that has recently gained attention as a structural graph parameter. In this paper we investigate the algorithmic trade-offs involved with this parameter from the point of view of algorithmic meta-theorems for First-Order (FO) and Monadic Second Order (MSO) logic. Our positive results are the following: (i) given a graph G of vertex integrity k and an FO formula ϕ with q quantifiers, deciding if G satisfies ϕ can be done in time 2^O(k²q + q log q) + n^O(1); (ii) for MSO formulas with q quantifiers, the same can be done in time 2^{2^O(k²+kq)} + n^O(1). Both results are obtained using kernelization arguments, which pre-process the input to sizes 2^O(k²)q and 2^O(k²+kq) respectively. The complexities of our meta-theorems are significantly better than the corresponding meta-theorems for tree-depth, which involve towers of exponentials. However, they are worse than the roughly 2^{O(kq)} and 2^{2^{O(k+q)}} complexities known for corresponding meta-theorems for vertex cover. To explain this deterioration we present two formula constructions which lead to fine-grained complexity lower bounds and establish that the dependence of our meta-theorems on k is best possible. More precisely, we show that it is not possible to decide FO formulas with q quantifiers in time 2^o(k²q), and that there exists a constant-size MSO formula which cannot be decided in time 2^{2^o(k²)}, both under the ETH. Hence, the quadratic blow-up in the dependence on k is unavoidable and vertex integrity has a complexity for FO and MSO logic which is truly intermediate between vertex cover and tree-depth.

Cite as

Michael Lampis and Valia Mitsou. Fine-Grained Meta-Theorems for Vertex Integrity. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 34:1-34:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{lampis_et_al:LIPIcs.ISAAC.2021.34,
  author =	{Lampis, Michael and Mitsou, Valia},
  title =	{{Fine-Grained Meta-Theorems for Vertex Integrity}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{34:1--34:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.34},
  URN =		{urn:nbn:de:0030-drops-154674},
  doi =		{10.4230/LIPIcs.ISAAC.2021.34},
  annote =	{Keywords: Model-Checking, Fine-grained complexity, Vertex Integrity}
}
Document
Essentially Tight Kernels For (Weakly) Closed Graphs

Authors: Tomohiro Koana, Christian Komusiewicz, and Frank Sommer


Abstract
We study kernelization of classic hard graph problems when the input graphs fulfill triadic closure properties. More precisely, we consider the recently introduced parameters closure number c and weak closure number γ [Fox et al., SICOMP 2020] in addition to the standard parameter solution size k. The weak closure number γ of a graph is upper-bounded by the minimum of its closure number c and its degeneracy d. For Capacitated Vertex Cover, Connected Vertex Cover, and Induced Matching we obtain the first kernels of size k^𝒪(γ), k^𝒪(γ), and (γk)^𝒪(γ), respectively. This extends previous results on the kernelization of these problems on degenerate graphs. These kernels are essentially tight as these problems are unlikely to admit kernels of size k^o(γ) by previous results on their kernelization complexity in degenerate graphs [Cygan et al., ACM TALG 2017]. For Capacitated Vertex Cover, we show that even a kernel of size k^o(c) is unlikely. In contrast, for Connected Vertex Cover, we obtain a problem kernel with 𝒪(ck²) vertices. Moreover, we prove that searching for an induced subgraph of order at least k belonging to a hereditary graph class 𝒢 admits a kernel of size k^𝒪(γ) when 𝒢 contains all complete and all edgeless graphs. Finally, we provide lower bounds for the kernelization of Independent Set on graphs with constant closure number c and kernels for Dominating Set on weakly closed split graphs and weakly closed bipartite graphs.

Cite as

Tomohiro Koana, Christian Komusiewicz, and Frank Sommer. Essentially Tight Kernels For (Weakly) Closed Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 35:1-35:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{koana_et_al:LIPIcs.ISAAC.2021.35,
  author =	{Koana, Tomohiro and Komusiewicz, Christian and Sommer, Frank},
  title =	{{Essentially Tight Kernels For (Weakly) Closed Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{35:1--35:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.35},
  URN =		{urn:nbn:de:0030-drops-154681},
  doi =		{10.4230/LIPIcs.ISAAC.2021.35},
  annote =	{Keywords: Fixed-parameter tractability, kernelization, c-closure, weak \gamma-closure, Independent Set, Induced Matching, Connected Vertex Cover, Ramsey numbers, Dominating Set}
}
Document
Filling Crosswords Is Very Hard

Authors: Laurent Gourvès, Ararat Harutyunyan, Michael Lampis, and Nikolaos Melissinos


Abstract
We revisit a classical crossword filling puzzle which already appeared in Garey&Jonhson’s book. We are given a grid with n vertical and horizontal slots and a dictionary with m words and are asked to place words from the dictionary in the slots so that shared cells are consistent. We attempt to pinpoint the source of intractability of this problem by carefully taking into account the structure of the grid graph, which contains a vertex for each slot and an edge if two slots intersect. Our main approach is to consider the case where this graph has a tree-like structure. Unfortunately, if we impose the common rule that words cannot be reused, we discover that the problem remains NP-hard under very severe structural restrictions, namely, if the grid graph is a union of stars and the alphabet has size 2, or the grid graph is a matching (so the crossword is a collection of disjoint crosses) and the alphabet has size 3. The problem does become slightly more tractable if word reuse is allowed, as we obtain an m^{tw} algorithm in this case, where tw is the treewidth of the grid graph. However, even in this case, we show that our algorithm cannot be improved to obtain fixed-parameter tractability. More strongly, we show that under the ETH the problem cannot be solved in time m^o(k), where k is the number of horizontal slots of the instance (which trivially bounds tw). Motivated by these mostly negative results, we also consider the much more restricted case where the problem is parameterized by the number of slots n. Here, we show that the problem does become FPT (if the alphabet has constant size), but the parameter dependence is exponential in n². We show that this dependence is also justified: the existence of an algorithm with running time 2^o(n²), even for binary alphabet, would contradict the randomized ETH. Finally, we consider an optimization version of the problem, where we seek to place as many words on the grid as possible. Here it is easy to obtain a 1/2-approximation, even on weighted instances, simply by considering only horizontal or only vertical slots. We show that this trivial algorithm is also likely to be optimal, as obtaining a better approximation ratio in polynomial time would contradict the Unique Games Conjecture. The latter two results apply whether word reuse is allowed or not.

Cite as

Laurent Gourvès, Ararat Harutyunyan, Michael Lampis, and Nikolaos Melissinos. Filling Crosswords Is Very Hard. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 36:1-36:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{gourves_et_al:LIPIcs.ISAAC.2021.36,
  author =	{Gourv\`{e}s, Laurent and Harutyunyan, Ararat and Lampis, Michael and Melissinos, Nikolaos},
  title =	{{Filling Crosswords Is Very Hard}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{36:1--36:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.36},
  URN =		{urn:nbn:de:0030-drops-154690},
  doi =		{10.4230/LIPIcs.ISAAC.2021.36},
  annote =	{Keywords: Crossword Puzzle, Treewidth, ETH}
}
Document
Grid Recognition: Classical and Parameterized Computational Perspectives

Authors: Siddharth Gupta, Guy Sa'ar, and Meirav Zehavi


Abstract
Grid graphs, and, more generally, k×r grid graphs, form one of the most basic classes of geometric graphs. Over the past few decades, a large body of works studied the (in)tractability of various computational problems on grid graphs, which often yield substantially faster algorithms than general graphs. Unfortunately, the recognition of a grid graph (given a graph G, decide whether it can be embedded into a grid graph) is particularly hard - it was shown to be NP-hard even on trees of pathwidth 3 already in 1987. Yet, in this paper, we provide several positive results in this regard in the framework of parameterized complexity (additionally, we present new and complementary hardness results). Specifically, our contribution is threefold. First, we show that the problem is fixed-parameter tractable (FPT) parameterized by k+mcc where mcc is the maximum size of a connected component of G. This also implies that the problem is FPT parameterized by td+k where td is the treedepth of G, as td ≤ mcc (to be compared with the hardness for pathwidth 2 where k = 3). (We note that when k and r are unrestricted, the problem is trivially FPT parameterized by td.) Further, we derive as a corollary that strip packing is FPT with respect to the height of the strip plus the maximum of the dimensions of the packed rectangles, which was previously only known to be in XP. Second, we present a new parameterization, denoted a_G, relating graph distance to geometric distance, which may be of independent interest. We show that the problem is para-NP-hard parameterized by a_G, but FPT parameterized by a_G on trees, as well as FPT parameterized by k+a_G. Third, we show that the recognition of k× r grid graphs is NP-hard on graphs of pathwidth 2 where k = 3. Moreover, when k and r are unrestricted, we show that the problem is NP-hard on trees of pathwidth 2, but trivially solvable in polynomial time on graphs of pathwidth 1.

Cite as

Siddharth Gupta, Guy Sa'ar, and Meirav Zehavi. Grid Recognition: Classical and Parameterized Computational Perspectives. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 37:1-37:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{gupta_et_al:LIPIcs.ISAAC.2021.37,
  author =	{Gupta, Siddharth and Sa'ar, Guy and Zehavi, Meirav},
  title =	{{Grid Recognition: Classical and Parameterized Computational Perspectives}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{37:1--37:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.37},
  URN =		{urn:nbn:de:0030-drops-154703},
  doi =		{10.4230/LIPIcs.ISAAC.2021.37},
  annote =	{Keywords: Grid Recognition, Grid Graph, Parameterized Complexity}
}
Document
An Improved Approximation Algorithm for the Matching Augmentation Problem

Authors: Joseph Cheriyan, Robert Cummings, Jack Dippel, and Jasper Zhu


Abstract
We present a 5/3-approximation algorithm for the matching augmentation problem (MAP): given a multi-graph with edges of cost either zero or one such that the edges of cost zero form a matching, find a 2-edge connected spanning subgraph (2-ECSS) of minimum cost. A 7/4-approximation algorithm for the same problem was presented recently, see Cheriyan, et al., "The matching augmentation problem: a 7/4-approximation algorithm," Math. Program., 182(1):315-354, 2020. Our improvement is based on new algorithmic techniques, and some of these may lead to advances on related problems.

Cite as

Joseph Cheriyan, Robert Cummings, Jack Dippel, and Jasper Zhu. An Improved Approximation Algorithm for the Matching Augmentation Problem. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 38:1-38:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cheriyan_et_al:LIPIcs.ISAAC.2021.38,
  author =	{Cheriyan, Joseph and Cummings, Robert and Dippel, Jack and Zhu, Jasper},
  title =	{{An Improved Approximation Algorithm for the Matching Augmentation Problem}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{38:1--38:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.38},
  URN =		{urn:nbn:de:0030-drops-154714},
  doi =		{10.4230/LIPIcs.ISAAC.2021.38},
  annote =	{Keywords: 2-Edge connected graph, 2-edge covers, approximation algorithms, connectivity augmentation, forest augmentation problem, matching augmentation problem, network design}
}
Document
Multimodal Transportation with Ridesharing of Personal Vehicles

Authors: Qian-Ping Gu and JiaJian Liang


Abstract
Many public transportation systems are unable to keep up with growing passenger demand as the population grows in urban areas. The slow or lack of improvement for public transportation pushes people to use private transportation modes, such as carpooling and ridesharing. However, the occupancy rate of personal vehicles has been dropping in many cities. In this paper, we describe a centralized transit system that integrates public transit and ridesharing, which matches drivers and transit riders such that the riders would result in shorter travel time using both transit and ridesharing. The optimization goal of the system is to assign as many riders to drivers as possible for ridesharing. We give an exact approach and approximation algorithms to achieve the optimization goal. As a case study, we conduct an extensive computational study to show the effectiveness of the transit system for different approximation algorithms, based on the real-world traffic data in Chicago City; the data sets include both public transit and ridesharing trip information. The experiment results show that our system is able to assign more than 60% of riders to drivers, leading to a substantial increase in occupancy rate of personal vehicles and reducing riders' travel time.

Cite as

Qian-Ping Gu and JiaJian Liang. Multimodal Transportation with Ridesharing of Personal Vehicles. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 39:1-39:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{gu_et_al:LIPIcs.ISAAC.2021.39,
  author =	{Gu, Qian-Ping and Liang, JiaJian},
  title =	{{Multimodal Transportation with Ridesharing of Personal Vehicles}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{39:1--39:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.39},
  URN =		{urn:nbn:de:0030-drops-154727},
  doi =		{10.4230/LIPIcs.ISAAC.2021.39},
  annote =	{Keywords: Multimodal transportation, ridesharing, approximation algorithms, computational study}
}
Document
Algorithms for Normalized Multiple Sequence Alignments

Authors: Eloi Araujo, Luiz C. Rozante, Diego P. Rubert, and Fábio V. Martinez


Abstract
Sequence alignment supports numerous tasks in bioinformatics, natural language processing, pattern recognition, social sciences, and other fields. While the alignment of two sequences may be performed swiftly in many applications, the simultaneous alignment of multiple sequences proved to be naturally more intricate. Although most multiple sequence alignment (MSA) formulations are NP-hard, several approaches have been developed, as they can outperform pairwise alignment methods or are necessary for some applications. Taking into account not only similarities but also the lengths of the compared sequences (i.e. normalization) can provide better alignment results than both unnormalized or post-normalized approaches. While some normalized methods have been developed for pairwise sequence alignment, none have been proposed for MSA. This work is a first effort towards the development of normalized methods for MSA. We discuss multiple aspects of normalized multiple sequence alignment (NMSA). We define three new criteria for computing normalized scores when aligning multiple sequences, showing the NP-hardness and exact algorithms for solving the NMSA using those criteria. In addition, we provide approximation algorithms for MSA and NMSA for some classes of scoring matrices.

Cite as

Eloi Araujo, Luiz C. Rozante, Diego P. Rubert, and Fábio V. Martinez. Algorithms for Normalized Multiple Sequence Alignments. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 40:1-40:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{araujo_et_al:LIPIcs.ISAAC.2021.40,
  author =	{Araujo, Eloi and Rozante, Luiz C. and Rubert, Diego P. and Martinez, F\'{a}bio V.},
  title =	{{Algorithms for Normalized Multiple Sequence Alignments}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{40:1--40:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.40},
  URN =		{urn:nbn:de:0030-drops-154734},
  doi =		{10.4230/LIPIcs.ISAAC.2021.40},
  annote =	{Keywords: Multiple sequence alignment, Normalized multiple sequence alignment, Algorithms and complexity}
}
Document
Separated Red Blue Center Clustering

Authors: Marzieh Eskandari, Bhavika Khare, and Nirman Kumar


Abstract
We study a generalization of k-center clustering, first introduced by Kavand et. al., where instead of one set of centers, we have two types of centers, p red and q blue, and where each red center is at least α distant from each blue center. The goal is to minimize the covering radius. We provide an approximation algorithm for this problem, and a polynomial-time algorithm for the constrained problem, where all the centers must lie on a line 𝓁.

Cite as

Marzieh Eskandari, Bhavika Khare, and Nirman Kumar. Separated Red Blue Center Clustering. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 41:1-41:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{eskandari_et_al:LIPIcs.ISAAC.2021.41,
  author =	{Eskandari, Marzieh and Khare, Bhavika and Kumar, Nirman},
  title =	{{Separated Red Blue Center Clustering}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{41:1--41:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.41},
  URN =		{urn:nbn:de:0030-drops-154740},
  doi =		{10.4230/LIPIcs.ISAAC.2021.41},
  annote =	{Keywords: Algorithms, Facility Location, Clustering, Approximation Algorithms, Computational Geometry}
}
Document
On the Extended TSP Problem

Authors: Julián Mestre, Sergey Pupyrev, and Seeun William Umboh


Abstract
We initiate the theoretical study of Ext-TSP, a problem that originates in the area of profile-guided binary optimization. Given a graph G = (V, E) with positive edge weights w: E → R^+, and a non-increasing discount function f(⋅) such that f(1) = 1 and f(i) = 0 for i > k, for some parameter k that is part of the problem definition. The problem is to sequence the vertices V so as to maximize ∑_{(u, v) ∈ E} f(|d_u - d_v|)⋅ w(u,v), where d_v ∈ {1, …, |V|} is the position of vertex v in the sequence. We show that Ext-TSP is APX-hard to approximate in general and we give a (k+1)-approximation algorithm for general graphs and a PTAS for some sparse graph classes such as planar or treewidth-bounded graphs. Interestingly, the problem remains challenging even on very simple graph classes; indeed, there is no exact n^o(k) time algorithm for trees unless the ETH fails. We complement this negative result with an exact n^O(k) time algorithm for trees.

Cite as

Julián Mestre, Sergey Pupyrev, and Seeun William Umboh. On the Extended TSP Problem. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 42:1-42:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{mestre_et_al:LIPIcs.ISAAC.2021.42,
  author =	{Mestre, Juli\'{a}n and Pupyrev, Sergey and Umboh, Seeun William},
  title =	{{On the Extended TSP Problem}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{42:1--42:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.42},
  URN =		{urn:nbn:de:0030-drops-154751},
  doi =		{10.4230/LIPIcs.ISAAC.2021.42},
  annote =	{Keywords: profile-guided optimization, approximation algorithms, bandwidth, TSP}
}
Document
Probabilistic Analysis of Euclidean Capacitated Vehicle Routing

Authors: Claire Mathieu and Hang Zhou


Abstract
We give a probabilistic analysis of the unit-demand Euclidean capacitated vehicle routing problem in the random setting, where the input distribution consists of n unit-demand customers modeled as independent, identically distributed uniform random points in the two-dimensional plane. The objective is to visit every customer using a set of routes of minimum total length, such that each route visits at most k customers, where k is the capacity of a vehicle. All of the following results are in the random setting and hold asymptotically almost surely. The best known polynomial-time approximation for this problem is the iterated tour partitioning (ITP) algorithm, introduced in 1985 by Haimovich and Rinnooy Kan. They showed that the ITP algorithm is near-optimal when k is either o(√n) or ω(√n), and they asked whether the ITP algorithm was "also effective in the intermediate range". In this work, we show that when k = √n, the ITP algorithm is at best a (1+c₀)-approximation for some positive constant c₀. On the other hand, the approximation ratio of the ITP algorithm was known to be at most 0.995+α due to Bompadre, Dror, and Orlin, where α is the approximation ratio of an algorithm for the traveling salesman problem. In this work, we improve the upper bound on the approximation ratio of the ITP algorithm to 0.915+α. Our analysis is based on a new lower bound on the optimal cost for the metric capacitated vehicle routing problem, which may be of independent interest.

Cite as

Claire Mathieu and Hang Zhou. Probabilistic Analysis of Euclidean Capacitated Vehicle Routing. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 43:1-43:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{mathieu_et_al:LIPIcs.ISAAC.2021.43,
  author =	{Mathieu, Claire and Zhou, Hang},
  title =	{{Probabilistic Analysis of Euclidean Capacitated Vehicle Routing}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{43:1--43:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.43},
  URN =		{urn:nbn:de:0030-drops-154769},
  doi =		{10.4230/LIPIcs.ISAAC.2021.43},
  annote =	{Keywords: capacitated vehicle routing, iterated tour partitioning, probabilistic analysis, approximation algorithms}
}
Document
Exact and Approximation Algorithms for Many-To-Many Point Matching in the Plane

Authors: Sayan Bandyapadhyay, Anil Maheshwari, and Michiel Smid


Abstract
Given two sets S and T of points in the plane, of total size n, a many-to-many matching between S and T is a set of pairs (p,q) such that p ∈ S, q ∈ T and for each r ∈ S ∪ T, r appears in at least one such pair. The cost of a pair (p,q) is the (Euclidean) distance between p and q. In the minimum-cost many-to-many matching problem, the goal is to compute a many-to-many matching such that the sum of the costs of the pairs is minimized. This problem is a restricted version of minimum-weight edge cover in a bipartite graph, and hence can be solved in O(n³) time. In a more restricted setting where all the points are on a line, the problem can be solved in O(nlog n) time [Justin Colannino et al., 2007]. However, no progress has been made in the general planar case in improving the cubic time bound. In this paper, we obtain an O(n²⋅ poly(log n)) time exact algorithm and an O(n^{3/2}⋅ poly(log n)) time (1+ε)-approximation in the planar case.

Cite as

Sayan Bandyapadhyay, Anil Maheshwari, and Michiel Smid. Exact and Approximation Algorithms for Many-To-Many Point Matching in the Plane. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 44:1-44:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bandyapadhyay_et_al:LIPIcs.ISAAC.2021.44,
  author =	{Bandyapadhyay, Sayan and Maheshwari, Anil and Smid, Michiel},
  title =	{{Exact and Approximation Algorithms for Many-To-Many Point Matching in the Plane}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{44:1--44:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.44},
  URN =		{urn:nbn:de:0030-drops-154779},
  doi =		{10.4230/LIPIcs.ISAAC.2021.44},
  annote =	{Keywords: Many-to-many matching, bipartite, planar, geometric, approximation}
}
Document
Augmenting Graphs to Minimize the Radius

Authors: Joachim Gudmundsson, Yuan Sha, and Fan Yao


Abstract
We study the problem of augmenting a metric graph by adding k edges while minimizing the radius of the augmented graph. We give a simple 3-approximation algorithm and show that there is no polynomial-time (5/3-ε)-approximation algorithm, for any ε > 0, unless P = NP. We also give two exact algorithms for the special case when the input graph is a tree, one of which is generalized to handle metric graphs with bounded treewidth.

Cite as

Joachim Gudmundsson, Yuan Sha, and Fan Yao. Augmenting Graphs to Minimize the Radius. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 45:1-45:20, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{gudmundsson_et_al:LIPIcs.ISAAC.2021.45,
  author =	{Gudmundsson, Joachim and Sha, Yuan and Yao, Fan},
  title =	{{Augmenting Graphs to Minimize the Radius}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{45:1--45:20},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.45},
  URN =		{urn:nbn:de:0030-drops-154785},
  doi =		{10.4230/LIPIcs.ISAAC.2021.45},
  annote =	{Keywords: graph augmentation, radius, approximation algorithm}
}
Document
Linear-Time Approximation Scheme for k-Means Clustering of Axis-Parallel Affine Subspaces

Authors: Kyungjin Cho and Eunjin Oh


Abstract
In this paper, we present a linear-time approximation scheme for k-means clustering of incomplete data points in d-dimensional Euclidean space. An incomplete data point with Δ > 0 unspecified entries is represented as an axis-parallel affine subspace of dimension Δ. The distance between two incomplete data points is defined as the Euclidean distance between two closest points in the axis-parallel affine subspaces corresponding to the data points. We present an algorithm for k-means clustering of axis-parallel affine subspaces of dimension Δ that yields an (1+ε)-approximate solution in O(nd) time. The constants hidden behind O(⋅) depend only on Δ, ε and k. This improves the O(n² d)-time algorithm by Eiben et al. [SODA'21] by a factor of n.

Cite as

Kyungjin Cho and Eunjin Oh. Linear-Time Approximation Scheme for k-Means Clustering of Axis-Parallel Affine Subspaces. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 46:1-46:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{cho_et_al:LIPIcs.ISAAC.2021.46,
  author =	{Cho, Kyungjin and Oh, Eunjin},
  title =	{{Linear-Time Approximation Scheme for k-Means Clustering of Axis-Parallel Affine Subspaces}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{46:1--46:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.46},
  URN =		{urn:nbn:de:0030-drops-154794},
  doi =		{10.4230/LIPIcs.ISAAC.2021.46},
  annote =	{Keywords: k-means clustering, affine subspaces}
}
Document
Feedback Vertex Set on Geometric Intersection Graphs

Authors: Shinwoo An and Eunjin Oh


Abstract
In this paper, we present an algorithm for computing a feedback vertex set of a unit disk graph of size k, if it exists, which runs in time 2^O(√k)(n+m), where n and m denote the numbers of vertices and edges, respectively. This improves the 2^O(√klog k) n^O(1)-time algorithm for this problem on unit disk graphs by Fomin et al. [ICALP 2017]. Moreover, our algorithm is optimal assuming the exponential-time hypothesis. Also, our algorithm can be extended to handle geometric intersection graphs of similarly sized fat objects without increasing the running time.

Cite as

Shinwoo An and Eunjin Oh. Feedback Vertex Set on Geometric Intersection Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 47:1-47:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{an_et_al:LIPIcs.ISAAC.2021.47,
  author =	{An, Shinwoo and Oh, Eunjin},
  title =	{{Feedback Vertex Set on Geometric Intersection Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{47:1--47:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.47},
  URN =		{urn:nbn:de:0030-drops-154807},
  doi =		{10.4230/LIPIcs.ISAAC.2021.47},
  annote =	{Keywords: Feedback vertex set, intersection graphs, parameterized algorithm}
}
Document
Streaming Algorithms for Graph k-Matching with Optimal or Near-Optimal Update Time

Authors: Jianer Chen, Qin Huang, Iyad Kanj, Qian Li, and Ge Xia


Abstract
We present streaming algorithms for the graph k-matching problem in both the insert-only and dynamic models. Our algorithms, while keeping the space complexity matching the best known upper bound, have optimal or near-optimal update time, significantly improving on previous results. More specifically, for the insert-only streaming model, we present a one-pass randomized algorithm that runs in optimal 𝒪(k²) space and has optimal 𝒪(1) update time, and that, w.h.p. (with high probability), computes a maximum weighted k-matching of a weighted graph. Previously, the best upper bound on the update time was 𝒪(log k), which was achieved by a deterministic streaming algorithm that however only works for unweighted graphs [Stefan Fafianie and Stefan Kratsch, 2014]. For the dynamic streaming model, we present a one-pass randomized algorithm that, w.h.p., computes a maximum weighted k-matching of a weighted graph in Õ(Wk²) space and with Õ(1) update time, where W is the number of distinct edge weights. Again the update time of our algorithm improves the previous best upper bound Õ(k²) [Rajesh Chitnis et al., 2016]. Moreover, we prove that in the dynamic streaming model, any randomized streaming algorithm for the problem requires k²⋅ Ω(W(log W+1)) bits of space. Hence, both the space and update-time complexities achieved by our algorithm in the dynamic model are near-optimal. A streaming approximation algorithm for k-matching is also presented, whose space complexity matches the best known upper bound with a significantly improved update time.

Cite as

Jianer Chen, Qin Huang, Iyad Kanj, Qian Li, and Ge Xia. Streaming Algorithms for Graph k-Matching with Optimal or Near-Optimal Update Time. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 48:1-48:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ISAAC.2021.48,
  author =	{Chen, Jianer and Huang, Qin and Kanj, Iyad and Li, Qian and Xia, Ge},
  title =	{{Streaming Algorithms for Graph k-Matching with Optimal or Near-Optimal Update Time}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{48:1--48:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.48},
  URN =		{urn:nbn:de:0030-drops-154816},
  doi =		{10.4230/LIPIcs.ISAAC.2021.48},
  annote =	{Keywords: streaming algorithms, matching, parameterized algorithms, lower bounds}
}
Document
Making Three out of Two: Three-Way Online Correlated Selection

Authors: Yongho Shin and Hyung-Chan An


Abstract
Two-way online correlated selection (two-way OCS) is an online algorithm that, at each timestep, takes a pair of elements from the ground set and irrevocably chooses one of the two elements, while ensuring negative correlation in the algorithm's choices. Whilst OCS was initially invented by Fahrbach, Huang, Tao, and Zadimoghaddam to break a natural long-standing barrier in the edge-weighted online bipartite matching problem, it is an interesting technique on its own due to its capability of introducing a powerful algorithmic tool, namely negative correlation, to online algorithms. As such, Fahrbach et al. posed two tantalizing open questions in their paper, one of which was the following: Can we obtain n-way OCS for n > 2, in which the algorithm can be given n > 2 elements to choose from at each timestep? In this paper, we affirmatively answer this open question by presenting a three-way OCS. Our algorithm uses two-way OCS as its building block and is simple to describe; however, as it internally runs two instances of two-way OCS, one of which is fed with the output of the other, the final output probability distribution becomes highly elusive. We tackle this difficulty by approximating the output distribution of OCS by a flat, less correlated function and using it as a safe "surrogate" of the real distribution. Our three-way OCS also yields a 0.5093-competitive algorithm for edge-weighted online matching, demonstrating its usefulness.

Cite as

Yongho Shin and Hyung-Chan An. Making Three out of Two: Three-Way Online Correlated Selection. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 49:1-49:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{shin_et_al:LIPIcs.ISAAC.2021.49,
  author =	{Shin, Yongho and An, Hyung-Chan},
  title =	{{Making Three out of Two: Three-Way Online Correlated Selection}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{49:1--49:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.49},
  URN =		{urn:nbn:de:0030-drops-154822},
  doi =		{10.4230/LIPIcs.ISAAC.2021.49},
  annote =	{Keywords: online correlated selection, multi-way OCS, online algorithms, negative correlation, edge-weighted online bipartite matching}
}
Document
Tight Competitive Analyses of Online Car-Sharing Problems

Authors: Ya-Chun Liang, Kuan-Yun Lai, Ho-Lin Chen, and Kazuo Iwama


Abstract
The car-sharing problem, proposed by Luo, Erlebach and Xu in 2018, mainly focuses on an online model in which there are two locations: 0 and 1, and k total cars. Each request which specifies its pick-up time and pick-up location (among 0 and 1, and the other is the drop-off location) is released in each stage a fixed amount of time before its specified start (i.e. pick-up) time. The time between the booking (i.e. released) time and the start time is enough to move empty cars between 0 and 1 for relocation if they are not used in that stage. The model, called kS2L-F, assumes that requests in each stage arrive sequentially regardless of the same booking time and the decision (accept or reject) must be made immediately. The goal is to accept as many requests as possible. In spite of only two locations, the analysis does not seem easy and the (tight) competitive ratio (CR) is only known to be 2.0 for k = 2 and 1.5 for a restricted value of k, i.e., a multiple of three. In this paper, we remove all the holes of unknown CR’s; namely we prove that the CR is 2k/(k + ⌊k/3⌋) for all k ≥ 2. Furthermore, if the algorithm can delay its decision until all requests have come in each stage, the CR is improved to roughly 4/3. We can take this advantage even further, precisely we can achieve a CR of (2+R)/3 if the number of requests in each stage is at most Rk, 1 ≤ R ≤ 2, where we do not have to know the value of R in advance. Finally we demonstrate that randomization also helps to get (slightly) better CR’s.

Cite as

Ya-Chun Liang, Kuan-Yun Lai, Ho-Lin Chen, and Kazuo Iwama. Tight Competitive Analyses of Online Car-Sharing Problems. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 50:1-50:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{liang_et_al:LIPIcs.ISAAC.2021.50,
  author =	{Liang, Ya-Chun and Lai, Kuan-Yun and Chen, Ho-Lin and Iwama, Kazuo},
  title =	{{Tight Competitive Analyses of Online Car-Sharing Problems}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{50:1--50:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.50},
  URN =		{urn:nbn:de:0030-drops-154835},
  doi =		{10.4230/LIPIcs.ISAAC.2021.50},
  annote =	{Keywords: Car-sharing, Competitive analysis, On-line scheduling, Randomized algorithm}
}
Document
Skeletons and Minimum Energy Scheduling

Authors: Antonios Antoniadis, Gunjan Kumar, and Nikhil Kumar


Abstract
Consider the problem where n jobs, each with a release time, a deadline and a required processing time are to be feasibly scheduled in a single- or multi-processor setting so as to minimize the total energy consumption of the schedule. A processor has two available states: a sleep state where no energy is consumed but also no processing can take place, and an active state which consumes energy at a rate of one, and in which jobs can be processed. Transitioning from the active to the sleep does not incur any further energy cost, but transitioning from the sleep to the active state requires q energy units. Jobs may be preempted and (in the multi-processor case) migrated. The single-processor case of the problem is known to be solvable in polynomial time via an involved dynamic program, whereas the only known approximation algorithm for the multi-processor case attains an approximation factor of 3 and is based on rounding the solution to a linear programming relaxation of the problem. In this work, we present efficient and combinatorial approximation algorithms for both the single- and the multi-processor setting. Before, only an algorithm based on linear programming was known for the multi-processor case. Our algorithms build upon the concept of a skeleton, a basic (and not necessarily feasible) schedule that captures the fact that some processor(s) must be active at some time point during an interval. Finally, we further demonstrate the power of skeletons by providing a 2-approximation algorithm for the multiprocessor case, thus improving upon the recent breakthrough 3-approximation result. Our algorithm is based on a novel rounding scheme of a linear-programming relaxation of the problem which incorporates skeletons.

Cite as

Antonios Antoniadis, Gunjan Kumar, and Nikhil Kumar. Skeletons and Minimum Energy Scheduling. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 51:1-51:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{antoniadis_et_al:LIPIcs.ISAAC.2021.51,
  author =	{Antoniadis, Antonios and Kumar, Gunjan and Kumar, Nikhil},
  title =	{{Skeletons and Minimum Energy Scheduling}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{51:1--51:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.51},
  URN =		{urn:nbn:de:0030-drops-154849},
  doi =		{10.4230/LIPIcs.ISAAC.2021.51},
  annote =	{Keywords: scheduling, energy-efficiency, approximation algorithms, dynamic programming, combinatorial algorithms}
}
Document
Machine Covering in the Random-Order Model

Authors: Susanne Albers, Waldo Gálvez, and Maximilian Janke


Abstract
In the Online Machine Covering problem jobs, defined by their sizes, arrive one by one and have to be assigned to m parallel and identical machines, with the goal of maximizing the load of the least-loaded machine. Unfortunately, the classical model allows only fairly pessimistic performance guarantees: The best possible deterministic ratio of m is achieved by the Greedy-strategy, and the best known randomized algorithm has competitive ratio Õ(√m) which cannot be improved by more than a logarithmic factor. Modern results try to mitigate this by studying semi-online models, where additional information about the job sequence is revealed in advance or extra resources are provided to the online algorithm. In this work we study the Machine Covering problem in the recently popular random-order model. Here no extra resources are present, but instead the adversary is weakened in that it can only decide upon the input set while jobs are revealed uniformly at random. It is particularly relevant to Machine Covering where lower bounds are usually associated to highly structured input sequences. We first analyze Graham’s Greedy-strategy in this context and establish that its competitive ratio decreases slightly to Θ(m/(log(m))) which is asymptotically tight. Then, as our main result, we present an improved Õ(∜m)-competitive algorithm for the problem. This result is achieved by exploiting the extra information coming from the random order of the jobs, using sampling techniques to devise an improved mechanism to distinguish jobs that are relatively large from small ones. We complement this result with a first lower bound showing that no algorithm can have a competitive ratio of O(log(m)/{log log(m)}) in the random-order model. This lower bound is achieved by studying a novel variant of the Secretary problem, which could be of independent interest.

Cite as

Susanne Albers, Waldo Gálvez, and Maximilian Janke. Machine Covering in the Random-Order Model. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 52:1-52:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{albers_et_al:LIPIcs.ISAAC.2021.52,
  author =	{Albers, Susanne and G\'{a}lvez, Waldo and Janke, Maximilian},
  title =	{{Machine Covering in the Random-Order Model}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{52:1--52:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.52},
  URN =		{urn:nbn:de:0030-drops-154858},
  doi =		{10.4230/LIPIcs.ISAAC.2021.52},
  annote =	{Keywords: Machine Covering, Online Algorithm, Random-Order, Competitive Analysis, Scheduling}
}
Document
Nearly-Tight Lower Bounds for Set Cover and Network Design with Deadlines/Delay

Authors: Noam Touitou


Abstract
In network design problems with deadlines/delay, an algorithm must make transmissions over time to satisfy connectivity requests on a graph. To satisfy a request, a transmission must be made that provides the desired connectivity. In the deadline case, this transmission must occur inside a time window associated with the request. In the delay case, the transmission should be as soon as possible after the request’s release, to avoid delay cost. In FOCS 2020, frameworks were given which reduce a network design problem with deadlines/delay to its classic, offline variant, while incurring an additional competitiveness loss factor of O(log n), where n is the number of vertices in the graph. Trying to improve upon this loss factor is thus a natural research direction. The frameworks of FOCS 2020 also apply to set cover with deadlines/delay, in which requests arrive on the elements of a universe over time, and the algorithm must transmit sets to serve them. In this problem, a universe of sets and elements is given, requests arrive on elements over time, and the algorithm must transmit sets to serve them. In this paper, we give nearly tight lower bounds for set cover with deadlines/delay. These lower bounds imply nearly-tight lower bounds of Ω(log n / log log n) for a few network design problems, such as node-weighted Steiner forest and directed Steiner tree. Our results imply that the frameworks in FOCS 2020 are essentially optimal, and improve quadratically over the best previously-known lower bounds.

Cite as

Noam Touitou. Nearly-Tight Lower Bounds for Set Cover and Network Design with Deadlines/Delay. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 53:1-53:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{touitou:LIPIcs.ISAAC.2021.53,
  author =	{Touitou, Noam},
  title =	{{Nearly-Tight Lower Bounds for Set Cover and Network Design with Deadlines/Delay}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{53:1--53:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.53},
  URN =		{urn:nbn:de:0030-drops-154865},
  doi =		{10.4230/LIPIcs.ISAAC.2021.53},
  annote =	{Keywords: Network Design, Deadlines, Delay, Online, Set Cover}
}
Document
Cryptographic Hardness Under Projections for Time-Bounded Kolmogorov Complexity

Authors: Eric Allender, John Gouwar, Shuichi Hirahara, and Caleb Robelle


Abstract
A version of time-bounded Kolmogorov complexity, denoted KT, has received attention in the past several years, due to its close connection to circuit complexity and to the Minimum Circuit Size Problem MCSP. Essentially all results about the complexity of MCSP hold also for MKTP (the problem of computing the KT complexity of a string). Both MKTP and MCSP are hard for SZK (Statistical Zero Knowledge) under BPP-Turing reductions; neither is known to be NP-complete. Recently, some hardness results for MKTP were proved that are not (yet) known to hold for MCSP. In particular, MKTP is hard for DET (a subclass of P) under nonuniform ≤^{NC^0}_m reductions. In this paper, we improve this, to show that the complement of MKTP is hard for the (apparently larger) class NISZK_L under not only ≤^{NC^0}_m reductions but even under projections. Also, the complement of MKTP is hard for NISZK under ≤^{P/poly}_m reductions. Here, NISZK is the class of problems with non-interactive zero-knowledge proofs, and NISZK_L is the non-interactive version of the class SZK_L that was studied by Dvir et al. As an application, we provide several improved worst-case to average-case reductions to problems in NP, and we obtain a new lower bound on MKTP (which is currently not known to hold for MCSP).

Cite as

Eric Allender, John Gouwar, Shuichi Hirahara, and Caleb Robelle. Cryptographic Hardness Under Projections for Time-Bounded Kolmogorov Complexity. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 54:1-54:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{allender_et_al:LIPIcs.ISAAC.2021.54,
  author =	{Allender, Eric and Gouwar, John and Hirahara, Shuichi and Robelle, Caleb},
  title =	{{Cryptographic Hardness Under Projections for Time-Bounded Kolmogorov Complexity}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{54:1--54:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.54},
  URN =		{urn:nbn:de:0030-drops-154875},
  doi =		{10.4230/LIPIcs.ISAAC.2021.54},
  annote =	{Keywords: Kolmogorov Complexity, Interactive Proofs, Minimum Circuit Size Problem, Worst-case to Average-case Reductions}
}
Document
Identity Testing Under Label Mismatch

Authors: Clément L. Canonne and Karl Wimmer


Abstract
Testing whether the observed data conforms to a purported model (probability distribution) is a basic and fundamental statistical task, and one that is by now well understood. However, the standard formulation, identity testing, fails to capture many settings of interest; in this work, we focus on one such natural setting, identity testing under promise of permutation. In this setting, the unknown distribution is assumed to be equal to the purported one, up to a relabeling (permutation) of the model: however, due to a systematic error in the reporting of the data, this relabeling may not be the identity. The goal is then to test identity under this assumption: equivalently, whether this systematic labeling error led to a data distribution statistically far from the reference model.

Cite as

Clément L. Canonne and Karl Wimmer. Identity Testing Under Label Mismatch. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 55:1-55:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{canonne_et_al:LIPIcs.ISAAC.2021.55,
  author =	{Canonne, Cl\'{e}ment L. and Wimmer, Karl},
  title =	{{Identity Testing Under Label Mismatch}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{55:1--55:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.55},
  URN =		{urn:nbn:de:0030-drops-154880},
  doi =		{10.4230/LIPIcs.ISAAC.2021.55},
  annote =	{Keywords: distribution testing, property testing, permutations, lower bounds}
}
Document
Unique-Neighbor-Like Expansion and Group-Independent Cosystolic Expansion

Authors: Tali Kaufman and David Mass


Abstract
In recent years, high dimensional expanders have been found to have a variety of applications in theoretical computer science, such as efficient CSPs approximations, improved sampling and list-decoding algorithms, and more. Within that, an important high dimensional expansion notion is cosystolic expansion, which has found applications in the construction of efficiently decodable quantum codes and in proving lower bounds for CSPs. Cosystolic expansion is considered with systems of equations over a group where the variables and equations correspond to faces of the complex. Previous works that studied cosystolic expansion were tailored to the specific group 𝔽₂. In particular, Kaufman, Kazhdan and Lubotzky (FOCS 2014), and Evra and Kaufman (STOC 2016) in their breakthrough works, who solved a famous open question of Gromov, have studied a notion which we term "parity" expansion for small sets. They showed that small sets of k-faces have proportionally many (k+1)-faces that contain an odd number of k-faces from the set. Parity expansion for small sets could then be used to imply cosystolic expansion only over 𝔽₂. In this work we introduce a stronger unique-neighbor-like expansion for small sets. We show that small sets of k-faces have proportionally many (k+1)-faces that contain exactly one k-face from the set. This notion is fundamentally stronger than parity expansion and cannot be implied by previous works. We then show, utilizing the new unique-neighbor-like expansion notion introduced in this work, that cosystolic expansion can be made group-independent, i.e., unique-neighbor-like expansion for small sets implies cosystolic expansion over any group.

Cite as

Tali Kaufman and David Mass. Unique-Neighbor-Like Expansion and Group-Independent Cosystolic Expansion. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 56:1-56:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{kaufman_et_al:LIPIcs.ISAAC.2021.56,
  author =	{Kaufman, Tali and Mass, David},
  title =	{{Unique-Neighbor-Like Expansion and Group-Independent Cosystolic Expansion}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{56:1--56:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.56},
  URN =		{urn:nbn:de:0030-drops-154898},
  doi =		{10.4230/LIPIcs.ISAAC.2021.56},
  annote =	{Keywords: High dimensional expanders, Unique-neighbor-like expansion, Cosystolic expansion}
}
Document
Group Evacuation on a Line by Agents with Different Communication Abilities

Authors: Jurek Czyzowicz, Ryan Killick, Evangelos Kranakis, Danny Krizanc, Lata Narayanan, Jaroslav Opatrny, Denis Pankratov, and Sunil Shende


Abstract
We consider evacuation of a group of n ≥ 2 autonomous mobile agents (or robots) from an unknown exit on an infinite line. The agents are initially placed at the origin of the line and can move with any speed up to the maximum speed 1 in any direction they wish and they all can communicate when they are co-located. However, the agents have different wireless communication abilities: while some are fully wireless and can send and receive messages at any distance, a subset of the agents are senders, they can only transmit messages wirelessly, and the rest are receivers, they can only receive messages wirelessly. The agents start at the same time and their communication abilities are known to each other from the start. Starting at the origin of the line, the goal of the agents is to collectively find a target/exit at an unknown location on the line while minimizing the evacuation time, defined as the time when the last agent reaches the target. We investigate the impact of such a mixed communication model on evacuation time on an infinite line for a group of cooperating agents. In particular, we provide evacuation algorithms and analyze the resulting competitive ratio (CR) of the evacuation time for such a group of agents. If the group has two agents of two different types, we give an optimal evacuation algorithm with competitive ratio CR = 3+2√2. If there is a single sender or fully wireless agent, and multiple receivers we prove that CR ∈ [2+√5,5], and if there are multiple senders and a single receiver or fully wireless agent, we show that CR ∈ [3,5.681319]. Any group consisting of only senders or only receivers requires competitive ratio 9, and any other combination of agents has competitive ratio 3.

Cite as

Jurek Czyzowicz, Ryan Killick, Evangelos Kranakis, Danny Krizanc, Lata Narayanan, Jaroslav Opatrny, Denis Pankratov, and Sunil Shende. Group Evacuation on a Line by Agents with Different Communication Abilities. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 57:1-57:24, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{czyzowicz_et_al:LIPIcs.ISAAC.2021.57,
  author =	{Czyzowicz, Jurek and Killick, Ryan and Kranakis, Evangelos and Krizanc, Danny and Narayanan, Lata and Opatrny, Jaroslav and Pankratov, Denis and Shende, Sunil},
  title =	{{Group Evacuation on a Line by Agents with Different Communication Abilities}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{57:1--57:24},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.57},
  URN =		{urn:nbn:de:0030-drops-154903},
  doi =		{10.4230/LIPIcs.ISAAC.2021.57},
  annote =	{Keywords: Agent, Communication, Evacuation, Mobile, Receiver, Search, Sender}
}
Document
Lower Bounds for Induced Cycle Detection in Distributed Computing

Authors: François Le Gall and Masayuki Miyamoto


Abstract
The distributed subgraph detection asks, for a fixed graph H, whether the n-node input graph contains H as a subgraph or not. In the standard CONGEST model of distributed computing, the complexity of clique/cycle detection and listing has received a lot of attention recently. In this paper we consider the induced variant of subgraph detection, where the goal is to decide whether the n-node input graph contains H as an induced subgraph or not. We first show a Ω̃(n) lower bound for detecting the existence of an induced k-cycle for any k ≥ 4 in the CONGEST model. This lower bound is tight for k = 4, and shows that the induced variant of k-cycle detection is much harder than the non-induced version. This lower bound is proved via a reduction from two-party communication complexity. We complement this result by showing that for 5 ≤ k ≤ 7, this Ω̃(n) lower bound cannot be improved via the two-party communication framework. We then show how to prove stronger lower bounds for larger values of k. More precisely, we show that detecting an induced k-cycle for any k ≥ 8 requires Ω̃(n^{2-Θ{(1/k)}}) rounds in the CONGEST model, nearly matching the known upper bound Õ(n^{2-Θ{(1/k)}}) of the general k-node subgraph detection (which also applies to the induced version) by Eden, Fiat, Fischer, Kuhn, and Oshman [DISC 2019]. Finally, we investigate the case where H is the diamond (the diamond is obtained by adding an edge to a 4-cycle, or equivalently removing an edge from a 4-clique), and show non-trivial upper and lower bounds on the complexity of the induced version of diamond detecting and listing.

Cite as

François Le Gall and Masayuki Miyamoto. Lower Bounds for Induced Cycle Detection in Distributed Computing. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 58:1-58:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{legall_et_al:LIPIcs.ISAAC.2021.58,
  author =	{Le Gall, Fran\c{c}ois and Miyamoto, Masayuki},
  title =	{{Lower Bounds for Induced Cycle Detection in Distributed Computing}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{58:1--58:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.58},
  URN =		{urn:nbn:de:0030-drops-154919},
  doi =		{10.4230/LIPIcs.ISAAC.2021.58},
  annote =	{Keywords: Distributed computing, Lower bounds, Subgraph detection}
}
Document
Distributed Approximations of f-Matchings and b-Matchings in Graphs of Sub-Logarithmic Expansion

Authors: Andrzej Czygrinow, Michał Hanćkowiak, and Marcin Witkowski


Abstract
We give a distributed algorithm which given ε > 0 finds a (1-ε)-factor approximation of a maximum f-matching in graphs G = (V,E) of sub-logarithmic expansion. Using a similar approach we also give a distributed approximation of a maximum b-matching in the same class of graphs provided the function b: V → ℤ^+ is L-Lipschitz for some constant L. Both algorithms run in O(log^* n) rounds in the LOCAL model, which is optimal.

Cite as

Andrzej Czygrinow, Michał Hanćkowiak, and Marcin Witkowski. Distributed Approximations of f-Matchings and b-Matchings in Graphs of Sub-Logarithmic Expansion. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 59:1-59:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{czygrinow_et_al:LIPIcs.ISAAC.2021.59,
  author =	{Czygrinow, Andrzej and Han\'{c}kowiak, Micha{\l} and Witkowski, Marcin},
  title =	{{Distributed Approximations of f-Matchings and b-Matchings in Graphs of Sub-Logarithmic Expansion}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{59:1--59:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.59},
  URN =		{urn:nbn:de:0030-drops-154925},
  doi =		{10.4230/LIPIcs.ISAAC.2021.59},
  annote =	{Keywords: Distributed algorithms, f-matching, b-matching}
}
Document
Inverse Suffix Array Queries for 2-Dimensional Pattern Matching in Near-Compact Space

Authors: Dhrumil Patel and Rahul Shah


Abstract
In a 2-dimensional (2D) pattern matching problem, the text is arranged as a matrix 𝖬[1..n, 1..n] and consists of N = n × n symbols drawn from alphabet set Σ of size σ. The query consists of a m × m square matrix 𝖯[1..m, 1..m] drawn from the same alphabet set Σ and the task is to find all the locations in 𝖬 where 𝖯 appears as a (contiguous) submatrix. The patterns can be of any size, but as long as they are square in shape data structures like suffix trees and suffix array exist [Raffaele Giancarlo, 1995; Dong Kyue Kim et al., 1998] for the task of efficient pattern matching. These are essentially 2D counterparts of classic suffix trees and arrays known for traditional 1-dimensional (1D) pattern matching. They work based on linearization of 2D suffixes which would preserve the prefix match property (i.e., every pattern match is a prefix of some suffix). The main limitation of the suffix trees and the suffix arrays (in 1D) was their space utilization of O(N log N) bits, where N is the size of the text. This was suboptimal compared to Nlog σ bits of space, which is information theoretic optimal for the text. With the advent of the field of succinct/compressed data structures, it was possible to develop compressed variants of suffix trees and array based on Burrows-Wheeler Tansform and LF-mapping (or Φ function) [Roberto Grossi and Jeffrey Scott Vitter, 2005; Paolo Ferragina and Giovanni Manzini, 2005; Kunihiko Sadakane, 2007]. These data structures indeed achieve O(N log σ) bits of space or better. This gives rise to the question: analogous to 1D case, can we design a succinct or compressed index for 2D pattern matching? Can there be a 2D compressed suffix tree? Are there analogues of Burrows-Wheeler Transform or LF-mapping? The problem has been acknowledged for over a decade now and there have been a few attempts at applying Φ function [Ankur Gupta, 2004] and achieving entropy based compression [Veli Mäkinen and Gonzalo Navarro, 2008]. However, achieving the complexity breakthrough akin to 1D case has yet to be found. In this paper, we still do not know how to answer suffix array queries in O(N log σ) bits of space - which would have led to efficient pattern matching. However, for the first time, we show an interesting result that it is indeed possible to compute inverse suffix array (ISA) queries in near compact space in O(polylog n) time. Our 2D succinct text index design is based on two 1D compressed suffix trees and it takes O(N log log N + N logσ) bits of space which is much smaller than its naive design that takes O(N log N) bits. Although the main problem is still evasive, this index gives a hope on the existence of a full 2D succinct index with all functionalities similar to that of 1D case.

Cite as

Dhrumil Patel and Rahul Shah. Inverse Suffix Array Queries for 2-Dimensional Pattern Matching in Near-Compact Space. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 60:1-60:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{patel_et_al:LIPIcs.ISAAC.2021.60,
  author =	{Patel, Dhrumil and Shah, Rahul},
  title =	{{Inverse Suffix Array Queries for 2-Dimensional Pattern Matching in Near-Compact Space}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{60:1--60:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.60},
  URN =		{urn:nbn:de:0030-drops-154932},
  doi =		{10.4230/LIPIcs.ISAAC.2021.60},
  annote =	{Keywords: Pattern Matching, Succinct Data Structures}
}
Document
Dynamic Boolean Formula Evaluation

Authors: Rathish Das, Andrea Lincoln, Jayson Lynch, and J. Ian Munro


Abstract
We present a linear space data structure for Dynamic Evaluation of k-CNF Boolean Formulas which achieves O(m^{1-1/k}) query and variable update time where m is the number of clauses in the formula and clauses are of size at most a constant k. Our algorithm is additionally able to count the total number of satisfied clauses. We then show how this data structure can be parallelized in the PRAM model to achieve O(log m) span (i.e. parallel time) and still O(m^{1-1/k}) work. This parallel algorithm works in the stronger Binary Fork model. We then give a series of lower bounds on the problem including an average-case result showing the lower bounds hold even when the updates to the variables are chosen at random. Specifically, a reduction from k-Clique shows that dynamically counting the number of satisfied clauses takes time at least n^{(2ω-3)/6 √{2k} -1 -o(√k)}, where 2 ≤ ω < 2.38 is the matrix multiplication constant. We show the Combinatorial k-Clique Hypothesis implies a lower bound of m^{(1-k^{-1/2})(1-o(1))} which suggests our algorithm is close to optimal without involving Matrix Multiplication or new techniques. We next give an average-case reduction to k-clique showing the prior lower bounds hold even when the updates are chosen at random. We use our conditional lower bound to show any Binary Fork algorithm solving these problems requires at least Ω(log m) span, which is tight against our algorithm in this model. Finally, we give an unconditional linear space lower bound for Dynamic k-CNF Boolean Formula Evaluation.

Cite as

Rathish Das, Andrea Lincoln, Jayson Lynch, and J. Ian Munro. Dynamic Boolean Formula Evaluation. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 61:1-61:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{das_et_al:LIPIcs.ISAAC.2021.61,
  author =	{Das, Rathish and Lincoln, Andrea and Lynch, Jayson and Munro, J. Ian},
  title =	{{Dynamic Boolean Formula Evaluation}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{61:1--61:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.61},
  URN =		{urn:nbn:de:0030-drops-154945},
  doi =		{10.4230/LIPIcs.ISAAC.2021.61},
  annote =	{Keywords: Data Structures, SAT, Dynamic Algorithms, Boolean Formulas, Fine-grained Complexity, Parallel Algorithms}
}
Document
Shortest Beer Path Queries in Outerplanar Graphs

Authors: Joyce Bacic, Saeed Mehrabi, and Michiel Smid


Abstract
A beer graph is an undirected graph G, in which each edge has a positive weight and some vertices have a beer store. A beer path between two vertices u and v in G is any path in G between u and v that visits at least one beer store. We show that any outerplanar beer graph G with n vertices can be preprocessed in O(n) time into a data structure of size O(n), such that for any two query vertices u and v, (i) the weight of the shortest beer path between u and v can be reported in O(α(n)) time (where α(n) is the inverse Ackermann function), and (ii) the shortest beer path between u and v can be reported in O(L) time, where L is the number of vertices on this path. Both results are optimal, even when G is a beer tree (i.e., a beer graph whose underlying graph is a tree).

Cite as

Joyce Bacic, Saeed Mehrabi, and Michiel Smid. Shortest Beer Path Queries in Outerplanar Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 62:1-62:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bacic_et_al:LIPIcs.ISAAC.2021.62,
  author =	{Bacic, Joyce and Mehrabi, Saeed and Smid, Michiel},
  title =	{{Shortest Beer Path Queries in Outerplanar Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{62:1--62:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.62},
  URN =		{urn:nbn:de:0030-drops-154950},
  doi =		{10.4230/LIPIcs.ISAAC.2021.62},
  annote =	{Keywords: shortest paths, outerplanar graph}
}
Document
Space-Efficient Algorithms for Reachability in Directed Geometric Graphs

Authors: Sujoy Bhore and Rahul Jain


Abstract
The problem of graph Reachability is to decide whether there is a path from one vertex to another in a given graph. In this paper, we study the Reachability problem on three distinct graph families - intersection graphs of Jordan regions, unit contact disk graphs (penny graphs), and chordal graphs. For each of these graph families, we present space-efficient algorithms for the Reachability problem. For intersection graphs of Jordan regions, we show how to obtain a "good" vertex separator in a space-efficient manner and use it to solve the Reachability in polynomial time and O(m^{1/2} log n) space, where n is the number of Jordan regions, and m is the total number of crossings among the regions. We use a similar approach for chordal graphs and obtain a polynomial time and O(m^{1/2} log n) space algorithm, where n and m are the number of vertices and edges, respectively. However, for unit contact disk graphs (penny graphs), we use a more involved technique and obtain a better algorithm. We show that for every ε > 0, there exists a polynomial time algorithm that can solve Reachability in an n vertex directed penny graph, using O(n^{1/4+ε}) space. We note that the method used to solve penny graphs does not extend naturally to the class of geometric intersection graphs that include arbitrary size cliques.

Cite as

Sujoy Bhore and Rahul Jain. Space-Efficient Algorithms for Reachability in Directed Geometric Graphs. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 63:1-63:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{bhore_et_al:LIPIcs.ISAAC.2021.63,
  author =	{Bhore, Sujoy and Jain, Rahul},
  title =	{{Space-Efficient Algorithms for Reachability in Directed Geometric Graphs}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{63:1--63:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.63},
  URN =		{urn:nbn:de:0030-drops-154961},
  doi =		{10.4230/LIPIcs.ISAAC.2021.63},
  annote =	{Keywords: Reachablity, Geometric intersection graphs, Space-efficient algorithms}
}
Document
Repetition- and Linearity-Aware Rank/Select Dictionaries

Authors: Paolo Ferragina, Giovanni Manzini, and Giorgio Vinciguerra


Abstract
We revisit the fundamental problem of compressing an integer dictionary that supports efficient rank and select operations by exploiting two kinds of regularities arising in real data: repetitiveness and approximate linearity. Our first contribution is a Lempel-Ziv parsing properly enriched to also capture approximate linearity in the data and still be compressed to the kth order entropy. Our second contribution is a variant of the block tree structure whose space complexity takes advantage of both repetitiveness and approximate linearity, and results highly competitive in time too. Our third and final contribution is an implementation and experimentation of this last data structure, which achieves new space-time trade-offs compared to known data structures that exploit only one of the two regularities.

Cite as

Paolo Ferragina, Giovanni Manzini, and Giorgio Vinciguerra. Repetition- and Linearity-Aware Rank/Select Dictionaries. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 64:1-64:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{ferragina_et_al:LIPIcs.ISAAC.2021.64,
  author =	{Ferragina, Paolo and Manzini, Giovanni and Vinciguerra, Giorgio},
  title =	{{Repetition- and Linearity-Aware Rank/Select Dictionaries}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{64:1--64:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.64},
  URN =		{urn:nbn:de:0030-drops-154974},
  doi =		{10.4230/LIPIcs.ISAAC.2021.64},
  annote =	{Keywords: Data compression, Compressed data structures, Entropy}
}
Document
Pattern Masking for Dictionary Matching

Authors: Panagiotis Charalampopoulos, Huiping Chen, Peter Christen, Grigorios Loukides, Nadia Pisanti, Solon P. Pissis, and Jakub Radoszewski


Abstract
Data masking is a common technique for sanitizing sensitive data maintained in database systems, and it is also becoming increasingly important in various application areas, such as in record linkage of personal data. This work formalizes the Pattern Masking for Dictionary Matching (PMDM) problem. In PMDM, we are given a dictionary 𝒟 of d strings, each of length 𝓁, a query string q of length 𝓁, and a positive integer z, and we are asked to compute a smallest set K ⊆ {1,…,𝓁}, so that if q[i] is replaced by a wildcard for all i ∈ K, then q matches at least z strings from 𝒟. Solving PMDM allows providing data utility guarantees as opposed to existing approaches. We first show, through a reduction from the well-known k-Clique problem, that a decision version of the PMDM problem is NP-complete, even for strings over a binary alphabet. We thus approach the problem from a more practical perspective. We show a combinatorial 𝒪((d𝓁)^{|K|/3}+d𝓁)-time and 𝒪(d𝓁)-space algorithm for PMDM for |K| = 𝒪(1). In fact, we show that we cannot hope for a faster combinatorial algorithm, unless the combinatorial k-Clique hypothesis fails [Abboud et al., SIAM J. Comput. 2018; Lincoln et al., SODA 2018]. We also generalize this algorithm for the problem of masking multiple query strings simultaneously so that every string has at least z matches in 𝒟. Note that PMDM can be viewed as a generalization of the decision version of the dictionary matching with mismatches problem: by querying a PMDM data structure with string q and z = 1, one obtains the minimal number of mismatches of q with any string from 𝒟. The query time or space of all known data structures for the more restricted problem of dictionary matching with at most k mismatches incurs some exponential factor with respect to k. A simple exact algorithm for PMDM runs in time 𝒪(2^𝓁 d). We present a data structure for PMDM that answers queries over 𝒟 in time 𝒪(2^{𝓁/2}(2^{𝓁/2}+τ)𝓁) and requires space 𝒪(2^𝓁 d²/τ²+2^{𝓁/2}d), for any parameter τ ∈ [1,d]. We complement our results by showing a two-way polynomial-time reduction between PMDM and the Minimum Union problem [Chlamtáč et al., SODA 2017]. This gives a polynomial-time 𝒪(d^{1/4+ε})-approximation algorithm for PMDM, which is tight under a plausible complexity conjecture.

Cite as

Panagiotis Charalampopoulos, Huiping Chen, Peter Christen, Grigorios Loukides, Nadia Pisanti, Solon P. Pissis, and Jakub Radoszewski. Pattern Masking for Dictionary Matching. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 65:1-65:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{charalampopoulos_et_al:LIPIcs.ISAAC.2021.65,
  author =	{Charalampopoulos, Panagiotis and Chen, Huiping and Christen, Peter and Loukides, Grigorios and Pisanti, Nadia and Pissis, Solon P. and Radoszewski, Jakub},
  title =	{{Pattern Masking for Dictionary Matching}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{65:1--65:19},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.65},
  URN =		{urn:nbn:de:0030-drops-154982},
  doi =		{10.4230/LIPIcs.ISAAC.2021.65},
  annote =	{Keywords: string algorithms, dictionary matching, wildcards, record linkage, query term dropping}
}
Document
Resilient Level Ancestor, Bottleneck, and Lowest Common Ancestor Queries in Dynamic Trees

Authors: Luciano Gualà, Stefano Leucci, and Isabella Ziccardi


Abstract
We study the problem of designing a resilient data structure maintaining a tree under the Faulty-RAM model [Finocchi and Italiano, STOC'04] in which up to δ memory words can be corrupted by an adversary. Our data structure stores a rooted dynamic tree that can be updated via the addition of new leaves, requires linear size, and supports resilient (weighted) level ancestor queries, lowest common ancestor queries, and bottleneck vertex queries in O(δ) worst-case time per operation.

Cite as

Luciano Gualà, Stefano Leucci, and Isabella Ziccardi. Resilient Level Ancestor, Bottleneck, and Lowest Common Ancestor Queries in Dynamic Trees. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 66:1-66:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{guala_et_al:LIPIcs.ISAAC.2021.66,
  author =	{Gual\`{a}, Luciano and Leucci, Stefano and Ziccardi, Isabella},
  title =	{{Resilient Level Ancestor, Bottleneck, and Lowest Common Ancestor Queries in Dynamic Trees}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{66:1--66:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.66},
  URN =		{urn:nbn:de:0030-drops-154998},
  doi =		{10.4230/LIPIcs.ISAAC.2021.66},
  annote =	{Keywords: level ancestor queries, lowest common ancestor queries, bottleneck vertex queries, resilient data structures, faulty-RAM model, dynamic trees}
}
Document
Computing Shapley Values for Mean Width in 3-D

Authors: Shuhao Tan


Abstract
The Shapley value is a classical concept from game theory, which is used to evaluate the importance of a player in a cooperative setting. Assuming that players are inserted in a uniformly random order, the Shapley value of a player p is the expected increase in the value of the characteristic function when p is inserted. Cabello and Chan (SoCG 2019) recently showed how to adapt this to a geometric context on planar point sets. For example, when the characteristic function is the area of the convex hull, the Shapley value of a point is the average amount by which the convex-hull area increases when this point is added to the set. Shapley values can be viewed as an indication of the relative importance/impact of a point on the function of interest. In this paper, we present an efficient algorithm for computing Shapley values in 3-dimensional space, where the function of interest is the mean width of the point set. Our algorithm runs in O(n³log²n) time and O(n) space. This result can be generalized to any point set in d-dimensional space (d ≥ 3) to compute the Shapley values for the mean volume of the convex hull projected onto a uniformly random (d - 2)-subspace in O(n^d log²n) time and O(n) space. These results are based on a new data structure for a dynamic variant of the convolution problem, which is of independent interest. Our data structure supports incremental modifications to n-element vectors (including cyclical rotation by one position). We show that n operations can be executed in O(n log²n) time and O(n) space.

Cite as

Shuhao Tan. Computing Shapley Values for Mean Width in 3-D. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 67:1-67:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{tan:LIPIcs.ISAAC.2021.67,
  author =	{Tan, Shuhao},
  title =	{{Computing Shapley Values for Mean Width in 3-D}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{67:1--67:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.67},
  URN =		{urn:nbn:de:0030-drops-155008},
  doi =		{10.4230/LIPIcs.ISAAC.2021.67},
  annote =	{Keywords: Shapley value, mean width, dynamic convolution}
}
Document
Simple Envy-Free and Truthful Mechanisms for Cake Cutting with a Small Number of Cuts

Authors: Takao Asano


Abstract
For the cake-cutting problem, Alijani, et al. [Reza Alijani et al., 2017; Masoud Seddighin et al., 2019] and Asano and Umeda [Takao Asano and Hiroyuki Umeda, 2020; Takao Asano and Hiroyuki Umeda, 2020] gave envy-free and truthful mechanisms with a small number of cuts, where the desired part of each player’s valuation function is a single interval on a given cake. In this paper, we give envy-free and truthful mechanisms with a small number of cuts, which are much simpler than those proposed by Alijani, et al. [Reza Alijani et al., 2017; Masoud Seddighin et al., 2019] and Asano and Umeda [Takao Asano and Hiroyuki Umeda, 2020; Takao Asano and Hiroyuki Umeda, 2020]. Furthermore, we show that this approach can be applied to the envy-free and truthful mechanism proposed by Chen, et al. [Yiling Chen et al., 2013], where the valuation function of each player is more general and piecewise uniform. Thus, we can obtain an envy-free and truthful mechanism with a small number of cuts even if the valuation function of each player is piecewise uniform, which solves the future problem posed by Alijani, et al. [Reza Alijani et al., 2017; Masoud Seddighin et al., 2019].

Cite as

Takao Asano. Simple Envy-Free and Truthful Mechanisms for Cake Cutting with a Small Number of Cuts. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 68:1-68:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{asano:LIPIcs.ISAAC.2021.68,
  author =	{Asano, Takao},
  title =	{{Simple Envy-Free and Truthful Mechanisms for Cake Cutting with a Small Number of Cuts}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{68:1--68:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.68},
  URN =		{urn:nbn:de:0030-drops-155011},
  doi =		{10.4230/LIPIcs.ISAAC.2021.68},
  annote =	{Keywords: cake-cutting problem, envy-freeness, fairness, truthfulness, mechanism design}
}
Document
Adaptive Regularized Submodular Maximization

Authors: Shaojie Tang and Jing Yuan


Abstract
In this paper, we study the problem of maximizing the difference between an adaptive submodular (revenue) function and a non-negative modular (cost) function. The input of our problem is a set of n items, where each item has a particular state drawn from some known prior distribution The revenue function g is defined over items and states, and the cost function c is defined over items, i.e., each item has a fixed cost. The state of each item is unknown initially and one must select an item in order to observe its realized state. A policy π specifies which item to pick next based on the observations made so far. Denote by g_{avg}(π) the expected revenue of π and let c_{avg}(π) denote the expected cost of π. Our objective is to identify the best policy π^o ∈ arg max_π g_{avg}(π)-c_{avg}(π) under a k-cardinality constraint. Since our objective function can take on both negative and positive values, the existing results of submodular maximization may not be applicable. To overcome this challenge, we develop a series of effective solutions with performance guarantees. Let π^o denote the optimal policy. For the case when g is adaptive monotone and adaptive submodular, we develop an effective policy π^l such that g_{avg}(π^l) - c_{avg}(π^l) ≥ (1-1/e-ε)g_{avg}(π^o) - c_{avg}(π^o), using only O(nε^{-2}log ε^{-1}) value oracle queries. For the case when g is adaptive submodular, we present a randomized policy π^r such that g_{avg}(π^r) - c_{avg}(π^r) ≥ 1/eg_{avg}(π^o) - c_{avg}(π^o).

Cite as

Shaojie Tang and Jing Yuan. Adaptive Regularized Submodular Maximization. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 69:1-69:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{tang_et_al:LIPIcs.ISAAC.2021.69,
  author =	{Tang, Shaojie and Yuan, Jing},
  title =	{{Adaptive Regularized Submodular Maximization}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{69:1--69:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.69},
  URN =		{urn:nbn:de:0030-drops-155029},
  doi =		{10.4230/LIPIcs.ISAAC.2021.69},
  annote =	{Keywords: Adaptive submodularity, approximation algorithms, active learning}
}
Document
Γ-Graphic Delta-Matroids and Their Applications

Authors: Donggyu Kim, Duksang Lee, and Sang-il Oum


Abstract
For an abelian group Γ, a Γ-labelled graph is a graph whose vertices are labelled by elements of Γ. We prove that a certain collection of edge sets of a Γ-labelled graph forms a delta-matroid, which we call a Γ-graphic delta-matroid, and provide a polynomial-time algorithm to solve the separation problem, which allows us to apply the symmetric greedy algorithm of Bouchet to find a maximum weight feasible set in such a delta-matroid. We present two algorithmic applications on graphs; Maximum Weight Packing of Trees of Order Not Divisible by k and Maximum Weight S-Tree Packing. We also discuss various properties of Γ-graphic delta-matroids.

Cite as

Donggyu Kim, Duksang Lee, and Sang-il Oum. Γ-Graphic Delta-Matroids and Their Applications. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 70:1-70:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{kim_et_al:LIPIcs.ISAAC.2021.70,
  author =	{Kim, Donggyu and Lee, Duksang and Oum, Sang-il},
  title =	{{\Gamma-Graphic Delta-Matroids and Their Applications}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{70:1--70:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.70},
  URN =		{urn:nbn:de:0030-drops-155038},
  doi =		{10.4230/LIPIcs.ISAAC.2021.70},
  annote =	{Keywords: delta-matroid, group-labelled graph, greedy algorithm, tree packing}
}
Document
An Approximation Algorithm for Maximum Stable Matching with Ties and Constraints

Authors: Yu Yokoi


Abstract
We present a polynomial-time 3/2-approximation algorithm for the problem of finding a maximum-cardinality stable matching in a many-to-many matching model with ties and laminar constraints on both sides. We formulate our problem using a bipartite multigraph whose vertices are called workers and firms, and edges are called contracts. Our algorithm is described as the computation of a stable matching in an auxiliary instance, in which each contract is replaced with three of its copies and all agents have strict preferences on the copied contracts. The construction of this auxiliary instance is symmetric for the two sides, which facilitates a simple symmetric analysis. We use the notion of matroid-kernel for computation in the auxiliary instance and exploit the base-orderability of laminar matroids to show the approximation ratio. In a special case in which each worker is assigned at most one contract and each firm has a strict preference, our algorithm defines a 3/2-approximation mechanism that is strategy-proof for workers.

Cite as

Yu Yokoi. An Approximation Algorithm for Maximum Stable Matching with Ties and Constraints. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 71:1-71:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{yokoi:LIPIcs.ISAAC.2021.71,
  author =	{Yokoi, Yu},
  title =	{{An Approximation Algorithm for Maximum Stable Matching with Ties and Constraints}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{71:1--71:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.71},
  URN =		{urn:nbn:de:0030-drops-155047},
  doi =		{10.4230/LIPIcs.ISAAC.2021.71},
  annote =	{Keywords: Stable matching, Approximation algorithm, Matroid, Strategy-proofness}
}
Document
A Faster Algorithm for Maximum Flow in Directed Planar Graphs with Vertex Capacities

Authors: Julian Enoch, Kyle Fox, Dor Mesica, and Shay Mozes


Abstract
We give an O(k³ Δ n log n min(k, log² n) log²(nC))-time algorithm for computing maximum integer flows in planar graphs with integer arc and vertex capacities bounded by C, and k sources and sinks. This improves by a factor of max(k²,k log² n) over the fastest algorithm previously known for this problem [Wang, SODA 2019]. The speedup is obtained by two independent ideas. First we replace an iterative procedure of Wang that uses O(k) invocations of an O(k³ n log³ n)-time algorithm for maximum flow algorithm in a planar graph with k apices [Borradaile et al., FOCS 2012, SICOMP 2017], by an alternative procedure that only makes one invocation of the algorithm of Borradaile et al. Second, we show two alternatives for computing flows in the k-apex graphs that arise in our modification of Wang’s procedure faster than the algorithm of Borradaile et al. In doing so, we introduce and analyze a sequential implementation of the parallel highest-distance push-relabel algorithm of Goldberg and Tarjan [JACM 1988].

Cite as

Julian Enoch, Kyle Fox, Dor Mesica, and Shay Mozes. A Faster Algorithm for Maximum Flow in Directed Planar Graphs with Vertex Capacities. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 72:1-72:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{enoch_et_al:LIPIcs.ISAAC.2021.72,
  author =	{Enoch, Julian and Fox, Kyle and Mesica, Dor and Mozes, Shay},
  title =	{{A Faster Algorithm for Maximum Flow in Directed Planar Graphs with Vertex Capacities}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{72:1--72:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.72},
  URN =		{urn:nbn:de:0030-drops-155057},
  doi =		{10.4230/LIPIcs.ISAAC.2021.72},
  annote =	{Keywords: flow, planar graphs, vertex capacities}
}
Document
Maximum-Weight Matching in Sliding Windows and Beyond

Authors: Leyla Biabani, Mark de Berg, and Morteza Monemizadeh


Abstract
We study the maximum-weight matching problem in the sliding-window model. In this model, we are given an adversarially ordered stream of edges of an underlying edge-weighted graph G(V,E), and a parameter L specifying the window size, and we want to maintain an approximation of the maximum-weight matching of the current graph G(t); here G(t) is defined as the subgraph of G consisting of the edges that arrived during the time interval [max(t-L,1),t], where t is the current time. The goal is to do this with Õ(n) space, where n is the number of vertices of G. We present a deterministic (3.5+ε)-approximation algorithm for this problem, thus significantly improving the (6+ε)-approximation algorithm due to Crouch and Stubbs [Michael S. Crouch and Daniel M. Stubbs, 2014]. We also present a generic machinery for approximating subadditve functions in the sliding-window model. A function f is called subadditive if for every disjoint substreams A, B of a stream S it holds that f(AB) ⩽ f(A) + f(B), where AB denotes the concatenation of A and B. We show that given an α-approximation algorithm for a subadditive function f in the insertion-only model we can maintain a (2α+ε)-approximation of f in the sliding-window model. This improves upon recent result Krauthgamer and Reitblat [Robert Krauthgamer and David Reitblat, 2019], who obtained a (2α²+ε)-approximation.

Cite as

Leyla Biabani, Mark de Berg, and Morteza Monemizadeh. Maximum-Weight Matching in Sliding Windows and Beyond. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 73:1-73:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{biabani_et_al:LIPIcs.ISAAC.2021.73,
  author =	{Biabani, Leyla and de Berg, Mark and Monemizadeh, Morteza},
  title =	{{Maximum-Weight Matching in Sliding Windows and Beyond}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{73:1--73:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.73},
  URN =		{urn:nbn:de:0030-drops-155061},
  doi =		{10.4230/LIPIcs.ISAAC.2021.73},
  annote =	{Keywords: maximum-weight matching, sliding-window model, approximation algorithm, and subadditve functions}
}
Document
Quantum Advantage with Shallow Circuits Under Arbitrary Corruption

Authors: Atsuya Hasegawa and François Le Gall


Abstract
Recent works by Bravyi, Gosset and König (Science 2018), Bene Watts et al. (STOC 2019), Coudron, Stark and Vidick (QIP 2019) and Le Gall (CCC 2019) have shown unconditional separations between the computational powers of shallow (i.e., small-depth) quantum and classical circuits: quantum circuits can solve in constant depth computational problems that require logarithmic depth to solve with classical circuits. Using quantum error correction, Bravyi, Gosset, König and Tomamichel (Nature Physics 2020) further proved that a similar separation still persists even if quantum circuits are subject to local stochastic noise. In this paper, we consider the case where any constant fraction of the qubits (for instance, huge blocks of qubits) may be arbitrarily corrupted at the end of the computation. We make a first step forward towards establishing a quantum advantage even in this extremely challenging setting: we show that there exists a computational problem that can be solved in constant depth by a quantum circuit but such that even solving any large subproblem of this problem requires logarithmic depth with bounded fan-in classical circuits. This gives another compelling evidence of the computational power of quantum shallow circuits. In order to show our result, we consider the Graph State Sampling problem (which was also used in prior works) on expander graphs. We exploit the "robustness" of expander graphs against vertex corruption to show that a subproblem hard for small-depth classical circuits can still be extracted from the output of the corrupted quantum circuit.

Cite as

Atsuya Hasegawa and François Le Gall. Quantum Advantage with Shallow Circuits Under Arbitrary Corruption. In 32nd International Symposium on Algorithms and Computation (ISAAC 2021). Leibniz International Proceedings in Informatics (LIPIcs), Volume 212, pp. 74:1-74:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2021)


Copy BibTex To Clipboard

@InProceedings{hasegawa_et_al:LIPIcs.ISAAC.2021.74,
  author =	{Hasegawa, Atsuya and Le Gall, Fran\c{c}ois},
  title =	{{Quantum Advantage with Shallow Circuits Under Arbitrary Corruption}},
  booktitle =	{32nd International Symposium on Algorithms and Computation (ISAAC 2021)},
  pages =	{74:1--74:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-214-3},
  ISSN =	{1868-8969},
  year =	{2021},
  volume =	{212},
  editor =	{Ahn, Hee-Kap and Sadakane, Kunihiko},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2021.74},
  URN =		{urn:nbn:de:0030-drops-155076},
  doi =		{10.4230/LIPIcs.ISAAC.2021.74},
  annote =	{Keywords: Quantum computing, circuit complexity, constant-depth circuits}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail