LIPIcs, Volume 181

31st International Symposium on Algorithms and Computation (ISAAC 2020)



Thumbnail PDF

Event

ISAAC 2020, December 14-18, 2020, Hong Kong, China (Virtual Conference)

Editors

Yixin Cao
  • Hong Kong Polytechnic University, China
Siu-Wing Cheng
  • Hong Kong University of Science and Technology, China
Minming Li
  • City University of Hong Kong, China

Publication Details

  • published at: 2020-12-04
  • Publisher: Schloss Dagstuhl – Leibniz-Zentrum für Informatik
  • ISBN: 978-3-95977-173-3
  • DBLP: db/conf/isaac/isaac2020

Access Numbers

Documents

No documents found matching your filter selection.
Document
Complete Volume
LIPIcs, Volume 181, ISAAC 2020, Complete Volume

Authors: Yixin Cao, Siu-Wing Cheng, and Minming Li


Abstract
LIPIcs, Volume 181, ISAAC 2020, Complete Volume

Cite as

31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 1-1012, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@Proceedings{cao_et_al:LIPIcs.ISAAC.2020,
  title =	{{LIPIcs, Volume 181, ISAAC 2020, Complete Volume}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{1--1012},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020},
  URN =		{urn:nbn:de:0030-drops-133439},
  doi =		{10.4230/LIPIcs.ISAAC.2020},
  annote =	{Keywords: LIPIcs, Volume 181, ISAAC 2020, Complete Volume}
}
Document
Front Matter
Front Matter, Table of Contents, Preface, Conference Organization

Authors: Yixin Cao, Siu-Wing Cheng, and Minming Li


Abstract
Front Matter, Table of Contents, Preface, Conference Organization

Cite as

31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 0:i-0:xviii, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{cao_et_al:LIPIcs.ISAAC.2020.0,
  author =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  title =	{{Front Matter, Table of Contents, Preface, Conference Organization}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{0:i--0:xviii},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.0},
  URN =		{urn:nbn:de:0030-drops-133448},
  doi =		{10.4230/LIPIcs.ISAAC.2020.0},
  annote =	{Keywords: Front Matter, Table of Contents, Preface, Conference Organization}
}
Document
Invited Talk
How to Decompose a Graph into a Tree-Like Structure (Invited Talk)

Authors: Sang-il Oum


Abstract
Many NP-hard problems on graphs are known to be tractable if we restrict the input to have a certain decomposition into a tree-like structure. Width parameters of graphs are measures on how easy it is to decompose the input graph into a tree-like structure. The tree-width is one of the most well-studied width parameters of graphs and the rank-width is a generalization of tree-width into dense graphs. This talk will present a survey on width parameters of graphs such as tree-width and rank-width and discuss known algorithms to find a decomposition of an input graph into such tree-like structures efficiently.

Cite as

Sang-il Oum. How to Decompose a Graph into a Tree-Like Structure (Invited Talk). In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, p. 1:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{oum:LIPIcs.ISAAC.2020.1,
  author =	{Oum, Sang-il},
  title =	{{How to Decompose a Graph into a Tree-Like Structure}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{1:1--1:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.1},
  URN =		{urn:nbn:de:0030-drops-133458},
  doi =		{10.4230/LIPIcs.ISAAC.2020.1},
  annote =	{Keywords: tree-width, rank-width}
}
Document
Invited Talk
Worst-Case Optimal Join Algorithms (Invited Talk)

Authors: Ke Yi


Abstract
Join is the most important operator in relational databases, and remains the most expensive one despite years of research and engineering efforts. Following the ground-breaking work of Atserias, Grohe, and Marx in 2008, worst-case optimal join algorithms have been discovered, which has led to not only a series of beautiful theoretical results, but also new database systems based on drastically different query evaluation techniques. In this talk, I will present an overview of this topic, including algorithms in various computation models (sequential and parallel), variants of the problem (full, Boolean, and counting), and approximate solutions.

Cite as

Ke Yi. Worst-Case Optimal Join Algorithms (Invited Talk). In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, p. 2:1, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{yi:LIPIcs.ISAAC.2020.2,
  author =	{Yi, Ke},
  title =	{{Worst-Case Optimal Join Algorithms}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{2:1--2:1},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.2},
  URN =		{urn:nbn:de:0030-drops-133462},
  doi =		{10.4230/LIPIcs.ISAAC.2020.2},
  annote =	{Keywords: query evaluation}
}
Document
(In)approximability of Maximum Minimal FVS

Authors: Louis Dublois, Tesshu Hanaka, Mehdi Khosravian Ghadikolaei, Michael Lampis, and Nikolaos Melissinos


Abstract
We study the approximability of the NP-complete Maximum Minimal Feedback Vertex Set problem. Informally, this natural problem seems to lie in an intermediate space between two more well-studied problems of this type: Maximum Minimal Vertex Cover, for which the best achievable approximation ratio is √n, and Upper Dominating Set, which does not admit any n^{1-ε} approximation. We confirm and quantify this intuition by showing the first non-trivial polynomial time approximation for Max Min FVS with a ratio of O(n^{2/3}), as well as a matching hardness of approximation bound of n^{2/3-ε}, improving the previous known hardness of n^{1/2-ε}. Along the way, we also obtain an O(Δ)-approximation and show that this is asymptotically best possible, and we improve the bound for which the problem is NP-hard from Δ ≥ 9 to Δ ≥ 6. Having settled the problem’s approximability in polynomial time, we move to the context of super-polynomial time. We devise a generalization of our approximation algorithm which, for any desired approximation ratio r, produces an r-approximate solution in time n^O(n/r^{3/2}). This time-approximation trade-off is essentially tight: we show that under the ETH, for any ratio r and ε > 0, no algorithm can r-approximate this problem in time n^{O((n/r^{3/2})^{1-ε})}, hence we precisely characterize the approximability of the problem for the whole spectrum between polynomial and sub-exponential time, up to an arbitrarily small constant in the second exponent.

Cite as

Louis Dublois, Tesshu Hanaka, Mehdi Khosravian Ghadikolaei, Michael Lampis, and Nikolaos Melissinos. (In)approximability of Maximum Minimal FVS. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 3:1-3:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{dublois_et_al:LIPIcs.ISAAC.2020.3,
  author =	{Dublois, Louis and Hanaka, Tesshu and Khosravian Ghadikolaei, Mehdi and Lampis, Michael and Melissinos, Nikolaos},
  title =	{{(In)approximability of Maximum Minimal FVS}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{3:1--3:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.3},
  URN =		{urn:nbn:de:0030-drops-133477},
  doi =		{10.4230/LIPIcs.ISAAC.2020.3},
  annote =	{Keywords: Approximation Algorithms, ETH, Inapproximability}
}
Document
A Faster Subquadratic Algorithm for the Longest Common Increasing Subsequence Problem

Authors: Anadi Agrawal and Paweł Gawrychowski


Abstract
The Longest Common Increasing Subsequence (LCIS) is a variant of the classical Longest Common Subsequence (LCS), in which we additionally require the common subsequence to be strictly increasing. While the well-known "Four Russians" technique can be used to find LCS in subquadratic time, it does not seem directly applicable to LCIS. Recently, Duraj [STACS 2020] used a completely different method based on the combinatorial properties of LCIS to design an 𝒪(n²(log log n)²/log^{1/6}n) time algorithm. We show that an approach based on exploiting tabulation (more involved than "Four Russians") can be used to construct an asymptotically faster 𝒪(n² log log n/√{log n}) time algorithm. As our solution avoids using the specific combinatorial properties of LCIS, it can be also adapted for the Longest Common Weakly Increasing Subsequence (LCWIS).

Cite as

Anadi Agrawal and Paweł Gawrychowski. A Faster Subquadratic Algorithm for the Longest Common Increasing Subsequence Problem. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 4:1-4:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{agrawal_et_al:LIPIcs.ISAAC.2020.4,
  author =	{Agrawal, Anadi and Gawrychowski, Pawe{\l}},
  title =	{{A Faster Subquadratic Algorithm for the Longest Common Increasing Subsequence Problem}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{4:1--4:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.4},
  URN =		{urn:nbn:de:0030-drops-133487},
  doi =		{10.4230/LIPIcs.ISAAC.2020.4},
  annote =	{Keywords: Longest Common Increasing Subsequence, Four Russians}
}
Document
A Unified Framework of FPT Approximation Algorithms for Clustering Problems

Authors: Qilong Feng, Zhen Zhang, Ziyun Huang, Jinhui Xu, and Jianxin Wang


Abstract
In this paper, we present a framework for designing FPT approximation algorithms for many k-clustering problems. Our results are based on a new technique for reducing search spaces. A reduced search space is a small subset of the input data that has the guarantee of containing k clients close to the facilities opened in an optimal solution for any clustering problem we consider. We show, somewhat surprisingly, that greedily sampling O(k) clients yields the desired reduced search space, based on which we obtain FPT(k)-time algorithms with improved approximation guarantees for problems such as capacitated clustering, lower-bounded clustering, clustering with service installation costs, fault tolerant clustering, and priority clustering.

Cite as

Qilong Feng, Zhen Zhang, Ziyun Huang, Jinhui Xu, and Jianxin Wang. A Unified Framework of FPT Approximation Algorithms for Clustering Problems. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 5:1-5:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{feng_et_al:LIPIcs.ISAAC.2020.5,
  author =	{Feng, Qilong and Zhang, Zhen and Huang, Ziyun and Xu, Jinhui and Wang, Jianxin},
  title =	{{A Unified Framework of FPT Approximation Algorithms for Clustering Problems}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{5:1--5:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.5},
  URN =		{urn:nbn:de:0030-drops-133495},
  doi =		{10.4230/LIPIcs.ISAAC.2020.5},
  annote =	{Keywords: clustering, approximation algorithms, fixed-parameter tractability}
}
Document
A Reduction of the Dynamic Time Warping Distance to the Longest Increasing Subsequence Length

Authors: Yoshifumi Sakai and Shunsuke Inenaga


Abstract
The similarity between a pair of time series, i.e., sequences of indexed values in time order, is often estimated by the dynamic time warping (DTW) distance, instead of any in the well-studied family of measures including the longest common subsequence (LCS) length and the edit distance. Although it may seem as if the DTW and the LCS(-like) measures are essentially different, we reveal that the DTW distance can be represented by the longest increasing subsequence (LIS) length of a sequence of integers, which is the LCS length between the integer sequence and itself sorted. For a given pair of time series of n integers between zero and c, we propose an integer sequence that represents any substring-substring DTW distance as its band-substring LIS length. The length of the produced integer sequence is O(c⁴ n²) or O(c² n²) depending on the variant of the DTW distance used, both of which can be translated to O(n²) for constant cost functions. To demonstrate that techniques developed under the LCS(-like) measures are directly applicable to analysis of time series via our reduction of DTW to LIS, we present time-efficient algorithms for DTW-related problems utilizing the semi-local sequence comparison technique developed for LCS-related problems.

Cite as

Yoshifumi Sakai and Shunsuke Inenaga. A Reduction of the Dynamic Time Warping Distance to the Longest Increasing Subsequence Length. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 6:1-6:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{sakai_et_al:LIPIcs.ISAAC.2020.6,
  author =	{Sakai, Yoshifumi and Inenaga, Shunsuke},
  title =	{{A Reduction of the Dynamic Time Warping Distance to the Longest Increasing Subsequence Length}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{6:1--6:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.6},
  URN =		{urn:nbn:de:0030-drops-133508},
  doi =		{10.4230/LIPIcs.ISAAC.2020.6},
  annote =	{Keywords: algorithms, dynamic time warping distance, longest increasing subsequence, semi-local sequence comparison}
}
Document
Algorithms and Complexity for Geodetic Sets on Planar and Chordal Graphs

Authors: Dibyayan Chakraborty, Sandip Das, Florent Foucaud, Harmender Gahlawat, Dimitri Lajou, and Bodhayan Roy


Abstract
We study the complexity of finding the geodetic number on subclasses of planar graphs and chordal graphs. A set S of vertices of a graph G is a geodetic set if every vertex of G lies in a shortest path between some pair of vertices of S. The Minimum Geodetic Set (MGS) problem is to find a geodetic set with minimum cardinality of a given graph. The problem is known to remain NP-hard on bipartite graphs, chordal graphs, planar graphs and subcubic graphs. We first study MGS on restricted classes of planar graphs: we design a linear-time algorithm for MGS on solid grids, improving on a 3-approximation algorithm by Chakraborty et al. (CALDAM, 2020) and show that MGS remains NP-hard even for subcubic partial grids of arbitrary girth. This unifies some results in the literature. We then turn our attention to chordal graphs, showing that MGS is fixed parameter tractable for inputs of this class when parameterized by their treewidth (which equals the clique number minus one). This implies a linear-time algorithm for k-trees, for fixed k. Then, we show that MGS is NP-hard on interval graphs, thereby answering a question of Ekim et al. (LATIN, 2012). As interval graphs are very constrained, to prove the latter result we design a rather sophisticated reduction technique to work around their inherent linear structure.

Cite as

Dibyayan Chakraborty, Sandip Das, Florent Foucaud, Harmender Gahlawat, Dimitri Lajou, and Bodhayan Roy. Algorithms and Complexity for Geodetic Sets on Planar and Chordal Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 7:1-7:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{chakraborty_et_al:LIPIcs.ISAAC.2020.7,
  author =	{Chakraborty, Dibyayan and Das, Sandip and Foucaud, Florent and Gahlawat, Harmender and Lajou, Dimitri and Roy, Bodhayan},
  title =	{{Algorithms and Complexity for Geodetic Sets on Planar and Chordal Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{7:1--7:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.7},
  URN =		{urn:nbn:de:0030-drops-133516},
  doi =		{10.4230/LIPIcs.ISAAC.2020.7},
  annote =	{Keywords: Geodetic set, Planar graph, Chordal graph, Interval graph, FPT algorithm}
}
Document
An SPQR-Tree-Like Embedding Representation for Level Planarity

Authors: Guido Brückner and Ignaz Rutter


Abstract
An SPQR-tree is a data structure that efficiently represents all planar embeddings of a biconnected planar graph. It is a key tool in a number of constrained planarity testing algorithms, which seek a planar embedding of a graph subject to some given set of constraints. We develop an SPQR-tree-like data structure that represents all level-planar embeddings of a biconnected level graph with a single source, called the LP-tree, and give a simple algorithm to compute it in linear time. Moreover, we show that LP-trees can be used to adapt three constrained planarity algorithms to the level-planar case by using them as a drop-in replacement for SPQR-trees.

Cite as

Guido Brückner and Ignaz Rutter. An SPQR-Tree-Like Embedding Representation for Level Planarity. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 8:1-8:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bruckner_et_al:LIPIcs.ISAAC.2020.8,
  author =	{Br\"{u}ckner, Guido and Rutter, Ignaz},
  title =	{{An SPQR-Tree-Like Embedding Representation for Level Planarity}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{8:1--8:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.8},
  URN =		{urn:nbn:de:0030-drops-133526},
  doi =		{10.4230/LIPIcs.ISAAC.2020.8},
  annote =	{Keywords: SPQR-tree, Level planarity, Partial drawings, Simultaneous drawings}
}
Document
Approximating the Packedness of Polygonal Curves

Authors: Joachim Gudmundsson, Yuan Sha, and Sampson Wong


Abstract
In 2012 Driemel et al. [Anne Driemel et al., 2012] introduced the concept of c-packed curves as a realistic input model. In the case when c is a constant they gave a near linear time (1+ε)-approximation algorithm for computing the Fréchet distance between two c-packed polygonal curves. Since then a number of papers have used the model. In this paper we consider the problem of computing the smallest c for which a given polygonal curve in ℝ^d is c-packed. We present two approximation algorithms. The first algorithm is a 2-approximation algorithm and runs in O(dn² log n) time. In the case d = 2 we develop a faster algorithm that returns a (6+ε)-approximation and runs in O((n/ε³)^{4/3} polylog (n/ε))) time. We also implemented the first algorithm and computed the approximate packedness-value for 16 sets of real-world trajectories. The experiments indicate that the notion of c-packedness is a useful realistic input model for many curves and trajectories.

Cite as

Joachim Gudmundsson, Yuan Sha, and Sampson Wong. Approximating the Packedness of Polygonal Curves. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 9:1-9:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{gudmundsson_et_al:LIPIcs.ISAAC.2020.9,
  author =	{Gudmundsson, Joachim and Sha, Yuan and Wong, Sampson},
  title =	{{Approximating the Packedness of Polygonal Curves}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{9:1--9:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.9},
  URN =		{urn:nbn:de:0030-drops-133530},
  doi =		{10.4230/LIPIcs.ISAAC.2020.9},
  annote =	{Keywords: Computational geometry, trajectories, realistic input models}
}
Document
Approximation Algorithms for Generalized Path Scheduling

Authors: Haozhou Pang and Mohammad R. Salavatipour


Abstract
Scheduling problems where the machines can be represented as the edges of a network and each job needs to be processed by a sequence of machines that form a path in this network have been the subject of many research articles (e.g. flow shop is the special case where the network as well as the sequence of machines for each job is a simple path). In this paper we consider one such problem, called Generalized Path Scheduling (GPS) problem, which can be defined as follows. Given a set of non-preemptive jobs J and identical machines M ( |J| = n and |M| = m ). The machines are ordered on a path. Each job j = {P_j = {l_j, r_j}, p_j} is defined by its processing time p_j and a sub-path P_j from machine with index l_j to r_j (l_j, r_j ∈ M, and l_j ≤ r_j) specifying the order of machines it must go through. We assume each machine has a queue of infinite size where jobs can sit in the queue to resolve conflicts. Two objective functions, makespan and total completion time, are considered. Machines can be identical or unrelated. In the latter case, this problem generalizes the classical Flow shop problem (in which all jobs have to go through all machines from 1 to m in that order). Generalized Path Scheduling has been studied (e.g. see [Ronald Koch et al., 2009; Zachary Friggstad et al., 2019]). In this paper, we present several improved approximation algorithms for both objectives. For the case of number of machines being sub-logarithmic in the number of jobs we present a PTAS for both makespan and total completion time. The PTAS holds even on unrelated machines setting and therefore, generalizes the result of Hall [Leslie A. Hall, 1998] for the classic problem of Flow shop. For the case of identical machines, we present an O((log m)/(log log m))-approximation algorithms for both objectives, which improve the previous best result of [Zachary Friggstad et al., 2019]. We also show that the GPS problem is NP-complete for both makespan and total completion time objectives.

Cite as

Haozhou Pang and Mohammad R. Salavatipour. Approximation Algorithms for Generalized Path Scheduling. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 10:1-10:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{pang_et_al:LIPIcs.ISAAC.2020.10,
  author =	{Pang, Haozhou and Salavatipour, Mohammad R.},
  title =	{{Approximation Algorithms for Generalized Path Scheduling}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{10:1--10:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.10},
  URN =		{urn:nbn:de:0030-drops-133547},
  doi =		{10.4230/LIPIcs.ISAAC.2020.10},
  annote =	{Keywords: Approximation Algorithms, Path Scheduling, Flow shop, Job Shop}
}
Document
Approximations for Throughput Maximization

Authors: Dylan Hyatt-Denesik, Mirmahdi Rahgoshay, and Mohammad R. Salavatipour


Abstract
In this paper we study the classical problem of throughput maximization. In this problem we have a collection J of n jobs, each having a release time r_j, deadline d_j, and processing time p_j. They have to be scheduled non-preemptively on m identical parallel machines. The goal is to find a schedule which maximizes the number of jobs scheduled entirely in their [r_j,d_j] window. This problem has been studied extensively (even for the case of m = 1). Several special cases of the problem remain open. Bar-Noy et al. [STOC1999] presented an algorithm with ratio 1-1/(1+1/m)^m for m machines, which approaches 1-1/e as m increases. For m = 1, Chuzhoy-Ostrovsky-Rabani [FOCS2001] presented an algorithm with approximation with ratio 1-1/e-ε (for any ε > 0). Recently Im-Li-Moseley [IPCO2017] presented an algorithm with ratio 1-1/e+ε₀ for some absolute constant ε₀ > 0 for any fixed m. They also presented an algorithm with ratio 1-O(√(log m/m))-ε for general m which approaches 1 as m grows. The approximability of the problem for m = O(1) remains a major open question. Even for the case of m = 1 and c = O(1) distinct processing times the problem is open (Sgall [ESA2012]). In this paper we study the case of m = O(1) and show that if there are c distinct processing times, i.e. p_j’s come from a set of size c, then there is a randomized (1-ε)-approximation that runs in time O(n^{mc⁷ε^(-6)}log T), where T is the largest deadline. Therefore, for constant m and constant c this yields a PTAS. Our algorithm is based on proving structural properties for a near optimum solution that allows one to use a dynamic programming with pruning.

Cite as

Dylan Hyatt-Denesik, Mirmahdi Rahgoshay, and Mohammad R. Salavatipour. Approximations for Throughput Maximization. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 11:1-11:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{hyattdenesik_et_al:LIPIcs.ISAAC.2020.11,
  author =	{Hyatt-Denesik, Dylan and Rahgoshay, Mirmahdi and Salavatipour, Mohammad R.},
  title =	{{Approximations for Throughput Maximization}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{11:1--11:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.11},
  URN =		{urn:nbn:de:0030-drops-133555},
  doi =		{10.4230/LIPIcs.ISAAC.2020.11},
  annote =	{Keywords: Scheduling, Approximation Algorithms, Throughput Maximization}
}
Document
Arithmetic Expression Construction

Authors: Leo Alcock, Sualeh Asif, Jeffrey Bosboom, Josh Brunner, Charlotte Chen, Erik D. Demaine, Rogers Epstein, Adam Hesterberg, Lior Hirschfeld, William Hu, Jayson Lynch, Sarah Scheffler, and Lillian Zhang


Abstract
When can n given numbers be combined using arithmetic operators from a given subset of {+,-,×,÷} to obtain a given target number? We study three variations of this problem of Arithmetic Expression Construction: when the expression (1) is unconstrained; (2) has a specified pattern of parentheses and operators (and only the numbers need to be assigned to blanks); or (3) must match a specified ordering of the numbers (but the operators and parenthesization are free). For each of these variants, and many of the subsets of {+,-,×,÷}, we prove the problem NP-complete, sometimes in the weak sense and sometimes in the strong sense. Most of these proofs make use of a rational function framework which proves equivalence of these problems for values in rational functions with values in positive integers.

Cite as

Leo Alcock, Sualeh Asif, Jeffrey Bosboom, Josh Brunner, Charlotte Chen, Erik D. Demaine, Rogers Epstein, Adam Hesterberg, Lior Hirschfeld, William Hu, Jayson Lynch, Sarah Scheffler, and Lillian Zhang. Arithmetic Expression Construction. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 12:1-12:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{alcock_et_al:LIPIcs.ISAAC.2020.12,
  author =	{Alcock, Leo and Asif, Sualeh and Bosboom, Jeffrey and Brunner, Josh and Chen, Charlotte and Demaine, Erik D. and Epstein, Rogers and Hesterberg, Adam and Hirschfeld, Lior and Hu, William and Lynch, Jayson and Scheffler, Sarah and Zhang, Lillian},
  title =	{{Arithmetic Expression Construction}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{12:1--12:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.12},
  URN =		{urn:nbn:de:0030-drops-133568},
  doi =		{10.4230/LIPIcs.ISAAC.2020.12},
  annote =	{Keywords: Hardness, algebraic complexity, expression trees}
}
Document
Between Shapes, Using the Hausdorff Distance

Authors: Marc van Kreveld, Tillmann Miltzow, Tim Ophelders, Willem Sonke, and Jordi L. Vermeulen


Abstract
Given two shapes A and B in the plane with Hausdorff distance 1, is there a shape S with Hausdorff distance 1/2 to and from A and B? The answer is always yes, and depending on convexity of A and/or B, S may be convex, connected, or disconnected. We show a generalization of this result on Hausdorff distances and middle shapes, and show some related properties. We also show that a generalization of such middle shapes implies a morph with a bounded rate of change. Finally, we explore a generalization of the concept of a Hausdorff middle to more than two sets and show how to approximate or compute it.

Cite as

Marc van Kreveld, Tillmann Miltzow, Tim Ophelders, Willem Sonke, and Jordi L. Vermeulen. Between Shapes, Using the Hausdorff Distance. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 13:1-13:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{vankreveld_et_al:LIPIcs.ISAAC.2020.13,
  author =	{van Kreveld, Marc and Miltzow, Tillmann and Ophelders, Tim and Sonke, Willem and Vermeulen, Jordi L.},
  title =	{{Between Shapes, Using the Hausdorff Distance}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{13:1--13:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.13},
  URN =		{urn:nbn:de:0030-drops-133572},
  doi =		{10.4230/LIPIcs.ISAAC.2020.13},
  annote =	{Keywords: computational geometry, Hausdorff distance, shape interpolation}
}
Document
Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs

Authors: Trung Thanh Nguyen and Jörg Rothe


Abstract
We study a generalized version of the load balancing problem on unrelated machines with cost constraints: Given a set of m machines (of certain types) and a set of n jobs, each job j processed on machine i requires p_{i,j} time units and incurs a cost c_{i,j}, and the goal is to find a schedule of jobs to machines, which is defined as an ordered partition of n jobs into m disjoint subsets, in such a way that some objective function of the vector of the completion times of the machines is optimized, subject to the constraint that the total costs by the schedule must be within a given budget B. Motivated by recent results from the literature, our focus is on the case when the number of machine types is a fixed constant and we develop a bi-criteria approximation scheme for the studied problem. Our result generalizes several known results for certain special cases, such as the case with identical machines, or the case with a constant number of machines with cost constraints. Building on the elegant technique recently proposed by Jansen and Maack [K. Jansen and M. Maack, 2019], we construct a more general approach that can be used to derive approximation schemes to a wider class of load balancing problems with constraints.

Cite as

Trung Thanh Nguyen and Jörg Rothe. Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 14:1-14:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{nguyen_et_al:LIPIcs.ISAAC.2020.14,
  author =	{Nguyen, Trung Thanh and Rothe, J\"{o}rg},
  title =	{{Bi-Criteria Approximation Algorithms for Load Balancing on Unrelated Machines with Costs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{14:1--14:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.14},
  URN =		{urn:nbn:de:0030-drops-133582},
  doi =		{10.4230/LIPIcs.ISAAC.2020.14},
  annote =	{Keywords: bi-criteria approximation algorithm, polynomial-time approximation algorithm, load balancing, machine scheduling}
}
Document
Cake Cutting: An Envy-Free and Truthful Mechanism with a Small Number of Cuts

Authors: Takao Asano and Hiroyuki Umeda


Abstract
The mechanism for the cake-cutting problem based on the expansion process with unlocking proposed by Alijani, Farhadi, Ghodsi, Seddighin, and Tajik [Reza Alijani et al., 2017; Masoud Seddighin et al., 2019] uses a small number of cuts, but is not actually envy-free and truthful, although they claimed that it is envy-free and truthful. In this paper, we consider the same cake-cutting problem and give a new envy-free and truthful mechanism with a small number of cuts, which is not based on their expansion process with unlocking.

Cite as

Takao Asano and Hiroyuki Umeda. Cake Cutting: An Envy-Free and Truthful Mechanism with a Small Number of Cuts. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 15:1-15:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{asano_et_al:LIPIcs.ISAAC.2020.15,
  author =	{Asano, Takao and Umeda, Hiroyuki},
  title =	{{Cake Cutting: An Envy-Free and Truthful Mechanism with a Small Number of Cuts}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{15:1--15:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.15},
  URN =		{urn:nbn:de:0030-drops-133599},
  doi =		{10.4230/LIPIcs.ISAAC.2020.15},
  annote =	{Keywords: cake-cutting problem, envy-freeness, fairness, truthfulness, mechanism design}
}
Document
Compact Routing in Unit Disk Graphs

Authors: Wolfgang Mulzer and Max Willert


Abstract
Let V ⊂ ℝ² be a set of n sites in the plane. The unit disk graph DG(V) of V is the graph with vertex set V where two sites v and w are adjacent if and only if their Euclidean distance is at most 1. We develop a compact routing scheme ℛ for DG(V). The routing scheme ℛ preprocesses DG(V) by assigning a label 𝓁(v) to every site v in V. After that, for any two sites s and t, the scheme ℛ must be able to route a packet from s to t as follows: given the label of a current vertex r (initially, r = s), the label of the target vertex t, and additional information in the header of the packet, the scheme determines a neighbor r' of r. Then, the packet is forwarded to r', and the process continues until the packet reaches its desired target t. The resulting path between the source s and the target t is called the routing path of s and t. The stretch of the routing scheme is the maximum ratio of the total Euclidean length of the routing path and of the shortest path in DG(V), between any two sites s, t ∈ V. We show that for any given ε > 0, we can construct a routing scheme for DG(V) with diameter D that achieves stretch 1+ε, has label size (1/ε)^{O(ε^(-2))} log Dlog³n/log log n, and the header has at most O(log²n/log log n) bits. In the past, several routing schemes for unit disk graphs have been proposed. Our scheme achieves poly-logarithmic label and header size, small stretch and does not use any neighborhood oracles.

Cite as

Wolfgang Mulzer and Max Willert. Compact Routing in Unit Disk Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 16:1-16:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{mulzer_et_al:LIPIcs.ISAAC.2020.16,
  author =	{Mulzer, Wolfgang and Willert, Max},
  title =	{{Compact Routing in Unit Disk Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{16:1--16:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.16},
  URN =		{urn:nbn:de:0030-drops-133602},
  doi =		{10.4230/LIPIcs.ISAAC.2020.16},
  annote =	{Keywords: routing scheme, unit disk graph, separator}
}
Document
Complexity of Retrograde and Helpmate Chess Problems: Even Cooperative Chess Is Hard

Authors: Josh Brunner, Erik D. Demaine, Dylan Hendrickson, and Julian Wellman


Abstract
We prove PSPACE-completeness of two classic types of Chess problems when generalized to n × n boards. A "retrograde" problem asks whether it is possible for a position to be reached from a natural starting position, i.e., whether the position is "valid" or "legal" or "reachable". Most real-world retrograde Chess problems ask for the last few moves of such a sequence; we analyze the decision question which gets at the existence of an exponentially long move sequence. A "helpmate" problem asks whether it is possible for a player to become checkmated by any sequence of moves from a given position. A helpmate problem is essentially a cooperative form of Chess, where both players work together to cause a particular player to win; it also arises in regular Chess games, where a player who runs out of time (flags) loses only if they could ever possibly be checkmated from the current position (i.e., the helpmate problem has a solution). Our PSPACE-hardness reductions are from a variant of a puzzle game called Subway Shuffle.

Cite as

Josh Brunner, Erik D. Demaine, Dylan Hendrickson, and Julian Wellman. Complexity of Retrograde and Helpmate Chess Problems: Even Cooperative Chess Is Hard. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 17:1-17:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{brunner_et_al:LIPIcs.ISAAC.2020.17,
  author =	{Brunner, Josh and Demaine, Erik D. and Hendrickson, Dylan and Wellman, Julian},
  title =	{{Complexity of Retrograde and Helpmate Chess Problems: Even Cooperative Chess Is Hard}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{17:1--17:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.17},
  URN =		{urn:nbn:de:0030-drops-133618},
  doi =		{10.4230/LIPIcs.ISAAC.2020.17},
  annote =	{Keywords: hardness, board games, PSPACE}
}
Document
Complexity of Scheduling Few Types of Jobs on Related and Unrelated Machines

Authors: Martin Koutecký and Johannes Zink


Abstract
The task of scheduling jobs to machines while minimizing the total makespan, the sum of weighted completion times, or a norm of the load vector, are among the oldest and most fundamental tasks in combinatorial optimization. Since all of these problems are in general NP-hard, much attention has been given to the regime where there is only a small number k of job types, but possibly the number of jobs n is large; this is the few job types, high-multiplicity regime. Despite many positive results, the hardness boundary of this regime was not understood until now. We show that makespan minimization on uniformly related machines (Q|HM|C_max) is NP-hard already with 6 job types, and that the related Cutting Stock problem is NP-hard already with 8 item types. For the more general unrelated machines model (R|HM|C_max), we show that if either the largest job size p_max, or the number of jobs n are polynomially bounded in the instance size |I|, there are algorithms with complexity |I|^poly(k). Our main result is that this is unlikely to be improved, because Q||C_max is W[1]-hard parameterized by k already when n, p_max, and the numbers describing the speeds are polynomial in |I|; the same holds for R|HM|C_max (without speeds) when the job sizes matrix has rank 2. Our positive and negative results also extend to the objectives 𝓁₂-norm minimization of the load vector and, partially, sum of weighted completion times ∑ w_j C_j. Along the way, we answer affirmatively the question whether makespan minimization on identical machines (P||C_max) is fixed-parameter tractable parameterized by k, extending our understanding of this fundamental problem. Together with our hardness results for Q||C_max this implies that the complexity of P|HM|C_max is the only remaining open case.

Cite as

Martin Koutecký and Johannes Zink. Complexity of Scheduling Few Types of Jobs on Related and Unrelated Machines. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 18:1-18:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{koutecky_et_al:LIPIcs.ISAAC.2020.18,
  author =	{Kouteck\'{y}, Martin and Zink, Johannes},
  title =	{{Complexity of Scheduling Few Types of Jobs on Related and Unrelated Machines}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{18:1--18:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.18},
  URN =		{urn:nbn:de:0030-drops-133620},
  doi =		{10.4230/LIPIcs.ISAAC.2020.18},
  annote =	{Keywords: Scheduling, cutting stock, hardness, parameterized complexity}
}
Document
Complexity of Stability

Authors: Fabian Frei, Edith Hemaspaandra, and Jörg Rothe


Abstract
Graph parameters such as the clique number, the chromatic number, and the independence number are central in many areas, ranging from computer networks to linguistics to computational neuroscience to social networks. In particular, the chromatic number of a graph (i.e., the smallest number of colors needed to color all vertices such that no two adjacent vertices are of the same color) can be applied in solving practical tasks as diverse as pattern matching, scheduling jobs to machines, allocating registers in compiler optimization, and even solving Sudoku puzzles. Typically, however, the underlying graphs are subject to (often minor) changes. To make these applications of graph parameters robust, it is important to know which graphs are stable for them in the sense that adding or deleting single edges or vertices does not change them. We initiate the study of stability of graphs for such parameters in terms of their computational complexity. We show that, for various central graph parameters, the problem of determining whether or not a given graph is stable is complete for Θ₂ᵖ, a well-known complexity class in the second level of the polynomial hierarchy, which is also known as "parallel access to NP."

Cite as

Fabian Frei, Edith Hemaspaandra, and Jörg Rothe. Complexity of Stability. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 19:1-19:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{frei_et_al:LIPIcs.ISAAC.2020.19,
  author =	{Frei, Fabian and Hemaspaandra, Edith and Rothe, J\"{o}rg},
  title =	{{Complexity of Stability}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{19:1--19:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.19},
  URN =		{urn:nbn:de:0030-drops-133631},
  doi =		{10.4230/LIPIcs.ISAAC.2020.19},
  annote =	{Keywords: Stability, Robustness, Complexity, Local Modifications, Colorability, Vertex Cover, Clique, Independent Set, Satisfiability, Unfrozenness, Criticality, DP, coDP, Parallel Access to NP}
}
Document
Computing Dense and Sparse Subgraphs of Weakly Closed Graphs

Authors: Tomohiro Koana, Christian Komusiewicz, and Frank Sommer


Abstract
A graph G is weakly γ-closed if every induced subgraph of G contains one vertex v such that for each non-neighbor u of v it holds that |N(u)∩ N(v)| < γ. The weak closure γ(G) of a graph, recently introduced by Fox et al. [SIAM J. Comp. 2020], is the smallest number such that G is weakly γ-closed. This graph parameter is never larger than the degeneracy (plus one) and can be significantly smaller. Extending the work of Fox et al. [SIAM J. Comp. 2020] on clique enumeration, we show that several problems related to finding dense subgraphs, such as the enumeration of bicliques and s-plexes, are fixed-parameter tractable with respect to γ(G). Moreover, we show that the problem of determining whether a weakly γ-closed graph G has a subgraph on at least k vertices that belongs to a graph class 𝒢 which is closed under taking subgraphs admits a kernel with at most γ k² vertices. Finally, we provide fixed-parameter algorithms for Independent Dominating Set and Dominating Clique when parameterized by γ+k where k is the solution size.

Cite as

Tomohiro Koana, Christian Komusiewicz, and Frank Sommer. Computing Dense and Sparse Subgraphs of Weakly Closed Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 20:1-20:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{koana_et_al:LIPIcs.ISAAC.2020.20,
  author =	{Koana, Tomohiro and Komusiewicz, Christian and Sommer, Frank},
  title =	{{Computing Dense and Sparse Subgraphs of Weakly Closed Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{20:1--20:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.20},
  URN =		{urn:nbn:de:0030-drops-133646},
  doi =		{10.4230/LIPIcs.ISAAC.2020.20},
  annote =	{Keywords: Fixed-parameter tractability, c-closure, degeneracy, clique relaxations, bicliques, dominating set}
}
Document
Constant-Factor Approximation Algorithms for the Parity-Constrained Facility Location Problem

Authors: Kangsan Kim, Yongho Shin, and Hyung-Chan An


Abstract
Facility location is a prominent optimization problem that has inspired a large quantity of both theoretical and practical studies in combinatorial optimization. Although the problem has been investigated under various settings reflecting typical structures within the optimization problems of practical interest, little is known on how the problem behaves in conjunction with parity constraints. This shortfall of understanding was rather discouraging when we consider the central role of parity in the field of combinatorics. In this paper, we present the first constant-factor approximation algorithm for the facility location problem with parity constraints. We are given as the input a metric on a set of facilities and clients, the opening cost of each facility, and the parity requirement - odd, even, or unconstrained - of every facility in this problem. The objective is to open a subset of facilities and assign every client to an open facility so as to minimize the sum of the total opening costs and the assignment distances, but subject to the condition that the number of clients assigned to each open facility must have the same parity as its requirement. Although the unconstrained facility location problem as a relaxation for this parity-constrained generalization has unbounded gap, we demonstrate that it yields a structured solution whose parity violation can be corrected at small cost. This correction is prescribed by a T-join on an auxiliary graph constructed by the algorithm. This auxiliary graph does not satisfy the triangle inequality, but we show that a carefully chosen set of shortcutting operations leads to a cheap and sparse T-join. Finally, we bound the correction cost by exhibiting a combinatorial multi-step construction of an upper bound.

Cite as

Kangsan Kim, Yongho Shin, and Hyung-Chan An. Constant-Factor Approximation Algorithms for the Parity-Constrained Facility Location Problem. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 21:1-21:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kim_et_al:LIPIcs.ISAAC.2020.21,
  author =	{Kim, Kangsan and Shin, Yongho and An, Hyung-Chan},
  title =	{{Constant-Factor Approximation Algorithms for the Parity-Constrained Facility Location Problem}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{21:1--21:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.21},
  URN =		{urn:nbn:de:0030-drops-133652},
  doi =		{10.4230/LIPIcs.ISAAC.2020.21},
  annote =	{Keywords: Facility location problems, approximation algorithms, clustering problems, parity constraints}
}
Document
Contracting to a Longest Path in H-Free Graphs

Authors: Walter Kern and Daniël Paulusma


Abstract
The Path Contraction problem has as input a graph G and an integer k and is to decide if G can be modified to the k-vertex path P_k by a sequence of edge contractions. A graph G is H-free for some graph H if G does not contain H as an induced subgraph. The Path Contraction problem restricted to H-free graphs is known to be NP-complete if H = claw or H = P₆ and polynomial-time solvable if H = P₅. We first settle the complexity of Path Contraction on H-free graphs for every H by developing a common technique. We then compare our classification with a (new) classification of the complexity of the problem Long Induced Path, which is to decide for a given integer k, if a given graph can be modified to P_k by a sequence of vertex deletions. Finally, we prove that the complexity classifications of Path Contraction and Cycle Contraction for H-free graphs do not coincide. The latter problem, which has not been fully classified for H-free graphs yet, is to decide if for some given integer k, a given graph contains the k-vertex cycle C_k as a contraction.

Cite as

Walter Kern and Daniël Paulusma. Contracting to a Longest Path in H-Free Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 22:1-22:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kern_et_al:LIPIcs.ISAAC.2020.22,
  author =	{Kern, Walter and Paulusma, Dani\"{e}l},
  title =	{{Contracting to a Longest Path in H-Free Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{22:1--22:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.22},
  URN =		{urn:nbn:de:0030-drops-133664},
  doi =		{10.4230/LIPIcs.ISAAC.2020.22},
  annote =	{Keywords: dichotomy, edge contraction, path, cycle, H-free graph}
}
Document
Counting 4-Patterns in Permutations Is Equivalent to Counting 4-Cycles in Graphs

Authors: Bartłomiej Dudek and Paweł Gawrychowski


Abstract
Permutation σ appears in permutation π if there exists a subsequence of π that is order-isomorphic to σ. The natural algorithmic question is to check if σ appears in π, and if so count the number of occurrences. Only since very recently we know that for any fixed length k, we can check if a given pattern of length k appears in a permutation of length n in time linear in n, but being able to count all such occurrences in f(k)⋅ n^o(k/log k) time would refute the exponential time hypothesis (ETH). Together with practical applications in statistics, this motivates a systematic study of the complexity of counting occurrences for different patterns of fixed small length k. We investigate this question for k = 4. Very recently, Even-Zohar and Leng [arXiv 2019] identified two types of 4-patterns. For the first type they designed an 𝒪̃(n) time algorithm, while for the second they were able to provide an 𝒪̃(n^1.5) time algorithm. This brings up the question whether the permutations of the second type are inherently harder than the first type. We establish a connection between counting 4-patterns of the second type and counting 4-cycles (not necessarily induced) in a sparse undirected graph. By designing two-way reductions we show that the complexities of both problems are the same, up to polylogarithmic factors. This allows us to leverage the work done on the latter to provide a reasonable argument for why there is a difference in the complexities for counting 4-patterns of the first and the second type. In particular, even for the seemingly simpler problem of detecting a 4-cycle in a graph on m edges, the best known algorithm works in 𝒪(m^{4/3}) time. Our reductions imply that an 𝒪(n^{4/3-ε}) time algorithm for counting occurrences of any 4-pattern of the second type in a permutation of length n would imply an exciting breakthrough for counting (and hence also detecting) 4-cycles. In the other direction, by plugging in the fastest known algorithm for counting 4-cycles, we obtain an algorithm for counting occurrences of any 4-pattern of the second type in 𝒪(n^1.48) time.

Cite as

Bartłomiej Dudek and Paweł Gawrychowski. Counting 4-Patterns in Permutations Is Equivalent to Counting 4-Cycles in Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 23:1-23:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{dudek_et_al:LIPIcs.ISAAC.2020.23,
  author =	{Dudek, Bart{\l}omiej and Gawrychowski, Pawe{\l}},
  title =	{{Counting 4-Patterns in Permutations Is Equivalent to Counting 4-Cycles in Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{23:1--23:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.23},
  URN =		{urn:nbn:de:0030-drops-133678},
  doi =		{10.4230/LIPIcs.ISAAC.2020.23},
  annote =	{Keywords: Permutations, pattern avoidance, counting cycles}
}
Document
Discriminating Codes in Geometric Setups

Authors: Sanjana Dey, Florent Foucaud, Subhas C. Nandy, and Arunabha Sen


Abstract
We study two geometric variations of the discriminating code problem. In the discrete version, a finite set of points P and a finite set of objects S are given in ℝ^d. The objective is to choose a subset S^* ⊆ S of minimum cardinality such that the subsets S_i^* ⊆ S^* covering p_i, satisfy S_i^* ≠ ∅ for each i = 1,2,…, n, and S_i^* ≠ S_j^* for each pair (i,j), i ≠ j. In the continuous version, the solution set S^* can be chosen freely among a (potentially infinite) class of allowed geometric objects. In the 1-dimensional case (d = 1), the points are placed on some fixed-line L, and the objects in S are finite segments of L (called intervals). We show that the discrete version of this problem is NP-complete. This is somewhat surprising as the continuous version is known to be polynomial-time solvable. This is also in contrast with most geometric covering problems, which are usually polynomial-time solvable in 1D. We then design a polynomial-time 2-approximation algorithm for the 1-dimensional discrete case. We also design a PTAS for both discrete and continuous cases when the intervals are all required to have the same length. We then study the 2-dimensional case (d = 2) for axis-parallel unit square objects. We show that both continuous and discrete versions are NP-hard, and design polynomial-time approximation algorithms with factors 4+ε and 32+ε, respectively (for every fixed ε > 0).

Cite as

Sanjana Dey, Florent Foucaud, Subhas C. Nandy, and Arunabha Sen. Discriminating Codes in Geometric Setups. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 24:1-24:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{dey_et_al:LIPIcs.ISAAC.2020.24,
  author =	{Dey, Sanjana and Foucaud, Florent and Nandy, Subhas C. and Sen, Arunabha},
  title =	{{Discriminating Codes in Geometric Setups}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{24:1--24:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.24},
  URN =		{urn:nbn:de:0030-drops-133686},
  doi =		{10.4230/LIPIcs.ISAAC.2020.24},
  annote =	{Keywords: Discriminating code, Approximation algorithm, Segment stabbing, Geometric Hitting set}
}
Document
Distance Oracles for Interval Graphs via Breadth-First Rank/Select in Succinct Trees

Authors: Meng He, J. Ian Munro, Yakov Nekrich, Sebastian Wild, and Kaiyu Wu


Abstract
We present the first succinct distance oracles for (unweighted) interval graphs and related classes of graphs, using a novel succinct data structure for ordinal trees that supports the mapping between preorder (i.e., depth-first) ranks and level-order (breadth-first) ranks of nodes in constant time. Our distance oracles for interval graphs also support navigation queries - testing adjacency, computing node degrees, neighborhoods, and shortest paths - all in optimal time. Our technique also yields optimal distance oracles for proper interval graphs (unit-interval graphs) and circular-arc graphs. Our tree data structure supports all operations provided by different approaches in previous work, as well as mapping to and from level-order ranks and retrieving the last (first) internal node before (after) a given node in a level-order traversal, all in constant time.

Cite as

Meng He, J. Ian Munro, Yakov Nekrich, Sebastian Wild, and Kaiyu Wu. Distance Oracles for Interval Graphs via Breadth-First Rank/Select in Succinct Trees. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 25:1-25:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{he_et_al:LIPIcs.ISAAC.2020.25,
  author =	{He, Meng and Munro, J. Ian and Nekrich, Yakov and Wild, Sebastian and Wu, Kaiyu},
  title =	{{Distance Oracles for Interval Graphs via Breadth-First Rank/Select in Succinct Trees}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{25:1--25:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.25},
  URN =		{urn:nbn:de:0030-drops-133693},
  doi =		{10.4230/LIPIcs.ISAAC.2020.25},
  annote =	{Keywords: succinct data structures, distance oracles, ordinal tree, level order, breadth-first order, interval graphs, proper interval graphs, succinct graph representation}
}
Document
Diverse Pairs of Matchings

Authors: Fedor V. Fomin, Petr A. Golovach, Lars Jaffke, Geevarghese Philip, and Danil Sagunov


Abstract
We initiate the study of the Diverse Pair of (Maximum/ Perfect) Matchings problems which given a graph G and an integer k, ask whether G has two (maximum/perfect) matchings whose symmetric difference is at least k. Diverse Pair of Matchings (asking for two not necessarily maximum or perfect matchings) is NP-complete on general graphs if k is part of the input, and we consider two restricted variants. First, we show that on bipartite graphs, the problem is polynomial-time solvable, and second we show that Diverse Pair of Maximum Matchings is FPT parameterized by k. We round off the work by showing that Diverse Pair of Matchings has a kernel on 𝒪(k²) vertices.

Cite as

Fedor V. Fomin, Petr A. Golovach, Lars Jaffke, Geevarghese Philip, and Danil Sagunov. Diverse Pairs of Matchings. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 26:1-26:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{fomin_et_al:LIPIcs.ISAAC.2020.26,
  author =	{Fomin, Fedor V. and Golovach, Petr A. and Jaffke, Lars and Philip, Geevarghese and Sagunov, Danil},
  title =	{{Diverse Pairs of Matchings}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{26:1--26:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.26},
  URN =		{urn:nbn:de:0030-drops-133706},
  doi =		{10.4230/LIPIcs.ISAAC.2020.26},
  annote =	{Keywords: Matching, Solution Diversity, Fixed-Parameter Tractability}
}
Document
Efficient Labeling for Reachability in Directed Acyclic Graphs

Authors: Maciej Dulęba, Paweł Gawrychowski, and Wojciech Janczewski


Abstract
We consider labeling nodes of a directed graph for reachability queries. A reachability labeling scheme for such a graph assigns a binary string, called a label, to each node. Then, given the labels of nodes u and v and no other information about the underlying graph, it should be possible to determine whether there exists a directed path from u to v. By a simple information theoretical argument and invoking the bound on the number of partial orders, in any scheme some labels need to consist of at least n/4 bits, where n is the number of nodes. On the other hand, it is not hard to design a scheme with labels consisting of n/2+𝒪(log n) bits. In the classical centralised setting, where a single data structure is stored as a whole, Munro and Nicholson designed a structure for reachability queries consisting of n²/4+o(n²) bits (which is optimal, up to the lower order term). We extend their approach to obtain a scheme with labels consisting of n/3+o(n) bits.

Cite as

Maciej Dulęba, Paweł Gawrychowski, and Wojciech Janczewski. Efficient Labeling for Reachability in Directed Acyclic Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 27:1-27:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{duleba_et_al:LIPIcs.ISAAC.2020.27,
  author =	{Dul\k{e}ba, Maciej and Gawrychowski, Pawe{\l} and Janczewski, Wojciech},
  title =	{{Efficient Labeling for Reachability in Directed Acyclic Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{27:1--27:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.27},
  URN =		{urn:nbn:de:0030-drops-133710},
  doi =		{10.4230/LIPIcs.ISAAC.2020.27},
  annote =	{Keywords: informative labeling scheme, reachability, DAG}
}
Document
Efficiently Computing All Delaunay Triangles Occurring over All Contiguous Subsequences

Authors: Stefan Funke and Felix Weitbrecht


Abstract
Given an ordered sequence of points P = {p₁, p₂, … , p_n}, we are interested in computing T, the set of distinct triangles occurring over all Delaunay triangulations of contiguous subsequences within P. We present a deterministic algorithm for this purpose with near-optimal time complexity O(|T|log n). Additionally, we prove that for an arbitrary point set in random order, the expected number of Delaunay triangles occurring over all contiguous subsequences is Θ(nlog n).

Cite as

Stefan Funke and Felix Weitbrecht. Efficiently Computing All Delaunay Triangles Occurring over All Contiguous Subsequences. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 28:1-28:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{funke_et_al:LIPIcs.ISAAC.2020.28,
  author =	{Funke, Stefan and Weitbrecht, Felix},
  title =	{{Efficiently Computing All Delaunay Triangles Occurring over All Contiguous Subsequences}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{28:1--28:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.28},
  URN =		{urn:nbn:de:0030-drops-133725},
  doi =		{10.4230/LIPIcs.ISAAC.2020.28},
  annote =	{Keywords: Computational Geometry, Delaunay Triangulation, Randomized Analysis}
}
Document
Enumerating Range Modes

Authors: Kentaro Sumigawa, Sankardeep Chakraborty, Kunihiko Sadakane, and Srinivasa Rao Satti


Abstract
Given a sequence of elements, we consider the problem of indexing the sequence to support range mode queries - given a query range, find the element with maximum frequency in the range. We give indexing data structures for this problem; given a sequence, we construct a data structure that can be used later to process arbitrary queries. Our algorithms are efficient for small maximum frequency cases. We also consider a natural generalization of the problem: the range mode enumeration problem, for which there has been no known efficient algorithms. Our algorithms have query time complexities which are linear in the output size plus small terms.

Cite as

Kentaro Sumigawa, Sankardeep Chakraborty, Kunihiko Sadakane, and Srinivasa Rao Satti. Enumerating Range Modes. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 29:1-29:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{sumigawa_et_al:LIPIcs.ISAAC.2020.29,
  author =	{Sumigawa, Kentaro and Chakraborty, Sankardeep and Sadakane, Kunihiko and Satti, Srinivasa Rao},
  title =	{{Enumerating Range Modes}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{29:1--29:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.29},
  URN =		{urn:nbn:de:0030-drops-133732},
  doi =		{10.4230/LIPIcs.ISAAC.2020.29},
  annote =	{Keywords: range mode, space-efficient data structure, enumeration algorithm}
}
Document
Finding Temporal Paths Under Waiting Time Constraints

Authors: Arnaud Casteigts, Anne-Sophie Himmel, Hendrik Molter, and Philipp Zschoche


Abstract
Computing a (short) path between two vertices is one of the most fundamental primitives in graph algorithmics. In recent years, the study of paths in temporal graphs, that is, graphs where the vertex set is fixed but the edge set changes over time, gained more and more attention. A path is time-respecting, or temporal, if it uses edges with non-decreasing time stamps. We investigate a basic constraint for temporal paths, where the time spent at each vertex must not exceed a given duration Δ, referred to as Δ-restless temporal paths. This constraint arises naturally in the modeling of real-world processes like packet routing in communication networks and infection transmission routes of diseases where recovery confers lasting resistance. While finding temporal paths without waiting time restrictions is known to be doable in polynomial time, we show that the "restless variant" of this problem becomes computationally hard even in very restrictive settings. For example, it is W[1]-hard when parameterized by the feedback vertex number or the pathwidth of the underlying graph. The main question thus is whether the problem becomes tractable in some natural settings. We explore several natural parameterizations, presenting FPT algorithms for three kinds of parameters: (1) output-related parameters (here, the maximum length of the path), (2) classical parameters applied to the underlying graph (e.g., feedback edge number), and (3) a new parameter called timed feedback vertex number, which captures finer-grained temporal features of the input temporal graph, and which may be of interest beyond this work.

Cite as

Arnaud Casteigts, Anne-Sophie Himmel, Hendrik Molter, and Philipp Zschoche. Finding Temporal Paths Under Waiting Time Constraints. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 30:1-30:18, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{casteigts_et_al:LIPIcs.ISAAC.2020.30,
  author =	{Casteigts, Arnaud and Himmel, Anne-Sophie and Molter, Hendrik and Zschoche, Philipp},
  title =	{{Finding Temporal Paths Under Waiting Time Constraints}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{30:1--30:18},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.30},
  URN =		{urn:nbn:de:0030-drops-133745},
  doi =		{10.4230/LIPIcs.ISAAC.2020.30},
  annote =	{Keywords: Temporal graphs, disease spreading, waiting-time policies, restless temporal paths, timed feedback vertex set, NP-hard problems, parameterized algorithms}
}
Document
Flexible List Colorings in Graphs with Special Degeneracy Conditions

Authors: Peter Bradshaw, Tomáš Masařk, and Ladislav Stacho


Abstract
For a given ε > 0, we say that a graph G is ε-flexibly k-choosable if the following holds: for any assignment L of lists of size k on V(G), if a preferred color is requested at any set R of vertices, then at least ε |R| of these requests are satisfied by some L-coloring. We consider flexible list colorings in several graph classes with certain degeneracy conditions. We characterize the graphs of maximum degree Δ that are ε-flexibly Δ-choosable for some ε = ε(Δ) > 0, which answers a question of Dvořák, Norin, and Postle [List coloring with requests, JGT 2019]. We also show that graphs of treewidth 2 are 1/3-flexibly 3-choosable, answering a question of Choi et al. [arXiv 2020], and we give conditions for list assignments by which graphs of treewidth k are 1/(k+1)-flexibly (k+1)-choosable. We show furthermore that graphs of treedepth k are 1/k-flexibly k-choosable. Finally, we introduce a notion of flexible degeneracy, which strengthens flexible choosability, and we show that apart from a well-understood class of exceptions, 3-connected non-regular graphs of maximum degree Δ are flexibly (Δ - 1)-degenerate.

Cite as

Peter Bradshaw, Tomáš Masařk, and Ladislav Stacho. Flexible List Colorings in Graphs with Special Degeneracy Conditions. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 31:1-31:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bradshaw_et_al:LIPIcs.ISAAC.2020.31,
  author =	{Bradshaw, Peter and Masa\v{r}k, Tom\'{a}\v{s} and Stacho, Ladislav},
  title =	{{Flexible List Colorings in Graphs with Special Degeneracy Conditions}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{31:1--31:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.31},
  URN =		{urn:nbn:de:0030-drops-133750},
  doi =		{10.4230/LIPIcs.ISAAC.2020.31},
  annote =	{Keywords: Flexibility, List Coloring, Choosability, Degeneracy}
}
Document
Geometric Pattern Matching Reduces to k-SUM

Authors: Boris Aronov and Jean Cardinal


Abstract
We prove that some exact geometric pattern matching problems reduce in linear time to o k-SUM when the pattern has a fixed size k. This holds in the real RAM model for searching for a similar copy of a set of k ≥ 3 points within a set of n points in the plane, and for searching for an affine image of a set of k ≥ d+2 points within a set of n points in d-space. As corollaries, we obtain improved real RAM algorithms and decision trees for the two problems. In particular, they can be solved by algebraic decision trees of near-linear height.

Cite as

Boris Aronov and Jean Cardinal. Geometric Pattern Matching Reduces to k-SUM. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 32:1-32:9, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{aronov_et_al:LIPIcs.ISAAC.2020.32,
  author =	{Aronov, Boris and Cardinal, Jean},
  title =	{{Geometric Pattern Matching Reduces to k-SUM}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{32:1--32:9},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.32},
  URN =		{urn:nbn:de:0030-drops-133760},
  doi =		{10.4230/LIPIcs.ISAAC.2020.32},
  annote =	{Keywords: Geometric pattern matching, k-SUM problem, Linear decision trees}
}
Document
Gourds: A Sliding-Block Puzzle with Turning

Authors: Joep Hamersma, Marc van Kreveld, Yushi Uno, and Tom C. van der Zanden


Abstract
We propose a new kind of sliding-block puzzle, called Gourds, where the objective is to rearrange 1×2 pieces on a hexagonal grid board of 2n+1 cells with n pieces, using sliding, turning and pivoting moves. This puzzle has a single empty cell on a board and forms a natural extension of the 15-puzzle to include rotational moves. We analyze the puzzle and completely characterize the cases when the puzzle can always be solved. We also study the complexity of determining whether a given set of colored pieces can be placed on a colored hexagonal grid board with matching colors. We show this problem is NP-complete for arbitrarily many colors, but solvable in randomized polynomial time if the number of colors is a fixed constant.

Cite as

Joep Hamersma, Marc van Kreveld, Yushi Uno, and Tom C. van der Zanden. Gourds: A Sliding-Block Puzzle with Turning. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 33:1-33:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{hamersma_et_al:LIPIcs.ISAAC.2020.33,
  author =	{Hamersma, Joep and van Kreveld, Marc and Uno, Yushi and van der Zanden, Tom C.},
  title =	{{Gourds: A Sliding-Block Puzzle with Turning}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{33:1--33:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.33},
  URN =		{urn:nbn:de:0030-drops-133773},
  doi =		{10.4230/LIPIcs.ISAAC.2020.33},
  annote =	{Keywords: computational complexity, divide-and-conquer, Hamiltonian cycle, puzzle game, (combinatorial) reconfiguration, sliding-block puzzle}
}
Document
Improved FPT Algorithms for Deletion to Forest-Like Structures

Authors: Kishen N. Gowda, Aditya Lonkar, Fahad Panolan, Vraj Patel, and Saket Saurabh


Abstract
The Feedback Vertex Set problem is undoubtedly one of the most well-studied problems in Parameterized Complexity. In this problem, given an undirected graph G and a non-negative integer k, the objective is to test whether there exists a subset S ⊆ V(G) of size at most k such that G-S is a forest. After a long line of improvement, recently, Li and Nederlof [SODA, 2020] designed a randomized algorithm for the problem running in time 𝒪^⋆(2.7^k). In the Parameterized Complexity literature, several problems around Feedback Vertex Set have been studied. Some of these include Independent Feedback Vertex Set (where the set S should be an independent set in G), Almost Forest Deletion and Pseudoforest Deletion. In Pseudoforest Deletion, each connected component in G-S has at most one cycle in it. However, in Almost Forest Deletion, the input is a graph G and non-negative integers k,𝓁 ∈ ℕ, and the objective is to test whether there exists a vertex subset S of size at most k, such that G-S is 𝓁 edges away from a forest. In this paper, using the methodology of Li and Nederlof [SODA, 2020], we obtain the current fastest algorithms for all these problems. In particular we obtain following randomized algorithms. 1) Independent Feedback Vertex Set can be solved in time 𝒪^⋆(2.7^k). 2) Pseudo Forest Deletion can be solved in time 𝒪^⋆(2.85^k). 3) Almost Forest Deletion can be solved in 𝒪^⋆(min{2.85^k ⋅ 8.54^𝓁, 2.7^k ⋅ 36.61^𝓁, 3^k ⋅ 1.78^𝓁}).

Cite as

Kishen N. Gowda, Aditya Lonkar, Fahad Panolan, Vraj Patel, and Saket Saurabh. Improved FPT Algorithms for Deletion to Forest-Like Structures. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 34:1-34:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{gowda_et_al:LIPIcs.ISAAC.2020.34,
  author =	{Gowda, Kishen N. and Lonkar, Aditya and Panolan, Fahad and Patel, Vraj and Saurabh, Saket},
  title =	{{Improved FPT Algorithms for Deletion to Forest-Like Structures}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{34:1--34:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.34},
  URN =		{urn:nbn:de:0030-drops-133781},
  doi =		{10.4230/LIPIcs.ISAAC.2020.34},
  annote =	{Keywords: Parameterized Complexity, Independent Feedback Vertex Set, PseudoForest, Almost Forest, Cut and Count, Treewidth}
}
Document
Indexing Isodirectional Pointer Sequences

Authors: Sung-Hwan Kim and Hwan-Gue Cho


Abstract
Many sequential and temporal data have dependency relationships among their elements, which can be represented as a sequence of pointers. In this paper, we introduce a new string matching problem with a particular type of strings, which we call isodirectional pointer sequence, in which each entry has a pointer to another entry. The proposed problem is not only a formalization of real-world dependency matching problems, but also a generalization of variants of the string matching problem such as parameterized pattern matching and Cartesian tree matching. We present a 2nlgσ+2n+o(n)-bit index that preprocesses the text T[1:n] so as to count the number of occurrences of pattern P[1:m] in 𝒪(mlgσ) where σ is the number of distinct lengths of pointers in T. Our index is also easily implementable in practice because it consists of wavelet trees and range maximum query index, which are widely used building blocks in many other compact data structures. By compressing the wavelet trees, the index can also be stored into 2nH^*₀(T)+2n+o(n) bits where H^*₀(T) is the 0-th order empirical entropy of the distribution of pointer lengths of T.

Cite as

Sung-Hwan Kim and Hwan-Gue Cho. Indexing Isodirectional Pointer Sequences. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 35:1-35:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kim_et_al:LIPIcs.ISAAC.2020.35,
  author =	{Kim, Sung-Hwan and Cho, Hwan-Gue},
  title =	{{Indexing Isodirectional Pointer Sequences}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{35:1--35:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.35},
  URN =		{urn:nbn:de:0030-drops-133797},
  doi =		{10.4230/LIPIcs.ISAAC.2020.35},
  annote =	{Keywords: String Matching, Suffix Array, FM-index, Wavelet Tree, Range Minimum Query, Parameterized String Matching, Cartesian Tree Matching}
}
Document
Length-Bounded Cuts: Proper Interval Graphs and Structural Parameters

Authors: Matthias Bentert, Klaus Heeger, and Dušan Knop


Abstract
In the presented paper, we study the Length-Bounded Cut problem for special graph classes as well as from a parameterized-complexity viewpoint. Here, we are given a graph G, two vertices s and t, and positive integers β and λ. The task is to find a set F of edges of size at most β such that every s-t-path of length at most λ in G contains some edge in F. Bazgan et al. [Networks, 2019] conjectured that Length-Bounded Cut admits a polynomial-time algorithm if the input graph G is a proper interval graph. We confirm this conjecture by providing a dynamic-programming based polynomial-time algorithm. Moreover, we strengthen the W[1]-hardness result of Dvořák and Knop [Algorithmica, 2018] for Length-Bounded Cut parameterized by pathwidth. Our reduction is shorter, and the target of the reduction has stronger structural properties. Consequently, we give W[1]-hardness for the combined parameter pathwidth and maximum degree of the input graph. Finally, we prove that Length-Bounded Cut is W[1]-hard for the feedback vertex number. Both our hardness results complement known XP algorithms.

Cite as

Matthias Bentert, Klaus Heeger, and Dušan Knop. Length-Bounded Cuts: Proper Interval Graphs and Structural Parameters. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 36:1-36:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bentert_et_al:LIPIcs.ISAAC.2020.36,
  author =	{Bentert, Matthias and Heeger, Klaus and Knop, Du\v{s}an},
  title =	{{Length-Bounded Cuts: Proper Interval Graphs and Structural Parameters}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{36:1--36:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.36},
  URN =		{urn:nbn:de:0030-drops-133800},
  doi =		{10.4230/LIPIcs.ISAAC.2020.36},
  annote =	{Keywords: Edge-disjoint paths, pathwidth, feedback vertex number}
}
Document
Linear Transformations Between Dominating Sets in the TAR-Model

Authors: Nicolas Bousquet, Alice Joffard, and Paul Ouvrard


Abstract
Given a graph G and an integer k, a token addition and removal (TAR for short) reconfiguration sequence between two dominating sets D_s and D_t of size at most k is a sequence S = ⟨ D₀ = D_s, D₁ …, D_𝓁 = D_t ⟩ of dominating sets of G such that any two consecutive dominating sets differ by the addition or deletion of one vertex, and no dominating set has size bigger than k. We first improve a result of Haas and Seyffarth [R. Haas and K. Seyffarth, 2017], by showing that if k = Γ(G)+α(G)-1 (where Γ(G) is the maximum size of a minimal dominating set and α(G) the maximum size of an independent set), then there exists a linear TAR reconfiguration sequence between any pair of dominating sets. We then improve these results on several graph classes by showing that the same holds for K_𝓁-minor free graph as long as k ≥ Γ(G)+O(𝓁 √(log 𝓁)) and for planar graphs whenever k ≥ Γ(G)+3. Finally, we show that if k = Γ(G)+tw(G)+1, then there also exists a linear transformation between any pair of dominating sets.

Cite as

Nicolas Bousquet, Alice Joffard, and Paul Ouvrard. Linear Transformations Between Dominating Sets in the TAR-Model. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 37:1-37:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bousquet_et_al:LIPIcs.ISAAC.2020.37,
  author =	{Bousquet, Nicolas and Joffard, Alice and Ouvrard, Paul},
  title =	{{Linear Transformations Between Dominating Sets in the TAR-Model}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{37:1--37:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.37},
  URN =		{urn:nbn:de:0030-drops-133812},
  doi =		{10.4230/LIPIcs.ISAAC.2020.37},
  annote =	{Keywords: reconfiguration, dominating sets, addition removal, connectivity, diameter, minor, treewidth}
}
Document
Linear-Time Algorithms for Computing Twinless Strong Articulation Points and Related Problems

Authors: Loukas Georgiadis and Evangelos Kosinas


Abstract
A directed graph G = (V,E) is twinless strongly connected if it contains a strongly connected spanning subgraph without any pair of antiparallel (or twin) edges. The twinless strongly connected components (TSCCs) of a directed graph G are its maximal twinless strongly connected subgraphs. These concepts have several diverse applications, such as the design of telecommunication networks and the structural stability of buildings. A vertex v ∈ V is a twinless strong articulation point of G, if the deletion of v increases the number of TSCCs of G. Here, we present the first linear-time algorithm that finds all the twinless strong articulation points of a directed graph. We show that the computation of twinless strong articulation points reduces to the following problem in undirected graphs, which may be of independent interest: Given a 2-vertex-connected undirected graph H, find all vertices v for which there exists an edge e such that H⧵{v,e} is not connected. We develop a linear-time algorithm that not only finds all such vertices v, but also computes the number of edges e such that H⧵{v,e} is not connected. This also implies that for each twinless strong articulation point v which is not a strong articulation point in a strongly connected digraph G, we can compute the number of TSCCs in G⧵v.

Cite as

Loukas Georgiadis and Evangelos Kosinas. Linear-Time Algorithms for Computing Twinless Strong Articulation Points and Related Problems. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 38:1-38:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{georgiadis_et_al:LIPIcs.ISAAC.2020.38,
  author =	{Georgiadis, Loukas and Kosinas, Evangelos},
  title =	{{Linear-Time Algorithms for Computing Twinless Strong Articulation Points and Related Problems}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{38:1--38:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.38},
  URN =		{urn:nbn:de:0030-drops-133820},
  doi =		{10.4230/LIPIcs.ISAAC.2020.38},
  annote =	{Keywords: 2-connectivity, cut pairs, strongly connected components}
}
Document
Market Pricing for Matroid Rank Valuations

Authors: Kristóf Bérczi, Naonori Kakimura, and Yusuke Kobayashi


Abstract
In this paper, we study the problem of maximizing social welfare in combinatorial markets through pricing schemes. We consider the existence of prices that are capable to achieve optimal social welfare without a central tie-breaking coordinator. In the case of two buyers with matroid rank valuations, we give polynomial-time algorithms that always find such prices when one of the matroids is a simple partition matroid or both matroids are strongly base orderable. This result partially answers a question raised by Düetting and Végh in 2017. We further formalize a weighted variant of the conjecture of Düetting and Végh, and show that the weighted variant can be reduced to the unweighted one based on the weight-splitting theorem for weighted matroid intersection by Frank. We also show that a similar reduction technique works for M^♮-concave functions, or equivalently, for gross substitutes functions.

Cite as

Kristóf Bérczi, Naonori Kakimura, and Yusuke Kobayashi. Market Pricing for Matroid Rank Valuations. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 39:1-39:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{berczi_et_al:LIPIcs.ISAAC.2020.39,
  author =	{B\'{e}rczi, Krist\'{o}f and Kakimura, Naonori and Kobayashi, Yusuke},
  title =	{{Market Pricing for Matroid Rank Valuations}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{39:1--39:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.39},
  URN =		{urn:nbn:de:0030-drops-133833},
  doi =		{10.4230/LIPIcs.ISAAC.2020.39},
  annote =	{Keywords: Pricing schemes, Walrasian equilibrium, gross substitutes valuations, matroid rank functions}
}
Document
Minimization and Parameterized Variants of Vertex Partition Problems on Graphs

Authors: Yuma Tamura, Takehiro Ito, and Xiao Zhou


Abstract
Let Π₁, Π₂, …, Π_c be graph properties for a fixed integer c. Then, (Π₁, Π₂, …, Π_c)-Partition is the problem of asking whether the vertex set of a given graph can be partitioned into c subsets V₁, V₂, …, V_c such that the subgraph induced by V_i satisfies the graph property Π_i for every i ∈ {1,2, …, c}. Minimization and parameterized variants of (Π₁, Π₂, …, Π_c)-Partition have been studied for several specific graph properties, where the size of the vertex subset V₁ satisfying Π₁ is minimized or taken as a parameter. In this paper, we first show that the minimization variant is hard to approximate for any nontrivial additive hereditary graph properties, unless c = 2 and both Π₁ and Π₂ are classes of edgeless graphs. We then give FPT algorithms for the parameterized variant when restricted to the case where c = 2, Π₁ is a hereditary graph property, and Π₂ is the class of acyclic graphs.

Cite as

Yuma Tamura, Takehiro Ito, and Xiao Zhou. Minimization and Parameterized Variants of Vertex Partition Problems on Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 40:1-40:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{tamura_et_al:LIPIcs.ISAAC.2020.40,
  author =	{Tamura, Yuma and Ito, Takehiro and Zhou, Xiao},
  title =	{{Minimization and Parameterized Variants of Vertex Partition Problems on Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{40:1--40:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.40},
  URN =		{urn:nbn:de:0030-drops-133844},
  doi =		{10.4230/LIPIcs.ISAAC.2020.40},
  annote =	{Keywords: Graph Algorithms, Approximability, Fixed-Parameter Tractability, Vertex Partition Problem, Feedback Vertex Set Problem}
}
Document
Multicommodity Flows in Planar Graphs with Demands on Faces

Authors: Nikhil Kumar


Abstract
We consider the problem of multicommodity flows in planar graphs. Seymour [Seymour, 1981] showed that if the union of supply and demand graphs is planar, then the cut condition is also sufficient for routing demands. Okamura-Seymour [Okamura and Seymour, 1981] showed that if the supply graph is planar and all demands are incident on one face, then again the cut condition is sufficient for routing demands. We consider a common generalization of these settings where the end points of each demand are on the same face of the planar graph. We show that if the source sink pairs on each face of the graph are such that sources and sinks appear contiguously on the cycle bounding the face, then the flow cut gap is at most 3. We come up with a notion of approximating demands on a face by convex combination of laminar demands to prove this result.

Cite as

Nikhil Kumar. Multicommodity Flows in Planar Graphs with Demands on Faces. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 41:1-41:11, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{kumar:LIPIcs.ISAAC.2020.41,
  author =	{Kumar, Nikhil},
  title =	{{Multicommodity Flows in Planar Graphs with Demands on Faces}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{41:1--41:11},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.41},
  URN =		{urn:nbn:de:0030-drops-133857},
  doi =		{10.4230/LIPIcs.ISAAC.2020.41},
  annote =	{Keywords: Combinatorial Optimization, Multicommodity Flow, Network Design}
}
Document
Multiparty Selection

Authors: Ke Chen and Adrian Dumitrescu


Abstract
Given a sequence A of n numbers and an integer (target) parameter 1 ≤ i ≤ n, the (exact) selection problem is that of finding the i-th smallest element in A. An element is said to be (i,j)-mediocre if it is neither among the top i nor among the bottom j elements of S. The approximate selection problem is that of finding an (i,j)-mediocre element for some given i,j; as such, this variant allows the algorithm to return any element in a prescribed range. In the first part, we revisit the selection problem in the two-party model introduced by Andrew Yao (1979) and then extend our study of exact selection to the multiparty model. In the second part, we deduce some communication complexity benefits that arise in approximate selection. In particular, we present a deterministic protocol for finding an approximate median among k players.

Cite as

Ke Chen and Adrian Dumitrescu. Multiparty Selection. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 42:1-42:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ISAAC.2020.42,
  author =	{Chen, Ke and Dumitrescu, Adrian},
  title =	{{Multiparty Selection}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{42:1--42:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.42},
  URN =		{urn:nbn:de:0030-drops-133867},
  doi =		{10.4230/LIPIcs.ISAAC.2020.42},
  annote =	{Keywords: approximate selection, mediocre element, comparison algorithm, i-th order statistic, tournaments, quantiles, communication complexity}
}
Document
Multistage s-t Path: Confronting Similarity with Dissimilarity in Temporal Graphs

Authors: Till Fluschnik, Rolf Niedermeier, Carsten Schubert, and Philipp Zschoche


Abstract
Addressing a quest by Gupta et al. [ICALP'14], we provide a first, comprehensive study of finding a short s-t path in the multistage graph model, referred to as the Multistage s-t Path problem. Herein, given a sequence of graphs over the same vertex set but changing edge sets, the task is to find short s-t paths in each graph ("snapshot") such that in the found path sequence the consecutive s-t paths are "similar". We measure similarity by the size of the symmetric difference of either the vertex set (vertex-similarity) or the edge set (edge-similarity) of any two consecutive paths. We prove that these two variants of Multistage s-t Path are already NP-hard for an input sequence of only two graphs and maximum vertex degree four. Motivated by this fact and natural applications of this scenario e.g. in traffic route planning, we perform a parameterized complexity analysis. Among other results, for both variants, vertex- and edge-similarity, we prove parameterized hardness (W[1]-hardness) regarding the parameter path length (solution size) for both variants, vertex- and edge-similarity. As a further conceptual study, we then modify the multistage model by asking for dissimilar consecutive paths. One of our main technical results (employing so-called representative sets known from non-temporal settings) is that dissimilarity allows for fixed-parameter tractability for the parameter solution size, contrasting the W[1]-hardness of the corresponding similarity case. We also provide partially positive results concerning efficient and effective data reduction (kernelization).

Cite as

Till Fluschnik, Rolf Niedermeier, Carsten Schubert, and Philipp Zschoche. Multistage s-t Path: Confronting Similarity with Dissimilarity in Temporal Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 43:1-43:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{fluschnik_et_al:LIPIcs.ISAAC.2020.43,
  author =	{Fluschnik, Till and Niedermeier, Rolf and Schubert, Carsten and Zschoche, Philipp},
  title =	{{Multistage s-t Path: Confronting Similarity with Dissimilarity in Temporal Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{43:1--43:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.43},
  URN =		{urn:nbn:de:0030-drops-133879},
  doi =		{10.4230/LIPIcs.ISAAC.2020.43},
  annote =	{Keywords: Temporal graphs, shortest paths, consecutive similarity, consecutive dissimilarity, parameterized complexity, kernelization, representative sets in temporal graphs}
}
Document
On Girth and the Parameterized Complexity of Token Sliding and Token Jumping

Authors: Valentin Bartier, Nicolas Bousquet, Clément Dallard, Kyle Lomer, and Amer E. Mouawad


Abstract
In the Token Jumping problem we are given a graph G = (V,E) and two independent sets S and T of G, each of size k ≥ 1. The goal is to determine whether there exists a sequence of k-sized independent sets in G, 〈S_0, S_1, ..., S_𝓁〉, such that for every i, |S_i| = k, S_i is an independent set, S = S_0, S_𝓁 = T, and |S_i Δ S_i+1| = 2. In other words, if we view each independent set as a collection of tokens placed on a subset of the vertices of G, then the problem asks for a sequence of independent sets which transforms S to T by individual token jumps which maintain the independence of the sets. This problem is known to be PSPACE-complete on very restricted graph classes, e.g., planar bounded degree graphs and graphs of bounded bandwidth. A closely related problem is the Token Sliding problem, where instead of allowing a token to jump to any vertex of the graph we instead require that a token slides along an edge of the graph. Token Sliding is also known to be PSPACE-complete on the aforementioned graph classes. We investigate the parameterized complexity of both problems on several graph classes, focusing on the effect of excluding certain cycles from the input graph. In particular, we show that both Token Sliding and Token Jumping are fixed-parameter tractable on C_4-free bipartite graphs when parameterized by k. For Token Jumping, we in fact show that the problem admits a polynomial kernel on {C_3,C_4}-free graphs. In the case of Token Sliding, we also show that the problem admits a polynomial kernel on bipartite graphs of bounded degree. We believe both of these results to be of independent interest. We complement these positive results by showing that, for any constant p ≥ 4, both problems are W[1]-hard on {C_4, ..., C_p}-free graphs and Token Sliding remains W[1]-hard even on bipartite graphs.

Cite as

Valentin Bartier, Nicolas Bousquet, Clément Dallard, Kyle Lomer, and Amer E. Mouawad. On Girth and the Parameterized Complexity of Token Sliding and Token Jumping. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 44:1-44:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bartier_et_al:LIPIcs.ISAAC.2020.44,
  author =	{Bartier, Valentin and Bousquet, Nicolas and Dallard, Cl\'{e}ment and Lomer, Kyle and Mouawad, Amer E.},
  title =	{{On Girth and the Parameterized Complexity of Token Sliding and Token Jumping}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{44:1--44:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.44},
  URN =		{urn:nbn:de:0030-drops-133886},
  doi =		{10.4230/LIPIcs.ISAAC.2020.44},
  annote =	{Keywords: Combinatorial reconfiguration, Independent Set, Token Jumping, Token Sliding, Parameterized Complexity}
}
Document
Online Primal-Dual Algorithms with Configuration Linear Programs

Authors: Nguyễn Kim Thắng


Abstract
In this paper, we present primal-dual algorithms for online problems with non-convex objectives. Problems with convex objectives have been extensively studied in recent years where the analyses rely crucially on the convexity and the Fenchel duality. However, problems with non-convex objectives resist against current approaches and non-convexity represents a strong barrier in optimization in general and in the design of online algorithms in particular. In our approach, we consider configuration linear programs with the multilinear extension of the objectives. We follow the multiplicative weight update framework in which a novel point is that the primal update is defined based on the gradient of the multilinear extension. We introduce new notions, namely (local) smoothness, in order to characterize the competitive ratios of our algorithms. The approach leads to competitive algorithms for several problems with convex/non-convex objectives.

Cite as

Nguyễn Kim Thắng. Online Primal-Dual Algorithms with Configuration Linear Programs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 45:1-45:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{thang:LIPIcs.ISAAC.2020.45,
  author =	{Thắng, Nguy\~{ê}n Kim},
  title =	{{Online Primal-Dual Algorithms with Configuration Linear Programs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{45:1--45:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.45},
  URN =		{urn:nbn:de:0030-drops-133891},
  doi =		{10.4230/LIPIcs.ISAAC.2020.45},
  annote =	{Keywords: Configuration Linear Programs, Primal-Dual, Smoothness}
}
Document
Partial Function Extension with Applications to Learning and Property Testing

Authors: Umang Bhaskar and Gunjan Kumar


Abstract
Partial function extension is a basic problem that underpins multiple research topics in optimization, including learning, property testing, and game theory. Here, we are given a partial function consisting of n points from a domain and a function value at each point. Our objective is to determine if this partial function can be extended to a function defined on the domain, that additionally satisfies a given property, such as linearity. We formally study partial function extension to fundamental properties in combinatorial optimization - subadditivity, XOS, and matroid independence. A priori, it is not clear if partial function extension for these properties even lies in NP (or coNP). Our contributions are twofold. Firstly, for the properties studied, we give bounds on the complexity of partial function extension. For subadditivity and XOS, we give tight bounds on approximation guarantees as well. Secondly, we develop new connections between partial function extension and learning and property testing, and use these to give new results for these problems. In particular, for subadditive functions, we give improved lower bounds on learning, as well as the first subexponential-query tester.

Cite as

Umang Bhaskar and Gunjan Kumar. Partial Function Extension with Applications to Learning and Property Testing. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 46:1-46:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bhaskar_et_al:LIPIcs.ISAAC.2020.46,
  author =	{Bhaskar, Umang and Kumar, Gunjan},
  title =	{{Partial Function Extension with Applications to Learning and Property Testing}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{46:1--46:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.46},
  URN =		{urn:nbn:de:0030-drops-133906},
  doi =		{10.4230/LIPIcs.ISAAC.2020.46},
  annote =	{Keywords: Partial function extension, subadditivity, matroid rank, approximation algorithms, learning, property testing}
}
Document
Quantum-Inspired Algorithms for Solving Low-Rank Linear Equation Systems with Logarithmic Dependence on the Dimension

Authors: Nai-Hui Chia, András Gilyén, Han-Hsuan Lin, Seth Lloyd, Ewin Tang, and Chunhao Wang


Abstract
We present two efficient classical analogues of the quantum matrix inversion algorithm [Harrow et al., 2009] for low-rank matrices. Inspired by recent work of Tang [Tang, 2019], assuming length-square sampling access to input data, we implement the pseudoinverse of a low-rank matrix allowing us to sample from the solution to the problem Ax = b using fast sampling techniques. We construct implicit descriptions of the pseudo-inverse by finding approximate singular value decomposition of A via subsampling, then inverting the singular values. In principle, our approaches can also be used to apply any desired "smooth" function to the singular values. Since many quantum algorithms can be expressed as a singular value transformation problem [András Gilyén et al., 2019], our results indicate that more low-rank quantum algorithms can be effectively "dequantised" into classical length-square sampling algorithms.

Cite as

Nai-Hui Chia, András Gilyén, Han-Hsuan Lin, Seth Lloyd, Ewin Tang, and Chunhao Wang. Quantum-Inspired Algorithms for Solving Low-Rank Linear Equation Systems with Logarithmic Dependence on the Dimension. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 47:1-47:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{chia_et_al:LIPIcs.ISAAC.2020.47,
  author =	{Chia, Nai-Hui and Gily\'{e}n, Andr\'{a}s and Lin, Han-Hsuan and Lloyd, Seth and Tang, Ewin and Wang, Chunhao},
  title =	{{Quantum-Inspired Algorithms for Solving Low-Rank Linear Equation Systems with Logarithmic Dependence on the Dimension}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{47:1--47:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.47},
  URN =		{urn:nbn:de:0030-drops-133916},
  doi =		{10.4230/LIPIcs.ISAAC.2020.47},
  annote =	{Keywords: sublinear algorithms, quantum-inspired, regression, importance sampling, quantum machine learning}
}
Document
Random Access in Persistent Strings

Authors: Philip Bille and Inge Li Gørtz


Abstract
We consider compact representations of collections of similar strings that support random access queries. The collection of strings is given by a rooted tree where edges are labeled by an edit operation (inserting, deleting, or replacing a character) and a node represents the string obtained by applying the sequence of edit operations on the path from the root to the node. The goal is to compactly represent the entire collection while supporting fast random access to any part of a string in the collection. This problem captures natural scenarios such as representing the past history of an edited document or representing highly-repetitive collections. Given a tree with n nodes, we show how to represent the corresponding collection in O(n) space and optimal O(log n/ log log n) query time. This improves the previous time-space trade-offs for the problem. To obtain our results, we introduce new techniques and ideas, including a reduction to a new geometric line segment selection together with an efficient solution.

Cite as

Philip Bille and Inge Li Gørtz. Random Access in Persistent Strings. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 48:1-48:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bille_et_al:LIPIcs.ISAAC.2020.48,
  author =	{Bille, Philip and G{\o}rtz, Inge Li},
  title =	{{Random Access in Persistent Strings}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{48:1--48:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.48},
  URN =		{urn:nbn:de:0030-drops-133922},
  doi =		{10.4230/LIPIcs.ISAAC.2020.48},
  annote =	{Keywords: Data compression, data structures, persistent strings}
}
Document
Recency Queries with Succinct Representation

Authors: William L. Holland, Anthony Wirth, and Justin Zobel


Abstract
In the context of the sliding-window set membership problem, and caching policies that require knowledge of item recency, we formalize the problem of Recency on a stream. Informally, the query asks, "when was the last time I saw item x?" Existing structures, such as hash tables, can support a recency query by augmenting item occurrences with timestamps. To support recency queries on a window of W items, this might require Θ(W log W) bits. We propose a succinct data structure for Recency. By combining sliding-window dictionaries in a hierarchical structure, and careful design of the underlying hash tables, we achieve a data structure that returns a 1+ε approximation to the recency of every item in O(log(ε W)) time, in only (1+o(1))(1+ε)(ℬ+Wlog(ε^(-1))) bits. Here, ℬ is the information-theoretic lower bound on the number of bits for a set of size W, in a universe of cardinality N.

Cite as

William L. Holland, Anthony Wirth, and Justin Zobel. Recency Queries with Succinct Representation. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 49:1-49:14, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{holland_et_al:LIPIcs.ISAAC.2020.49,
  author =	{Holland, William L. and Wirth, Anthony and Zobel, Justin},
  title =	{{Recency Queries with Succinct Representation}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{49:1--49:14},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.49},
  URN =		{urn:nbn:de:0030-drops-133931},
  doi =		{10.4230/LIPIcs.ISAAC.2020.49},
  annote =	{Keywords: Succinct Data Structures, Data Streams, Sliding Dictionary}
}
Document
Recursed Is Not Recursive: A Jarring Result

Authors: Erik D. Demaine, Justin Kopinsky, and Jayson Lynch


Abstract
Recursed is a 2D puzzle platform video game featuring "treasure chests" that, when jumped into, instantiate a room that can later be exited (similar to function calls), optionally generating a "jar" that returns back to that room (similar to continuations). We prove that Recursed is RE-complete and thus undecidable (not recursive) by a reduction from the Post Correspondence Problem. Our reduction is "practical": the reduction from PCP results in fully playable levels that abide by all constraints governing levels (including the 15 × 20 room size) designed for the main game. Our reduction is also "efficient": a Turing machine can be simulated by a Recursed level whose size is linear in the encoding size of the Turing machine and whose solution length is polynomial in the running time of the Turing machine.

Cite as

Erik D. Demaine, Justin Kopinsky, and Jayson Lynch. Recursed Is Not Recursive: A Jarring Result. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 50:1-50:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{demaine_et_al:LIPIcs.ISAAC.2020.50,
  author =	{Demaine, Erik D. and Kopinsky, Justin and Lynch, Jayson},
  title =	{{Recursed Is Not Recursive: A Jarring Result}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{50:1--50:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.50},
  URN =		{urn:nbn:de:0030-drops-133940},
  doi =		{10.4230/LIPIcs.ISAAC.2020.50},
  annote =	{Keywords: Computational Complexity, Undecidable, Video Games}
}
Document
Shared vs Private Randomness in Distributed Interactive Proofs

Authors: Pedro Montealegre, Diego Ramírez-Romero, and Ivan Rapaport


Abstract
In distributed interactive proofs, the nodes of a graph G interact with a powerful but untrustable prover who tries to convince them, in a small number of rounds and through short messages, that G satisfies some property. This series of interactions is followed by a phase of distributed verification, which may be either deterministic or randomized, where nodes exchange messages with their neighbors. The nature of this last verification round defines the two types of interactive protocols. We say that the protocol is of Arthur-Merlin type if the verification round is deterministic. We say that the protocol is of Merlin-Arthur type if, in the verification round, the nodes are allowed to use a fresh set of random bits. In the original model introduced by Kol, Oshman, and Saxena [PODC 2018], the randomness was private in the sense that each node had only access to an individual source of random coins. Crescenzi, Fraigniaud, and Paz [DISC 2019] initiated the study of the impact of shared randomness (the situation where the coin tosses are visible to all nodes) in the distributed interactive model. In this work, we continue that research line by showing that the impact of the two forms of randomness is very different depending on whether we are considering Arthur-Merlin protocols or Merlin-Arthur protocols. While private randomness gives more power to the first type of protocols, shared randomness provides more power to the second. Our results also connect shared randomness in distributed interactive proofs with distributed verification, and new lower bounds are obtained.

Cite as

Pedro Montealegre, Diego Ramírez-Romero, and Ivan Rapaport. Shared vs Private Randomness in Distributed Interactive Proofs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 51:1-51:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{montealegre_et_al:LIPIcs.ISAAC.2020.51,
  author =	{Montealegre, Pedro and Ram{\'\i}rez-Romero, Diego and Rapaport, Ivan},
  title =	{{Shared vs Private Randomness in Distributed Interactive Proofs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{51:1--51:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.51},
  URN =		{urn:nbn:de:0030-drops-133959},
  doi =		{10.4230/LIPIcs.ISAAC.2020.51},
  annote =	{Keywords: Distributed interactive proofs, Distributed verification, Shared randomness, Private randomness}
}
Document
Shortest-Path Queries in Geometric Networks

Authors: Eunjin Oh


Abstract
A Euclidean t-spanner for a point set V ⊂ ℝ^d is a graph such that, for any two points p and q in V, the distance between p and q in the graph is at most t times the Euclidean distance between p and q. Gudmundsson et al. [TALG 2008] presented a data structure for answering ε-approximate distance queries in a Euclidean spanner in constant time, but it seems unlikely that one can report the path itself using this data structure. In this paper, we present a data structure of size O(nlog n) that answers ε-approximate shortest-path queries in time linear in the size of the output.

Cite as

Eunjin Oh. Shortest-Path Queries in Geometric Networks. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 52:1-52:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{oh:LIPIcs.ISAAC.2020.52,
  author =	{Oh, Eunjin},
  title =	{{Shortest-Path Queries in Geometric Networks}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{52:1--52:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.52},
  URN =		{urn:nbn:de:0030-drops-133963},
  doi =		{10.4230/LIPIcs.ISAAC.2020.52},
  annote =	{Keywords: Shortest path, Euclidean spanner, data structure}
}
Document
Signal Passing Self-Assembly Simulates Tile Automata

Authors: Angel A. Cantu, Austin Luchsinger, Robert Schweller, and Tim Wylie


Abstract
The natural process of self-assembly has been studied through various abstract models due to the abundant applications that benefit from self-assembly. Many of these different models emerged in an effort to capture and understand the fundamental properties of different physical systems and the mechanisms by which assembly may occur. A newly proposed model, known as Tile Automata, offers an abstract toolkit to analyze and compare the algorithmic properties of different self-assembly systems. In this paper, we show that for every Tile Automata system, there exists a Signal-passing Tile Assembly system that can simulate it. Finally, we connect our result with a recent discovery showing that Tile Automata can simulate Amoebot programmable matter systems, thus showing that the Signal-passing Tile Assembly can simulate any Amoebot system.

Cite as

Angel A. Cantu, Austin Luchsinger, Robert Schweller, and Tim Wylie. Signal Passing Self-Assembly Simulates Tile Automata. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 53:1-53:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{cantu_et_al:LIPIcs.ISAAC.2020.53,
  author =	{Cantu, Angel A. and Luchsinger, Austin and Schweller, Robert and Wylie, Tim},
  title =	{{Signal Passing Self-Assembly Simulates Tile Automata}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{53:1--53:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.53},
  URN =		{urn:nbn:de:0030-drops-133978},
  doi =		{10.4230/LIPIcs.ISAAC.2020.53},
  annote =	{Keywords: self-assembly, signal-passing tile assembly model, tile automata, cellular automata, simulation}
}
Document
Size, Depth and Energy of Threshold Circuits Computing Parity Function

Authors: Kei Uchizawa


Abstract
We investigate relations among the size, depth and energy of threshold circuits computing the n-variable parity function PAR_n, where the energy is a complexity measure for sparsity on computation of threshold circuits, and is defined to be the maximum number of gates outputting "1" over all the input assignments. We show that PAR_n is hard for threshold circuits of small size, depth and energy: - If a depth-2 threshold circuit C of size s and energy e computes PAR_n, it holds that 2^{n/(elog ^e n)} ≤ s; and - if a threshold circuit C of size s, depth d and energy e computes PAR_n, it holds that 2^{n/(e2^{e+d}log ^e n)} ≤ s. We then provide several upper bounds: - PAR_n is computable by a depth-2 threshold circuit of size O(2^{n-2e}) and energy e; - PAR_n is computable by a depth-3 threshold circuit of size O(2^{n/(e-1)} + 2^{e-2}) and energy e; and - PAR_n is computable by a threshold circuit of size O((e+d)2^{n-m}), depth d + O(1) and energy e + O(1), where m = max (((e-1)/(d-1))^{d-1}, ((d-1)/(e-1))^{e-1}). Our lower and upper bounds imply that threshold circuits need exponential size if both depth and energy are constant, which contrasts with the fact that PAR_n is computable by a threshold circuit of size O(n) and depth 2 if there is no restriction on the energy. Our results also suggest that any threshold circuit computing the parity function needs depth to be sparse if its size is bounded.

Cite as

Kei Uchizawa. Size, Depth and Energy of Threshold Circuits Computing Parity Function. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 54:1-54:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{uchizawa:LIPIcs.ISAAC.2020.54,
  author =	{Uchizawa, Kei},
  title =	{{Size, Depth and Energy of Threshold Circuits Computing Parity Function}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{54:1--54:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.54},
  URN =		{urn:nbn:de:0030-drops-133988},
  doi =		{10.4230/LIPIcs.ISAAC.2020.54},
  annote =	{Keywords: Circuit complexity, neural networks, threshold circuits, sprase activity, tradeoffs}
}
Document
Sorting by Prefix Block-Interchanges

Authors: Anthony Labarre


Abstract
We initiate the study of sorting permutations using prefix block-interchanges, which exchange any prefix of a permutation with another non-intersecting interval. The goal is to transform a given permutation into the identity permutation using as few such operations as possible. We give a 2-approximation algorithm for this problem, show how to obtain improved lower and upper bounds on the corresponding distance, and determine the largest possible value for that distance.

Cite as

Anthony Labarre. Sorting by Prefix Block-Interchanges. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 55:1-55:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{labarre:LIPIcs.ISAAC.2020.55,
  author =	{Labarre, Anthony},
  title =	{{Sorting by Prefix Block-Interchanges}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{55:1--55:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.55},
  URN =		{urn:nbn:de:0030-drops-133991},
  doi =		{10.4230/LIPIcs.ISAAC.2020.55},
  annote =	{Keywords: permutations, genome rearrangements, interconnection network, sorting, edit distance, prefix block-interchange}
}
Document
Space Hardness of Solving Structured Linear Systems

Authors: Xuangui Huang


Abstract
Space-efficient Laplacian solvers are closely related to derandomization of space-bound randomized computations. We show that if the probabilistic logarithmic-space solver or the deterministic nearly logarithmic-space solver for undirected Laplacian matrices can be extended to solve slightly larger subclasses of linear systems, then they can be used to solve all linear systems with similar space complexity. Previously Kyng and Zhang [Rasmus Kyng and Peng Zhang, 2017] proved such results in the time complexity setting using reductions between approximate solvers. We prove that their reductions can be implemented using constant-depth, polynomial-size threshold circuits.

Cite as

Xuangui Huang. Space Hardness of Solving Structured Linear Systems. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 56:1-56:12, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{huang:LIPIcs.ISAAC.2020.56,
  author =	{Huang, Xuangui},
  title =	{{Space Hardness of Solving Structured Linear Systems}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{56:1--56:12},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.56},
  URN =		{urn:nbn:de:0030-drops-134001},
  doi =		{10.4230/LIPIcs.ISAAC.2020.56},
  annote =	{Keywords: linear system solver, logarithmic space, threshold circuit}
}
Document
Sparse Hop Spanners for Unit Disk Graphs

Authors: Adrian Dumitrescu, Anirban Ghosh, and Csaba D. Tóth


Abstract
A unit disk graph G on a given set of points P in the plane is a geometric graph where an edge exists between two points p,q ∈ P if and only if |pq| ≤ 1. A subgraph G' of G is a k-hop spanner if and only if for every edge pq ∈ G, the topological shortest path between p,q in G' has at most k edges. We obtain the following results for unit disk graphs. 1) Every n-vertex unit disk graph has a 5-hop spanner with at most 5.5n edges. We analyze the family of spanners constructed by Biniaz (2020) and improve the upper bound on the number of edges from 9n to 5.5n. 2) Using a new construction, we show that every n-vertex unit disk graph has a 3-hop spanner with at most 11n edges. 3) Every n-vertex unit disk graph has a 2-hop spanner with O(nlog n) edges. This is the first nontrivial construction of 2-hop spanners. 4) For every sufficiently large n, there exists a set P of n points on a circle, such that every plane hop spanner on P has hop stretch factor at least 4. Previously, no lower bound greater than 2 was known. 5) For every point set on a circle, there exists a plane 4-hop spanner. As such, this provides a tight bound for points on a circle. 6) The maximum degree of k-hop spanners cannot be bounded from above by a function of k.

Cite as

Adrian Dumitrescu, Anirban Ghosh, and Csaba D. Tóth. Sparse Hop Spanners for Unit Disk Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 57:1-57:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{dumitrescu_et_al:LIPIcs.ISAAC.2020.57,
  author =	{Dumitrescu, Adrian and Ghosh, Anirban and T\'{o}th, Csaba D.},
  title =	{{Sparse Hop Spanners for Unit Disk Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{57:1--57:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.57},
  URN =		{urn:nbn:de:0030-drops-134018},
  doi =		{10.4230/LIPIcs.ISAAC.2020.57},
  annote =	{Keywords: graph approximation, \epsilon-net, hop-spanner, unit disk graph, lower bound}
}
Document
Sparsification Lower Bounds for List H-Coloring

Authors: Hubie Chen, Bart M. P. Jansen, Karolina Okrasa, Astrid Pieterse, and Paweł Rzążewski


Abstract
We investigate the List H-Coloring problem, the generalization of graph coloring that asks whether an input graph G admits a homomorphism to the undirected graph H (possibly with loops), such that each vertex v ∈ V(G) is mapped to a vertex on its list L(v) ⊆ V(H). An important result by Feder, Hell, and Huang [JGT 2003] states that List H-Coloring is polynomial-time solvable if H is a so-called bi-arc graph, and NP-complete otherwise. We investigate the NP-complete cases of the problem from the perspective of polynomial-time sparsification: can an n-vertex instance be efficiently reduced to an equivalent instance of bitsize 𝒪(n^(2-ε)) for some ε > 0? We prove that if H is not a bi-arc graph, then List H-Coloring does not admit such a sparsification algorithm unless NP ⊆ coNP/poly. Our proofs combine techniques from kernelization lower bounds with a study of the structure of graphs H which are not bi-arc graphs.

Cite as

Hubie Chen, Bart M. P. Jansen, Karolina Okrasa, Astrid Pieterse, and Paweł Rzążewski. Sparsification Lower Bounds for List H-Coloring. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 58:1-58:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{chen_et_al:LIPIcs.ISAAC.2020.58,
  author =	{Chen, Hubie and Jansen, Bart M. P. and Okrasa, Karolina and Pieterse, Astrid and Rz\k{a}\.{z}ewski, Pawe{\l}},
  title =	{{Sparsification Lower Bounds for List H-Coloring}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{58:1--58:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.58},
  URN =		{urn:nbn:de:0030-drops-134027},
  doi =		{10.4230/LIPIcs.ISAAC.2020.58},
  annote =	{Keywords: List H-Coloring, Sparsification, Constraint Satisfaction Problem}
}
Document
The Complexity of Connectivity Problems in Forbidden-Transition Graphs And Edge-Colored Graphs

Authors: Thomas Bellitto, Shaohua Li, Karolina Okrasa, Marcin Pilipczuk, and Manuel Sorge


Abstract
The notion of forbidden-transition graphs allows for a robust generalization of walks in graphs. In a forbidden-transition graph, every pair of edges incident to a common vertex is permitted or forbidden; a walk is compatible if all pairs of consecutive edges on the walk are permitted. Forbidden-transition graphs and related models have found applications in a variety of fields, such as routing in optical telecommunication networks, road networks, and bio-informatics. We initiate the study of fundamental connectivity problems from the point of view of parameterized complexity, including an in-depth study of tractability with regards to various graph-width parameters. Among several results, we prove that finding a simple compatible path between given endpoints in a forbidden-transition graph is W[1]-hard when parameterized by the vertex-deletion distance to a linear forest (so it is also hard when parameterized by pathwidth or treewidth). On the other hand, we show an algebraic trick that yields tractability when parameterized by treewidth of finding a properly colored Hamiltonian cycle in an edge-colored graph; properly colored walks in edge-colored graphs is one of the most studied special cases of compatible walks in forbidden-transition graphs.

Cite as

Thomas Bellitto, Shaohua Li, Karolina Okrasa, Marcin Pilipczuk, and Manuel Sorge. The Complexity of Connectivity Problems in Forbidden-Transition Graphs And Edge-Colored Graphs. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 59:1-59:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{bellitto_et_al:LIPIcs.ISAAC.2020.59,
  author =	{Bellitto, Thomas and Li, Shaohua and Okrasa, Karolina and Pilipczuk, Marcin and Sorge, Manuel},
  title =	{{The Complexity of Connectivity Problems in Forbidden-Transition Graphs And Edge-Colored Graphs}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{59:1--59:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.59},
  URN =		{urn:nbn:de:0030-drops-134036},
  doi =		{10.4230/LIPIcs.ISAAC.2020.59},
  annote =	{Keywords: Graph algorithms, fixed-parameter tractability, parameterized complexity}
}
Document
The Online Broadcast Range-Assignment Problem

Authors: Mark de Berg, Aleksandar Markovic, and Seeun William Umboh


Abstract
Let P = {p₀,…,p_{n-1}} be a set of points in ℝ^d, modeling devices in a wireless network. A range assignment assigns a range r(p_i) to each point p_i ∈ P, thus inducing a directed communication graph 𝒢_r in which there is a directed edge (p_i,p_j) iff dist(p_i, p_j) ⩽ r(p_i), where dist(p_i,p_j) denotes the distance between p_i and p_j. The range-assignment problem is to assign the transmission ranges such that 𝒢_r has a certain desirable property, while minimizing the cost of the assignment; here the cost is given by ∑_{p_i ∈ P} r(p_i)^α, for some constant α > 1 called the distance-power gradient. We introduce the online version of the range-assignment problem, where the points p_j arrive one by one, and the range assignment has to be updated at each arrival. Following the standard in online algorithms, resources given out cannot be taken away - in our case this means that the transmission ranges will never decrease. The property we want to maintain is that 𝒢_r has a broadcast tree rooted at the first point p₀. Our results include the following. - We prove that already in ℝ¹, a 1-competitive algorithm does not exist. In particular, for distance-power gradient α = 2 any online algorithm has competitive ratio at least 1.57. - For points in ℝ¹ and ℝ², we analyze two natural strategies for updating the range assignment upon the arrival of a new point p_j. The strategies do not change the assignment if p_j is already within range of an existing point, otherwise they increase the range of a single point, as follows: Nearest-Neighbor (NN) increases the range of NN(p_j), the nearest neighbor of p_j, to dist(p_j, NN(p_j)), and Cheapest Increase (CI) increases the range of the point p_i for which the resulting cost increase to be able to reach the new point p_j is minimal. We give lower and upper bounds on the competitive ratio of these strategies as a function of the distance-power gradient α. We also analyze the following variant of NN in ℝ² for α = 2: 2-Nearest-Neighbor (2-NN) increases the range of NN(p_j) to 2⋅ dist(p_j,NN(p_j)), - We generalize the problem to points in arbitrary metric spaces, where we present an O(log n)-competitive algorithm.

Cite as

Mark de Berg, Aleksandar Markovic, and Seeun William Umboh. The Online Broadcast Range-Assignment Problem. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 60:1-60:15, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{deberg_et_al:LIPIcs.ISAAC.2020.60,
  author =	{de Berg, Mark and Markovic, Aleksandar and Umboh, Seeun William},
  title =	{{The Online Broadcast Range-Assignment Problem}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{60:1--60:15},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.60},
  URN =		{urn:nbn:de:0030-drops-134042},
  doi =		{10.4230/LIPIcs.ISAAC.2020.60},
  annote =	{Keywords: Computational geometry, online algorithms, range assignment, broadcast}
}
Document
The k-Server Problem with Delays on the Uniform Metric Space

Authors: Predrag Krnetić, Darya Melnyk, Yuyi Wang, and Roger Wattenhofer


Abstract
In this paper, we present tight bounds for the k-server problem with delays in the uniform metric space. The problem is defined on n+k nodes in the uniform metric space which can issue requests over time. These requests can be served directly or with some delay using k servers, by moving a server to the corresponding node with an open request. The task is to find an online algorithm that can serve the requests while minimizing the total moving and delay costs. We first provide a lower bound by showing that the competitive ratio of any deterministic online algorithm cannot be better than (2k+1) in the clairvoyant setting. We will then show that conservative algorithms (without delay) can be equipped with an accumulative delay function such that all such algorithms become (2k+1)-competitive in the non-clairvoyant setting. Together, the two bounds establish a tight result for both, the clairvoyant and the non-clairvoyant settings.

Cite as

Predrag Krnetić, Darya Melnyk, Yuyi Wang, and Roger Wattenhofer. The k-Server Problem with Delays on the Uniform Metric Space. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 61:1-61:13, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{krnetic_et_al:LIPIcs.ISAAC.2020.61,
  author =	{Krneti\'{c}, Predrag and Melnyk, Darya and Wang, Yuyi and Wattenhofer, Roger},
  title =	{{The k-Server Problem with Delays on the Uniform Metric Space}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{61:1--61:13},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.61},
  URN =		{urn:nbn:de:0030-drops-134056},
  doi =		{10.4230/LIPIcs.ISAAC.2020.61},
  annote =	{Keywords: Online k-Server, Paging, Delayed Service, Conservative Algorithms}
}
Document
Towards Constant-Factor Approximation for Chordal / Distance-Hereditary Vertex Deletion

Authors: Jungho Ahn, Eun Jung Kim, and Euiwoong Lee


Abstract
For a family of graphs ℱ, Weighted ℱ-Deletion is the problem for which the input is a vertex weighted graph G = (V, E) and the goal is to delete S ⊆ V with minimum weight such that G⧵S ∈ ℱ. Designing a constant-factor approximation algorithm for large subclasses of perfect graphs has been an interesting research direction. Block graphs, 3-leaf power graphs, and interval graphs are known to admit constant-factor approximation algorithms, but the question is open for chordal graphs and distance-hereditary graphs. In this paper, we add one more class to this list by presenting a constant-factor approximation algorithm when ℱ is the intersection of chordal graphs and distance-hereditary graphs. They are known as ptolemaic graphs and form a superset of both block graphs and 3-leaf power graphs above. Our proof presents new properties and algorithmic results on inter-clique digraphs as well as an approximation algorithm for a variant of Feedback Vertex Set that exploits this relationship (named Feedback Vertex Set with Precedence Constraints), each of which may be of independent interest.

Cite as

Jungho Ahn, Eun Jung Kim, and Euiwoong Lee. Towards Constant-Factor Approximation for Chordal / Distance-Hereditary Vertex Deletion. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 62:1-62:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{ahn_et_al:LIPIcs.ISAAC.2020.62,
  author =	{Ahn, Jungho and Kim, Eun Jung and Lee, Euiwoong},
  title =	{{Towards Constant-Factor Approximation for Chordal / Distance-Hereditary Vertex Deletion}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{62:1--62:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.62},
  URN =		{urn:nbn:de:0030-drops-134063},
  doi =		{10.4230/LIPIcs.ISAAC.2020.62},
  annote =	{Keywords: ptolemaic, approximation algorithm, linear programming, feedback vertex set}
}
Document
Update Query Time Trade-Off for Dynamic Suffix Arrays

Authors: Amihood Amir and Itai Boneh


Abstract
The Suffix Array SA(S) of a string S[1 … n] is an array containing all the suffixes of S sorted by lexicographic order. The suffix array is one of the most well known indexing data structures, and it functions as a key tool in many string algorithms. In this paper, we present a data structure for maintaining the Suffix Array of a dynamic string. For every 1 ≤ k ≤ n, our data structure reports SA[i] in 𝒪̃(n/k) time and handles text modification in 𝒪̃(k) time. Additionally, our data structure enables the same query time for reporting iSA[i], with iSA being the Inverse Suffix Array of S[1 … n]. Our data structure can be used to construct sub-linear dynamic variants of static strings algorithms or data structures that are based on the Suffix Array and the Inverse Suffix Array.

Cite as

Amihood Amir and Itai Boneh. Update Query Time Trade-Off for Dynamic Suffix Arrays. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 63:1-63:16, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{amir_et_al:LIPIcs.ISAAC.2020.63,
  author =	{Amir, Amihood and Boneh, Itai},
  title =	{{Update Query Time Trade-Off for Dynamic Suffix Arrays}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{63:1--63:16},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.63},
  URN =		{urn:nbn:de:0030-drops-134070},
  doi =		{10.4230/LIPIcs.ISAAC.2020.63},
  annote =	{Keywords: String Algorithms, Dynamic Algorithms, Suffix Array, Inverse Suffix Array}
}
Document
Weakly Submodular Function Maximization Using Local Submodularity Ratio

Authors: Richard Santiago and Yuichi Yoshida


Abstract
Weak submodularity is a natural relaxation of the diminishing return property, which is equivalent to submodularity. Weak submodularity has been used to show that many (monotone) functions that arise in practice can be efficiently maximized with provable guarantees. In this work we introduce two natural generalizations of weak submodularity for non-monotone functions. We show that an efficient randomized greedy algorithm has provable approximation guarantees for maximizing these functions subject to a cardinality constraint. We then provide a more refined analysis that takes into account that the weak submodularity parameter may change (sometimes improving) throughout the execution of the algorithm. This leads to improved approximation guarantees in some settings. We provide applications of our results for monotone and non-monotone maximization problems.

Cite as

Richard Santiago and Yuichi Yoshida. Weakly Submodular Function Maximization Using Local Submodularity Ratio. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 64:1-64:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{santiago_et_al:LIPIcs.ISAAC.2020.64,
  author =	{Santiago, Richard and Yoshida, Yuichi},
  title =	{{Weakly Submodular Function Maximization Using Local Submodularity Ratio}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{64:1--64:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.64},
  URN =		{urn:nbn:de:0030-drops-134082},
  doi =		{10.4230/LIPIcs.ISAAC.2020.64},
  annote =	{Keywords: weakly submodular, non-monotone, local submodularity ratio}
}
Document
Wear Leveling Revisited

Authors: Taku Onodera and Tetsuo Shibuya


Abstract
Wear leveling - a technology designed to balance the write counts among memory cells regardless of the requested accesses - is vital in prolonging the lifetime of certain computer memory devices, especially the type of next-generation non-volatile memory, known as phase change memory (PCM). Although researchers have been working extensively on wear leveling, almost all existing studies mainly focus on the practical aspects and lack rigorous mathematical analyses. The lack of theory is particularly problematic for security-critical applications. We address this issue by revisiting wear leveling from a theoretical perspective. First, we completely determine the problem parameter regime for which Security Refresh - one of the most well-known existing wear leveling schemes for PCM - works effectively by providing a positive result and a matching negative result. In particular, Security Refresh is not competitive for the practically relevant regime of large-scale memory. Then, we propose a novel scheme that achieves better lifetime, time/space overhead, and wear-free space for the relevant regime not covered by Security Refresh. Unlike existing studies, we give rigorous theoretical lifetime analyses, which is necessary to assess and control the security risk.

Cite as

Taku Onodera and Tetsuo Shibuya. Wear Leveling Revisited. In 31st International Symposium on Algorithms and Computation (ISAAC 2020). Leibniz International Proceedings in Informatics (LIPIcs), Volume 181, pp. 65:1-65:17, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2020)


Copy BibTex To Clipboard

@InProceedings{onodera_et_al:LIPIcs.ISAAC.2020.65,
  author =	{Onodera, Taku and Shibuya, Tetsuo},
  title =	{{Wear Leveling Revisited}},
  booktitle =	{31st International Symposium on Algorithms and Computation (ISAAC 2020)},
  pages =	{65:1--65:17},
  series =	{Leibniz International Proceedings in Informatics (LIPIcs)},
  ISBN =	{978-3-95977-173-3},
  ISSN =	{1868-8969},
  year =	{2020},
  volume =	{181},
  editor =	{Cao, Yixin and Cheng, Siu-Wing and Li, Minming},
  publisher =	{Schloss Dagstuhl -- Leibniz-Zentrum f{\"u}r Informatik},
  address =	{Dagstuhl, Germany},
  URL =		{https://drops-dev.dagstuhl.de/entities/document/10.4230/LIPIcs.ISAAC.2020.65},
  URN =		{urn:nbn:de:0030-drops-134092},
  doi =		{10.4230/LIPIcs.ISAAC.2020.65},
  annote =	{Keywords: Wear leveling, Randomized algorithm, Non-volatile memory}
}

Filters


Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail