Revised: August 10, 2022

Published: December 19, 2023

**Keywords:**binary search trees, dynamic optimality, data structures

**Categories:**algorithms, binary search trees, dynamic optimality, data structures, competitive ratio, APPROX, APPROX-RANDOM 2020 special issue

**ACM Classification:**Theory of computation $\to$ Data structure design and analysis

**AMS Classification:**68Q25, 68W25

**Abstract:**
[Plain Text Version]

Dynamic Optimality Conjecture, postulating the existence of an $O(1)$-competitive online algorithm for binary search trees (BSTs), is among the most fundamental open problems in dynamic data structures. The conjecture remains wide open, despite extensive work and some notable progress, including, for example, the $O(\log\log n)$-competitive Tango Trees, which is the best currently known competitive ratio. One of the main hurdles towards settling the conjecture is that we currently do not have polynomial-time approximation algorithms achieving better than an $O(\log \log n)$-approximation, even in the offline setting. All known non-trivial algorithms for BSTs rely on comparing the algorithm's cost with the so-called Wilber-1 bound (WB-1). Therefore, establishing the worst-case relationship between this bound and the optimal solution cost appears crucial for further progress, and it is an interesting open question in its own right.

Our contribution is twofold.
First, we show that the gap between WB-1 and the optimal solution value
can be as large as $\Omega(\log \log n/ \log \log \log n)$; in fact, we
show that the gap holds even for several stronger variants of the bound.$^*$
Second, we show, given an integer $D> 0$, a $D$-approximation algorithm that
runs in time $\exp\left(O\left (n^{1/2^{\Omega(D)}}\log n\right )\right )$.
In particular, this yields a constant-factor approximation algorithm with
subexponential running time.$**$ Moreover, we obtain a simpler and cleaner
efficient $O(\log \log n)$-approximation algorithm that can be used in
an online setting.
Finally, we suggest a new bound, that we call the *Guillotine Bound*,
that is stronger than WB-1, while maintaining its algorithm-friendly nature,
that we hope will lead to better algorithms. All our results use the
geometric interpretation of the problem, leading to cleaner and simpler
analysis.

---------------

$^*$ A recent independent paper by Lecomte and Weinstein (ESA'20) shows an even stronger, $\Omega(\log\log n)$, separation.

$**$ The term “subexponential time” in this paper refers to the running time $2^{o(n)}$.

---------------

An extended abstract of this paper appeared in the Proceedings of the 23rd Internat. Conf. on Approximation Algorithms and Combinatorial Optimization (APPROXâ€™20).