Theory of Computing ------------------- Title : Trading Information Complexity for Error Authors : Yuval Dagan, Yuval Filmus, Hamed Hatami, and Yaqiao Li Volume : 14 Number : 6 Pages : 1-73 URL : https://theoryofcomputing.org/articles/v014a006 Abstract -------- We consider the standard two-party communication model. The central problem studied in this article is how much one can save in information complexity by allowing an error of $\epsilon$. * For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order $\Omega(h(\epsilon))$ and $O(h(\sqrt{\epsilon}))$, respectively. Here $h$ denotes the binary entropy function. * We analyze the case of the two-bit AND function in detail to show that for this function the gain is $\Theta(h(\epsilon))$. This answers a question of Braverman et al (STOC'13). * We obtain sharp bounds for the set disjointness function of order $n$. For the case of the distributional error, we introduce a new protocol that achieves a gain of $\Theta(\sqrt{h(\epsilon)})$ provided that $n$ is sufficiently large. We apply these results to answer another question of Braverman et al. regarding the randomized communication complexity of the set disjointness function. * Answering a question of Braverman (STOC'12), we apply our analysis of the set disjointness function to establish a gap between the two different notions of the prior-free information cost. In the light of Braverman (STOC'12), this implies that the amortized randomized communication complexity is not necessarily equal to the amortized distributional communication complexity with respect to the hardest distribution. As a consequence, we show that the $\eps$-error randomized communication complexity of the set disjointness function of order $n$ is $n[CDISJ - \Theta(h(\eps))] + o(n)$, where $CDISJ \approx 0.4827$ is the constant found by Braverman et al. (STOC'13). A conference version of this paper appeared in the Proceedings of the 32nd IEEE Conference on Computational Complexity, 2017.