Volume 14 (2018) Article 20 pp. 1-29
Simplified Separation of Information and Communication
by
Received: September 20, 2016
Revised: January 2, 2018
Published: December 27, 2018
Download article from ToC site:
[PDF (331K)]    [PS (1918K)]    [PS.GZ (398K)]
[Source ZIP]
Keywords: communication complexity, information complexity, fooling distribution, separation of complexity measures
ACM Classification: F.1.1, F.1.2, F.2.3
AMS Classification: 68Q05, 68W10, 68Q17, 68Q85, 68Q87

Abstract: [Plain Text Version]

We give an example of a boolean function whose information complexity is exponentially smaller than its communication complexity. Such a separation was first demonstrated by Ganor, Kol and Raz (J. ACM 2016). We give a simpler proof of the same result. In the course of this simplification, we make several new contributions: we introduce a new communication lower-bound technique, the notion of a fooling distribution, which allows us to separate information and communication complexity, and we also give a more direct proof of the information complexity upper bound.

We also prove a generalization of Shearer's Lemma that may be of independent interest. A version of Shearer's original lemma bounds the expected mutual information of a subset of random variables with another random variable, when the subset is chosen independently of all the random variables that are involved. Our generalization allows some dependence between the random subset and the random variables involved, and still gives us similar bounds with an appropriate error term.