Theory of Computing
-------------------
Title : Optimal Cryptographic Hardness of Learning Monotone Functions
Authors : Dana Dachman-Soled, Homin K. Lee, Tal Malkin, Rocco A. Servedio, Andrew Wan, and Hoeteck Wee
Volume : 5
Number : 13
Pages : 257-282
URL : https://theoryofcomputing.org/articles/v005a013
Abstract
--------
Over the years a range of positive algorithmic results have been
obtained for learning various classes of monotone Boolean functions
from uniformly distributed random examples. Prior to our work,
however, the only negative result for learning monotone functions
in this model has been an information-theoretic lower bound showing
that certain superpolynomial-size monotone circuits cannot be learned
to accuracy 1/2 + omega(log n) / sqrt{n} (Blum, Birch, and Langford,
FOCS '98). This is in contrast with the situation for non-monotone
functions, where a wide range of cryptographic hardness results
establish that various "simple" classes of polynomial-size circuits
are not learnable by polynomial-time algorithms.
In this paper we establish cryptographic hardness results for
learning various "simple" classes of monotone circuits, thus giving
a computational analogue of the information-theoretic hardness
results of Blum et al. mentioned above. Some of our results show
the cryptographic hardness of learning polynomial-size monotone
circuits to accuracy only slightly greater than 1/2 + 1/ sqrt{n},
which is close to the optimal accuracy bound by known positive
results of Blum et al. Other results show that under a plausible
cryptographic hardness assumption, a class of constant-depth,
sub-polynomial-size circuits computing monotone functions is hard
to learn. This result is close to optimal in terms of the circuit
size parameter by known positive results as well (Servedio,
Information and Computation '04). Our main tool is a
complexity-theoretic approach to hardness amplification via noise
sensitivity of monotone functions that was pioneered by O'Donnell
(JCSS'04).