Theory of Computing ------------------- Title : Complexity of Linear Operators Authors : Alexander S. Kulikov, Ivan Mikhailin, Andrey Mokhov, and Vladimir V. Podolskii Volume : 21 Number : 9 Pages : 1-32 URL : https://theoryofcomputing.org/articles/v021a009 Abstract -------- Let $A$ be an $n$-by-$n$ $0/1$-matrix with $z$ zeroes and $u$ ones and let $x$ be an $n$-dimensional vector of formal variables over a semigroup $(S, \circ)$. How many semigroup operations are required to compute the linear operator $Ax$? It is easy to compute $Ax$ using $O(u)$ semigroup operations. The main question studied in this paper is: can $Ax$ be computed using $O(z)$ semigroup operations? For the case when the semigroup is commutative, we give a constructive proof of an $O(z)$ upper bound. This implies that in the commutative settings, complements of sparse matrices can be processed as efficiently as sparse matrices, though the corresponding algorithms are more involved. This covers the cases of Boolean and tropical semirings that have numerous applications, e.g., in graph theory. On the other hand, we prove that in general this is not possible: for faithful non-commutative semigroups there exists an $n$-by-$n$ $0/1$-matrix with exactly two zeroes in every row (hence $z=2n$) whose complexity is $\Theta(n\alpha(n))$ where $\alpha(n)$ is the inverse Ackermann function. As a simple application of the linear-size construction presented, we show how to multiply two $n\times n$ matrices over an arbitrary semiring in $O(n^2)$ time if one of these matrices is a $0/1$-matrix with $O(n)$ zeroes (i.e., the complement of a sparse matrix).