The information complexity of a function f is the minimum amount of information Alice and Bob need to exchange to compute the function f . In this paper we provide an algorithm for approximating the information complexity of an arbitrary function f to within any additive error 0"> 0 , thus resolving an open question as to whether information complexity is computable.
In the process, we give the first explicit upper bound on the rate of convergence of the information complexity of f when restricted to b -bit protocols to the (unrestricted) information complexity of f .