首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:Correlation in Hard Distributions in Communication Complexity
  • 本地全文:下载
  • 作者:Ralph Christian Bottesch ; Dmitry Gavinsky ; Hartmut Klauck
  • 期刊名称:LIPIcs : Leibniz International Proceedings in Informatics
  • 电子版ISSN:1868-8969
  • 出版年度:2015
  • 卷号:40
  • 页码:544-572
  • DOI:10.4230/LIPIcs.APPROX-RANDOM.2015.544
  • 出版社:Schloss Dagstuhl -- Leibniz-Zentrum fuer Informatik
  • 摘要:We study the effect that the amount of correlation in a bipartite distribution has on the communication complexity of a problem under that distribution. We introduce a new family of complexity measures that interpolates between the two previously studied extreme cases: the (standard) randomised communication complexity and the case of distributional complexity under product distributions. - We give a tight characterisation of the randomised complexity of Disjointness under distributions with mutual information k, showing that it is Theta(sqrt(n(k+1))) for all 0 <= k <= n. This smoothly interpolates between the lower bounds of Babai, Frankl and Simon for the product distribution case (k=0), and the bound of Razborov for the randomised case. The upper bounds improve and generalise what was known for product distributions, and imply that any tight bound for Disjointness needs Omega(n) bits of mutual information in the corresponding distribution. - We study the same question in the distributional quantum setting, and show a lower bound of Omega((n(k+1))^{1/4}), and an upper bound (via constructing communication protocols), matching up to a logarithmic factor. - We show that there are total Boolean functions f_d that have distributional communication complexity O(log(n)) under all distributions of information up to o(n), while the (interactive) distributional complexity maximised over all distributions is Theta(log(d)) for n <= d <= 2^{n/100}. This shows, in particular, that the correlation needed to show that a problem is hard can be much larger than the communication complexity of the problem. - We show that in the setting of one-way communication under product distributions, the dependence of communication cost on the allowed error epsilon is multiplicative in log(1/epsilon) - the previous upper bounds had the dependence of more than 1/epsilon. This result, for the first time, explains how one-way communication complexity under product distributions is stronger than PAC-learning: both tasks are characterised by the VC-dimension, but have very different error dependence (learning from examples, it costs more to reduce the error).
  • 关键词:communication complexity; information theory
国家哲学社会科学文献中心版权所有