首页    期刊浏览 2024年12月01日 星期日
登录注册

文章基本信息

  • 标题:Mixing Implies Lower Bounds for Space Bounded Learning
  • 本地全文:下载
  • 作者:Michal Moshkovitz ; Dana Moshkovitz
  • 期刊名称:Electronic Colloquium on Computational Complexity
  • 印刷版ISSN:1433-8092
  • 出版年度:2017
  • 卷号:2017
  • 出版社:Universität Trier, Lehrstuhl für Theoretische Computer-Forschung
  • 摘要:

    One can learn any hypothesis class H with O ( log H ) labeled examples. Alas, learning with so few examples requires saving the examples in memory, and this requires X O ( log H ) memory states, where X is the set of all labeled examples. A question that arises is how many labeled examples are needed in case the memory is bounded. Previous work showed, using techniques such as linear algebra and Fourier analysis, that parities cannot be learned with bounded memory and less than H (1) examples. One might wonder whether a general combinatorial condition exists for unlearnability with bounded memory, as we have with the condition VCdim ( H ) = for PAC unlearnability.

    In this paper we give such a condition. We show that if an hypothesis class H , when viewed as a bipartite graph between hypotheses H and labeled examples X , is mixing, then learning it requires H (1) examples under a certain bound on the memory. Note that the class of parities is mixing. As an immediate corollary, we get that most hypothesis classes are unlearnable with bounded memory. Our proof technique is combinatorial in nature and very different from previous analyses.

  • 关键词:bounded space ; lower bound ; mixing ; PAC learning ; time-space tradeoff ; VC-dimension
国家哲学社会科学文献中心版权所有