首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:Benign overfitting in linear regression
  • 本地全文:下载
  • 作者:Peter L. Bartlett ; Philip M. Long ; Gábor Lugosi
  • 期刊名称:Proceedings of the National Academy of Sciences
  • 印刷版ISSN:0027-8424
  • 电子版ISSN:1091-6490
  • 出版年度:2020
  • 卷号:117
  • 期号:48
  • 页码:30063-30070
  • DOI:10.1073/pnas.1907378117
  • 出版社:The National Academy of Sciences of the United States of America
  • 摘要:The phenomenon of benign overfitting is one of the key mysteries uncovered by deep learning methodology: deep neural networks seem to predict well, even with a perfect fit to noisy training data. Motivated by this phenomenon, we consider when a perfect fit to training data in linear regression is compatible with accurate prediction. We give a characterization of linear regression problems for which the minimum norm interpolating prediction rule has near-optimal prediction accuracy. The characterization is in terms of two notions of the effective rank of the data covariance. It shows that overparameterization is essential for benign overfitting in this setting: the number of directions in parameter space that are unimportant for prediction must significantly exceed the sample size. By studying examples of data covariance properties that this characterization shows are required for benign overfitting, we find an important role for finite-dimensional data: the accuracy of the minimum norm interpolating prediction rule approaches the best possible accuracy for a much narrower range of properties of the data distribution when the data lie in an infinite-dimensional space vs. when the data lie in a finite-dimensional space with dimension that grows faster than the sample size.
  • 关键词:statistical learning theory ; overfitting ; linear regression ; interpolation
国家哲学社会科学文献中心版权所有