首页    期刊浏览 2025年03月02日 星期日
登录注册

文章基本信息

  • 标题:Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM
  • 本地全文:下载
  • 作者:Nimrod Segol ; Boaz Nadler
  • 期刊名称:Electronic Journal of Statistics
  • 印刷版ISSN:1935-7524
  • 出版年度:2021
  • 卷号:15
  • 期号:2
  • 页码:4510-4544
  • DOI:10.1214/21-EJS1905
  • 语种:English
  • 出版社:Institute of Mathematical Statistics
  • 摘要:We consider the problem of estimating the parameters a Gaussian Mixture Model with K components of known weights, all with an identity covariance matrix. We make two contributions. First, at the population level, we present a sharper analysis of the local convergence of EM and gradient EM, compared to previous works. Assuming a separation of Ω( logK), we prove convergence of both methods to the global optima from an initialization region larger than those of previous works. Specifically, the initial guess of each component can be as far as (almost) half its distance to the nearest Gaussian. This is essentially the largest possible contraction region. Our second contribution are improved sample size requirements for accurate estimation by EM and gradient EM. In previous works, the required number of samples had a quadratic dependence on the maximal separation between the K components, and the resulting error estimate increased linearly with this maximal separation. In this manuscript we show that both quantities depend only logarithmically on the maximal separation.
  • 关键词:62F10; 62F99; EM algorithm; Gaussian mixture models
国家哲学社会科学文献中心版权所有