首页    期刊浏览 2024年07月09日 星期二
登录注册

文章基本信息

  • 标题:Convergence Rate of Distributed Random Projections
  • 本地全文:下载
  • 作者:Thinh T. Doan ; Joseph Lubars ; Carolyn L. Beck
  • 期刊名称:IFAC PapersOnLine
  • 印刷版ISSN:2405-8963
  • 出版年度:2018
  • 卷号:51
  • 期号:23
  • 页码:373-378
  • DOI:10.1016/j.ifacol.2018.12.064
  • 语种:English
  • 出版社:Elsevier
  • 摘要:AbstractStochastic gradient descent is a common algorithm used in machine learning, especially when the loss function is in a separable form. Here, we consider SGD for constrained convex optimization problems, where the constraint is expressed as an intersection of a large number of convex sets. It is expensive to project the result of the gradient descent algorithm on each of these convex sets at each iteration; thus, there is a growing body of work which considers projections onto a random subset of the convex sets. In this paper, we consider a distributed version (parameter-server model) of the random projections since a centralized approach is too slow for very large-scale problems. Our main result is as follows: under a mild regularity condition on the convex sets, we show that the rate of convergence of distributed SGD with distributed random projections is the same as that of distributed SGD applied to a problem with no constraints, except for a factor which captures the regularity assumption.
  • 关键词:KeywordsDistributed optimizationrandom projections
国家哲学社会科学文献中心版权所有