首页    期刊浏览 2024年07月18日 星期四
登录注册

文章基本信息

  • 标题:No Quantum Speedup over Gradient Descent for Non-Smooth Convex Optimization
  • 本地全文:下载
  • 作者:Ankit Garg ; Robin Kothari ; Praneeth Netrapalli
  • 期刊名称:LIPIcs : Leibniz International Proceedings in Informatics
  • 电子版ISSN:1868-8969
  • 出版年度:2021
  • 卷号:185
  • 页码:53:1-53:20
  • DOI:10.4230/LIPIcs.ITCS.2021.53
  • 出版社:Schloss Dagstuhl -- Leibniz-Zentrum fuer Informatik
  • 摘要:We study the first-order convex optimization problem, where we have black-box access to a (not necessarily smooth) function f:â"â¿ â†' â" and its (sub)gradient. Our goal is to find an ε-approximate minimum of f starting from a point that is distance at most R from the true minimum. If f is G-Lipschitz, then the classic gradient descent algorithm solves this problem with O((GR/ε)²) queries. Importantly, the number of queries is independent of the dimension n and gradient descent is optimal in this regard: No deterministic or randomized algorithm can achieve better complexity that is still independent of the dimension n. In this paper we reprove the randomized lower bound of Ω((GR/ε)²) using a simpler argument than previous lower bounds. We then show that although the function family used in the lower bound is hard for randomized algorithms, it can be solved using O(GR/ε) quantum queries. We then show an improved lower bound against quantum algorithms using a different set of instances and establish our main result that in general even quantum algorithms need Ω((GR/ε)²) queries to solve the problem. Hence there is no quantum speedup over gradient descent for black-box first-order convex optimization without further assumptions on the function family.
  • 关键词:Quantum algorithms; Gradient descent; Convex optimization
国家哲学社会科学文献中心版权所有