No Quantum Speedup over Gradient Descent for Non-Smooth Convex Optimization

Pages: 20
Published: Jan 1, 2021
Abstract
We study the first-order convex optimization problem, where we have black-box access to a (not necessarily smooth) function f:ℝⁿ → ℝ and its (sub)gradient. Our goal is to find an e-approximate minimum of f starting from a point that is distance at most R from the true minimum. If f is G-Lipschitz, then the classic gradient descent algorithm solves this problem with O((GR/e)²) queries. Importantly, the number of queries is independent of the...
Paper Details
Title
No Quantum Speedup over Gradient Descent for Non-Smooth Convex Optimization
Published Date
Jan 1, 2021
Pages
20
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.