Calibrated Stochastic Gradient Descent for Convolutional Neural Networks

Volume: 33, Issue: 01, Pages: 9348 - 9355
Published: Jul 17, 2019
Abstract
In stochastic gradient descent (SGD) and its variants, the optimized gradient estimators may be as expensive to compute as the true gradient in many scenarios. This paper introduces a calibrated stochastic gradient descent (CSGD) algorithm for deep neural network optimization. A theorem is developed to prove that an unbiased estimator for the network variables can be obtained in a probabilistic way based on the Lipschitz hypothesis. Our work is...
Paper Details
Title
Calibrated Stochastic Gradient Descent for Convolutional Neural Networks
Published Date
Jul 17, 2019
Volume
33
Issue
01
Pages
9348 - 9355
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.