Efficient Online and Batch Learning Using Forward Backward Splitting

Volume: 10, Issue: 99, Pages: 2899 - 2934
Published: Dec 1, 2009
Abstract
We describe, analyze, and experiment with a framework for empirical loss minimization with regularization. Our algorithmic framework alternates between two phases. On each iteration we first perform an unconstrained gradient descent step. We then cast and solve an instantaneous optimization problem that trades off minimization of a regularization term while keeping close proximity to the result of the first phase. This view yields a simple yet...
Paper Details
Title
Efficient Online and Batch Learning Using Forward Backward Splitting
Published Date
Dec 1, 2009
Volume
10
Issue
99
Pages
2899 - 2934
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.