Linear-Least-Squares Initialization of Multilayer Perceptrons Through Backpropagation of the Desired Response

Volume: 16, Issue: 2, Pages: 325 - 337
Published: Mar 1, 2005
Abstract
Training multilayer neural networks is typically carried out using descent techniques such as the gradient-based backpropagation (BP) of error or the quasi-Newton approaches including the Levenberg-Marquardt algorithm. This is basically due to the fact that there are no analytical methods to find the optimal weights, so iterative local or global optimization techniques are necessary. The success of iterative optimization procedures is strictly...
Paper Details
Title
Linear-Least-Squares Initialization of Multilayer Perceptrons Through Backpropagation of the Desired Response
Published Date
Mar 1, 2005
Volume
16
Issue
2
Pages
325 - 337
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.