Deep, Big, Simple Neural Nets for Handwritten Digit Recognition

Volume: 22, Issue: 12, Pages: 3207 - 3220
Published: Dec 1, 2010
Abstract
Good old online backpropagation for plain multilayer perceptrons yields a very low 0.35% error rate on the MNIST handwritten digits benchmark. All we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images to avoid overfitting, and graphics cards to greatly speed up...
Paper Details
Title
Deep, Big, Simple Neural Nets for Handwritten Digit Recognition
Published Date
Dec 1, 2010
Volume
22
Issue
12
Pages
3207 - 3220
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.