A theoretically grounded application of dropout in recurrent neural networks

Volume: 29, Pages: 1027 - 1035
Published: Dec 5, 2016
Abstract
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with these models is their tendency to overfit, with dropout shown to fail when applied to recurrent layers. Recent results at the intersection of Bayesian modelling and deep learning offer a Bayesian interpretation of common deep learning techniques such as dropout. This grounding of dropout in approximate Bayesian...
Paper Details
Title
A theoretically grounded application of dropout in recurrent neural networks
Published Date
Dec 5, 2016
Volume
29
Pages
1027 - 1035
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.