Original paper
Learning word dependencies in text by means of a deep recurrent belief network
Abstract
We propose a deep recurrent belief network with distributed time delays for learning multivariate Gaussians. Learning long time delays in deep belief networks is difficult due to the problem of vanishing or exploding gradients with increase in delay. To mitigate this problem and improve the transparency of learning time-delays, we introduce the use of Gaussian networks with time-delays to initialize the weights of each hidden neuron. From our...
Paper Details
Title
Learning word dependencies in text by means of a deep recurrent belief network
Published Date
Jul 16, 2016
Journal
Volume
108
Pages
144 - 154
TrendsPro
You’ll need to upgrade your plan to Pro
Looking to understand a paper’s academic impact over time?
- Scinapse’s Citation Trends graph enables the impact assessment of papers in adjacent fields.
- Assess paper quality within the same journal or volume, irrespective of the year or field, and track the changes in the attention a paper received over time.
Citation AnalysisPro
You’ll need to upgrade your plan to Pro
Looking to understand the true influence of a researcher’s work across journals & affiliations?
- Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
- Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.