Fast Compression and Optimization of Deep Learning Models for Natural Language Processing

Published: Nov 1, 2019
Abstract
Nowadays, recurrent neural networks (RNN) and convolutional neural networks (CNN) play a major role in a lot of natural language domains like text document categorization, part of speech tagging, chatbots, language modeling or language translation. Very often RNN networks have a few stacked layers with several megabytes of memory, the same is in case of CNN networks. In many domains like automatic speech recognition the real time inference is a...
Paper Details
Title
Fast Compression and Optimization of Deep Learning Models for Natural Language Processing
Published Date
Nov 1, 2019
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.