Search everything
Home
Research Intelligence
Expert Finder
Scinapse Trends
Paper Search
Journal Search
Collections
Favorites
History
Submit Feedback
doi.org/10.18653/v1/p19-1285
Transformer-XL: Attentive Language Models beyond a Fixed-Length Context
Zihang Dai
18
,
Zhilin Yang
53
,
...,
Ruslan Salakhutdinov
67
View all 6 authors
Published
: Jan 1, 2019
2,115
Citations
Sources
Cite
Basic Info
Analytics
References
Citations
Paper Fields
Electrical engineering
Dependency (UML)
Perplexity
Engineering
Language model
Voltage
Artificial intelligence
Natural language processing
Treebank
Computer science
Transformer
Hyperparameter
Paper Details
Title
Transformer-XL: Attentive Language Models beyond a Fixed-Length Context
DOI
doi.org/10.18653/v1/p19-1285
Published Date
Jan 1, 2019
Notes
History
View all history