UNKs Everywhere: Adapting Multilingual Language Models to New Scripts.

EMNLP 2021
Pages: 10186 - 10203
Published: Aug 26, 2021
Abstract
Massively multilingual language models such as multilingual BERT offer state-of-the-art cross-lingual transfer performance on a range of NLP tasks. However, due to limited capacity and large differences in pretraining data sizes, there is a profound performance gap between resource-rich and resource-poor target languages. The ultimate challenge is dealing with under-resourced languages not covered at all by the models and written in scripts...
Paper Details
Title
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts.
Published Date
Aug 26, 2021
Journal
Pages
10186 - 10203
Citation AnalysisPro
  • Scinapse’s Top 10 Citation Journals & Affiliations graph reveals the quality and authenticity of citations received by a paper.
  • Discover whether citations have been inflated due to self-citations, or if citations include institutional bias.