This website uses cookies.
We use cookies to improve your online experience. By continuing to use our website we assume you agree to the placement of these cookies.
To learn more, you can find in our Privacy Policy.
Original paper

Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA

Pages: 7046 - 7056
Published: Jan 1, 2020
Abstract
Many NLP tasks have benefited from transferring knowledge from contextualized word embeddings, however the picture of what type of knowledge is transferred is incomplete. This paper studies the types of linguistic phenomena accounted for by language models in the context of a Conversational Question Answering (CoQA) task. We identify the problematic areas for the finetuned RoBERTa, BERT and DistilBERT models through systematic error analysis -...
Paper Details
Title
Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA
Published Date
Jan 1, 2020
Pages
7046 - 7056
© 2025 Pluto Labs All rights reserved.
Step 1. Scroll down for details & analytics related to the paper.
Discover a range of citation analytics, paper references, a list of cited papers, and more.