Abstract
Language models like BERT and SpanBERT pretrained on open-domain data have obtained impressive gains on various NLP tasks. In this paper, we probe the effectiveness of domain-adaptive pretraining objectives on downstream tasks. In particular, three objectives, including a novel objective focusing on modeling predicate-argument relations, are evaluated on two challenging dialogue understanding tasks. Experimental results demonstrate that domain-adaptive pretraining with proper objectives can significantly improve the performance of a strong baseline on these tasks, achieving the new state-of-the-art performances.
Original language | English |
---|---|
Title of host publication | Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing |
Subtitle of host publication | Short Papers |
Publisher | Association for Computational Linguistics |
Pages | 665-669 |
Number of pages | 5 |
Volume | 2 |
ISBN (Print) | 978-1-954085-53-4 |
DOIs | |
Publication status | Published - Aug 2021 |
Event | 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021) - Virtual Duration: 1 Aug 2021 → 6 Aug 2021 https://2021.aclweb.org/ |
Publication series
Name | ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference |
---|---|
Volume | 2 |
Conference
Conference | 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021) |
---|---|
Period | 1/08/21 → 6/08/21 |
Internet address |