Domain-Adaptive Pretraining Methods for Dialogue Understanding

Han Wu, Kun Xu, Linfeng Song, Lifeng Jin, Haisong Zhang, Linqi Song

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

15 Citations (Scopus)

Abstract

Language models like BERT and SpanBERT pretrained on open-domain data have obtained impressive gains on various NLP tasks. In this paper, we probe the effectiveness of domain-adaptive pretraining objectives on downstream tasks. In particular, three objectives, including a novel objective focusing on modeling predicate-argument relations, are evaluated on two challenging dialogue understanding tasks. Experimental results demonstrate that domain-adaptive pretraining with proper objectives can significantly improve the performance of a strong baseline on these tasks, achieving the new state-of-the-art performances.
Original languageEnglish
Title of host publicationProceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing
Subtitle of host publicationShort Papers
PublisherAssociation for Computational Linguistics
Pages665-669
Number of pages5
Volume2
ISBN (Print)978-1-954085-53-4
DOIs
Publication statusPublished - Aug 2021
Event59th Annual Meeting of the Association for Computational Linguistics
and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021)
- Virtual
Duration: 1 Aug 20216 Aug 2021
https://2021.aclweb.org/

Publication series

NameACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference
Volume2

Conference

Conference59th Annual Meeting of the Association for Computational Linguistics
and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021)
Period1/08/216/08/21
Internet address

Fingerprint

Dive into the research topics of 'Domain-Adaptive Pretraining Methods for Dialogue Understanding'. Together they form a unique fingerprint.

Cite this