Abstract
Meta-embedding learning, which combines complementary information in different word embeddings, have shown superior performances across different Natural Language Processing tasks. However, domain-specific knowledge is still ignored by existing meta-embedding methods, which results in unstable performances across specific domains. Moreover, the importance of general and domain word embeddings is related to downstream tasks, how to regularize meta-embedding to adapt downstream tasks is an unsolved problem. In this paper, we propose a method to incorporate both domain-specific and task-oriented information into meta-embeddings. We conducted extensive experiments on four text classification datasets and the results show the effectiveness of our proposed method.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing |
Editors | Bonnie Webber, Trevor Cohn, Yulan He, Yang Li |
Publisher | Association for Computational Linguistics |
Pages | 3508-3513 |
ISBN (Print) | 9781952148606 |
DOIs | |
Publication status | Published - Nov 2020 |
Event | 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020) - Virtual Duration: 16 Nov 2020 → 20 Nov 2020 https://2020.emnlp.org/ |
Publication series
Name | EMNLP - Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference |
---|
Conference
Conference | 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020) |
---|---|
Period | 16/11/20 → 20/11/20 |
Internet address |