Projects per year
Abstract
Natural language processing (NLP) has recently shown significant progress in rich-resource scenarios. However, it is much less effective for low-resource scenarios due to the model easily overfitting to limited training data and generalizing poorly on testing data. In recent years, consistency training has been widely adopted and shown great promise in deep learning, but still remains unexplored in low-resource settings. In this work, we propose DM-CT, a framework that incorporates both data-level and model-level consistency training as well as advanced data augmentation techniques for low-resource scenarios. Concretely, the input data is first augmented, and the output distributions of different sub-models generated by model variance are forced to be consistent (model-level consistency). Meanwhile, the predictions of the original input and the augmented one are also constrained to be consistent (data-level consistency). Experiments on different low-resource NLP tasks, including neural machine translation (4 IWSLT14 translation tasks, multilingual translation task, and WMT16 Romanian → English translation), natural language understanding tasks (GLUE benchmark), and named entity recognition (Conll2003 and WikiGold), well demonstrate the superiority of DM-CT by obtaining significant and consistent performance improvements. © 2023 IEEE.
Original language | English |
---|---|
Pages (from-to) | 189-199 |
Journal | IEEE/ACM Transactions on Audio Speech and Language Processing |
Volume | 32 |
Online published | 19 Oct 2023 |
DOIs | |
Publication status | Published - 2023 |
Research Keywords
- Consistency training
- data augmentation
- low-resource
- natural language processing
Fingerprint
Dive into the research topics of 'Enhancing Low-Resource NLP by Consistency Training With Data and Model Perturbations'. Together they form a unique fingerprint.Projects
- 1 Finished
-
CRF: Multi-sourced Event Detection and Multi-dimensional Analysis based on Event Cube
Li, Q. (Main Project Coordinator [External]) & YU, X. N. (Principal Investigator / Project Coordinator)
1/03/19 → 31/07/24
Project: Research