Incorporating Effective Global Information via Adaptive Gate Attention for Text Classification
Research output: Working Papers › Working paper
Author(s)
Related Research Unit(s)
Detail(s)
Original language | English |
---|---|
Number of pages | 7 |
Publication status | Published - 22 Feb 2020 |
Link(s)
Permanent Link | https://scholars.cityu.edu.hk/en/publications/publication(fd2921bf-4e7d-4c1e-8906-87b6eb658bb7).html |
---|
Abstract
The dominant text classification studies focus on training classifiers using textual instances only or introducing external knowledge (e.g., hand-craft features and domain expert knowledge). In contrast, some corpus-level statistical features, like word frequency and distribution, are not well exploited. Our work shows that such simple statistical information can enhance classification performance both efficiently and significantly compared with several baseline models. In this paper, we propose a classifier with gate mechanism named Adaptive Gate Attention model with Global Information (AGA+GI), in which the adaptive gate mechanism incorporates global statistical features into latent semantic features and the attention layer captures dependency relationship within the sentence. To alleviate the overfitting issue, we propose a novel Leaky Dropout mechanism to improve generalization ability and performance stability. Our experiments show that the proposed method can achieve better accuracy than CNN-based and RNN-based approaches without global information on several benchmarks.
Research Area(s)
- cs.CL
Citation Format(s)
Incorporating Effective Global Information via Adaptive Gate Attention for Text Classification. / Li, Xianming; Li, Zongxi; Zhao, Yingbin et al.
2020.
2020.
Research output: Working Papers › Working paper