Improving Short Text Modeling by Two-level Attention Networks for Sentiment Classification
Refereed conference paper presented and published in conference proceedings


Times Cited
Altmetrics Information
.

Other information
AbstractUnderstanding short texts is crucial to many applications, but it has always been challenging, due to the sparsity and ambiguity of information in short texts. In addition, sentiments expressed in those user-generated short texts are often implicit and context dependent. To address this, we propose a novel model based on two-level attention networks to identify the sentiment of short text. Our model first adopts attention mechanism to capture both local features and long-distance dependent features simultaneously, so that it is more robust against irrelevant information. Then the attention-based features are non-linearly combined with a bidirectional recurrent attention network, which enhances the expressive power of our model and automatically captures more relevant feature combinations. We evaluate the performance of our model on MR, SST-1 and SST-2 datasets. The experimental results show that our model can outperform the previous methods.
All Author(s) ListYulong Li, Yi Cai, Ho-fung Leung, Qing Li
Name of Conference23rd International Conference on Database Systems for Advanced Applications, DASFAA 2018
Start Date of Conference21/05/2018
End Date of Conference24/05/2018
Place of ConferenceGold Coast
Country/Region of ConferenceAustralia
Proceedings TitleLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Year2018
Volume Number10827
PublisherSpringer
Pages878 - 890
ISBN978-3-319-91451-0
eISBN978-3-319-91452-7
ISSN0302-9743
LanguagesEnglish-United States

Last updated on 2020-05-08 at 04:31