Topic-Aware Neural Keyphrase Generation for Social Media Language
Other conference paper


Full Text

Other information
AbstractA huge volume of user-generated content is daily produced on social media. To facilitate automatic language understanding, we study keyphrase prediction, distilling salient information from massive posts. While most existing methods extract words from source posts to form keyphrases, we propose a sequence-to-sequence (seq2seq) based neural keyphrase generation framework, enabling absent keyphrases to be created. Moreover, our model, being topic-aware, allows joint modeling of corpus-level latent topic representations, which helps alleviate the data sparsity that widely exhibited in social media language. Experiments on three datasets collected from English and Chinese social media platforms show that our model significantly outperforms both extraction and generation models that do not exploit latent topics. Further discussions show that our model learns meaningful topics, which interprets its superiority in social media keyphrase generation.
Acceptance Date14/05/2019
All Author(s) ListYue Wang, Jing Li, Hou Pong Chan, Irwin King, Michael R. Lyu, Shuming Shi
Name of ConferenceThe 57th Annual Meeting of the Association for Computational Linguistics
Start Date of Conference28/07/2019
End Date of Conference02/08/2019
Place of ConferenceFlorence
Country/Region of ConferenceItaly
Year2019
LanguagesEnglish-United States

Last updated on 2019-26-09 at 09:42