Contrastive Learning with Dialogue Attributes for Neural Dialogue Generation
Refereed conference paper presented and published in conference proceedings

Altmetrics Information

Other information
AbstractDesigning an effective learning method remains a challenge in neural dialogue generation systems as it requires the training objective to well approximate the intrinsic human-preferred dialogue properties. Conventional training approaches such as maximum likelihood estimation focus on modeling general syntactic patterns and may fail to capture intricate conversational characteristics. Contrastive dialogue learning offers an effective training schema by explicitly training a neural dialogue model on multiple positive and negative conversational pairs. However, constructing contrastive learning pairs is non-trivial, and multiple dialogue attributes have been found to be crucial for governing the human judgments of conversations. This paper proposes to guide the response generation with attribute-aware contrastive learning to improve the overall quality of the generated responses, where contrastive learning samples are generated according to various important dialogue attributes each specializing in a different principle of conversation. Extensive experiments show that our proposed techniques are crucial to achieving superior model performance.
Acceptance Date17/02/2023
All Author(s) ListJie Tan, Hengyi Cai, Hongshen Chen, Hong Cheng, Helen Meng, Zhuoye Ding
Name of Conference2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Start Date of Conference04/06/2023
End Date of Conference10/06/2023
Place of ConferenceRhodes Island
Country/Region of ConferenceGreece
Proceedings TitleICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Article number10097068
LanguagesEnglish-United States
KeywordsDialogue Generation, Contrastive Learning, Conversational Attributes and Adversarial Perturbations

Last updated on 2024-31-01 at 11:42