Title More is Better: Enhancing Open-Domain Dialogue Generation via Multi-Source Heterogeneous Knowledge
Authors Wu, Sixing
Li, Ying
Wang, Minghui
Zhang, Dawei
Zhou, Yang
Wu, Zhonghai
Affiliation Peking Univ, Sch Elect Engn & Comp Sci, Beijing, Peoples R China
Peking Univ, Sch Software & Microelect, Beijing, Peoples R China
Aubum Univ, Auburn, AL USA
Peking Univ, Natl Res Ctr Software Engn, Beijing, Peoples R China
Peking Univ, Key Lab High Confidence Software Technol MOE, Beijing, Peoples R China
Issue Date 2021
Publisher 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021)
Abstract Despite achieving remarkable performance, previous knowledge-enhanced works usually only use a single-source homogeneous knowledge base of limited knowledge coverage. Thus, they often degenerate into traditional methods because not all dialogues can be linked with knowledge entries. This paper proposes a novel dialogue generation model, MSKE-Dialog, to solve this issue with three unique advantages: (1) Rather than only one, MSKE-Dialog can simultaneously leverage multiple heterogeneous knowledge sources (it includes but is not limited to commonsense knowledge facts, text knowledge, infobox knowledge) to improve the knowledge coverage; (2) To avoid the topic conflict among the context and different knowledge sources, we propose a Multi-Reference Selection to better select context/knowledge; (3) We propose a Multi-Reference Generation to generate informative responses by referring to multiple generation references at the same time. Extensive evaluations on a Chinese dataset show the superior performance of this work against various state-of-the-art approaches. To our best knowledge, this work is the first to use the multi-source heterogeneous knowledge in the open-domain knowledge-enhanced dialogue generation.
URI http://hdl.handle.net/20.500.11897/654800
ISBN 978-1-955917-09-4
Indexed CPCI-SSH(ISSHP)
CPCI-S(ISTP)
Appears in Collections: 信息科学技术学院
软件与微电子学院
高可信软件技术教育部重点实验室

Files in This Work
There are no files associated with this item.

Web of Science®


0

Checked on Last Week

百度学术™


0

Checked on Current Time




License: See PKU IR operational policies.