Title Hierarchical Curriculum Learning for AMR Parsing
Authors Wang, Peiyi
Chen, Liang
Liu, Tianyu
Dai, Damai
Cao, Yunbo
Chang, Baobao
Sui, Zhifang
Affiliation Peking Univ, Key Lab Computat Linguist, MOE, Beijing, Peoples R China
Tencent Cloud Xiaowei, Shenzhen, Peoples R China
Issue Date 2022
Publisher PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2
Abstract A Abstract Meaning Representation (AMR) parsing aims to translate sentences to semantic representation with a hierarchical structure, and is recently empowered by pretrained sequence-to-sequence models. However, there exists a gap between their flat training objective (i.e., equally treats all output tokens) and the hierarchical AMR structure, which limits the model generalization. To bridge this gap, we propose a Hierarchical Curriculum Learning (HCL) framework with Structure-level (SC) and Instance-level Curricula (IC). SC switches progressively from core to detail AMR semantic elements while IC transits from structure-simple to -complex AMR instances during training. Through these two warming-up processes, HCL reduces the difficulty of learning complex structures, thus the flat model can better adapt to the AMR hierarchy. Extensive experiments on AMR2.0, AMR3.0, structure-complex and out-of-distribution situations verify the effectiveness of HCL.
URI http://hdl.handle.net/20.500.11897/649484
ISBN 978-1-955917-22-3
Indexed CPCI-SSH(ISSHP)
CPCI-S(ISTP)
Appears in Collections: 计算语言学教育部重点实验室

Files in This Work
There are no files associated with this item.

Web of Science®


0

Checked on Last Week

百度学术™


0

Checked on Current Time




License: See PKU IR operational policies.