Title ACMo: Angle-Calibrated Moment Methods for Stochastic Optimization
Authors Huang, Xunpeng
Xu, Runxin
Zhou, Hao
Wang, Zhe
Liu, Zhengyang
Li, Lei
Affiliation Bytedance AI Lab, Shanghai, Peoples R China
Peking Univ, Beijing, Peoples R China
Ohio State Univ, Columbus, OH 43210 USA
Beijing Inst Technol, Beijing, Peoples R China
Issue Date 2021
Publisher THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE
Abstract Stochastic gradient descent (SGD) is a widely used method for its outstanding generalization ability and simplicity. Adaptive gradient methods have been proposed to further accelerate the optimization process. In this paper, we revisit existing adaptive gradient optimization methods with a new interpretation. Such new perspective leads to a refreshed understanding of the roles of second moments in stochastic optimization. Based on this, we propose Angle-Calibration Moment method (ACMo), a novel stochastic optimization method. It enjoys the benefits of second moments with only first moment updates. Theoretical analysis shows that ACMo is able to achieve the same convergence rate as mainstream adaptive methods. Experiments on a variety of CV and NLP tasks demonstrate that ACMo has a comparable convergence to state-of-the-art Adam-type optimizers, and even a better generalization performance in most cases. The code is available at https://github.com/Xunpeng746/ACMo.
URI http://hdl.handle.net/20.500.11897/623146
ISBN 978-1-57735-866-4
ISSN 2159-5399
Indexed CPCI-S(ISTP)
Appears in Collections: 待认领

Files in This Work
There are no files associated with this item.

Web of Science®


0

Checked on Last Week

百度学术™


0

Checked on Current Time




License: See PKU IR operational policies.