Title | Boosting with structural sparsity: A differential inclusion approach |
Authors | Huang, Chendi Sun, Xinwei Xiong, Jiechao Yao, Yuan |
Affiliation | Peking Univ, Sch Math Sci, LMAM LMEQF LMPR, Beijing 100871, Peoples R China Hong Kong Univ Sci & Technol, Dept Math, Kowloon, Clear Water Bay, Hong Kong, Peoples R China |
Keywords | RECOVERY LASSO ALGORITHMS REGRESSION SELECTION PATH |
Issue Date | Jan-2020 |
Publisher | APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS |
Abstract | Boosting as gradient descent algorithms is one popular method in machine learning. In this paper a novel Boosting-type algorithm is proposed based on restricted gradient descent with structural sparsity control whose underlying dynamics are governed by differential inclusions. In particular, we present an iterative regularization path with structural sparsity where the parameter is sparse under some linear transforms, based on variable splitting and the Linearized Bregman Iteration. Hence it is called Split LBI. Despite its simplicity, Split LBI outperforms the popular generalized Lasso in both theory and experiments. A theory of path consistency is presented that equipped with a proper early stopping, Split LBI may achieve model selection consistency under a family of Irrepresentable Conditions which can be weaker than the necessary and sufficient condition for generalized Lasso. Furthermore, some l(2) error bounds are also given at the minimax optimal rates. The utility and benefit of the algorithm are illustrated by several applications including image denoising, partial order ranking of sport teams, and world university grouping with crowdsourced ranking data. (C) 2018 Elsevier Inc. All rights reserved. |
URI | http://hdl.handle.net/20.500.11897/584699 |
ISSN | 1063-5203 |
DOI | 10.1016/j.acha.2017.12.004 |
Indexed | SCI(E) Scopus EI |
Appears in Collections: | 数学科学学院 数学及其应用教育部重点实验室 |