Title | Federated Sparse Gaussian Processes |
Authors | Guo, Xiangyang Wu, Daqing Ma, Jinwen |
Affiliation | Peking Univ, Dept Informat & Computat Sci, Sch Math Sci, Beijing 100871, Peoples R China Peking Univ, LMAM, Beijing 100871, Peoples R China |
Issue Date | 2022 |
Publisher | INTELLIGENT COMPUTING METHODOLOGIES, PT III |
Abstract | In this paper, we propose a federated sparse Gaussian process (FSGP) model, which combines the sparse Gaussian process (SGP) model with the framework of federated learning (FL). Sparsity enables the reduction in the time complexity of training a Gaussian process (GP) from O(N-3) to O(NM2) and the space complexity from O(N-2) to O(NM), where N is the number of training samples and M (M << N) the number of inducing points. Furthermore, FL aims at learning a shared model using data distributed on more than one client under the condition that local data on each client cannot be accessed by other clients. Therefore, our proposed FSGP model can not only deal with large datasets, but also preserve privacy. FSGPs are trained through variational inference and applied to regression problems. In experiments, we compare the performance of FSGPs with that of federated Gaussian processes (FGPs) and SGPs trained using the datasets consisting of all local data. The experimental results show that FSGPs are comparable with SGPs and outperform FGPs. |
URI | http://hdl.handle.net/20.500.11897/657412 |
ISBN | 978-3-031-13832-4; 978-3-031-13831-7 |
ISSN | 0302-9743 |
DOI | 10.1007/978-3-031-13832-4_23 |
Indexed | EI CPCI-S(ISTP) |
Appears in Collections: | 数学科学学院 数学及其应用教育部重点实验室 |