Towards Long-Term Remembering in Federated Continual Learning

Geiping J, Bauermeister H, Droge H, et al. Inverting gradients-how easy is it to break privacy in federated learning? Adv Neural Inf Proces Syst. 2020;16937–47.

J, McMahan H B, Yu F X, et al. Federated learning: strategies for improving communication efficiency. arXiv:1610.05492 [Preprint]. 2016. Available at http://arxiv.org/abs/1610.05492.

Mendieta M, Yang T, Wang P, et al. Local learning matters: rethinking data heterogeneity in federated learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022. p. 8397–406.

Fang X, Ye M. Robust federated learning with noisy and heterogeneous clients. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022. p. 10072–81.

McMahan B, Moore E, Ramage D, et al. Communication-efficient learning of deep networks from decentralized data. Artificial Intelligence and Statistics. 2017;1273–1282.

Du K, Lyu F, Hu F, et al. AGCN: augmented graph convolutional network for lifelong multi-label image recognition. 2022 IEEE International Conference on Multimedia and Expo(ICME). 2022.

Du K, Lyu F, Li L, et al. Multi-label continual learning using augmented graph convolutional network. IEEE Trans Multimedia. 2023.

Liu D, Lyu F, Li L, et al. Centroid distance distillation for effective rehearsal in continual learning. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2023. p. 1–5.

Xiong F, Liu Z, Huang K, et al. State primitive learning to overcome catastrophic forgetting in robotics. Cogn Comput. 2021;13:394–402.

Article  Google Scholar 

Smith J S, Tian J, Halbe S, et al. A closer look at rehearsal-free continual learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2023. p. 2409–19.

Rolnick D, Ahuja A, Schwarz J, et al. Experience replay for continual learning. Advances in Neural Information Processing. 2019. p. 32.

Mai Z, Li R, Kim H, et al. Supervised contrastive replay: revisiting the nearest class mean classifier in online class-incremental continual learning. CVPR; 2021. p. 3589–99.

Li Z, Hoiem D. Learning without forgetting[J]. IEEE Trans Pattern Anal Mach Intell. 2017;40(12):2935–47.

Article  Google Scholar 

Zenke F, Poole B, Ganguli S. Continual learning through synaptic intelligence. International Conference on Machine Learning. 2017. p. 3987–95.

Aljundi R, Babiloni F, Elhoseiny M, et al. Memory aware synapses: learning what (not) to forget. Proceedings of the European Conference on Computer Vision. 2018. p. 139–54.

Serra J, Suris D, Miron M, et al. Overcoming catastrophic forgetting with hard attention to the task. International Conference on Machine Learning. 2018. p. 4548–57.

Hendryx SM, KC DR, Walls B, et al. Federated reconnaissance: efficient, distributed, class-incremental learning. arXiv:2109.00150 [Preprint]. 2021. Available at http://arxiv.org/abs/2109.00150.

Usmanova A, Portet F, Lalanda P, et al. A distillation-based approach integrating continual learning and federated learning for pervasive services. arXiv:2109.04197 [Preprint]. 2021. http://arxiv.org/abs/2109.04197.

Li D, Wang J. FedMD: heterogenous federated learning via model distillation. Advances in Neural Information Processing. 2019.

Yao X, Sun L. Continual local training for better initialization of federated models. IEEE International Conference on Image Processing. 2020. p. 1736–40.

Bonawitz K, Eichner H, Grieskamp W, et al. Towards federated learning at scale: system design. Proceedings of Machine Learning and Systems. 2019. p. 1:374–88.

Kirkpatrick J, Pascanu R, et al. Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences. 2017. p. 3521–6.

Li T, Sahu AK, Zaheer M, et al. Federated optimization in heterogeneous networks[J]. Proceedings of Machine Learning and Systems. 2020. p. 429–50.

Lin T, Kong L, Stich SU, et al. Ensemble distillation for robust model fusion in federated learning[J]. Adv Neural Inf Proces Syst. 2020;33:2351–63.

Google Scholar 

Huang W, Ye M, Du B. Learn from others and be yourself in heterogeneous federated learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022. p. 10143–53.

Zhang L, Shen L, Ding L, et al. Fine-tuning global model via data-free knowledge distillation for non-iid federated learning[J]. arXiv:2203.09249 [Preprint]. 2022. http://arxiv.org/abs/2203.09249.

Huang Y, Chu L, Zhou Z, et al. Personalized cross-silo federated learning on non-IID data[C]. Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 9. 2021, p. 7865–73.

Li Q, He B, Song D. Model-contrastive federated learning. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2021. p. 10713–22.

Ezzeldin Y H, Yan S, He C, et al. FairFed: enabling group fairness in federated learning. Proceedings of the AAAI Conference on Artificial Intelligence. 2023. p. 7494–502.

Comments (0)

No login
gif