北京邮电大学学报

  • EI核心期刊

北京邮电大学学报 ›› 2023, Vol. 46 ›› Issue (1): 12-18.

• 论文 • 上一篇    下一篇

基于元蒸馏的个性化联邦学习算法

孙艳华,史亚会,王朱伟,李萌,司鹏搏   

  1. 北京工业大学
  • 收稿日期:2021-12-29 修回日期:2022-02-12 出版日期:2023-02-28 发布日期:2023-02-22
  • 通讯作者: 孙艳华 E-mail:sunyanhua@bjut.edu.cn
  • 基金资助:
    北京市自然科学基金项目

A Personalized Federated Learning Algorithm Based on Meta-Learning and Knowledge Distillation

  • Received:2021-12-29 Revised:2022-02-12 Online:2023-02-28 Published:2023-02-22

摘要: 联邦学习(FL)中客户端数据异构导致训练的统一模型无法满足每个客户端对性能的需求针对这一问题提出了一种个性化联邦学习算法———元蒸馏联邦学习,将知识蒸馏和元学习与 FL 结合,并将个性化过程嵌入 FL。在每次全局迭代中,每个客户端的本地模型(即学生模型)在蒸馏全局模型(即教师模型)的同时将自身情况反馈给教师模型并使其不断更新,从而获得一个更优的教师模型以进行个性化学习仿真结果表明,与现有个性化算法相比,所提算法在提高个性化精度的同时能在全局精度和个性化精度之间取得较好的折中

关键词: 联邦学习 , 元学习 , 知识蒸馏 , 个性化

Abstract: In federated learning (FL),the distribution of data in clients is always heterogeneous,which makes the unified model trained in FL unable to meet the demand of each client. To combat this issue, a personalized federated learning algorithm with meta learning and knowledge distillation is proposed,in which the knowledge distillation and meta-learning with FL and incorporating the personalization are combined into the training of FL. In each global iteration,the global model(teacher model) update itself according to the feedback from the local model ( student model) during the knowledge distillation. Therefore,each client can obtain a better personalized model. Simulation results show that compared with the existing personalized algorithms, the proposed algorithm can achieve a better compromise between global accuracy and personalization accuracy while improving the personalization accuracy.

Key words: federated learning , meta-learning , knowledge distillation , personalization

中图分类号: