Netinfo Security ›› 2024, Vol. 24 ›› Issue (10): 1562-1569.doi: 10.3969/j.issn.1671-1122.2024.10.010

Previous Articles     Next Articles

A Data-Free Personalized Federated Learning Algorithm Based on Knowledge Distillation

CHEN Jing1,2, ZHANG Jian1,2,3()   

  1. 1. College of Computer Science, Nankai University, Tianjin 300350, China
    2. Tianjin Key Laboratory of Network and Data Security Technology, Tianjin 300350, China
    3. College of Cyber Science, Nankai University, Tianjin 300350, China
  • Received:2024-05-03 Online:2024-10-10 Published:2024-09-27

Abstract:

Federated learning algorithms usually face the problem of huge differences between clients, and these heterogeneities degrade the global model performance, which are mitigated by knowledge distillation approaches. In order to further liberate public data and improve the model performance, DFP-KD trained a robust federated learning global model using datad-free knowledge distillation methods; used ReACGAN as the generator part; and adopted a step-by-step EMA fast updating strategy, which speeded up the update rate of the global model while avoiding catastrophic forgetting. Comparison experiments, ablation experiments, and parameter value influence experiments show that DFP-KD is more advantageous than the classical data-free knowledge distillation algorithms in terms of accuracy, stability, and update rate.

Key words: federated learning, heterogeneity, knowledge distillation, image generation

CLC Number: