Netinfo Security ›› 2025, Vol. 25 ›› Issue (6): 920-932.doi: 10.3969/j.issn.1671-1122.2025.06.007

Previous Articles     Next Articles

A Masking-Based Selective Federated Distillation Scheme

ZHU Shuaishuai1,2, LIU Keqian2()   

  1. 1. Key Laboratory of Network and Information Security under PAP, Xi’an 710086 China
    2. College of Cryptography Engineering, Engineering University of PAP, Xi’an 710086 China
  • Received:2025-03-26 Online:2025-06-10 Published:2025-07-11

Abstract:

With the continuous advancement of machine learning technology, privacy protection issues are becoming increasingly important. Federated learning, as a distributed machine learning framework, has been widely applied. However, it still faces challenges in terms of privacy leakage and efficiency in practical applications. In order to address the above challenges, the article proposed masking-based selective federated distillation (MSFD), which utilizes the characteristic of knowledge transfer rather than model parameters in federated distillation to effectively resist white box attacks and reduce communication overhead. By introducing AES encrypted masking mechanism into the shared soft tags, the problem of selective federated distillation plaintext shared soft tags being vulnerable to black box attacks was effectively solved, significantly improving the resistance to black box attacks and thus significantly enhancing the security of selective federated distillation schemes. By embedding dynamic encryption masked in client soft tags to achieve privacy obfuscation, and combining secret channel negotiation and round key update mechanisms, the risk of black box attacks was significantly reduced while maintaining model performance, balancing the security and communication efficiency of federated learning. Through security analysis and experimental results, it has been shown that MSFD can significantly reduce the success rate of black box attacks on multiple datasets, while maintaining classification accuracy and effectively improving privacy protection capabilities.

Key words: federated learning, knowledge distillation, mask mechanism, federated distillation, privacy protection

CLC Number: