Netinfo Security ›› 2025, Vol. 25 ›› Issue (4): 524-535.doi: 10.3969/j.issn.1671-1122.2025.04.002

Previous Articles     Next Articles

Research on Differential Privacy Methods for Medical Diagnosis Based on Knowledge Distillation

LI Xiao, SONG Xiao(), LI Yong   

  1. School of Cyber Science and Technology, Beihang University, Beijing 100191, China
  • Received:2024-11-25 Online:2025-04-10 Published:2025-04-25

Abstract:

With the rapid development of intelligent medical systems, the lack of labeled data has become a key factor restricting research progress. Knowledge distillation, as an effective data utilization strategy, can alleviate this problem. However, in intelligent medical field, models are usually used to replace manual diagnosis of images and data. This not only puts forward higher requirements for the protection of medical information privacy, but also emphasizes the decisive impact of model accuracy on the accuracy of diagnostic results.Therefore, this paper proposed a knowledge distillation scheme combined with differential privacy, and applied it to graph neural network models, aiming to protect users’ sensitive information in the knowledge distillation process while ensuring high medical diagnostic accuracy. To verify the effectiveness of the proposed method, this paper constructed a graph attention network (GAT) model and a convolutional neural network (CNN) model as control groups, and conducted experimental verification using three practical medical image datasets. The experimental results show that the accuracy of the GAT model proposed in this paper is higher than that of the CNN model, which is improved from 61% to 68%, 83% to 93%, and 67% to 80% on the three datasets respectively. Given the high resource overhead of the GAT model, this paper further designed a lightweight GAT model architecture. The lightweight model significantly reduces resource consumption while maintaining classification performance superior to the CNN model, thereby effectively improving medical diagnostic outcomes under the premise of differential privacy protection.

Key words: knowledge distillation, differential privacy, graph neural networks, intelligent healthcare, lightweight

CLC Number: