Netinfo Security ›› 2023, Vol. 23 ›› Issue (7): 98-110.doi: 10.3969/j.issn.1671-1122.2023.07.010

Previous Articles     Next Articles

A Multi-Server Federation Learning Scheme Based on Differential Privacy and Secret Sharing

CHEN Jing1, PENG Changgen1,2(), TAN Weijie1,2, XU Dequan1   

  1. 1. State Key Laboratory of Public Big Data, Guizhou University, Guiyang 550025, China
    2. Guizhou Big Data Academy, Guizhou University, Guiyang 550025, China
  • Received:2023-03-27 Online:2023-07-10 Published:2023-07-14

Abstract:

Federated learning relies on its central server scheduling mechanism, which can complete multi-user joint training without data leaving the domain. Most current federal learning schemes and their related privacy protection schemes rely on a single central server to complete encryption and decryption and gradient computation, which on the one hand tends to reduce the computational efficiency of the server, and on the other hand causes a large amount of privacy information leakage once the server is subject to external attacks or internal malicious collusion. Therefore, the paper combined differential privacy and secret sharing techniques to propose a multi-server federation learning scheme. Noise satisfying (ε,δ)-approximate differential privacy was added to the model trained by local users to prevent multiple servers from colluding to obtain private data. The noise-added gradients were distributed to multiple servers via a secret sharing protocol to ensure the security of the transmitted gradients while using multiple servers to balance the computational load and improve the overall computing efficiency. Experiments on the model performance, training overhead and security performance of the scheme based on public datasets show that the scheme has high security, and the performance loss of the scheme is only about 4% compared to the higher performance of the plaintext scheme, and the overall computational overhead is reduced by nearly 53% compared to the encryption scheme of a single server.

Key words: federated learning, differential privacy, secret sharing, multi-server, privacy security

CLC Number: