Netinfo Security ›› 2023, Vol. 23 ›› Issue (7): 22-30.doi: 10.3969/j.issn.1671-1122.2023.07.003

Previous Articles     Next Articles

Efficient Neural Network Inference Protocol Based on Secure Two-Party Computation

XU Chungen1, XUE Shaokang2(), XU Lei1, ZHANG Pan3   

  1. 1. School of Mathematics and Statistics, Nanjing University of Science and Technology, Nanjing 210094, China
    2. School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
    3. School of Cyber Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
  • Received:2023-02-20 Online:2023-07-10 Published:2023-07-14

Abstract:

Despite the rapid growth of Machine Learning as a Service(MLaaS) in recent years, there are still many performance and security issues in real-world applications, with the risk of leakage of user data and enterprise neural network model parameters. There are currently some machine learning schemes with privacy protection, but there are problems with low computational efficiency and high communication overhead. To address the above problems, the paper proposed an efficient neural network inference protocol based on secure two-party computation, where the linear layer used secret sharing to protect the privacy of the input data and the nonlinear layer used a low communication overhead comparison function based on oblivious transfer to compute the activation function. The experimental results show that the protocol is at least 23% more efficient and reduces communication overhead by at least 51% on two benchmark datasets compared to existing solutions.

Key words: machine learning, secure two-party computation, neural network, secret sharing, oblivious transfer

CLC Number: