信息网络安全 ›› 2023, Vol. 23 ›› Issue (7): 22-30.doi: 10.3969/j.issn.1671-1122.2023.07.003

• 技术研究 • 上一篇    下一篇

基于安全两方计算的高效神经网络推理协议

许春根1, 薛少康2(), 徐磊1, 张盼3   

  1. 1.南京理工大学数学与统计学院,南京 210094
    2.南京理工大学计算机科学与工程学院,南京 210094
    3.南京理工大学网络空间安全学院,南京 210094
  • 收稿日期:2023-02-20 出版日期:2023-07-10 发布日期:2023-07-14
  • 通讯作者: 薛少康 xueshaok@njust.edu.cn
  • 作者简介:许春根(1969—),男,安徽,教授,博士,CCF会员,主要研究方向为密码学、网络空间安全|薛少康(1999—),男,河南,硕士研究生,主要研究方向为安全多方计算|徐磊(1990—),男,安徽,副教授,博士,CCF会员,主要研究方向为应用密码学、信息安全|张盼(1997—),男,安徽,博士研究生,主要研究方向为联邦学习、差分隐私
  • 基金资助:
    国家自然科学基金(62072240);国家自然科学基金(62202228);江苏省自然科学基金(BK20210330)

Efficient Neural Network Inference Protocol Based on Secure Two-Party Computation

XU Chungen1, XUE Shaokang2(), XU Lei1, ZHANG Pan3   

  1. 1. School of Mathematics and Statistics, Nanjing University of Science and Technology, Nanjing 210094, China
    2. School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
    3. School of Cyber Science and Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
  • Received:2023-02-20 Online:2023-07-10 Published:2023-07-14

摘要:

近年来机器学习即服务(MLaaS)发展迅速,但在实际应用中,其性能存在很大瓶颈,且面临用户数据和企业神经网络模型参数泄露的风险。目前已有一些具有隐私保护功能的机器学习方案,但存在计算效率低和通信开销大的问题。针对上述问题,文章提出一种基于安全两方计算的高效神经网络推理协议,其中线性层使用秘密共享技术保护输入数据的隐私,非线性层使用低通信开销的基于不经意传输的比较函数计算激活函数。实验结果表明,与现有方案相比,该协议在两个基准数据集上的效率至少提高了23%,通信开销至少减小51%。

关键词: 机器学习, 安全两方计算, 神经网络, 秘密共享, 不经意传输

Abstract:

Despite the rapid growth of Machine Learning as a Service(MLaaS) in recent years, there are still many performance and security issues in real-world applications, with the risk of leakage of user data and enterprise neural network model parameters. There are currently some machine learning schemes with privacy protection, but there are problems with low computational efficiency and high communication overhead. To address the above problems, the paper proposed an efficient neural network inference protocol based on secure two-party computation, where the linear layer used secret sharing to protect the privacy of the input data and the nonlinear layer used a low communication overhead comparison function based on oblivious transfer to compute the activation function. The experimental results show that the protocol is at least 23% more efficient and reduces communication overhead by at least 51% on two benchmark datasets compared to existing solutions.

Key words: machine learning, secure two-party computation, neural network, secret sharing, oblivious transfer

中图分类号: