Paper 2021/736

Adam in Private: Secure and Fast Training of Deep Neural Networks with Adaptive Moment Estimation

Nuttapong Attrapadung, Koki Hamada, Dai Ikarashi, Ryo Kikuchi, Takahiro Matsuda, Ibuki Mishina, Hiraku Morita, and Jacob C. N. Schuldt

Abstract

Machine Learning (ML) algorithms, especially deep neural networks (DNN), have proven themselves to be extremely useful tools for data analysis, and are increasingly being deployed in systems operating on sensitive data, such as recommendation systems, banking fraud detection, and healthcare systems. This underscores the need for privacy-preserving ML (PPML) systems, and has inspired a line of research into how such systems can be constructed efficiently. We contribute to this line of research by proposing a framework that allows efficient and secure evaluation of full-fledged state-of-the-art ML algorithms via secure multi-party computation (MPC). This is in contrast to most prior works on PPML, which require advanced ML algorithms to be substituted with approximated variants that are ``MPC-friendly'', before MPC techniques are applied to obtain a PPML algorithm. A drawback of the latter approach is that it requires careful fine-tuning of the combined ML and MPC algorithms, and might lead to less efficient algorithms or inferior quality ML (such as lower prediction accuracy). This is an issue for secure training of DNNs in particular, as this involves several arithmetic algorithms that are thought to be ``MPC-unfriendly'', namely, integer division, exponentiation, inversion, and square root extraction. In this work, we propose secure and efficient protocols for the above seemingly MPC-unfriendly computations (but which are essential to DNN). Our protocols are three-party protocols in the honest-majority setting, and we propose both passively secure and actively secure with abort variants. A notable feature of our protocols is that they simultaneously provide high accuracy and efficiency. This framework enables us to efficiently and securely compute modern ML algorithms such as Adam (Adaptive moment estimation) and the softmax function ``as is'', without resorting to approximations. As a result, we obtain secure DNN training that outperforms state-of-the-art three-party systems; our \textit{full} training is up to $6.7$ times faster than just the \textit{online} phase of the recently proposed FALCON (Wagh et al. at PETS'21) on the standard benchmark network for secure training of DNNs. To further demonstrate the scalability of our protocols, we perform measurements on real-world DNNs, AlexNet and VGG16, which are complex networks containing millions of parameters. The performance of our framework for these networks is up to a factor of about $12\sim 14$ faster for AlexNet and $46\sim 48$ faster for VGG16 to achieve an accuracy of $70\%$ and $75\%$, respectively, when compared to FALCON.

Metadata
Available format(s)
PDF
Category
Cryptographic protocols
Publication info
Preprint. MINOR revision.
Keywords
Secure multiparty computationmachine learningtrainingdeep learningAdam
Contact author(s)
kikuchi_ryo @ fw ipsj or jp
History
2021-06-03: received
Short URL
https://ia.cr/2021/736
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2021/736,
      author = {Nuttapong Attrapadung and Koki Hamada and Dai Ikarashi and Ryo Kikuchi and Takahiro Matsuda and Ibuki Mishina and Hiraku Morita and Jacob C.  N.  Schuldt},
      title = {Adam in Private: Secure and Fast Training of Deep Neural Networks with Adaptive Moment Estimation},
      howpublished = {Cryptology ePrint Archive, Paper 2021/736},
      year = {2021},
      note = {\url{https://eprint.iacr.org/2021/736}},
      url = {https://eprint.iacr.org/2021/736}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.