Paper 2020/167

Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning

Jinhyun So, Basak Guler, and A. Salman Avestimehr

Abstract

Federated learning is gaining significant interests as it enables model training over a large volume of data that is distributedly stored over many users, while protecting the privacy of the individual users. However, a major bottleneck in scaling federated learning to a large number of users is the overhead of secure model aggregation across many users. In fact, the overhead of state-of-the-art protocols for secure model aggregation grows quadratically with the number of users. We propose a new scheme, named Turbo-Aggregate, that in a network with $N$ users achieves a secure aggregation overhead of $O(N\log{N})$, as opposed to $O(N^2)$, while tolerating up to a user dropout rate of $50\%$. Turbo-Aggregate employs a multi-group circular strategy for efficient model aggregation, and leverages additive secret sharing and novel coding techniques for injecting aggregation redundancy in order to handle user dropouts while guaranteeing user privacy. We experimentally demonstrate that Turbo-Aggregate achieves a total running time that grows almost linear in the number of users, and provides up to $40\times$ speedup over the state-of-the-art schemes with up to $N=200$ users. We also experimentally evaluate the impact of several key network parameters (e.g., user dropout rate, bandwidth, and model size) on the performance of Turbo-Aggregate.

Metadata
Available format(s)
PDF
Category
Cryptographic protocols
Publication info
Published elsewhere. Minor revision. arXiv:2002.04156
Keywords
Federated Learningsecure aggregationprivacy-preserving machine learning
Contact author(s)
jinhyuns @ usc edu
History
2020-05-24: last of 3 revisions
2020-02-13: received
See all versions
Short URL
https://ia.cr/2020/167
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2020/167,
      author = {Jinhyun So and Basak Guler and A.  Salman Avestimehr},
      title = {Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning},
      howpublished = {Cryptology ePrint Archive, Paper 2020/167},
      year = {2020},
      note = {\url{https://eprint.iacr.org/2020/167}},
      url = {https://eprint.iacr.org/2020/167}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.