Nishat Koti (IISc Bangalore), Arpita Patra (IISc Bangalore), Rahul Rachuri (Aarhus University, Denmark), Ajith Suresh (IISc, Bangalore)

Mixing arithmetic and boolean circuits to perform privacy-preserving machine learning has become increasingly popular. Towards this, we propose a framework for the case of four parties with at most one active corruption called Tetrad.

Tetrad works over rings and supports two levels of security, fairness and robustness. The fair multiplication protocol costs 5 ring elements, improving over the state-of-the-art Trident (Chaudhari et al. NDSS'20). A key feature of Tetrad is that robustness comes for free over fair protocols. Other highlights across the two variants include (a) probabilistic truncation without overhead, (b) multi-input multiplication protocols, and (c) conversion protocols to switch between the computational domains, along with a tailor-made garbled circuit approach.

Benchmarking of Tetrad for both training and inference is conducted over deep neural networks such as LeNet and VGG16. We found that Tetrad is up to 4 times faster in ML training and up to 5 times faster in ML inference. Tetrad is also lightweight in terms of deployment cost, costing up to 6 times less than Trident.

View More Papers

DeepSight: Mitigating Backdoor Attacks in Federated Learning Through Deep...

Phillip Rieger (Technical University of Darmstadt), Thien Duc Nguyen (Technical University of Darmstadt), Markus Miettinen (Technical University of Darmstadt), Ahmad-Reza Sadeghi (Technical University of Darmstadt)

Read More

What Storage? An Empirical Analysis of Web Storage in...

Zubair Ahmad (Università Ca’ Foscari Venezia), Samuele Casarin (Università Ca’ Foscari Venezia), and Stefano Calzavara (Università Ca’ Foscari Venezia)

Read More

DITTANY: Strength-Based Dynamic Information Flow Analysis Tool for x86...

Walid J. Ghandour, Clémentine Maurice (CNRS, CRIStAL)

Read More