Privacy-Preserving Distributed Linear Regression on High-Dimensional Data
Adrià Gascón,Phillipp Schoppmann,Borja Balle,Mariana Raykova,Jack Doerner,Samee Zahur,David Evans +6 more
- Vol. 2017, Iss: 4, pp 345-364
Reads0
Chats0
TLDR
A hybrid multi-party computation protocol that combines Yao’s garbled circuits with tailored protocols for computing inner products is proposed, suitable for secure computation because it uses an efficient fixed-point representation of real numbers while maintaining accuracy and convergence rates comparable to what can be obtained with a classical solution using floating point numbers.Abstract:
We propose privacy-preserving protocols for computing linear regression models, in the setting where the training dataset is vertically distributed among several parties. Our main contribution is a hybrid multi-party computation protocol that combines Yao’s garbled circuits with tailored protocols for computing inner products. Like many machine learning tasks, building a linear regression model involves solving a system of linear equations. We conduct a comprehensive evaluation and comparison of different techniques for securely performing this task, including a new Conjugate Gradient Descent (CGD) algorithm. This algorithm is suitable for secure computation because it uses an efficient fixed-point representation of real numbers while maintaining accuracy and convergence rates comparable to what can be obtained with a classical solution using floating point numbers. Our technique improves on Nikolaenko et al.’s method for privacy-preserving ridge regression (S&P 2013), and can be used as a building block in other analyses. We implement a complete system and demonstrate that our approach is highly scalable, solving data analysis problems with one million records and one hundred features in less than one hour of total running time.read more
Citations
More filters
Proceedings ArticleDOI
Vertically Federated Graph Neural Network for Privacy-Preserving Node Classification
TL;DR: Wang et al. as mentioned in this paper proposed a federated GNN learning paradigm for privacy-preserving node classification task under data vertically partitioned setting, which can be generalized to existing GNN models.
Posted Content
Confined Gradient Descent: Privacy-preserving Optimization for Federated Learning.
TL;DR: In this article, the authors proposed the Confined Gradient Descent (CGD) method to enhance the privacy of federated learning by eliminating the sharing of global model parameters.
Book ChapterDOI
Steal from Collaboration: Spy Attack by a Dishonest Party in Vertical Federated Learning
Hongbin Chen,Chaohao Fu,Na Ruan +2 more
TL;DR: Wang et al. as discussed by the authors proposed a new attack path for an attacker to steal private data from collaboration which existing works have not explored, and designed two methods to execute spy attacks for the two cases where the attacker is the active party and the passive party.
Proceedings ArticleDOI
MPC-Friendly Commitments for Publicly Verifiable Covert Security
TL;DR: In this article, the problem of efficiently verifying a commitment in a two-party computation was addressed, where a party P1 commits to a value x to be used in a subsequent secure computation with another party P2 that wants to receive assurance that x was indeed the value inputted into the secure computation.
Book ChapterDOI
Amortizing Division and Exponentiation
TL;DR: In this paper , the authors used vector oblivious linear function evaluation (VOLE) to generate correlation multiplication triples and used these triples to compute correlation multiplication in division and exponentiation protocols.
References
More filters
Book
Numerical Optimization
Jorge Nocedal,Stephen J. Wright +1 more
TL;DR: Numerical Optimization presents a comprehensive and up-to-date description of the most effective methods in continuous optimization, responding to the growing interest in optimization in engineering, science, and business by focusing on the methods that are best suited to practical problems.
Book
Machine Learning : A Probabilistic Perspective
TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Book
The algebraic eigenvalue problem
TL;DR: Theoretical background Perturbation theory Error analysis Solution of linear algebraic equations Hermitian matrices Reduction of a general matrix to condensed form Eigenvalues of matrices of condensed forms The LR and QR algorithms Iterative methods Bibliography.
Book
The Algorithmic Foundations of Differential Privacy
Cynthia Dwork,Aaron Roth +1 more
TL;DR: The preponderance of this monograph is devoted to fundamental techniques for achieving differential privacy, and application of these techniques in creative combinations, using the query-release problem as an ongoing example.