scispace - formally typeset
Journal ArticleDOI

Differentially Private Publication of Vertically Partitioned Data

TLDR
This paper presents a differentially private latent tree (DPLT) approach, which is, to the best of the knowledge, the first approach to solving this challenging problem of publishing vertically partitioned data under differential privacy.
Abstract
In this paper, we study the problem of publishing vertically partitioned data under differential privacy, where different attributes of the same set of individuals are held by multiple parties. In this setting, with the assistance of a semi-trusted curator, the involved parties aim to collectively generate an integrated dataset while satisfying differential privacy for each local dataset. Based on the latent tree model (LTM), we present a differentially private latent tree (DPLT) approach, which is, to the best of our knowledge, the first approach to solving this challenging problem. In DPLT, the parties and the curator collaboratively identify the latent tree that best approximates the joint distribution of the integrated dataset, from which a synthetic dataset can be generated. The fundamental advantage of adopting LTM is that we can use the connections between a small number of latent attributes derived from each local dataset to capture the cross-dataset dependencies of the observed attributes in all local datasets such that the joint distribution of the integrated dataset can be learned with little injected noise and low computation and communication costs. DPLT is backed up by a series of novel techniques, including two-phase latent attribute generation (TLAG), tree index based correlation quantification (TICQ) and distributed Laplace perturbation protocol (DLPP). Extensive experiments on real datasets demonstrate that DPLT offers desirable data utility with low computation and communication costs.

read more

Citations
More filters
Journal ArticleDOI

Differential Privacy for Deep and Federated Learning: A Survey

TL;DR: This study reveals the gap between theory and application, accuracy, and robustness of DP, and illustrates all types of probability distributions that satisfy the DP mechanism, with their properties and use cases.
Journal ArticleDOI

Differential Privacy for Deep and Federated Learning: A Survey

- 01 Jan 2022 - 
TL;DR: In this paper , the authors proposed two layers of privacy protection approach to overcome the limitations of the existing differential privacy-based approaches, which consists of adding noise to the clients' gradients before sharing them with the server.
Journal ArticleDOI

Differentially private data publishing for arbitrarily partitioned data

TL;DR: This paper presents the first differentially private solution to anonymize data from two parties with arbitrarily partitioned data in a semi-honest model and proposes a distributed differentiallyPrivate anonymization algorithm that guarantees that each step of the algorithm satisfies the definition of secure two-party computation.
Journal ArticleDOI

Analysis of Privacy-Enhancing Technologies in Open-Source Federated Learning Frameworks for Driver Activity Recognition

TL;DR: The experiments showed that the current implementation of the privacy-preserving techniques in open-source FL frameworks limits the practical application of FL to cross-silo settings.
Proceedings ArticleDOI

Differentially Private Publication of Multi-Party Sequential Data

TL;DR: In this article, a distributed prediction suffix tree construction (DPST) solution is proposed to publish a synthetic dataset that preserves approximate sequentiality information of the integrated dataset while satisfying differential privacy for each local dataset.
References
More filters
Journal ArticleDOI

How to share a secret

TL;DR: This technique enables the construction of robust key management schemes for cryptographic systems that can function securely and reliably even when misfortunes destroy half the pieces and security breaches expose all but one of the remaining pieces.
Journal ArticleDOI

k -anonymity: a model for protecting privacy

TL;DR: The solution provided in this paper includes a formal protection model named k-anonymity and a set of accompanying policies for deployment and examines re-identification attacks that can be realized on releases that adhere to k- anonymity unless accompanying policies are respected.
Book ChapterDOI

Calibrating noise to sensitivity in private data analysis

TL;DR: In this article, the authors show that for several particular applications substantially less noise is needed than was previously understood to be the case, and also show the separation results showing the increased value of interactive sanitization mechanisms over non-interactive.
Journal Article

Calibrating noise to sensitivity in private data analysis

TL;DR: The study is extended to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f, which is the amount that any single argument to f can change its output.
Book

Introduction to Nonparametric Estimation

TL;DR: The main idea is to introduce the fundamental concepts of the theory while maintaining the exposition suitable for a first approach in the field, and many important and useful results on optimal and adaptive estimation are provided.
Related Papers (5)