L
Luc Rocher
Researcher at Imperial College London
Publications - 10
Citations - 634
Luc Rocher is an academic researcher from Imperial College London. The author has contributed to research in topics: Noise & Differential privacy. The author has an hindex of 5, co-authored 10 publications receiving 368 citations. Previous affiliations of Luc Rocher include Université catholique de Louvain.
Papers
More filters
Journal ArticleDOI
Estimating the success of re-identifications in incomplete datasets using generative models
TL;DR: A generative copula-based method that can accurately estimate the likelihood of a specific person to be correctly re-identified, even in a heavily incomplete dataset, casting doubt on the adequacy of current anonymization practices.
Journal ArticleDOI
On the privacy-conscientious use of mobile phone data
Yves-Alexandre de Montjoye,Yves-Alexandre de Montjoye,Sébastien Gambs,Vincent D. Blondel,Geoffrey Canright,Nicolas de Cordes,Sébastien Deletaille,Kenth Engø-Monsen,Manuel García-Herranz,Jake Kendall,Cameron F. Kerry,Gautier Krings,Emmanuel Letouzé,Miguel Luengo-Oroz,Nuria Oliver,Luc Rocher,Alex Rutherford,Zbigniew Smoreda,Jessica Steele,Erik Wetter,Erik Wetter,Alex Pentland,Linus Bengtsson +22 more
TL;DR: Researchers have compared the recent availability of large-scale behavioral datasets, such as the ones generated by mobile phones, to the invention of the microscope, giving rise to the new field of computational social science.
Journal Article
bandicoot: a python toolbox for mobile phone metadata
TL;DR: Bandicoot makes it easy for machine learning researchers and practitioners to load mobile phone data, to analyze and visualize them, and to extract robust features which can be used for various classification and clustering tasks.
Journal Article
Solving artificial intelligence’s privacy problem
TL;DR: There is strong evidence that the tool used historically to find a balance between using the data in aggregate and protecting people’s privacy, de-identification, does not scale to big data datasets.
Proceedings Article
When the Signal is in the Noise: Exploiting Diffix's Sticky Noise
TL;DR: In this paper, the authors present a new class of noise-exploitation attacks, exploiting the noise added by the system to infer private information about individuals in the dataset, and demonstrate that adding data-dependent noise, as done by Diffix, is not sufficient to prevent inference of private attributes.