E
Eric Price
Researcher at University of Texas at Austin
Publications - 132
Citations - 7816
Eric Price is an academic researcher from University of Texas at Austin. The author has contributed to research in topics: Upper and lower bounds & Compressed sensing. The author has an hindex of 32, co-authored 127 publications receiving 6062 citations. Previous affiliations of Eric Price include Max Planck Society & Massachusetts Institute of Technology.
Papers
More filters
Proceedings Article
Equality of opportunity in supervised learning
TL;DR: This work proposes a criterion for discrimination against a specified sensitive attribute in supervised learning, where the goal is to predict some target based on available features and shows how to optimally adjust any learned predictor so as to remove discrimination according to this definition.
Posted Content
Compressed Sensing using Generative Models
TL;DR: In this paper, the authors show that if the vectors lie near the range of a generative model, such as a variational autoencoder or generative adversarial network, then roughly O(k 2 ) random Gaussian measurements suffice for recovery.
Posted Content
Equality of Opportunity in Supervised Learning
TL;DR: In this paper, the authors propose a criterion for discrimination against a specified sensitive attribute in supervised learning, where the goal is to predict some target based on available features, assuming data about the predictor, target, and membership in the protected group are available, and show how to optimally adjust any learned predictor so as to remove discrimination according to their definition.
Proceedings ArticleDOI
Simple and practical algorithm for sparse Fourier transform
TL;DR: This work considers the sparse Fourier transform problem, and proposes a new algorithm, which leverages techniques from digital signal processing, notably Gaussian and Dolph-Chebyshev filters, and is faster than FFT, both in theory and practice.
Proceedings Article
Compressed sensing using generative models
TL;DR: This work shows how to achieve guarantees similar to standard compressed sensing but without employing sparsity at all, and proves that, if G is L-Lipschitz, then roughly O(k log L) random Gaussian measurements suffice for an l2/l2 recovery guarantee.