J
Jean-Christophe Pesquet
Researcher at Université Paris-Saclay
Publications - 387
Citations - 14714
Jean-Christophe Pesquet is an academic researcher from Université Paris-Saclay. The author has contributed to research in topics: Convex optimization & Wavelet. The author has an hindex of 50, co-authored 364 publications receiving 13264 citations. Previous affiliations of Jean-Christophe Pesquet include University of Marne-la-Vallée & CentraleSupélec.
Papers
More filters
Posted Content
Tutorial on Stochastic Simulation and Optimization Methods in Signal Processing
Marcelo Pereyra,Philip Schniter,Emilie Chouzenoux,Jean-Christophe Pesquet,Jean-Yves Tourneret,Alfred O. Hero,Steve McLaughlin +6 more
TL;DR: This paper addresses a variety of high-dimensional Markov chain Monte Carlo methods as well as deterministic surrogate methods, such as variational Bayes, the Bethe approach, belief and expectation propagation and approximate message passing algorithms.
Posted Content
Epigraphical splitting for solving constrained convex formulations of inverse problems with proximal tools
TL;DR: A proximal approach to deal with a class of convex variational problems involving nonlinear constraints based on Non-Local Total Variation, which leads to significant improvements in term of convergence speed over existing algorithms for solving similar constrained problems.
Journal ArticleDOI
Convex multiresolution analysis
TL;DR: A nonlinear extension of this framework in which the vector subspaces are replaced by convex subsets is proposed, chosen so as to provide a recursive, monotone approximation scheme that allows for various signal and image features to be investigated.
Journal ArticleDOI
A Proximal Interior Point Algorithm with Applications to Image Processing
TL;DR: A new proximal interior point algorithm (PIPA) is introduced that is able to handle convex optimization problems involving various constraints where the objective function is the sum of a Lipschitz differentiable term and a possibly nonsmooth one.
Journal ArticleDOI
A random block-coordinate Douglas–Rachford splitting method with low computational complexity for binary logistic regression
TL;DR: In this article, the authors proposed a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas-Rachford splitting method, which sweeps the training set by randomly selecting a mini-batch of data at each iteration, and it allows updating the variables in a block coordinate manner.