scispace - formally typeset
J

Jean-Christophe Pesquet

Researcher at Université Paris-Saclay

Publications -  387
Citations -  14714

Jean-Christophe Pesquet is an academic researcher from Université Paris-Saclay. The author has contributed to research in topics: Convex optimization & Wavelet. The author has an hindex of 50, co-authored 364 publications receiving 13264 citations. Previous affiliations of Jean-Christophe Pesquet include University of Marne-la-Vallée & CentraleSupélec.

Papers
More filters
Posted Content

Multidimensional Wavelet-based Regularized Reconstruction for Parallel Acquisition in Neuroimaging

TL;DR: 4D-UWR-SENSE algorithm outperforms the SENSE reconstruction at the subject and group levels (15 subjects) for different contrasts of interest (eg, motor or computation tasks) and using different parallel acceleration factors on 2× 2× 3mm EPI images.
Journal ArticleDOI

Convergence of Proximal Gradient Algorithm in the Presence of Adjoint Mismatch

TL;DR: In this article, the proximal gradient algorithm for solving penalized least-squares minimization problems arising in data science is considered and conditions on the step size and on the gradient of the smooth part of the objective function under which convergence of the algorithm to a fixed point is guaranteed.
Proceedings ArticleDOI

A quadratic MISO contrast function for blind equalization

TL;DR: A new, so-called "reference" CF is proposed, which is based on cross-statistics between the estimated output and a reference signal, which presents the advantage over other CFs to be a quadratic function, which makes its optimization much easier to realize.
Posted Content

Deep Transform and Metric Learning Network: Wedding Deep Dictionary Learning and Neural Networks.

TL;DR: A novel DDL approach where each DL layer can be formulated as a combination of one linear layer and a Recurrent Neural Network (RNN) is proposed, shown to flexibly account for the layer-associated and learned metric.
Journal ArticleDOI

A Random Block-Coordinate Douglas-Rachford Splitting Method with Low Computational Complexity for Binary Logistic Regression

TL;DR: In this paper, the authors proposed a new optimization algorithm for sparse logistic regression based on a stochastic version of the Douglas-Rachford splitting method, which sweeps the training set by randomly selecting a mini-batch of data at each iteration, and it allows updating the variables in a block coordinate manner.