scispace - formally typeset
C

Christos Thrampoulidis

Researcher at University of British Columbia

Publications -  127
Citations -  2207

Christos Thrampoulidis is an academic researcher from University of British Columbia. The author has contributed to research in topics: Gaussian & Computer science. The author has an hindex of 22, co-authored 105 publications receiving 1560 citations. Previous affiliations of Christos Thrampoulidis include Massachusetts Institute of Technology & University of California, Santa Barbara.

Papers
More filters
Journal ArticleDOI

Precise Error Analysis of Regularized $M$ -Estimators in High Dimensions

TL;DR: In this article, the squared error of a convex Gaussian min-max estimator was shown to converge in probability to a nontrivial limit that is given as the solution to a minimax convex-concave optimization problem on four scalar optimization variables.
Proceedings Article

Regularized Linear Regression: A Precise Analysis of the Estimation Error

TL;DR: This paper focuses on the problem of linear regression and considers a general class of optimization methods that minimize a loss function measuring the misfit of the model to the observations with an added structured-inducing regularization term.
Proceedings Article

LASSO with non-linear measurements is equivalent to one with linear measurements

TL;DR: In this article, it was shown that the Lloyd-Max quantizer is the optimal quantizer of the measurements that minimizes the estimation error of the Generalized LASSO.
Journal ArticleDOI

Optimal Placement of Distributed Energy Storage in Power Networks

TL;DR: In this paper, the authors formulate the optimal placement, sizing and control of storage devices in a power network to minimize generation costs with the intent of load shifting and prove that when the generation costs are convex and nondecreasing, there always exists an optimal storage capacity allocation that places zero storage at generation-only buses that connect to the rest of the network via single links.
Journal ArticleDOI

A Model of Double Descent for High-dimensional Binary Linear Classification

TL;DR: In this article, the authors investigated the dependence of the classification error on the overparameterization ratio of logistic regression, and showed that the classifier is obtained by running gradient descent (GD) on logistic loss.