scispace - formally typeset
M

Mert Pilanci

Researcher at Stanford University

Publications -  132
Citations -  1926

Mert Pilanci is an academic researcher from Stanford University. The author has contributed to research in topics: Convex optimization & Computer science. The author has an hindex of 15, co-authored 103 publications receiving 1475 citations. Previous affiliations of Mert Pilanci include Bilkent University & University of California, Berkeley.

Papers
More filters
Journal ArticleDOI

Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence

TL;DR: In this paper, the authors proposed a randomized second-order method for optimization known as the Newton sketch, which is based on performing an approximate Newton step using a randomly projected Hessian.
Journal ArticleDOI

Randomized sketches for kernels: Fast and optimal nonparametric regression

TL;DR: In this article, a lower bound on the minimax risk of kernel regression in terms of the localized Rademacher complexity is established. But this lower bound is not applicable to nonparametric regression.
Journal ArticleDOI

Randomized Sketches of Convex Programs With Sharp Guarantees

TL;DR: This work analyzes RP-based approximations of convex programs, in which the original optimization problem is approximated by solving a lower dimensional problem, and proves that the approximation ratio of this procedure can be bounded in terms of the geometry of the constraint set.
Journal Article

Iterative hessian sketch: fast and accurate solution approximation for constrained least-squares

TL;DR: In this paper, the authors study randomized sketching methods for approximately solving least-squares problem with a general convex constraint and provide a general lower bound on any randomized method that sketches both the data matrix and vector.
Posted Content

Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares

TL;DR: This work provides a general lower bound on any randomized method that sketches both the data matrix and vector in a least-squares problem and presents a new method known as the iterative Hessian sketch, which can be used to obtain approximations to the original least- Squares problem using a projection dimension proportional to the statistical complexity of the least-Squares minimizer, and a logarithmic number of iterations.