scispace - formally typeset
D

Dmitriy Drusvyatskiy

Researcher at University of Washington

Publications -  116
Citations -  3059

Dmitriy Drusvyatskiy is an academic researcher from University of Washington. The author has contributed to research in topics: Convex function & Subgradient method. The author has an hindex of 26, co-authored 108 publications receiving 2310 citations. Previous affiliations of Dmitriy Drusvyatskiy include Cornell University.

Papers
More filters
Journal ArticleDOI

Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods

TL;DR: The proximal gradient algorithm for minimizing the sum of a smooth and nonsmooth convex function often converges linearly even without strong convexity as mentioned in this paper, and the equivalence of such an error bound to a natural quadratic growth condition is established.
Journal ArticleDOI

Stochastic model-based minimization of weakly convex functions

TL;DR: This work shows that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate of $O(k^{-1/4})$.
Journal ArticleDOI

Efficiency of minimizing compositions of convex functions and smooth maps

TL;DR: In this paper, the authors consider global efficiency of algorithms for minimizing a sum of convex functions and a composition of a Lipschitz convex function with a smooth map, and show that when the subproblems can only be solved by first-order methods, a simple combination of smoothing, the prox-linear method, and a fast-gradient scheme yields an algorithm with complexity with complexity
Posted Content

Error bounds, quadratic growth, and linear convergence of proximal methods

TL;DR: This work explains the observed linear convergence intuitively by proving the equivalence of such an error bound to a natural quadratic growth condition and generalizes to linear convergence analysis for proximal methods for minimizing compositions of nonsmooth functions with smooth mappings.
Posted Content

Stochastic subgradient method converges on tame functions

TL;DR: In particular, this article showed that the stochastic subgradient method on any locally Lipschitz function produces limit points that are all first-order stationary in the absence of smoothness and convexity.