scispace - formally typeset
A

Alexander Gasnikov

Researcher at Moscow Institute of Physics and Technology

Publications -  317
Citations -  3533

Alexander Gasnikov is an academic researcher from Moscow Institute of Physics and Technology. The author has contributed to research in topics: Convex optimization & Convex function. The author has an hindex of 27, co-authored 248 publications receiving 2612 citations. Previous affiliations of Alexander Gasnikov include Moscow State University & Adyghe State University.

Papers
More filters
Proceedings Article

Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn’s Algorithm

TL;DR: In this article, two algorithms for approximating the general optimal transport (OT) distance between two discrete distributions of size n, up to accuracy δ(n 2 ) were presented.
Journal ArticleDOI

A dual approach for optimal algorithms in distributed optimization over networks

TL;DR: This work studies dual-based algorithms for distributed convex optimization problems over networks, and proposes distributed algorithms that achieve the same optimal rates as their centralized counterparts (up to constant and logarithmic factors), with an additional optimal cost related to the spectral properties of the network.
Journal ArticleDOI

Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle

TL;DR: The first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle and can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non-Euclidean setup.
Posted Content

Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm

TL;DR: The first algorithm analyzed has better dependence on $\varepsilon$ in the complexity bound, but also is not specific to entropic regularization and can solve the OT problem with different regularizers.
Posted Content

An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization

TL;DR: A non-accelerated derivative-free algorithm with a complexity bound similar to the stochastic-gradient-based algorithm, that is, the authors' bound does not have any dimension-dependent factor except logarithmic.