scispace - formally typeset
Q

Qihang Lin

Researcher at University of Iowa

Publications -  116
Citations -  3152

Qihang Lin is an academic researcher from University of Iowa. The author has contributed to research in topics: Convex optimization & Subgradient method. The author has an hindex of 27, co-authored 103 publications receiving 2707 citations. Previous affiliations of Qihang Lin include Microsoft & College of Business Administration.

Papers
More filters
Journal ArticleDOI

Smoothing proximal gradient method for general structured sparse regression

TL;DR: This paper proposes a general optimization approach, the smoothing proximal gradient method, which can solve structured sparse regression problems with any smooth convex loss under a wide spectrum of structured sparsity-inducing penalties.
Journal ArticleDOI

Smoothing proximal gradient method for general structured sparse regression

TL;DR: The smoothing proximal gradient (SPG) method as discussed by the authors combines a smoothing technique with an effective proximal gradients method to solve structured sparse regression problems with any smooth convex loss under a wide spectrum of structured sparsityinducing penalties.
Posted Content

Non-Convex Min-Max Optimization: Provable Algorithms and Applications in Machine Learning

TL;DR: This paper proposes a proximally guided stochastic subgradient method and a proxIMally guided Stochastic variance-reduced method for expected and finite-sum saddle-point problems, respectively and establishes the computation complexities of both methods for finding a nearly stationary point of the corresponding minimization problem.
Proceedings Article

An Accelerated Proximal Coordinate Gradient Method

TL;DR: This paper shows how to apply the APCG method to solve the dual of the regularized empirical risk minimization (ERM) problem, and devise efficient implementations that avoid full-dimensional vector operations.
Journal ArticleDOI

An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization

TL;DR: An accelerated randomized proximal coordinate gradient (APCG) method is developed for minimizing the sum of two convex functions that is smooth and given by a gradient oracle, and the other is separable over blocks of coordinates and has a simple known structure over each block.