J
Jinshan Zeng
Researcher at Jiangxi Normal University
Publications - 76
Citations - 2277
Jinshan Zeng is an academic researcher from Jiangxi Normal University. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 15, co-authored 66 publications receiving 1534 citations. Previous affiliations of Jinshan Zeng include Xi'an Jiaotong University.
Papers
More filters
Journal ArticleDOI
Global Convergence of ADMM in Nonconvex Nonsmooth Optimization
Yu Wang,Wotao Yin,Jinshan Zeng +2 more
TL;DR: In this paper, the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function is analyzed, subject to coupled linear equality constraints.
Posted Content
Global Convergence of ADMM in Nonconvex Nonsmooth Optimization
Yu Wang,Wotao Yin,Jinshan Zeng +2 more
TL;DR: ADMM might be a better choice than ALM for some nonconvex nonsmooth problems, because ADMM is not only easier to implement, it is also more likely to converge for the concerned scenarios.
Journal ArticleDOI
$L_{1/2}$ Regularization: Convergence of Iterative Half Thresholding Algorithm
TL;DR: It is shown that under certain conditions, the half algorithm converges to a local minimizer of the L1/2 regularization, with an eventually linear convergence rate, which provides a theoretical guarantee for a wide range of applications of thehalf algorithm.
Journal ArticleDOI
On Nonconvex Decentralized Gradient Descent
Jinshan Zeng,Wotao Yin +1 more
TL;DR: In particular, when diminishing (or constant) step sizes are used, we can prove convergence to a (or a neighborhood of) consensus stationary solution under some regular assumptions as mentioned in this paper, which is not the case in the nonconvex setting.
Journal ArticleDOI
ExtraPush for Convex Smooth Decentralized Optimization over Directed Networks
Jinshan Zeng,Wotao Yin +1 more
TL;DR: This note extends the algorithms Extra and sub gradient-push to a new algorithm ExtraPush for consensus optimization with convex differentiable objective functions over a directed network and proposes a simplified algorithm called Normalized ExtraPush, which is significantly faster than subgradient-push.