scispace - formally typeset
H

Hongchao Zhang

Researcher at Louisiana State University

Publications -  80
Citations -  5080

Hongchao Zhang is an academic researcher from Louisiana State University. The author has contributed to research in topics: Rate of convergence & Convex optimization. The author has an hindex of 27, co-authored 72 publications receiving 4232 citations. Previous affiliations of Hongchao Zhang include IBM & University of Florida.

Papers
More filters
Journal ArticleDOI

A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search

TL;DR: A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed and an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion is obtained.

A survey of nonlinear conjugate gradient methods

TL;DR: In this article, the development of dierent versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties, is reviewed, with a focus on the convergence properties of the dierent methods.
Journal ArticleDOI

A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization

TL;DR: For the L-BFGS method and the unconstrained optimization problems in the CUTE library, the new nonmonotone line search algorithm used fewer function and gradient evaluations, on average, than either the monotone or the traditional nonMonotone scheme.
Journal ArticleDOI

Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization

TL;DR: In this paper, a randomized stochastic projected gradient (RSPG) algorithm was proposed to solve the convex composite optimization problem, in which proper mini-batch of samples are taken at each iteration depending on the total budget of stochiastic samples allowed, and a post-optimization phase was also proposed to reduce the variance of the solutions returned by the algorithm.
Journal ArticleDOI

Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent

TL;DR: This article studies the convergence behavior of the algorithm; extensive numerical tests and comparisons with other methods for large-scale unconstrained optimization are given.