scispace - formally typeset
Journal ArticleDOI

Some comments of Wolfe's `away step'

J Guélat, +1 more
- 01 May 1986 - 
- Vol. 35, Iss: 1, pp 110-119
TLDR
It is given a detailed proof, under slightly weaker conditions on the objective function, that a modified Frank-Wolfe algorithm based on Wolfe's ‘away step’ strategy can achieve geometric convergence, provided a strict complementarity assumption holds.
Abstract
We give a detailed proof, under slightly weaker conditions on the objective function, that a modified Frank-Wolfe algorithm based on Wolfe's ‘away step’ strategy can achieve geometric convergence, provided a strict complementarity assumption holds.

read more

Citations
More filters
Journal ArticleDOI

Efficient Projection-Free Online Methods with Stochastic Recursive Gradient.

TL;DR: In this article, two efficient projection-free online methods called ORGFW and MORGFW are proposed for solving stochastic and adversarial online convex optimization problems, respectively, by employing a recursive gradient estimator.
Posted Content

Restarting Frank-Wolfe

TL;DR: In this article, a new variant of the Frank-Wolfe algorithm is presented, which dynamically adapts to the function's geometric properties using restarts and thus smoothly interpolates between the sublinear and linear regimes.
Posted Content

Projection-Free Optimization on Uniformly Convex Sets

TL;DR: This work shows accelerated convergence rates when the set is only locally uniformly convex and provides similar results in online linear optimization, highlighting that the Frank-Wolfe algorithm is adaptive to much more generic constraint set structures, thus explaining faster empirical convergence.
Posted Content

Revisiting the Approximate Carath\'eodory Problem via the Frank-Wolfe Algorithm

TL;DR: The approximate Caratheodory problem is revisited by solving the primal problem via the Frank-Wolfe algorithm, providing a simplified analysis and leading to an efficient practical method.
Posted Content

Functional Frank-Wolfe Boosting for General Loss Functions

TL;DR: A novel Frank-Wolfe type boosting algorithm (FWBoost) applied to general loss functions is proposed, which can be rewritten as a variant of AdaBoost for binary classification by using exponential loss.