scispace - formally typeset
Journal ArticleDOI

Some comments of Wolfe's `away step'

J Guélat, +1 more
- 01 May 1986 - 
- Vol. 35, Iss: 1, pp 110-119
TLDR
It is given a detailed proof, under slightly weaker conditions on the objective function, that a modified Frank-Wolfe algorithm based on Wolfe's ‘away step’ strategy can achieve geometric convergence, provided a strict complementarity assumption holds.
Abstract
We give a detailed proof, under slightly weaker conditions on the objective function, that a modified Frank-Wolfe algorithm based on Wolfe's ‘away step’ strategy can achieve geometric convergence, provided a strict complementarity assumption holds.

read more

Citations
More filters
Journal ArticleDOI

An Active-Set Algorithmic Framework for Non-Convex Optimization Problems over the Simplex.

TL;DR: In this article, an active-set algorithm for minimizing a non-convex function over the unit simplex is proposed, which makes use of a rule for identifying active variables (i.e., variables that are zero at a stationary point) and specific directions satisfying a new "nonorthogonality" type of condition.
Posted Content

Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions

TL;DR: In this article, the convergence rate of a simple Frank-Wolfe variant that uses the open-loop step size strategy was established, obtaining a √ O(1/t) convergence rate for this class of functions.

Zero-Order Stochastic Conditional Gradient Sliding Method for Non-smooth Convex Optimization

TL;DR: In this article , a non-smooth stochastic convex optimization problem with constraints using a smoothing technique and based on an accelerated batched first-order Stochastic Conditional Gradient Sliding method is studied.
Proceedings Article

Conditional Gradients for the Approximately Vanishing Ideal

TL;DR: In CGAVI, the set of generators constructed by solving instances of (constrained) convex optimization problems with the Pairwise Frank-Wolfe algorithm inherit the LASSO generalization bound and not only vanish on the training but also on out-sample data.
Posted Content

Greedy Frank-Wolfe Algorithm for Exemplar Selection

TL;DR: This paper identifies a subset of a data set such that it efficiently describes the entire data set, in a way formalized via convex optimization, and forms a boolean selection optimization problem designed to recover the exemplar set S.