scispace - formally typeset
Search or ask a question
Journal ArticleDOI

At what Points is the Projection Mapping Differentiable

01 Aug 1982-American Mathematical Monthly (Informa UK Limited)-Vol. 89, Iss: 7, pp 456-458
TL;DR: In this paper, at what points is the projection mapping differentiable? The American Mathematical Monthly: Vol. 89, No. 7, pp. 456-458, 1982.
Abstract: (1982). At what Points is the Projection Mapping Differentiable? The American Mathematical Monthly: Vol. 89, No. 7, pp. 456-458.
Citations
More filters
Journal ArticleDOI
TL;DR: This theory offers a framework in which previously proposed retractions can be analyzed, as well as a toolbox for constructing new ones, for submanifolds of Euclidean spaces.
Abstract: This paper deals with constructing retractions, a key step when applying optimization algorithms on matrix manifolds. For submanifolds of Euclidean spaces, we show that the operation consisting of taking a tangent step in the embedding Euclidean space followed by a projection onto the submanifold is a retraction. We also show that the operation remains a retraction if the projection is generalized to a projection-like procedure that consists of coming back to the submanifold along “admissible” directions, and we give a sufficient condition on the admissible directions for the generated retraction to be second order. This theory offers a framework in which previously proposed retractions can be analyzed, as well as a toolbox for constructing new ones. Illustrations are given for projection-like procedures on some specific manifolds for which we have an explicit, easy-to-compute expression.

291 citations

Journal Article
TL;DR: A model targeted at classification tasks, where sparse activity and sparse connectivity are used to enhance classification capabilities is proposed, and it is shown that the projection is differentiable almost everywhere and can thus be implemented as a smooth neuronal transfer function.
Abstract: Sparseness is a useful regularizer for learning in a wide range of applications, in particular in neural networks. This paper proposes a model targeted at classification tasks, where sparse activity and sparse connectivity are used to enhance classification capabilities. The tool for achieving this is a sparseness-enforcing projection operator which finds the closest vector with a pre-defined sparseness for any given vector. In the theoretical part of this paper, a comprehensive theory for such a projection is developed. In conclusion, it is shown that the projection is differentiable almost everywhere and can thus be implemented as a smooth neuronal transfer function. The entire model can hence be tuned end-to-end using gradient-based methods. Experiments on the MNIST database of handwritten digits show that classification performance can be boosted by sparse activity or sparse connectivity. With a combination of both, performance can be significantly better compared to classical non-sparse approaches.

50 citations


Cites background from "At what Points is the Projection Ma..."

  • ...For example, the projection onto a closed, convex set is guaranteed to be differentiable almost everywhere (Hiriart-Urruty, 1982)....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the differentiability properties of the projection onto the cone of positive semidefinite matrices was studied and the expression of the Clarke generalized Jacobian for the projection at any symmetric matrix is given.
Abstract: This paper studies the differentiability properties of the projection onto the cone of positive semidefinite matrices. In particular, the expression of the Clarke generalized Jacobian of the projection at any symmetric matrix is given.

42 citations

Journal ArticleDOI
TL;DR: This work considers a family of convex programming problems that depend on a vector parameter, characterizing those values of parameters at which solutions and associated Lagrange multipliers are Gâteaux differentiable.
Abstract: We consider a family of convex programming problems that depend on a vector parameter, characterizing those values of parameters at which solutions and associated Lagrange multipliers are Gâteaux differentiable. These results are specialized to the problem of the metric projection onto a convex set. At those points where the projection mapping is not differentiable the form of Clarke's generalized derivative of this mapping is derived.

31 citations


Cites background from "At what Points is the Projection Ma..."

  • ...Hiriart-Urruty in [5], who posed, among others, the following two unsolved problems: 1....

    [...]

  • ...11) qb(h) = ~ is independent of h problem (Ph) becomes the problem of the metric projection onto the convex set qb, considered among others in [5]....

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, the authors implique l'accord avec les conditions générales d'utilisation ( http://www.numdam.org/legal.html) and declare that every copie ou impression de ce fichier doit contenir la présente mention de copyright.
Abstract: © Bulletin de la S. M. F., 1965, tous droits réservés. L’accès aux archives de la revue « Bulletin de la S. M. F. » ( http://smf. emath.fr/Publications/Bulletin/Presentation.html), implique l’accord avec les conditions générales d’utilisation ( http://www.numdam.org/legal.php). Toute utilisation commerciale ou impression systématique est constitutive d’une infraction pénale. Toute copie ou impression de ce fichier doit contenir la présente mention de copyright.

1,546 citations

Journal ArticleDOI
TL;DR: In this paper, a-projections (projection par rapport a forme bilineaire a nonnecessairement symetrique) sont etudiees dans les Chapitres 2 and 3.

369 citations

Journal ArticleDOI
TL;DR: A short set of notes includes a complete proof of the Inverse Function Theorem as discussed by the authors, which is a theorem that is related to the one of the proofs in this paper.
Abstract: This short set of notes includes a complete proof of the Inverse Function Theorem. There will be more notes later covering smooth manifolds, immersions, and submer-sions. Our goal is to prove the following theorem:

298 citations