scispace - formally typeset
Search or ask a question
Author

Thomas Cason

Bio: Thomas Cason is an academic researcher from Université catholique de Louvain. The author has contributed to research in topics: Optimization problem & Singular value. The author has an hindex of 4, co-authored 5 publications receiving 76 citations.

Papers
More filters
Book ChapterDOI
16 Mar 2009
TL;DR: This work employs a trust-region method for minimizing the cost function on the Stiefel manifold of p -frames in *** n and does not only use second-order statistics to estimate the dimension reduction and is therefore denoted as soft dimension reduction.
Abstract: Joint diagonalization for ICA is often performed on the orthogonal group after a pre-whitening step. Here we assume that we only want to extract a few sources after pre-whitening, and hence work on the Stiefel manifold of p -frames in *** n . The resulting method does not only use second-order statistics to estimate the dimension reduction and is therefore denoted as soft dimension reduction. We employ a trust-region method for minimizing the cost function on the Stiefel manifold. Applications to a toy example and functional MRI data show a higher numerical efficiency, especially when p is much smaller than n , and more robust performance in the presence of strong noise than methods based on pre-whitening.

55 citations

Journal ArticleDOI
TL;DR: The convergence properties of the algorithm are analyzed and it is proved that accumulation points are stationary points of Φ(S), which is a low-rank approximation of the similarity matrix S introduced by Blondel et al. in [1].

20 citations

Book ChapterDOI
01 Jan 2011
TL;DR: A number of optimization problems defined on a manifold in order to compare two matrices, possibly of different order, are considered and how they relate to various specific problems from the literature are considered.
Abstract: In this paper, we go over a number of optimization problems defined on a manifold in order to compare two matrices, possibly of different order. We consider several variants and show how these problems relate to various specific problems from the literature.

5 citations

Proceedings Article
01 Jan 2009
TL;DR: In this article, an algorithm to compute a low-rank approximation of the similarity matrix S introduced by Blondel et al. in [1] was proposed. But the convergence properties of the algorithm were not analyzed.
Abstract: In this paper, we analyze an algorithm to compute a low-rank approximation of the similarity matrix S introduced by Blondel et al. in [1]. This problem can be reformulated as an optimization problem of a continuous function Φ(S) = tr ( S M2(S) ) where S is constrained to have unit Frobenius norm, and M2 is a non-negative linear map. We restrict the feasible set to the set of matrices of unit Frobenius norm with either k nonzero identical singular values or at most k nonzero (not necessarily identical) singular values. We first characterize the stationary points of the associated optimization problems and further consider iterative algorithms to find one of them. We analyze the convergence properties of our algorithm and prove that accumulation points are stationary points of Φ(S). We finally compare our method in terms of speed and accuracy to the full rank algorithm proposed in [1].

1 citations


Cited by
More filters
Journal Article
TL;DR: The Manopt toolbox as discussed by the authors is a user-friendly, documented piece of software dedicated to simplify experimenting with state-of-the-art Riemannian optimization algorithms.
Abstract: Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design efficient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. By dealing internally with most of the differential geometry, the package aims particularly at lowering the entrance barrier.

775 citations

Journal ArticleDOI
TL;DR: This work proposes a new algorithm for matrix completion that minimizes the least-square distance on the sampling set over the Riemannian manifold of fixed-rank matrices and proves convergence of a regularized version of the algorithm under the assumption that the restricted isometry property holds for incoherent matrices throughout the iterations.
Abstract: The matrix completion problem consists of finding or approximating a low-rank matrix based on a few samples of this matrix. We propose a new algorithm for matrix completion that minimizes the least-square distance on the sampling set over the Riemannian manifold of fixed-rank matrices. The algorithm is an adaptation of classical nonlinear conjugate gradients, developed within the framework of retraction-based optimization on manifolds. We describe all the necessary objects from differential geometry necessary to perform optimization over this low-rank matrix manifold, seen as a submanifold embedded in the space of matrices. In particular, we describe how metric projection can be used as retraction and how vector transport lets us obtain the conjugate search directions. Finally, we prove convergence of a regularized version of our algorithm under the assumption that the restricted isometry property holds for incoherent matrices throughout the iterations. The numerical experiments indicate that our approach...

512 citations

Journal ArticleDOI
TL;DR: In this article, the authors introduce set-valued analysis and convex analysis, as well as set-valued analysis and the Fourier transform of functions of one variable.
Abstract: The Projection Theorem. Theorems on Extension and Separation. Dual Spaces and Transposed Operators. The Banach Theorem and the Banach-Steinhaus Theorem. Construction of Hilbert Spaces. L ~ 2 Spaces and Convolution Operators. Sobolev Spaces of Functions of One Variable. Some Approximation Procedures in Spaces of Functions. Sobolev Spaces of Functions of Several Variables and the Fourier Transform. Introduction to Set-Valued Analysis and Convex Analysis. Elementary Spectral Theory. Hilbert-Schmidt Operators and Tensor Products. Boundary Value Problems. Differential-Operational Equations and Semigroups of Operators. Viability Kernels and Capture Basins. First-Order Partial Differential Equations. Selection of Results. Exercises. Bibliography. Index.

302 citations

Posted Content
TL;DR: The Manopt toolbox, available at www.manopt.org, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms, which aims particularly at lowering the entrance barrier.
Abstract: Optimization on manifolds is a rapidly developing branch of nonlinear optimization. Its focus is on problems where the smooth geometry of the search space can be leveraged to design efficient numerical algorithms. In particular, optimization on manifolds is well-suited to deal with rank and orthogonality constraints. Such structured constraints appear pervasively in machine learning applications, including low-rank matrix completion, sensor network localization, camera network registration, independent component analysis, metric learning, dimensionality reduction and so on. The Manopt toolbox, available at this http URL, is a user-friendly, documented piece of software dedicated to simplify experimenting with state of the art Riemannian optimization algorithms. We aim particularly at reaching practitioners outside our field.

139 citations

Journal ArticleDOI
TL;DR: In this article, the authors derived convergence results for projected line-search methods on the real algebraic variety of real $m \times n$ matrices of rank at most $k.
Abstract: The aim of this paper is to derive convergence results for projected line-search methods on the real-algebraic variety $\mathcal{M}_{\le k}$ of real $m \times n$ matrices of rank at most $k$. Such methods extend Riemannian optimization methods, which are successfully used on the smooth manifold $\mathcal{M}_k$ of rank-$k$ matrices, to its closure by taking steps along gradient-related directions in the tangent cone, and afterwards projecting back to $\mathcal{M}_{\le k}$. Considering such a method circumvents the difficulties which arise from the nonclosedness and the unbounded curvature of $\mathcal{M}_k$. The pointwise convergence is obtained for real-analytic functions on the basis of a Łojasiewicz inequality for the projection of the antigradient to the tangent cone. If the derived limit point lies on the smooth part of $\mathcal{M}_{\le k}$, i.e., in $\mathcal{M}_k$, this boils down to more or less known results, but with the benefit that asymptotic convergence rate estimates (for specific step-sizes...

110 citations