Manifold Optimization Over the Set of Doubly Stochastic Matrices: A Second-Order Geometry
read more
Citations
Geoopt: Riemannian Optimization in PyTorch
Probabilistic Permutation Synchronization Using the Riemannian Structure of the Birkhoff Polytope
Geometry-aware Domain Adaptation for Unsupervised Alignment of Word Embeddings
Geometry-aware domain adaptation for unsupervised alignment of word embeddings.
High order discriminant analysis based on Riemannian optimization
References
Convex Optimization
Neural networks for pattern recognition
Numerical Optimization
The Matrix Cookbook
Optimization Algorithms on Matrix Manifolds
Related Papers (5)
Frequently Asked Questions (12)
Q2. What are the contributions mentioned in the paper "Manifold optimization over the set of doubly stochastic matrices: a second-order geometry" ?
This paper is interested in solving a subset of convex optimization using a different approach. The paper introduces three manifolds to solve convex programs under particular box constraints.
Q3. What is the general idea behind unconstrained Euclidean numerical optimization methods?
The general idea behind unconstrained Euclidean numerical optimization methods is to start with an initial point X0 and to iteratively update it according to certain predefined rules in order to obtain a sequence {Xt} which converges to a local minimizes of the objective function.
Q4. What is the explanation for the performance of the symmetric multinomial?
one can note that the symmetric multinomial performs better than the positive one which can be explained by the fact that the optimal solution has vanishing eigenvalues which make the retraction on the cone of positive matrices non-efficient.
Q5. How many optimization problems are set up using regularizers?
For each of the manifolds, an optimization problem is set up using regularizers is order to reach the optimal solution to optimization problem (70) with M = SP+n .
Q6. What is the retraction of the set of doubly stochastic matrices?
Although the projection on the set of doubly stochastic matrices is difficult [33], this paper proposes a highly efficient retraction that take advantage of the structure of both the manifold and its tangent space.
Q7. What is the reason why the objective function is simpler than the manifold?
This can be explained by the fact that not only the objective function (72) is simpler than (73) but also by the fact that the manifold contains less degrees of freedom which makes the projections more efficient.
Q8. What is the role of the tangent space in the optimization algorithms?
The tangent space plays a primordial role in the optimization algorithms over manifold in the same way as the derivative of a function plays an important role in Euclidean optimization.
Q9. What is the retraction of the definite symmetric multinomial manifold?
Unlike the previous retractions that rely on the Euclidean structure of the embedding space, this retraction is obtained by direct computation of the properties of the retraction given in Section II.
Q10. What is the plot of the proposed framework?
The plot reveals that the proposed framework is highly efficient in high dimension with significant gain over the specialized algorithm.
Q11. Why is the symmetric manifold more efficient than the unconstrained one?
This is mainly due to the fact that the Riemannian optimization approach convert a constrained optimization into an unconstrained one over a constrained set.
Q12. What is the gain in performance of the proposed algorithm?
The gain in performance can be explained by the fact that the proposed method uses the geometry of the problem efficiently unlike generic solvers which convert the problem in a standard form and solve it using standard methods.