scispace - formally typeset
Open AccessPosted Content

An Optimal-Storage Approach to Semidefinite Programming using Approximate Complementarity

TLDR
A new storage-optimal algorithm that provably solves generic semidefinite programs (SDPs) in standard form and is particularly effective for weakly constrained SDPs is developed.
Abstract
This paper develops a new storage-optimal algorithm that provably solves generic semidefinite programs (SDPs) in standard form. This method is particularly effective for weakly constrained SDPs. The key idea is to formulate an approximate complementarity principle: Given an approximate solution to the dual SDP, the primal SDP has an approximate solution whose range is contained in the eigenspace with small eigenvalues of the dual slack matrix. For weakly constrained SDPs, this eigenspace has very low dimension, so this observation significantly reduces the search space for the primal solution. This result suggests an algorithmic strategy that can be implemented with minimal storage: (1) Solve the dual SDP approximately; (2) compress the primal SDP to the eigenspace with small eigenvalues of the dual slack matrix; (3) solve the compressed primal SDP. The paper also provides numerical experiments showing that this approach is successful for a range of interesting large-scale SDPs.

read more

Citations
More filters
Journal ArticleDOI

Scalable Semidefinite Programming

TL;DR: Numerical evidence shows that the provably correct algorithm for solving large SDP problems by economizing on both the storage and the arithmetic costs is effective for a range of applications, including relaxations of MaxCut, abstract phase retrieval, and quadratic assignment.
Posted Content

A Survey of Recent Scalability Improvements for Semidefinite Programming with Applications in Machine Learning, Control, and Robotics.

TL;DR: This article surveys recent approaches to scalable semidefinite programming, including those that exploit structure, those that produce low-rank approximate solutions to semideFinite programs, and those that use more scalable algorithms that rely on augmented Lagrangian techniques and the alternating-direction method of multipliers.
Posted Content

A hierarchy of spectral relaxations for polynomial optimization

TL;DR: In this article, it was shown that any constrained polynomial optimization problem (POP) has an equivalent formulation on a variety contained in an Euclidean sphere and that the resulting semidefinite relaxations in the moment-SOS hierarchy have the constant trace property (CTP) for the involved matrices.
References
More filters
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book

Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers

TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Journal ArticleDOI

A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems

TL;DR: A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Related Papers (5)