scispace - formally typeset
Open AccessJournal ArticleDOI

Conic Optimization via Operator Splitting and Homogeneous Self-Dual Embedding

Reads0
Chats0
TLDR
In this article, the alternating directions method of multipliers is used to solve the homogeneous self-dual embedding, an equivalent feasibility problem involving finding a nonzero point in the intersection of a subspace and a cone.
Abstract
We introduce a first-order method for solving very large convex cone programs. The method uses an operator splitting method, the alternating directions method of multipliers, to solve the homogeneous self-dual embedding, an equivalent feasibility problem involving finding a nonzero point in the intersection of a subspace and a cone. This approach has several favorable properties. Compared to interior-point methods, first-order methods scale to very large problems, at the cost of requiring more time to reach very high accuracy. Compared to other first-order methods for cone programs, our approach finds both primal and dual solutions when available or a certificate of infeasibility or unboundedness otherwise, is parameter free, and the per-iteration cost of the method is the same as applying a splitting method to the primal or dual alone. We discuss efficient implementation of the method in detail, including direct and indirect methods for computing projection onto the subspace, scaling the original problem data, and stopping criteria. We describe an open-source implementation, which handles the usual (symmetric) nonnegative, second-order, and semidefinite cones as well as the (non-self-dual) exponential and power cones and their duals. We report numerical results that show speedups over interior-point cone solvers for large problems, and scaling to very large general cone programs.

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings Article

Fast Active Set Methods for Online Spike Inference from Calcium Imaging

TL;DR: This work presents a fast online active set method to solve this sparse nonnegative deconvolution problem of whole-brain zebrafish imaging data, enabling real-time online spike inference during the imaging session and gaining remarkable increases in processing speed.
Proceedings ArticleDOI

Infeasibility Detection in the Alternating Direction Method of Multipliers for Convex Optimization

TL;DR: It is shown that in the limit the ADMM iterates either satisfy a set of first-order optimality conditions or produce a certificate of either primal or dual infeasibility for a wide class of convex optimization problems including both quadratic and conic programs.
Proceedings ArticleDOI

Fast ADMM for semidefinite programs with chordal sparsity

TL;DR: In this paper, chordal decomposition is applied to solve SDPs with chordal sparsity based on the alternating direction method of multipliers (ADMM), resulting in scaled versions of ADMM algorithms with the same computational cost.
Journal ArticleDOI

Bivariate Partial Information Decomposition: The Optimization Perspective

TL;DR: The solution of the Bertschinger, Rauh, Olbrich, Jost, and Ay Convex Program is discussed from theoretical and practical points of view.
Book ChapterDOI

Synthesis in pMDPs: A Tale of 1001 Parameters

TL;DR: It is shown that the synthesis problem for parametric Markov decision processes whose transitions are equipped with affine functions over a finite set of parameters can be formulated as a quadratically-constrained quadratic program (QCQP) and is non-convex in general.
References
More filters
Book ChapterDOI

I and J

Journal ArticleDOI

Regression Shrinkage and Selection via the Lasso

TL;DR: A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.
Book

Matrix computations

Gene H. Golub
Book

Convex Optimization

TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Book

Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers

TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.