scispace - formally typeset
Search or ask a question
Topic

Convex optimization

About: Convex optimization is a research topic. Over the lifetime, 24906 publications have been published within this topic receiving 908795 citations. The topic is also known as: convex optimisation.


Papers
More filters
Proceedings Article
08 Dec 2008
TL;DR: A sharp bound is held on the excess risk of the output of an online algorithm in terms of the average regret, that allows one to use recent algorithms with logarithmic cumulative regret guarantees to achieve fast convergence rates for the excessrisk with high probability.
Abstract: This paper examines the generalization properties of online convex programming algorithms when the loss function is Lipschitz and strongly convex. Our main result is a sharp bound, that holds with high probability, on the excess risk of the output of an online algorithm in terms of the average regret. This allows one to use recent algorithms with logarithmic cumulative regret guarantees to achieve fast convergence rates for the excess risk with high probability. As a corollary, we characterize the convergence rate of PEGASOS (with high probability), a recently proposed method for solving the SVM optimization problem.

169 citations

Journal ArticleDOI
TL;DR: A new method is introduced for large-scale convex constrained optimization and is a generalization of the Spectral Projected Gradient method (SPG), but can be used when projections are difficult to compute.
Abstract: A new method is introduced for large-scale convex constrained optimization. The general model algorithm involves, at each iteration, the approximate minimization of a convex quadratic on the feasible set of the original problem and global convergence is obtained by means of nonmonotone line searches. A specific algorithm, the Inexact Spectral Projected Gradient method (ISPG), is implemented using inexact projections computed by Dykstra's alternating projection method and generates interior iterates. The ISPG method is a generalization of the Spectral Projected Gradient method (SPG), but can be used when projections are difficult to compute. Numerical results for constrained least-squares rectangular matrix problems are presented.

169 citations

Journal ArticleDOI
TL;DR: This paper characterize properties of the null space of the linear operator defining the constraint set that are necessary and sufficient for the heuristic to succeed, and obtains dimension-free bounds under which these null space properties hold almost surely as the matrix dimensions tend to infinity.
Abstract: Minimizing the rank of a matrix subject to constraints is a challenging problem that arises in many applications in machine learning, control theory, and discrete geometry. This class of optimization problems, known as rank minimization, is NP-hard, and for most practical problems there are no efficient algorithms that yield exact solutions. A popular heuristic replaces the rank function with the nuclear norm—equal to the sum of the singular values—of the decision variable and has been shown to provide the optimal low rank solution in a variety of scenarios. In this paper, we assess the practical performance of this heuristic for finding the minimum rank matrix subject to linear equality constraints. We characterize properties of the null space of the linear operator defining the constraint set that are necessary and sufficient for the heuristic to succeed. We then analyze linear constraints sampled uniformly at random, and obtain dimension-free bounds under which our null space properties hold almost surely as the matrix dimensions tend to infinity. Finally, we provide empirical evidence that these probabilistic bounds provide accurate predictions of the heuristic’s performance in non-asymptotic scenarios.

169 citations

Journal ArticleDOI
TL;DR: A heuristic online algorithm, referred to as look-ahead water-filling, which jointly adapts to both channel fading state and backlog is described, which achieves a considerable reduction in energy relative to water filling solely on channel states.
Abstract: This paper investigates the problem of energy-efficient transmission of data packets in a wireless network by jointly adapting to backlog and channel condition. Specifically, we consider minimum-energy scheduling problems over multiple-access channels, broadcast channels, and channels with fading, when packets of all users need to be transmitted before a deadline T. Earlier work has considered a similar setup and demonstrated significant transmission energy saving by adapting to backlog for channels that are time invariant and when transmission is restricted to time-division. For concreteness, throughout the paper, rates and powers corresponding to optimal coding over discrete-time additive white Gaussian noise (AWGN) channels are assumed. The results, however, hold for more general channels and coding schemes where the total transmitted power is convex in the transmission rates. The offline scheduling problems for all the channels considered are shown to reduce to convex optimization problems with linear constraints. An iterative algorithm, referred to as FlowRight, that finds optimal offline schedules is presented. A heuristic online algorithm that we call look-ahead water-filling, which jointly adapts to both channel fading state and backlog is described. By the use of a small buffer which introduces an almost fixed delay, this algorithm achieves a considerable reduction in energy relative to water filling solely on channel states.

169 citations

Journal ArticleDOI
TL;DR: In this paper, an algorithmic framework based on a sub-problem constructed from a linearized approximation to the objective and a regularization term is proposed for minimization of functions that are compositions of convex or prox-regular functions with smooth vector functions.
Abstract: We consider minimization of functions that are compositions of convex or prox-regular functions (possibly extended-valued) with smooth vector functions. A wide variety of important optimization problems fall into this framework. We describe an algorithmic framework based on a subproblem constructed from a linearized approximation to the objective and a regularization term. Properties of local solutions of this subproblem underlie both a global convergence result and an identification property of the active manifold containing the solution of the original problem. Preliminary computational results on both convex and nonconvex examples are promising.

168 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
94% related
Robustness (computer science)
94.7K papers, 1.6M citations
89% related
Linear system
59.5K papers, 1.4M citations
88% related
Markov chain
51.9K papers, 1.3M citations
86% related
Control theory
299.6K papers, 3.1M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023392
2022849
20211,461
20201,673
20191,677
20181,580