scispace - formally typeset
Search or ask a question
Topic

Convex optimization

About: Convex optimization is a research topic. Over the lifetime, 24906 publications have been published within this topic receiving 908795 citations. The topic is also known as: convex optimisation.


Papers
More filters
Proceedings ArticleDOI
06 Nov 2011
TL;DR: It is shown that tracking multiple people whose paths may intersect can be formulated as a convex global optimization problem and perseveres identities better than state-of-the-art algorithms while keeping similar MOTA scores.
Abstract: In this paper, we show that tracking multiple people whose paths may intersect can be formulated as a convex global optimization problem. Our proposed framework is designed to exploit image appearance cues to prevent identity switches. Our method is effective even when such cues are only available at distant time intervals. This is unlike many current approaches that depend on appearance being exploitable from frame to frame. We validate our approach on three multi-camera sport and pedestrian datasets that contain long and complex sequences. Our algorithm perseveres identities better than state-of-the-art algorithms while keeping similar MOTA scores.

263 citations

Journal ArticleDOI
TL;DR: The potential for convex optimization methods to be much more widely used in signal processing is shown and the disciplined convex programming framework that has been shown useful in transforming problems to a standard form may be extended to create solvers themselves.
Abstract: This article shows the potential for convex optimization methods to be much more widely used in signal processing. In particular, automatic code generation makes it easier to create convex optimization solvers that are made much faster by being designed for a specific problem family. The disciplined convex programming framework that has been shown useful in transforming problems to a standard form may be extended to create solvers themselves. Much work remains to be done in exploring the capabilities and limitations of automatic code generation. As computing power increases, and as automatic code generation improves, the authors expect convex optimization solvers to be found more and more often in real-time signal processing applications.

263 citations

Journal ArticleDOI
TL;DR: A computer-aided design (CAD) method and associated architectures are proposed for linear controllers based on recent results that parameterize all controllers that stabilize a given plant.
Abstract: A computer-aided design (CAD) method and associated architectures are proposed for linear controllers. The design method and architecture are based on recent results that parameterize all controllers that stabilize a given plant. With this architecture, the design of controllers is a convex programming problem that can be solved numerically. Constraints on the closed-loop system, such as asymptotic tracking, decoupling, limits on peak excursions of variables, step response, settling time, and overshoot, as well as frequency-domain inequalities, are readily incorporated in the design. The minimization objective is quite general, with LQG (linear quadratic Gaussian) H/sub infinity / and new l/sub 1/ types as special cases. The constraints and objective are specified in a control specification language which is natural for the control engineer, referring directly to step responses, noise powers, transfer functions, and so on. >

263 citations

Journal ArticleDOI
TL;DR: This framework applies to arbitrary structure-inducing norms as well as to a wide range of measurement ensembles, and allows us to give sample complexity bounds for problems such as sparse phase retrieval and low-rank tensor completion.
Abstract: Recovering structured models (e.g., sparse or group-sparse vectors, low-rank matrices) given a few linear observations have been well-studied recently. In various applications in signal processing and machine learning, the model of interest is structured in several ways, for example, a matrix that is simultaneously sparse and low rank. Often norms that promote the individual structures are known, and allow for recovery using an orderwise optimal number of measurements (e.g., $\ell _{1}$ norm for sparsity, nuclear norm for matrix rank). Hence, it is reasonable to minimize a combination of such norms. We show that, surprisingly, using multiobjective optimization with these norms can do no better, orderwise, than exploiting only one of the structures, thus revealing a fundamental limitation in sample complexity. This result suggests that to fully exploit the multiple structures, we need an entirely new convex relaxation. Further, specializing our results to the case of sparse and low-rank matrices, we show that a nonconvex formulation recovers the model from very few measurements (on the order of the degrees of freedom), whereas the convex problem combining the $\ell _{1}$ and nuclear norms requires many more measurements, illustrating a gap between the performance of the convex and nonconvex recovery problems. Our framework applies to arbitrary structure-inducing norms as well as to a wide range of measurement ensembles. This allows us to give sample complexity bounds for problems such as sparse phase retrieval and low-rank tensor completion.

263 citations

Journal ArticleDOI
TL;DR: This paper presents a systematic way to construct ZGS algorithms, shows that a subset of them converge exponentially, and obtains lower bounds on their convergence rates in terms of the convexity characteristics of the problem and the network topology, including its algebraic connectivity.
Abstract: This technical note presents a set of continuous-time distributed algorithms that solve unconstrained, separable, convex optimization problems over undirected networks with fixed topologies. The algorithms are developed using a Lyapunov function candidate that exploits convexity, and are called Zero-Gradient-Sum (ZGS) algorithms as they yield nonlinear networked dynamical systems that evolve invariantly on a zero-gradient-sum manifold and converge asymptotically to the unknown optimizer. We also describe a systematic way to construct ZGS algorithms, show that a subset of them actually converge exponentially, and obtain lower and upper bounds on their convergence rates in terms of the network topologies, problem characteristics, and algorithm parameters, including the algebraic connectivity, Laplacian spectral radius, and function curvatures. The findings of this technical note may be regarded as a natural generalization of several well-known algorithms and results for distributed consensus, to distributed convex optimization.

262 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
94% related
Robustness (computer science)
94.7K papers, 1.6M citations
89% related
Linear system
59.5K papers, 1.4M citations
88% related
Markov chain
51.9K papers, 1.3M citations
86% related
Control theory
299.6K papers, 3.1M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023392
2022849
20211,461
20201,673
20191,677
20181,580