scispace - formally typeset
Search or ask a question
Topic

Convex optimization

About: Convex optimization is a research topic. Over the lifetime, 24906 publications have been published within this topic receiving 908795 citations. The topic is also known as: convex optimisation.


Papers
More filters
Journal ArticleDOI
TL;DR: This paper analyzes several new methods for solving optimization problems with the objective function formed as a sum of two terms, one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure.
Abstract: In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure. Despite the absence of good properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the first part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (with convergence rate $$O\left({1 \over k}\right)$$ ), and an accelerated multistep version with convergence rate $$O\left({1 \over k^2}\right)$$ , where $$k$$ is the iteration counter. For nonconvex problems with this structure, we prove convergence to a point from which there is no descent direction. In contrast, we show that for general nonsmooth, nonconvex problems, even resolving the question of whether a descent direction exists from a point is NP-hard. For all methods, we suggest some efficient “line search” procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.

1,444 citations

Book
Elad Hazan1
10 Aug 2016
TL;DR: This monograph portrays optimization as a process, by applying an optimization method that learns as one goes along, learning from experience as more aspects of the problem are observed.
Abstract: This monograph portrays optimization as a process. In many practical applications the environment is so complex that it is infeasible to lay out a comprehensive theoretical model and use classical algorithmic theory and mathematical optimization. It is necessary as well as beneficial to take a robust approach, by applying an optimization method that learns as one goes along, learning from experience as more aspects of the problem are observed. This view of optimization as a process has become prominent in varied fields and has led to some spectacular success in modeling and systems that are now part of our daily lives.

1,438 citations

Journal ArticleDOI
TL;DR: An algorithm involving convex optimization is proposed to design a controller guaranteeing a suboptimal maximal delay such that the system can be stabilized for all admissible uncertainties.
Abstract: This paper concerns a problem of robust stabilization of uncertain state-delayed systems. A new delay-dependent stabilization condition using a memoryless controller is formulated in terms of matrix inequalities. An algorithm involving convex optimization is proposed to design a controller guaranteeing a suboptimal maximal delay such that the system can be stabilized for all admissible uncertainties.

1,432 citations

Journal ArticleDOI
TL;DR: This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems.
Abstract: In applications throughout science and engineering one is often faced with the challenge of solving an ill-posed inverse problem, where the number of available measurements is smaller than the dimension of the model to be estimated. However in many practical situations of interest, models are constrained structurally so that they only have a few degrees of freedom relative to their ambient dimension. This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems. The class of simple models considered includes those formed as the sum of a few atoms from some (possibly infinite) elementary atomic set; examples include well-studied cases from many technical fields such as sparse vectors (signal processing, statistics) and low-rank matrices (control, statistics), as well as several others including sums of a few permutation matrices (ranked elections, multiobject tracking), low-rank tensors (computer vision, neuroscience), orthogonal matrices (machine learning), and atomic measures (system identification). The convex programming formulation is based on minimizing the norm induced by the convex hull of the atomic set; this norm is referred to as the atomic norm. The facial structure of the atomic norm ball carries a number of favorable properties that are useful for recovering simple models, and an analysis of the underlying convex geometry provides sharp estimates of the number of generic measurements required for exact and robust recovery of models from partial information. These estimates are based on computing the Gaussian widths of tangent cones to the atomic norm ball. When the atomic set has algebraic structure the resulting optimization problems can be solved or approximated via semidefinite programming. The quality of these approximations affects the number of measurements required for recovery, and this tradeoff is characterized via some examples. Thus this work extends the catalog of simple models (beyond sparse vectors and low-rank matrices) that can be recovered from limited linear information via tractable convex programming.

1,431 citations

Journal ArticleDOI
TL;DR: In this paper, the minimum throughput over all ground users in the downlink communication was maximized by optimizing the multiuser communication scheduling and association jointly with the UAV's trajectory and power control.
Abstract: Due to the high maneuverability, flexible deployment, and low cost, unmanned aerial vehicles (UAVs) have attracted significant interest recently in assisting wireless communication. This paper considers a multi-UAV enabled wireless communication system, where multiple UAV-mounted aerial base stations are employed to serve a group of users on the ground. To achieve fair performance among users, we maximize the minimum throughput over all ground users in the downlink communication by optimizing the multiuser communication scheduling and association jointly with the UAV’s trajectory and power control. The formulated problem is a mixed integer nonconvex optimization problem that is challenging to solve. As such, we propose an efficient iterative algorithm for solving it by applying the block coordinate descent and successive convex optimization techniques. Specifically, the user scheduling and association, UAV trajectory, and transmit power are alternately optimized in each iteration. In particular, for the nonconvex UAV trajectory and transmit power optimization problems, two approximate convex optimization problems are solved, respectively. We further show that the proposed algorithm is guaranteed to converge. To speed up the algorithm convergence and achieve good throughput, a low-complexity and systematic initialization scheme is also proposed for the UAV trajectory design based on the simple circular trajectory and the circle packing scheme. Extensive simulation results are provided to demonstrate the significant throughput gains of the proposed design as compared to other benchmark schemes.

1,361 citations


Network Information
Related Topics (5)
Optimization problem
96.4K papers, 2.1M citations
94% related
Robustness (computer science)
94.7K papers, 1.6M citations
89% related
Linear system
59.5K papers, 1.4M citations
88% related
Markov chain
51.9K papers, 1.3M citations
86% related
Control theory
299.6K papers, 3.1M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023392
2022849
20211,461
20201,673
20191,677
20181,580