Accelerated gradient methods and dual decomposition in distributed model predictive control
Reads0
Chats0
TLDR
The evaluation shows that the proposed distributed optimization algorithm for mixed L"1/L"2-norm optimization based on accelerated gradient methods using dual decomposition can outperform current state-of-the-art optimization software CPLEX and MOSEK.About:
This article is published in Automatica.The article was published on 2013-03-01 and is currently open access. It has received 265 citations till now. The article focuses on the topics: Optimization problem & Duality (optimization).read more
Citations
More filters
Proceedings ArticleDOI
Deep learning-based embedded mixed-integer model predictive control
Benjamin Karg,Sergio Lucia +1 more
TL;DR: It is suggested that using deep learning networks to learn model predictive controllers is a powerful alternative to online optimization, especially when the underlying problems are complex, as in the case of mixed-integer quadratic programs.
Journal ArticleDOI
A comparison of distributed MPC schemes on a hydro-power plant benchmark
José M. Maestre,Miguel A. Ridao,Attila Kozma,Carlo Savorgnan,Moritz Diehl,Moritz Diehl,Minh Dang Doan,Anna Sadowska,Tamas Keviczky,B. De Schutter,Holger Scheu,Wolfgang Marquardt,Felipe Valencia,Jairo Espinosa +13 more
TL;DR: In this article, the authors compared five distributed model predictive control (DMPC) schemes using a hydro-power plant benchmark, and provided qualitative and quantitative comparisons between different DMPC schemes implemented on a common benchmark, which is a type of assessment rare in the literature.
Journal ArticleDOI
High-dimensional microarray dataset classification using an improved adam optimizer (iAdam)
TL;DR: The proposed Improved Adam (iAdam) technique is a combination of the look-ahead mechanism and adaptive learning rate for each parameter which demonstrates that iAdam is suitable for the classification of high-dimensional data and it prevents the model from overfitting by effectively handling bias-variance trade-offs.
Journal ArticleDOI
Distributed algorithm for dynamic economic power dispatch with energy storage in smart grids
TL;DR: In this article, the authors considered the dynamic economic dispatch problem with energy storage in a smart grid scenario, which aims at minimising the aggregate generation costs over multiple periods on condition that the time-varying demand is met, while physical constraints on generation and storage as well as system spinning reserve requirement are satisfied.
Journal ArticleDOI
Fast gradient-based distributed optimisation approach for model predictive control and application in four-tank benchmark
TL;DR: In this article, a Lagrangian dual method is introduced to deal with the optimisation problem, in which, the primal problem is solved by a parallel coordinate descent method, and a fast dual ascend method is adopted to solve the dual problem iteratively.
References
More filters
Book
Convex Optimization
Stephen Boyd,Lieven Vandenberghe +1 more
TL;DR: In this article, the focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them, and a comprehensive introduction to the subject is given. But the focus of this book is not on the optimization problem itself, but on the problem of finding the appropriate technique to solve it.
Journal ArticleDOI
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
Amir Beck,Marc Teboulle +1 more
TL;DR: A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Book
Introductory Lectures on Convex Optimization: A Basic Course
TL;DR: A polynomial-time interior-point method for linear optimization was proposed in this paper, where the complexity bound was not only in its complexity, but also in the theoretical pre- diction of its high efficiency was supported by excellent computational results.
Journal ArticleDOI
Smooth minimization of non-smooth functions
TL;DR: A new approach for constructing efficient schemes for non-smooth convex optimization is proposed, based on a special smoothing technique, which can be applied to functions with explicit max-structure, and can be considered as an alternative to black-box minimization.