Convex Optimization for Big Data
Reads0
Chats0
TLDR
A recent review of convex optimization algorithms for big data can be found in this paper, which aim to reduce the computational, storage, and communications bottlenecks of big data.Abstract:
This article reviews recent advances in convex optimization algorithms for Big Data, which aim to reduce the computational, storage, and communications bottlenecks. We provide an overview of this emerging field, describe contemporary approximation techniques like first-order methods and randomization for scalability, and survey the important role of parallel and distributed computation. The new Big Data algorithms are based on surprisingly simple principles and attain staggering accelerations even on classical problems.read more
Citations
More filters
Journal ArticleDOI
A survey of machine learning for big data processing
TL;DR: A literature survey of the latest advances in researches on machine learning for big data processing finds some promising learning methods in recent studies, such as representation learning, deep learning, distributed and parallel learning, transfer learning, active learning, and kernel-based learning.
Journal ArticleDOI
A Unified Algorithmic Framework for Block-Structured Optimization Involving Big Data: With applications in machine learning and signal processing
TL;DR: In this article, various features and properties of the BSUM are discussed from the viewpoint of design flexibility, computational efficiency, parallel/distributed implementation, and the required communication overhead.
Proceedings ArticleDOI
Fast and flexible convolutional sparse coding
TL;DR: The proposed method is the first efficient approach to allow for proper boundary conditions to be imposed and it also supports feature learning from incomplete data as well as general reconstruction problems.
Journal ArticleDOI
Bayesian computation: a summary of the current state, and samples backwards and forwards
TL;DR: The difficulties of modelling and then handling ever more complex datasets most likely call for a new type of tool for computational inference that dramatically reduces the dimension and size of the raw data while capturing its essential aspects.
Journal ArticleDOI
Live Data Analytics With Collaborative Edge and Cloud Processing in Wireless IoT Networks
TL;DR: This work proposes a novel framework for coordinated processing between edge and cloud computing/processing by integrating advantages from both the platforms and provides various synergies and distinctions between cloud and edge processing.
References
More filters
Book
Distributed Optimization and Statistical Learning Via the Alternating Direction Method of Multipliers
TL;DR: It is argued that the alternating direction method of multipliers is well suited to distributed convex optimization, and in particular to large-scale problems arising in statistics, machine learning, and related areas.
Journal ArticleDOI
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
Amir Beck,Marc Teboulle +1 more
TL;DR: A new fast iterative shrinkage-thresholding algorithm (FISTA) which preserves the computational simplicity of ISTA but with a global rate of convergence which is proven to be significantly better, both theoretically and practically.
Journal ArticleDOI
A First-Order Primal-Dual Algorithm for Convex Problems with Applications to Imaging
Antonin Chambolle,Thomas Pock +1 more
TL;DR: A first-order primal-dual algorithm for non-smooth convex optimization problems with known saddle-point structure can achieve O(1/N2) convergence on problems, where the primal or the dual objective is uniformly convex, and it can show linear convergence, i.e. O(ωN) for some ω∈(0,1), on smooth problems.
Book
Introductory Lectures on Convex Optimization: A Basic Course
TL;DR: A polynomial-time interior-point method for linear optimization was proposed in this paper, where the complexity bound was not only in its complexity, but also in the theoretical pre- diction of its high efficiency was supported by excellent computational results.
Journal ArticleDOI
Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions
TL;DR: This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation, and presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions.
Related Papers (5)
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
Amir Beck,Marc Teboulle +1 more