Compressive parameter estimation via K-median clustering
Dian Mo,Marco F. Duarte +1 more
TLDR
The use of earth mover’s distance (EMD), as applied to a pair of true and estimated PD coefficient vectors, to measure the parameter estimation error and it is shown that the EMD provides a better-suited metric for parameter estimation performance than the Euclidean distance.About:
This article is published in Signal Processing.The article was published on 2018-01-01 and is currently open access. It has received 6 citations till now. The article focuses on the topics: Estimation theory & Sparse approximation.read more
Citations
More filters
Journal ArticleDOI
Overlap Aware Compressed Signal Classification
TL;DR: This paper proposes the use of a machine learning method known as overlap aware learning along with CSP that generates a smoother decision boundary and hence improves the classification accuracy at higher undersampling factors and simulation results show the trend of improved classification accuracy using the proposed method.
Proceedings ArticleDOI
A Comprehensive Review of Machine Learning in Multi-objective Optimization
TL;DR: In this article, the authors provide a global view of ML methods for multi-objective optimization problems and a reference for applying ML methods to solve a specific type of MOPs.
Dissertation
Compressive Acquisition and Processing of Sparse Analog Signals
TL;DR: The ways the application of a compressive measurement kernel impacts the signal recovery performance are looked into and methods to infer the current signal complexity from the compressive observations are investigated.
Journal ArticleDOI
An improved (1+1) evolutionary algorithm for k-median clustering problem with performance guarantee
TL;DR: It is proved that the (1+1) EA can obtain a performance guarantee of 5 for k -median problem in polynomial expected runtime O ( m n 2 ⋅ d m a x ) if all distances between data points and cluster centers are integers.
Journal ArticleDOI
Partially Coupled Stochastic Gradient Estimation for Multivariate Equation-Error Systems
TL;DR: By expanding the scalar innovation of each subsystem model to the innovation vector, a partially coupled multi-innovation generalized stochastic gradient algorithm is proposed and indicates that the proposed algorithms are effective and have good parameter estimation performances.
References
More filters
Journal ArticleDOI
Algorithms for the Assignment and Transportation Problems
TL;DR: In this paper, algorithms for the solution of the general assignment and transportation problems are presen, and the algorithm is generalized to one for the transportation problem.
Journal ArticleDOI
Greed is good: algorithmic results for sparse approximation
TL;DR: This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries and develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal.
Journal ArticleDOI
Compressive Sensing [Lecture Notes]
TL;DR: This lecture note presents a new method to capture and represent compressible signals at a rate significantly below the Nyquist rate, called compressive sensing, which employs nonadaptive linear projections that preserve the structure of the signal.
Journal ArticleDOI
Optimally sparse representation in general (nonorthogonal) dictionaries via 1 minimization
David L. Donoho,Michael Elad +1 more
TL;DR: This article obtains parallel results in a more general setting, where the dictionary D can arise from two or several bases, frames, or even less structured systems, and sketches three applications: separating linear features from planar ones in 3D data, noncooperative multiuser encoding, and identification of over-complete independent component models.
Journal ArticleDOI
CoSaMP: iterative signal recovery from incomplete and inaccurate samples
Deanna Needell,Joel A. Tropp +1 more
TL;DR: This extended abstract describes a recent algorithm, called, CoSaMP, that accomplishes the data recovery task and was the first known method to offer near-optimal guarantees on resource usage.