scispace - formally typeset
Open AccessProceedings Article

Robust Inverse Covariance Estimation under Noisy Measurements

TLDR
Different from previous linear programming based methods that cannot guarantee a positive semi-definite covariance matrix, this method adjusts the learned matrix to satisfy this condition, which further facilitates the tasks of forecasting future values.
Abstract
This paper proposes a robust method to estimate the inverse covariance under noisy measurements. The method is based on the estimation of each column in the inverse covariance matrix independently via robust regression, which enables parallelization. Different from previous linear programming based methods that cannot guarantee a positive semi-definite covariance matrix, our method adjusts the learned matrix to satisfy this condition, which further facilitates the tasks of forecasting future values. Experiments on time series prediction and classification under noisy condition demonstrate the effectiveness of the approach.

read more

Content maybe subject to copyright    Report

Citations
More filters
Posted Content

Convex programming approach to robust estimation of a multivariate Gaussian model

TL;DR: This paper proposes an estimator that robustly estimates the population mean and the population covariance matrix even when the sample contains a significant proportion of outliers, and develops a nonasymptotic approach to the problem of estimating the parameters of a multivariate Gaussian distribution when data are corrupted by outliers.
Proceedings ArticleDOI

Conic Optimization for Robust Quadratic Regression: Deterministic Bounds and Statistical Analysis

TL;DR: If the number of measurements is sufficiently large, it is proved that the iterative conic optimization method recovers the unknown state precisely even when up to a constant fraction of equations are arbitrarily wrong in the Gaussian case.
Posted Content

Sequential Inverse Approximation of a Regularized Sample Covariance Matrix

TL;DR: In this article, the authors derived sequential update rules to approximate the inverse shrinkage estimator of the covariance matrix, which paves the way for improved large-scale machine learning methods that involve sequential updates.
Journal Article

Conic Optimization for Quadratic Regression Under Sparse Noise

TL;DR: This paper develops two methods to address the quadratic regression problem, which are both based on conic optimization and able to accept any available prior knowledge on the solution as an input and derive sufficient conditions for guaranteeing the correct recovery of the unknown state for each method.
References
More filters
Journal ArticleDOI

The Elements of Statistical Learning: Data Mining, Inference, and Prediction

TL;DR: The Elements of Statistical Learning: Data Mining, Inference, and Prediction as discussed by the authors is a popular book for data mining and machine learning, focusing on data mining, inference, and prediction.
Book

Simulation Modeling and Analysis

TL;DR: The text is designed for a one-term or two-quarter course in simulation offered in departments of industrial engineering, business, computer science and operations research.
Book

Machine Learning : A Probabilistic Perspective

TL;DR: This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach, and is suitable for upper-level undergraduates with an introductory-level college math background and beginning graduate students.
Journal ArticleDOI

Model selection and estimation in regression with grouped variables

TL;DR: In this paper, instead of selecting factors by stepwise backward elimination, the authors focus on the accuracy of estimation and consider extensions of the lasso, the LARS algorithm and the non-negative garrotte for factor selection.
Related Papers (5)