scispace - formally typeset
Journal ArticleDOI

An Iterative Technique for Absolute Deviations Curve Fitting

TLDR
In this paper, an iterative technique is proposed for the absolute deviations regression of data, which is based on any standard least squares curve fitting algorithm, and the resulting regression procedure is computationally simple, requires less storage and is faster than the linear programming algorithm.
Abstract
An iterative technique is proposed for the absolute deviations regression of data. At the heart of the technique is any standard least squares curve fitting algorithm. Hence, the resulting regression procedure is computationally simple, requires less storage and is faster than the linear programming algorithm recently proposed as a solution to this problem. Problems associated with non-unique solutions in least absolute deviations regression are also discussed.

read more

Citations
More filters
Journal ArticleDOI

Enhancing Sparsity by Reweighted ℓ 1 Minimization

TL;DR: A novel method for sparse signal recovery that in many situations outperforms ℓ1 minimization in the sense that substantially fewer measurements are needed for exact recovery.
Journal ArticleDOI

A Tutorial on MM Algorithms

TL;DR: The principle behind MM algorithms is explained, some methods for constructing them are suggested, and some of their attractive features are discussed.
Journal ArticleDOI

Optimization Transfer Using Surrogate Objective Functions

TL;DR: Because optimization transfer algorithms often exhibit the slow convergence of EM algorithms, two methods of accelerating optimization transfer are discussed and evaluated in the context of specific problems.
Journal ArticleDOI

Coordinate descent algorithms for lasso penalized regression

TL;DR: In this article, the authors proposed two algorithms for estimating regression coefficients with a lasso penalty, one based on greedy coordinate descent and another based on Edgeworth's algorithm for ordinary l1 regression.
Journal ArticleDOI

Normal/Independent Distributions and Their Applications in Robust Regression

TL;DR: In this article, the properties of normal/independent distributions are reviewed and several new results are presented for adaptive, robust regression with non-normal error distributions, such as the t, slash, and contaminated normal families.
References
More filters
Book

Applied Regression Analysis

TL;DR: In this article, the Straight Line Case is used to fit a straight line by least squares, and the Durbin-Watson Test is used for checking the straight line fit.
Book

Linear programming: methods and applications

Saul I. Gass
TL;DR: A straighforward introduction to the concepts of linear programming and their applications and several applications to real-life problems in management and numerous exercises.
Journal ArticleDOI

Linear Programming Techniques for Regression Analysis

TL;DR: In this article, a linear programming approach was used to solve the least absolute deviations and least maximum deviations problem in regression problems, and fitting by the Chebyshev criterion was shown to lead to a standard-form p+1 equation linear programming model.
Journal ArticleDOI

Linear Curve Fitting Using Least Deviations

TL;DR: In this paper, a method is developed for finding a straight line of best fit to a set of two dimensional points such that the sum of the absolute values of the vertical deviations of the points from the line is a minimum.
Journal ArticleDOI

A Note on Curve Fitting with Minimum Deviations by Linear Programming

TL;DR: For some time it has been well known among specialists in mathematical programming that the statistical problem of fitting a linear multiple regression with the criterion of minimizing the sum of absolute deviations from the regression function (rather than squared deviations) may be reduced to a linear programming problem as discussed by the authors.