scispace - formally typeset
Open AccessJournal ArticleDOI

Residual Ratio Thresholding for Linear Model Order Selection

Reads0
Chats0
TLDR
In this article, the authors proposed to use residual ratio thresholding (RRT) for model order selection in linear regression models and provide rigorous mathematical analysis of RRT for MOS.
Abstract
Model order selection (MOS) in linear regression models is a widely studied problem in signal processing. Penalized log likelihood techniques based on information theoretic criteria (ITC) are algorithms of choice in MOS problems. Recently, a number of model selection problems have been successfully solved with explicit finite sample guarantees using a concept called residual ratio thresholding (RRT). This paper proposes to use RRT for MOS in linear regression models and provide rigorous mathematical analysis of RRT. RRT is numerically shown to deliver a highly competitive performance when compared to popular MOS criteria, such as Akaike information criterion, Bayesian information criterion, and penalized adaptive likelihood, especially when the sample size is small. We also analytically establish an interesting interpretation for RRT based on ITC thereby linking these two model selection principles.

read more

Citations
More filters
Journal ArticleDOI

High SNR consistent compressive sensing without signal and noise statistics

TL;DR: In this paper, the authors proposed two techniques, namely residual ratio minimization (RRM) and residual ratio thresholding with adaptation (RRTA), to operate OMP algorithm without a priroi knowledge of noise variance and signal sparsity.
Posted Content

Generalized Residual Ratio Thresholding

TL;DR: A novel technique called generalized residual ratio thresholding (GRRT) is presented for operating SOMP and BOMP without the \textit{a priori} knowledge of signal sparsity and noise variance and derive finite sample and finite signal to noise ratio (SNR) guarantees for exact support recovery.
Journal ArticleDOI

New Efficient Approach to Solve Big Data Systems Using Parallel Gauss-Seidel Algorithms

TL;DR: This work proposes two new parallel iterative algorithms as extensions of the Gauss–Seidel algorithm (GSA) to solve regression problems involving many variables to solve big-data analytics problems involving large matrices.
Proceedings ArticleDOI

Adaptive Algorithms for Model Order Selection problems

TL;DR: Three model order selection algorithms whose penalty terms depend on the observed data are described and these algorithms are applied to the problem of estimating the number of signals with unknown amplitudes.
Proceedings ArticleDOI

Estimating the number of signals with unknown parameters under Gaussian noises

TL;DR: A class of signal parameters for which the widely used maximum likelihood method is useless for estimating the number of signals is described, and it is established, that the amplitude parameters belong to this class.
References
More filters
Journal ArticleDOI

MDL denoising

TL;DR: The so-called denoising problem, relative to normal models for noise, is formalized such that "noise" is defined as the incompressible part in the data while the compressible part defines the meaningful information-bearing signal.
Book

A first course in linear model theory

TL;DR: A First Course in Linear Model Theory, Second Edition is an intermediate-level statistics text that fills an important gap by presenting the theory of linear statistical models at a level appropriate for senior undergraduate or first-year graduate students as discussed by the authors.
Journal ArticleDOI

A strongly consistent procedure for model selection in a regression problem

C. Radhakrishna Rao, +1 more
- 01 Jun 1989 - 
TL;DR: In this paper, a decision rule for the choice of a model which is strongly consistent for the true model as n -> oo is presented. But the decision rule is not applicable to the case where the distribution of the components of En is unknown.
Journal ArticleDOI

Maximum likelihood principle and model selection when the true model is unspecified

TL;DR: In this paper, the authors examined the asymptotic properties of the maximum likelihood estimate and the model selection problem for independent observations coming from an unknown unknown distribution, and applied these results to model selection problems.
Journal ArticleDOI

Asymptotic MAP criteria for model selection

TL;DR: This paper derives maximum a posteriori (MAP) rules for several different families of competing models and obtain forms that are similar to AIC and naive MDL, but for some families, however, it is found that the derived penalties are different.
Related Papers (5)