scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Bayesian Test with Quadratic Criterion for Multiple Hypothesis Testing Problem

TL;DR: A Bayesian test with a modified quadratic loss function is proposed to solve a multiple hypothesis testing (MHT) problem and the conditional asymptotic equivalence between these two tests is theoretically established.
Abstract: A Bayesian test has been previously proposed for a multiple hypothesis testing (MHT) problem with a quadratic loss function such that this problem can fit with some real applications where the concurrent hypotheses should be distinguished. However, this MHT problem as well as this quadratic loss function are insufficient for some other applications such as the simultaneous intrusion detection and localization in a wireless sensor network (WSN). This kind of applications could be considered as a MHT problem with null hypothesis. Therefore, a Bayesian test with a modified quadratic loss function is proposed to solve this MHT problem. The non-asymptotic bounds for analyzing the performance of the proposed test and the Bayesian test with the 0-1 loss function are obtained, from which the conditional asymptotic equivalence between these two tests is then theoretically established. The effectiveness of these bounds and the analysis on the conditional asymptotic equivalence are verified by the simulation results.
Citations
More filters
Journal ArticleDOI
TL;DR: In the proposed strategy, the decision results of the spectrum prediction and monitoring techniques are fused using AND and OR fusion rules, for the detection of emergence of PU during the data transmission.

37 citations

References
More filters
Book
01 Jan 1959
TL;DR: The general decision problem, the Probability Background, Uniformly Most Powerful Tests, Unbiasedness, Theory and First Applications, and UNbiasedness: Applications to Normal Distributions, Invariance, Linear Hypotheses as discussed by the authors.
Abstract: The General Decision Problem.- The Probability Background.- Uniformly Most Powerful Tests.- Unbiasedness: Theory and First Applications.- Unbiasedness: Applications to Normal Distributions.- Invariance.- Linear Hypotheses.- The Minimax Principle.- Multiple Testing and Simultaneous Inference.- Conditional Inference.- Basic Large Sample Theory.- Quadratic Mean Differentiable Families.- Large Sample Optimality.- Testing Goodness of Fit.- General Large Sample Methods.

6,480 citations

Book
22 Dec 2012
TL;DR: An overview of statistical decision theory, which emphasizes the use and application of the philosophical ideas and mathematical structure of decision theory.
Abstract: 1. Basic concepts 2. Utility and loss 3. Prior information and subjective probability 4. Bayesian analysis 5. Minimax analysis 6. Invariance 7. Preposterior and sequential analysis 8. Complete and essentially complete classes Appendices.

5,573 citations


"Bayesian Test with Quadratic Criter..." refers methods in this paper

  • ...As is introduced in [1][7][9], the quality of a test δ(X) is evaluated with the Bayes risk....

    [...]

Book
12 Jul 1967

1,454 citations


"Bayesian Test with Quadratic Criter..." refers background or methods in this paper

  • ...However, in the case of a general prior distribution, the following theorem, derived from that established by [7], gives the Bayesian test based on a Gaussian distribution, i....

    [...]

  • ...Historically, for the aforementioned MHT problem with null hypothesis, [7] has firstly proposed a Bayesian with a 0−1 loss function which is given by L0−1 [ θ, θδ(X) ] = { 1 if θ = θδ(X), 0 if θ = θδ(X), for all θ ∈ Θ....

    [...]

  • ...As is introduced in [1][7][9], the quality of a test δ(X) is evaluated with the Bayes risk....

    [...]

  • ...Specifically, [7] has proposed a Bayesian test with respect to a prior distribution invariant under G giving equal weight to θ1, ....

    [...]

Journal ArticleDOI
TL;DR: A multihypothesis testing framework for studying the tradeoffs between detection and parameter estimation (classification) for a finite discrete parameter set is developed and it is observed that Rissanen's order selection penalty method is nearly min-max optimal in some nonasymptotic regimes.
Abstract: This paper addresses the problem of finite sample simultaneous detection and estimation which arises when estimation of signal parameters is desired but signal presence is uncertain. In general, a joint detection and estimation algorithm cannot simultaneously achieve optimal detection and optimal estimation performance. We develop a multihypothesis testing framework for studying the tradeoffs between detection and parameter estimation (classification) for a finite discrete parameter set. Our multihypothesis testing problem is based on the worst case detection and worst case classification error probabilities of the class of joint detection and classification algorithms which are subject to a false alarm constraint. This framework leads to the evaluation of greatest lower bounds on the worst case decision error probabilities and a construction of decision rules which achieve these lower bounds. For illustration, we apply these methods to signal detection, order selection, and signal classification for a multicomponent signal in noise model. For two or fewer signals, an SNR of 3 dB and signal space dimension of N=10 numerical results are obtained which establish the existence of fundamental tradeoffs between three performance criteria: probability of signal detection, probability of correct order selection, and probability of correct classification. Furthermore, based on numerical performance comparisons between our optimal decision rule and other suboptimal penalty function methods, we observe that Rissanen's (1978) order selection penalty method is nearly min-max optimal in some nonasymptotic regimes. >

97 citations


Additional excerpts

  • ...Remark 1: The MHT problem with an unknown prior distribution has been tackled in a minimax framework [2][5][6]....

    [...]