scispace - formally typeset
Search or ask a question

Showing papers on "Sequential probability ratio test published in 2000"


Journal ArticleDOI
TL;DR: In this article, the authors present the explicit solution of the Bayesian problem of sequential testing of two simple hypotheses about the intensity of an observed Poisson process, which consists of reducing the initial problem to a free-boundary differential-difference problem, and solving the latter by use of the principles of smooth and continuous fit.
Abstract: We present the explicit solution of the Bayesian problem of sequential testing of two simple hypotheses about the intensity of an observed Poisson process. The method of proof consists of reducing the initial problem to a free-boundary differential-difference Stephan problem, and solving the latter by use of the principles of smooth and continuous fit. A rigorous proof of the optimality of the Wald’s sequential probability ratio test in the variational formulation of the problem is obtained as a consequence of the solution of the Bayesian problem.

106 citations


Journal ArticleDOI
TL;DR: This work devise a procedure analogous to Page's test for dependent observations that is applied to the detection of a change in hidden Markov modeled observations, i.e., a switch from one HMM to another.
Abstract: Addressed here is the quickest detection of transient signals which can be represented as hidden Markov models (HMMs), with the application of detection of transient signals. Relying on the fact that Page's test is equivalent to a repeated sequential probability ratio test (SPRT), we are able to devise a procedure analogous to Page's test for dependent observations. By using the so-called forward variable of an HMM, such a procedure is applied to the detection of a change in hidden Markov modeled observations, i.e., a switch from one HMM to another. Performance indices of Page's test, the average run length (ARL) under both hypotheses, are approximated and confirmed via simulation. Several important examples are investigated in depth to illustrate the advantages of the proposed scheme.

71 citations


Journal ArticleDOI
TL;DR: This article introduces a real-time fault detection method, applicable to sewer networks, for the follow-up of rainy events, which provided encouraging results during the analysis of several rains on the sewer network of Seine-Saint-Denis County, France.

25 citations


01 Jun 2000
TL;DR: In this paper, the authors compared three item selection criteria for sequential probability ratio test, i.e., Fisher information function, the KullbackLeibler information function and a weighted log-odds ratio.
Abstract: This paper presents comparisons among three item-selection criteria for the sequential probability ratio test. The criteria were compared in terms of their efficiency in selecting items, as indicated by average test length (ATL) and the percentage of correct decisions (PCD). The item-selection criteria applied in this study were the Fisher information function, the KullbackLeibler information function, and a weighted log-odds ratio. We also examined the effects of the cutoff scores, the width of the indifference region, the item pool size, and the item exposure rate under the different item-selection criteria. The results of the computer simulations showed that the three criteria yielded very small differences in the outcome measures, regardless of the conditions imposed.

19 citations


Journal ArticleDOI
TL;DR: In this article, the double sequential probability ratio test and the double triangular test were evaluated with simulated data for odds ratios equal to 1.5, 2.0 and 2.5 and various type I and type II error probabilities both under H(0) and under H (1).
Abstract: Sequential analysis of randomized controlled clinical trials and epidemiological prospective (matched) case-control studies can have ethical or economical advantages above a fixed sample size approach. It offers the possibility to stop early when enough evidence for an apparent effect of the risk factor or lack of the expected effect is achieved. In clinical trials it is well accepted to stop the trial early in favour of the alternative hypothesis. In epidemiological studies, in general, the need is not felt to stop early in case of a clear exposure effect. Little attention has been paid, however, to early stopping and accepting the null hypothesis. In metabolic epidemiological studies, where analysis destroys the biological material, the question of efficient use of samples, for example, those stored in a biobank, becomes crucial. Also a slow accrual of cases or the costs of follow-up of a cohort nested study can make it desirable to stop a study early once it becomes clear that no relevant exposure effect will be found. Matching can further reduce the amount of information necessary to reach a conclusion. We derived test statistics Z (efficient score) and V (Fisher's information) for the sequential analysis of studies with dichotomous data where each case can be matched to one or more controls. A variable matching ratio is allowed. These test statistics can be entered into the software PEST to monitor the course of the study. The double sequential probability ratio test and the double triangular test were evaluated with simulated data for odds ratios equal to 1.5, 2.0 and 2.5 and various type I and type II error probabilities both under H(0) and under H(1). Our simulations resulted in average and median values for the amount of information (V) that are far less than those for a fixed sample size study. Efficiency gain ranges from 32 per cent to 60 per cent. The proposed sequential analysis was applied in an investigation on the possible relationship between the polymorphism of the MTHFR-gene and rectal cancer in a cohort of women with cases matched by age to one and to three controls. A sequential analysis of matched data can lead to early stopping in favour of H(0) or H(1), thus conserving valuable resources for future testing. A sequentially designed study can be more economical and less arbitrary than a study that makes use of conditional power or conditional coverage probability calculations to decide early stopping.

17 citations


Proceedings ArticleDOI
03 Sep 2000
TL;DR: This paper implements the speaker-dependent utterance-length control using the SPRT first in speaker identification (SI), by making it applicable to multiclass classification, and shows that the proposed SI method is superior on computation time as well as error rate, to a conventional method with fixed-length utterances.
Abstract: In speaker recognition there are usually a small number of speakers whose utterances are difficult to be recognized correctly and then recognition errors are mainly come from those speakers. Recognition performance for those speakers may be improved if they are urged to produce more utterances. Such speaker-dependent utterance-length control has recently been realized in speaker verification (SV) using the sequential probability ratio test (SPRT). The SPRT is in principle for two-class classification and therefore it was naturally applied to SV. This paper implements the speaker-dependent utterance-length control using the SPRT first in speaker identification (SI), by making it applicable to multiclass classification. Experimental results show that the proposed SI method is superior on computation time as well as error rate, to a conventional method with fixed-length utterances.

7 citations


Journal ArticleDOI
TL;DR: In this article, a monotonicity assumption is used for a sequential test of a dose-response effect in pre-clinical studies and the objective of the test procedure is to compare several dose groups with a zero-dose control.
Abstract: Methods for a sequential test of a dose-response effect in pre-clinical studies are investigated. The objective of the test procedure is to compare several dose groups with a zero-dose control. The sequential testing is conducted within a closed family of one-sided tests. The procedures investigated are based on a monotonicity assumption. These closed procedures strongly control the familywise error rate while providing information about the shape of the dose-responce relationship. Performance of sequential testing procedures are compared via a Monte Carlo simulation study. We illustrae the procedures by application to a real data set.

6 citations


Journal ArticleDOI
01 Jan 2000
TL;DR: In this article, a class of probability density functions is considered, which covers several life-testing models as specific cases, and sequential probability ratio tests are developed for testing simple and composite hypotheses regarding the parameters of the probabilistic model.
Abstract: A class of probability density functions is considered, which covers several life-testing models as specific cases. Sequential probability ratio tests are developed for testing simple and composite hypotheses regarding the parameters of the probabilistic model. Expressions for the operating characteristic and the average sample number functions are derived and their behaviour is studied by means of graph-plotting.

4 citations


01 Jan 2000
TL;DR: Methods of pulse detection and bandwidth estimation that are able to be implemented on an FPGA that are designed to function in electronic intelligence (ELINT) applications are addressed.
Abstract: : The theory of optimum radar detection is well known and is generally implemented in expensive ASICs or supercomputers. However, today's state-of-the-art FPGAs are capable of performing relatively complex algorithms and provide the added benefit of being reconfigurable with new algorithms or methods on-site. Los Alamos National Laboratory has undertaken the goal of developing a receiver that is capable of performing detection and bandwidth estimation of pulsed radar systems. It is designed to function in electronic intelligence (ELINT) applications, where the goal is to determine the capabilities of threatening systems, such as radars which guide aircraft or missiles to targets. This thesis addresses methods of pulse detection and bandwidth estimation that are able to be implemented on an FPGA. The framework is that which is commonly used in this application: a polyphase filter bank subband frequency decomposition of the RF signal, followed by statistical detection methods. The optimal fixed-sample-size (FSS) estimator for this subband decomposition is shown to be the F-test, based on the output statistics of the filter bank, which are found to be chi-squared. An alternative to fixed-sample-size methods, the sequential probability ratio test (SPRT) is, however, more suited to ELINT due to its ability to adapt the test length to the received data. The SPRT is shown to achieve a higher probability of detection with approximately 1/5 the required sample size of the FSS method. The complexity of the SPRT is equivalent to that of the FSS method, and the statistic that results from the optimal SPRT implementation also lends itself to easy calculation of the bandwidth of the signal.

4 citations


Proceedings ArticleDOI
22 Aug 2000
TL;DR: In this paper, the authors apply high-dimensional analysis of variance (HANOVA) and sequential probability ratio test (SPRT) to detect buried land mines from array ground penetrating radar (GPR) measurements.
Abstract: We apply high-dimensional analysis of variance (HANOVA) and sequential probability ratio test (SPRT) to detect buried land mines from array ground penetrating radar (GPR) measurements. The GPR array surveys a region of interest in a progressive manner starting at a known position and moving step by step in a fixed direction. Our detection method consists of two stages. Because, at each stop of the array the path lengths are different from every transmitter/receiver pair to a mien target, there exists statistically significant difference among received signals when a mine target is presented. Thus, the first step in our processing consists of a HANOVA test to detect this statistical difference at each stop. HANOVA does not incorporate new data as the GPR array moves down-track. So secondly, we resort to a sequential probability ratio test to look for changes in the HANOVA statistics as the array proceeds down track. The SPRT allows for real-time processing as anew data are obtained by the GPR array. Finally, real sensor data are processed to verify the method.

4 citations


Journal ArticleDOI
TL;DR: In this paper, a sequential probability ratio test is developed for the problem of selection the best of k multinomial parameter estimation procedures when only one observation per the k estimation procedures is possible but the estimation procedures can be repeated many times.
Abstract: A sequential probability ratio test is developed for the problem of selection the best of k multinomial parameter estimation procedures when only one observation per the k estimation procedures is possible but the k estimation procedures can be repeated many times

DOI
01 Nov 2000
TL;DR: In this paper, several test procedures, such as the likelihood ratio test, uniformly most powerful unbiased test and the Wald test, are proposed for testing the response probability in a multiple logistic regression set up when the observations are independent binomial variables.
Abstract: In this paper we propose several test procedures, such as the likelihood ratio test, uniformly most powerful unbiased test and the Wald test, for testing the response probability in a multiple logistic regression set up when the observations are independent binomial variables. An application of the tests is provided.


Proceedings ArticleDOI
Ismail Jouny1
17 Aug 2000
TL;DR: The M-ary sequential probability ratio test MSPRT is used to recognize unknown non-cooperative radar targets and scenarios representing various degrees of azimuth uncertainty are examined in this paper.
Abstract: The M-ary sequential probability ratio test MSPRT is used to recognize unknown non-cooperative radar targets. Radar returns representing the unknown target backscatter coefficients are tested sequentially using MSPRT. At each stage of the recognition process all observations are used in MSPRT, if no identification decision can be made, additional information is requested and MSPRT is implemented again. The goal is either to minimize the number of observations needed to identify an unknown target assuming a certain predetermined error probability, or to minimize the probability of error assuming a predetermined maximum number of observations. The experimental phase of this study involves radar cross-section signatures of four commercial aircraft models recorded in a compact range environment. Scenarios representing various degrees of azimuth uncertainty are examined in this paper. In all cases, it is assumed that the unknown target is corrupted with additive white Gaussian noise.

Journal ArticleDOI
TL;DR: In this article, the first and second moments of the maximum likelihood estimate (MLE) were derived for the distribution and moment generating functions of a Brownian motion X(t) with drift ⊘.
Abstract: We consider Wald's (1947) sequential probability ratio tests for a Brownian motion X(t) with drift ⊘. Expressions are derived for the distribution and moment generating functions, first and second moments of the maximum likelihood estimate (MLE). Asymptotic behavior of the MLE is also discussed