scispace - formally typeset
Search or ask a question

Showing papers on "Sequential probability ratio test published in 2007"


Book ChapterDOI
29 Mar 2007

50 citations


Journal ArticleDOI
TL;DR: Results from simulation studies using three item selection methods, Fisher information (FI), posterior-weighted FI (FIP), and MI, are provided for an adaptive four-category classification test and it is shown that in general, MI item selection classifies the highest proportion of examinees correctly and yields the shortest test lengths.
Abstract: A general approach for item selection in adaptive multiple-category classification tests is provided. The approach uses mutual information (MI), a special case of the Kullback-Leibler distance, or relative entropy. MI works efficiently with the sequential probability ratio test and alleviates the difficulties encountered with using other local- and global-information measures in the multiple-category classification setting. Results from simulation studies using three item selection methods, Fisher information (FI), posterior-weighted FI (FIP), and MI, are provided for an adaptive four-category classification test. Both across and within the four classification categories, it is shown that in general, MI item selection classifies the highest proportion of examinees correctly and yields the shortest test lengths. The next best performance is observed for FIP item selection, followed by FI.

43 citations


Journal ArticleDOI
TL;DR: It is shown that a class of SPRT boundaries is minimax with respect to resampling risk and recommended a truncated version of boundaries in that class by comparing their resamplings risk (RR) to the RR of fixed boundaries with the same maximum resample size.
Abstract: When designing programs or software for the implementation of Monte Carlo (MC) hypothesis tests, we can save computation time by using sequential stopping boundaries. Such boundaries imply stopping resampling after relatively few replications if the early replications indicate a very large or a very small p value. We study a truncated sequential probability ratio test (SPRT) boundary and provide a tractable algorithm to implement it. We review two properties desired of any MC p value, the validity of the p value and a small resampling risk, where resampling risk is the probability that the accept/reject decision will be different than the decision from complete enumeration. We show how the algorithm can be used to calculate a valid p value and confidence intervals for any truncated SPRT boundary. We show that a class of SPRT boundaries is minimax with respect to resampling risk and recommend a truncated version of boundaries in that class by comparing their resampling risk (RR) to the RR of fixed boundari...

33 citations


Proceedings ArticleDOI
01 May 2007
TL;DR: This paper revisits the problem of detecting greedy behavior in the IEEE 802.11 MAC protocol by evaluating the performance of two previously proposed schemes: DOMINO and the sequential probability ratio test (SPRT), and derives a new analytical formulation of the SPRT that takes into account the discrete nature of the problem.
Abstract: This paper revisits the problem of detecting greedy behavior in the IEEE 802.11 MAC protocol by evaluating the performance of two previously proposed schemes: DOMINO and the sequential probability ratio test (SPRT). The evaluation is carried out in four steps. We first derive a new analytical formulation of the SPRT that takes into account the discrete nature of the problem. Then we develop a new tractable analytical model for DOMINO. As a third step, we evaluate the theoretical performance of SPRT and DOMINO with newly introduced metrics that take into account the repeated nature of the tests. This theoretical comparison provides two major insights into the problem: it confirms the optimality of SPRT and motivates us to define yet another test, a nonparametric CUSUM statistic that shares the same intuition as DOMINO but gives better performance. We finalize the paper with experimental results, confirming our theoretical analysis and validating the introduction of the new nonparametric CUSUM statistic.

33 citations


Journal ArticleDOI
TL;DR: This study deals with simultaneous testing of two systems, one "basic" (subscript b), and the other "new" (n), both with an exponential distribution describing the times between failures, and test whether the mean TBFn/MTBFb ratio equals a given value, versus whether it is smaller than the given value.
Abstract: This study deals with simultaneous testing of two systems, one "basic" (subscript b), and the other "new" (n), both with an exponential distribution describing the times between failures. We test whether the mean TBFn/MTBFb ratio equals a given value, versus whether it is smaller than the given value. These tests yield a binomial pattern. A recursive algorithm calculates the probability of a given combination of failure numbers in the systems, permitting rapid, accurate determination of the test characteristics. The influence of truncation of Wald's Sequential Probability Ratio Test (SPRT) on its characteristics is analysed, and relationships are derived for calculating the coordinates of truncation apex (TA). A test planning methodology is presented for the most common cases

21 citations


Journal ArticleDOI
TL;DR: In this paper, a partial sequential sampling scheme is introduced to develop a sequential rank-based nonparametric test for the identity of two unknown univariate continuous distribution functions against one-sided shift in location occurring at an unknown time point.
Abstract: In the present work, we introduce a partial sequential sampling scheme to develop a sequential rank-based nonparametric test for the identity of two unknown univariate continuous distribution functions against one-sided shift in location occurring at an unknown time point. Our work is motivated by Wolfe (1977) as well as Orban and Wolfe (1980). We provide detailed discussion on asymptotic studies related to the proposed test. We compare the proposed test with a usual rank-based test. Some simulation studies are also presented.

20 citations


Journal Article
TL;DR: The SLAM problem is tackled here for cases when only bearing measurements are available, and the computational complexity of the GSF is reduced by applying the sequential probability ratio test (SPRT) to remove under-performing EKFs.
Abstract: A Gaussian sum filter (GSF) is proposed in this paper on simultaneous localization and mapping (SLAM) for mobile robot navigation. In particular, the SLAM problem is tackled here for cases when only bearing measurements are available. Within the stochastic mapping framework using an extended Kalman filter (EKF), a Gaussian probability density function (pdf) is assumed to describe the range-and-bearing sensor noise. In the case of a bearing-only sensor, a sum of weighted Gaussians is used to represent the non-Gaussian robot-landmark range uncertainty, resulting in a bank of EKFs for estimation of the robot and landmark locations. In our approach, the Gaussian parameters are designed on the basis of minimizing the representation error. The computational complexity of the GSF is reduced by applying the sequential probability ratio test (SPRT) to remove under-performing EKFs. Extensive experimental results are included to demonstrate the effectiveness and efficiency of the proposed techniques.

16 citations


Proceedings ArticleDOI
27 Apr 2007
TL;DR: The sequential probability ratio test (SPRT) developed by Wald is implemented within the previously developed sensor management framework to allow cell-level decisions of "target" or "no target" to be made based on the observed sensor data.
Abstract: Previous work by the authors using information-based sensor management for static target detection has utilized a probability of error performance metric that assumes knowledge of the number of targets present in a grid of cells. Using this probability of error performance metric, target locations are estimated as the N cells with the largest posterior state probabilities of containing a target. In a realistic application, however, the number of targets is not known a priori. The sequential probability ratio test (SPRT) developed by Wald is therefore implemented within the previously developed sensor management framework to allow cell-level decisions of "target" or "no target" to be made based on the observed sensor data. Using these cell-level decisions, more traditional performance metrics such as probability of detection and probability of false alarm may then be calculated for the entire region of interest. The resulting sensor management framework is implemented on a large set of data from the U.S. Army's autonomous mine detection sensors (AMDS) program that has been collected using both ground penetrating radar (GPR) and electromagnetic induction (EMI) sensors. The performance of the sensor manager is compared to two different direct search techniques, and the sensor manager is found to achieve the same P d performance at a lower cost than either of the direct search techniques. Furthermore, uncertainty in the sensor performance characteristics is also modeled, and the use of uncertainty modeling allows a higher P d to be obtained than is possible when uncertainty is not modeled within the sensor management framework.

13 citations


Journal ArticleDOI
TL;DR: In this paper, a non-local criterion of asymptotic relative efficiency based on Bahadur slopes has been employed for the first time to the problem of unit root testing.
Abstract: Summary The unrestricted estimator of the information matrix is shown to be inconsistent for an autoregressive process with a root lying in a neighbourhood of unity with radial length proportional or smaller than n−1, i.e. a root that takes the form ρ= 1 +c/nα,α≥ 1. In this case the information evaluated at converges to a non-degenerate random variable and contributes to the asymptotic distribution of a Wald test for the null hypothesis of a random walk versus a stable AR(1) alternative. With this newly derived asymptotic distribution, the above Wald test is found to improve its performance. A non-local criterion of asymptotic relative efficiency based on Bahadur slopes has been employed for the first time to the problem of unit root testing. The Wald test derived in the paper is found to be as efficient as the Dickey Fuller t ratio test and to outperform the non-studentised Dickey Fuller test and a Lagrange Multiplier test.

12 citations


Book ChapterDOI
30 Nov 2007

6 citations


01 Jan 2007
TL;DR: It is argued that the actual goal of classification testing is a composite hypothesis (Weitzman, 1982) that an examinee’s ability θ is in the region of θ either above or below the cutscore, rather than equal to an arbitrarily defined point.
Abstract: Reckase (1983) proposed a widely used method of applying the sequential probability ratio test (SPRT; Wald, 1947) to computerized classification testing with item response theory. This method formulates the classification problem as a point hypothesis that an examinee’s ability, θ, is equal to a point, θ1, below the cutscore or a point, θ2, above the cutscore. The current paper argues that the actual goal of classification testing is a composite hypothesis (Weitzman, 1982) that an examinee’s ability θ is in the region of θ either above or below the cutscore, rather than equal to an arbitrarily defined point. A formulation of the SPRT to reflect this testing paradigm is proposed.

01 Jan 2007
TL;DR: The simulation results show that SPRT method is more suitable to detect soft failures of aircraft engine sensor than WSSR in troubleshooting soft failure of sensors.
Abstract: A method for troubleshooting soft failure of aircraft engine sensor was proposed based on Kalman filters and sequential probability ratio test(SPRT). This approach was used to process Kalman filter residuals and detect the soft sensor failure with revised SPRT.Thereafter,this method was compared with WSSR (Weighted Sum of Squared Residual) in troubleshooting soft failure of sensors.The simulation results show that SPRT method is more suitable to detect soft failures of aircraft engine sensor.

Journal Article
TL;DR: Simulation results show that the recursive track-before-detect (TBD) algorithm has favorable performance of detecting and tracking and can detect disappearing of targets in time by setting appropriate sample size.
Abstract: A recursive track-before-detect(TBD) algorithm based on particle filter,combined sequential probability ratio test and fixed sample size(SPRT-FSS) likelihood ratio test is presented according to the on-line detection and tracking of weak targets in low SNR environment.The particle filter is used to solve the nonlinear and non-Gaussian problem;SPRT can increase SNR to detect the existence of targets with the shortest delay through sequentially cumulating multi-frame measurements;The FSS likelihood ratio test is used to successively detect targets,and it can detect disappearing of targets in time by setting appropriate sample size.Simulation results show that the TBD algorithm has favorable performance of detecting and tracking.

Journal Article
XU Zhi-gao1
TL;DR: A novel method for detecting sensor faults is being presented, and the method's effectiveness has been vindicated by a simulated example with many-sided sensor faults occurring to the vacuum system of a certain 125 MW power generating set.
Abstract: A novel method for detecting sensor faults is being presentedThe conventional principal component analysis method dosen't work well with non-linear systemsA model based on kernel PCA is therefore constructed,for extracting the system's non-linear redundant information,and then reconstructing the data in the input spaceProper kernel functions and parameters,to serve as a guide for constructing the model,are moreover chosen by way of minimizing the mean square prediction errorThe model may also be effectively used during on-line fault detection service to denoise the system by projecting real-time data into the KPCA spaceThe sequential probability ratio test is used to detect the reconstituted residual errorWherewith not only obvious sensor faults,like drifting,can be diagnosed,but early fault symptoms may also be intime noticedThe method's effectiveness has been vindicated by a simulated example with many-sided sensor faults occurring to the vacuum system of a certain 125 MW power generating set

Proceedings ArticleDOI
26 Aug 2007
TL;DR: This paper provides a counterexample to show that the best sequential decision rule can not be obtained by the choice of any reward function in the reinforcement learning framework, and can only be learned via a rather unconventional formulation of the reinforcementlearning.
Abstract: Reinforcement learning deals with how to find the best policy under uncertain environment to maximize some notion of long term reward. In sequential decision making, it is often expected that the best policy can be designed by choosing appropriate reward or penalty for each action. In this paper, we provide a counterexample to show that the best sequential decision rule can not be obtained by the choice of any reward function in the reinforcement learning framework. In fact, the best policy, namely, the randomized sequential probability ratio test, can only be learned via a rather unconventional formulation of the reinforcement learning. The implication to the design of classifier combining method is also discussed.

Journal Article
TL;DR: This article proposes the two-step sequential mesh test and the results show that it is better than the sequentialMesh test.
Abstract: The sequential probability ratio test (SPRT) is widely used,while the sequential mesh test is much powerful.In this article we propose the two-step sequential mesh test.The results show that it is better than the sequential mesh test.

Proceedings Article
01 Jul 2007
TL;DR: A Sequential Probability Ratio Test (SPRT) algorithm helps to increase the reliability and speed of radiation detection and is further improved to reduce spatial gap and false alarm.
Abstract: A Sequential Probability Ratio Test (SPRT) algorithm helps to increase the reliability and speed of radiation detection. This algorithm is further improved to reduce spatial gap and false alarm. SPRT, using Last-in-First-Elected-Last-Out (LIFELO) technique, reduces the error between the radiation measured and resultant alarm. Statistical analysis determines the reduction of spatial error and false alarm.

Book ChapterDOI
12 Sep 2007
TL;DR: A novel filter called the Evolution Strategies based particle filter (ESP) proposed by recognizing the similarities and the difference of the processes between the particle filters and Evolution Strategies is applied here to fault detection of nonlinear stochastic state space models.
Abstract: Fault detection in dynamic systems has attracted considerable attention in designing systems with safety and reliability. Though a large number of methods have been proposed for solving the fault detection problem, it is hardly apply to nonlinear stochastic state space models. A novel filter called the Evolution Strategies based particle filter (ESP) proposed by recognizing the similarities and the difference of the processes between the particle filters and Evolution Strategies is applied here to fault detection of nonlinear stochastic state space models. Numerical simulation studies have been conducted to exemplify the applicability of this approach.

Journal Article
TL;DR: A novel approach is brought forward to solve the contradiction of real-time and nicety in detecting little offset faults of sensor by multi-data by using Kalman filter to acquire the residuals of the sensor and sequential probability ratio test to detect the fault based on multi-residuals.
Abstract: Based on the contradiction of real-time and nicety in detecting little offset faults of sensor by multi-data,A novel approach is brought forward to solve it in this paper.This method uses Kalman filter to acquire the residuals of the sensor and uses sequential probability ratio test to detect the fault based on multi-residuals,which can get the least number of the residuals by dynamical calculation and has some adaptive characteristic.The method can resolve the problems of real time,which detect a fault using the residuals and this method promotes the detection accuracy.

31 Dec 2007
TL;DR: In this paper, a sequential test for circular data is considered and the results obtained from the application are evaluated and some interpretations of how to use this test are given, and an illustrative application of the test is performed on a generated data.
Abstract: In this study, a sequential test for circular data is considered. The theoretical details, operating characteristic function and average sample number function of the Sequential Probability Ratio Test for the von Mises distribution are given. Also, an illustrative application of the test is performed on a generated data. Finally, the results obtained from the application are evaluated and some interpretations of how to use this test are given.

Journal ArticleDOI
TL;DR: In this article, an experimental method denoted as Impulse Method is proposed as a cost-effective non-destructive technique for the on-site evaluation of concrete elastic modulus in existing structures.
Abstract: An experimental method denoted as Impulse Method is proposed as a cost-effective non-destructive technique for the on-site evaluation of concrete elastic modulus in existing structures: on the basis of Hertz`s quasi-static theory of elastic impact and with the aid of a simple portable testing equipment, it makes it possible to collect series of local measurements of the elastic modulus in an easy way and in a very short time. A Hypothesis Testing procedure is developed in order to provide a statistical tool for processing the data collected by means of the Impulse Method and assessing the possible occurrence of significant variations in the elastic modulus without exceeding some prescribed error probabilities. It is based on a particular formulation of the renowned sequential probability ratio test and reveals to be optimal with respect to the error probabilities and the required number of observations, thus further improving the time-effectiveness of the Impulse Method. The results of an experimental investigation on different types of plain concrete prove the validity of the Impulse Method in estimating the unknown value of the elastic modulus and attest the effectiveness of the proposed Hypothesis Testing procedure in identifying significant variations in the elastic modulus.

01 Jan 2007
TL;DR: This section summarizes the McSPRT algorithm for solving correlated selection problems with a focus on finding a treatment with lowest (or highest) expected utility.
Abstract: 2. McSPRT This section summarizes the McSPRT algorithm for solving correlated selection problems. This section is brief. A more detailed discussion appears in [Gratch94]. McSPRT stands for “Multiple Comparison Sequential Probability Ratio Test.” A sequential procedure [Govindarjulu81] is one that draws data a little at a time until enough has been taken to make a statistical decision of some pre-specified quality. Sequential procedures tend to be more efficient than fixed sized procedures. The sequential probability ratio test (SPRT) is a sequential procedure which can be used (among other things) to decide the sign of the expected value of a distribution. A multiple comparison procedure [Hochberg87] is a statistical procedure that makes some global decision based on many separate decisions (called comparisons). McSPRT is a multiple comparison procedure for finding a treatment with lowest (or highest) expected utility. This is decided by comparing the differences in expected utility between treatments. In particular, after each training example, the treatment with lowest estimated utility is compared pair-wise with each other treatment. If the difference in expected value between the best and an alternative is significantly negative (as decided by SPRT), the alternative is discarded.

Proceedings ArticleDOI
26 Dec 2007
TL;DR: It is shown that IMHWSPRT is robust with the system model faults and is possible to reduce the effect of the faults by using the null matrix of the measurement model matrix.
Abstract: An algorithm for resolving the GPS integer ambiguity is introduced. This algorithm is called the improved multiple-hypothesis Wald sequential probability ratio test (IMHWSPRT) and is based on a log formulation of the multiple-hypothesis Wald sequential probability ratio test (MHWSPRT). In this method the dynamic information is used as constraint. It speeds up the resolution convergence time of the integer ambiguity of GPS carrier phase measurements. It is known that the improvement of convergence speed of the MHWSPRT is due to the increase the probability ratio with the use of the dynamic information constraint. The dynamic information make the state error covariance small, it helps to speed up the resolution convergence time. But this method has weakness when the system model has faults. It is possible to reduce the effect of the faults by using the null matrix of the measurement model matrix. It is shown that IMHWSPRT is robust with the system model faults. The performance of IMHWSPRT is demonstrated using numerical simulations.