scispace - formally typeset
Search or ask a question

Showing papers on "Sequential probability ratio test published in 2012"


Journal ArticleDOI
TL;DR: The results of this study reveal that the SPRT chart is more effective than the CUSUM chart and X¯ chart by 58% and 126%, respectively, from an overall viewpoint.

111 citations


Journal ArticleDOI
TL;DR: Simulation results show that the proposed scheme, using even 1 bit, can outperform its uniform sampling counterpart that uses infinite number of bits under changing target error probabilities, SNR values, and number of SUs.
Abstract: We propose a new framework for cooperative spectrum sensing in cognitive radio networks, that is based on a novel class of nonuniform samplers, called the event-triggered samplers, and sequential detection. In the proposed scheme, each secondary user (SU) computes its local sensing decision statistic based on its own channel output; and whenever such decision statistic crosses certain predefined threshold values, the secondary user will send one (or several) bit of information to the fusion center (FC). The FC asynchronously receives the bits from different SUs and updates the global sensing decision statistic to perform a sequential probability ratio test (SPRT), to reach a sensing decision. We provide an asymptotic analysis for the above scheme, and under different conditions, we compare it against the cooperative sensing scheme that is based on traditional uniform sampling and sequential detection. Simulation results show that the proposed scheme, using even 1 bit, can outperform its uniform sampling counterpart that uses infinite number of bits under changing target error probabilities, SNR values, and number of SUs.

73 citations


Journal ArticleDOI
TL;DR: The most commonly applied signal detection algorithms are presented, covering simple frequentistic methods like the proportional reporting rate or the reporting odds ratio, more advanced Bayesian techniques for spontaneous and longitudinal data, and a drug monitoring technique based on Wald’s sequential probability ratio test.
Abstract: Post-marketing detection and surveillance of potential safety hazards are crucial tasks in pharmacovigilance. To uncover such safety risks, a wide set of techniques has been developed for spontaneous reporting data and, more recently, for longitudinal data. This paper gives a broad overview of the signal detection process and introduces some types of data sources typically used. The most commonly applied signal detection algorithms are presented, covering simple frequentistic methods like the proportional reporting rate or the reporting odds ratio, more advanced Bayesian techniques for spontaneous and longitudinal data, e.g., the Bayesian Confidence Propagation Neural Network or the Multi-item Gamma-Poisson Shrinker and methods developed for longitudinal data only, like the IC temporal pattern detection. Additionally, the problem of adjustment for underlying confounding is discussed and the most common strategies to automatically identify false-positive signals are addressed. A drug monitoring technique based on Wald's sequential probability ratio test is presented. For each method, a real-life application is given, and a wide set of literature for further reading is referenced.

58 citations


Journal ArticleDOI
TL;DR: A new data fusion technique is proposed that uses a variable number of samples and introduces a reputation-based mechanism to the Sequential Probability Ratio Test (SPRT), which is the most robust against Byzantine Failures among the data fusion techniques that were considered.

48 citations


Journal ArticleDOI
TL;DR: In this article, the asymptotic expansion of the distribution of the gradient test statistic is derived for a composite hypothesis under a sequence of Pitman alternative hypotheses converging to the null hypothesis at rate n−1/2, n being the sample size.
Abstract: The asymptotic expansion of the distribution of the gradient test statistic is derived for a composite hypothesis under a sequence of Pitman alternative hypotheses converging to the null hypothesis at rate n−1/2, n being the sample size. Comparisons of the local powers of the gradient, likelihood ratio, Wald and score tests reveal no uniform superiority property. The power performance of all four criteria in one-parameter exponential family is examined.

41 citations


Journal ArticleDOI
TL;DR: In this paper, the idea of step-up and step-down methods for multiple comparisons to sequential designs was extended for testing multiple hypotheses, resulting in a statistical decision for each individual test and controlling the family-wise error rate and the familywise power.

38 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed stopping rules and decision rules for simultaneous testing of multiple hypotheses in sequential experiments, and derived asymptotically optimal procedures under Pitman alternative.
Abstract: Sequential procedures are developed for simultaneous testing of multiple hypotheses in sequential experiments. Proposed stopping rules and decision rules achieve strong control of both family-wise error rates I and II. The optimal procedure is sought that minimizes the expected sample size under these constraints. Bonferroni methods for multiple comparisons are extended to sequential setting and are shown to attain an approximately 50% reduction in the expected sample size compared with the earlier approaches. Asymptotically optimal procedures are derived under Pitman alternative.

35 citations


Journal ArticleDOI
TL;DR: The focus of the paper is on the design of space-time codes for a general multiple-input, multiple-output detection problem, when multiple observations are available at the receiver, and the figure of merit used for optimization purposes is the Kullback-Leibler divergences between the densities of the observations under the two hypotheses.
Abstract: The focus of the paper is on the design of space-time codes for a general multiple-input, multiple-output detection problem, when multiple observations are available at the receiver. The figure of merit used for optimization purposes is the convex combination of the Kullback-Leibler divergences between the densities of the observations under the two hypotheses, and different system constraints are considered. This approach permits to control the average sample number (i.e., the time for taking a decision) in a sequential probability ratio test and to asymptotically minimize the probability of miss in a likelihood ratio test: the solutions offer an interesting insight in the optimal transmit policies, encapsulated in the rank of the code matrix, which rules the amount of diversity to be generated, as well as in the power allocation policy along the active eigenmodes. A study of the region of achievable divergence pairs, whose availability permits optimization of a wide range of merit figures, is also undertaken. A set of numerical results is finally given, in order to analyze and discuss the performance and validate the theoretical results.

28 citations


Journal ArticleDOI
TL;DR: The presented method can improve the accuracy of the sequential probability ratio test by reducing the false and missed alarm probabilities caused by improper model parameters.
Abstract: The sequential probability ratio test is widely used in in-situ monitoring, anomaly detection, and decision making for electronics, structures, and process controls However, because model parameters for this method, such as the system disturbance magnitudes, and false and missed alarm probabilities, are selected by users primarily based on experience, the actual false and missed alarm probabilities are typically higher than the requirements of the users This paper presents a systematic method to select model parameters for the sequential probability ratio test by using a cross-validation technique The presented method can improve the accuracy of the sequential probability ratio test by reducing the false and missed alarm probabilities caused by improper model parameters A case study of anomaly detection of resettable fuses is used to demonstrate the application of a cross validation method to select model parameters for the sequential probability ratio test

27 citations


Journal ArticleDOI
TL;DR: Despite using only a basic linear dynamic model, the IPSPRT is shown to achieve exponentially increasing better time-to-decision than the windowed-average approach as the probability of false alarm and probability of a missed alarm are decreased.

19 citations


Proceedings Article
18 Oct 2012
TL;DR: It is shown that among all permutations of ordering of the observations, the average sample number (ASN) is minimum for the order in which the area under the receiver operating characteristic (ROC) curve for each of the non-identically distributed observations is monotonically decreasing.
Abstract: The effect of the ordering of independent and non-identical observations on the average number of samples needed to make a decision in a sequential binary hypothesis test is analyzed in this paper. We show that among all permutations of ordering of the observations, the average sample number (ASN) is minimum for the order in which the area under the receiver operating characteristic (ROC) curve for each of the non-identically distributed observations is monotonically decreasing. The claim is verified by computing the ASN of a generalized sequential probability ratio test (GSPRT) for different orderings of observations, which are independent and non-identical Gaussian random variables, using a combination of analytical and numerical techniques.

Journal ArticleDOI
TL;DR: The development of an algorithm for the optimal design of the SPRT chart based on the Average Extra Quadratic Loss (AEQL) is developed and the results of the systematic comparative studies show that this design algorithm is indeed effective.

Patent
01 Oct 2012
TL;DR: In this article, a method of detecting a compromised machine on a network is proposed. But the method is based on a sequential probability ratio test and is not suitable for the case of email messages.
Abstract: A method of detecting a compromised machine on a network. The method receives an email message from a machine on the network and classifies it as either spam or non-spam. A probability ratio is then updated, according to whether the message was spam or non-spam, by applying a sequential probability ratio test. If the probability ratio is greater than or equal to a first threshold, then the machine is compromised. If the probability ratio is less than or equal to a second threshold, then the machine is normal. The operations of receiving a message, classifying the message, updating the probability ratio, and indicating the machine is normal or compromised until the probability ratio is greater than or equal to the first threshold are repeated for a plurality of messages. Such repeated operations are performed on each of the messages one at a time, as each of the messages is received.

Journal ArticleDOI
TL;DR: This work proposes a contention-based protocol for general decentralized detection problem in the context of wireless sensor networks using a novel Bayesian update algorithm utilizing both sensing information and channel feedback and shows that exploiting capture effect can significantly improve communication and energy efficiency.
Abstract: In this work, we propose a contention-based protocol for general decentralized detection problem in the context of wireless sensor networks. In this scheme, fusion task is implemented in a multi-stage fashion: sensors are first grouped according to the informativeness of their data; fusion center then polls the sensor sets sequentially in the order of their informativeness until a target performance is reached. Within one stage, all polled sensors compete for a common channel medium where exists near-far effect, Raleigh fading, and shadowing. To determine the optimal transmission probability, we propose a novel Bayesian update algorithm utilizing both sensing information and channel feedback. The proposed dynamic protocol is applied to signal detection in Gaussian noise. As shown by our simulations, incorporating sensing information greatly improves efficiency over a generic Bayesian update scheme relying only on channel feedback. Our results also show that exploiting capture effect can significantly improve communication and energy efficiency. Comparison with fixed sample size test and sequential probability ratio test shows that the proposed scheme achieves significant efficiency gain over existing fusion strategies.

Journal ArticleDOI
TL;DR: This work derives explicit solutions for the error rate and probability distribution function of decision times for a group of independent, (possibly) nonidentical decision makers using one of three simple rules: Race, Majority Total, and Majority First.
Abstract: The sequential probability ratio test (SPRT) and related drift-diffusion model (DDM) are optimal for choosing between two hypotheses using the minimal (average) number of samples and relevant for modeling the decision-making process in human observers. This work extends these models to group decision making. Previous works have focused almost exclusively on group accuracy; here, we explicitly address group decision time. First, we derive explicit solutions for the error rate and probability distribution function of decision times for a group of independent, (possibly) nonidentical decision makers using one of three simple rules: Race, Majority Total, and Majority First. We illustrate our solutions with a group of $N$ i.i.d. decision makers who each make an individual decision using the SPRT-based DDM, then compare the performance of each group rule under different constraints. We then generalize these group rules to the $\eta$-Total and $\eta$-First schemes, to demonstrate the flexibility and power of our approach in characterizing the performance of a group, given the performance of its individual members.

Journal ArticleDOI
TL;DR: In this paper, an analytic redundancy-based fault diagnosis technique (ARFDT) is applied to onboard maintenance system (OMS) to enhance the functions of redundancy management and built in test equipment (BITE) monitor.

Posted Content
TL;DR: In this paper, the authors proposed a change detection test based on the Doob's Maximal Inequality and showed that it is an approximation of the sequential probability ratio test (SPRT), and the relationship between the threshold value used in the proposed test and its size and power was deduced from the approximation.
Abstract: A martingale framework for concept change detection based on testing data exchangeability was recently proposed (Ho, 2005). In this paper, we describe the proposed change-detection test based on the Doob's Maximal Inequality and show that it is an approximation of the sequential probability ratio test (SPRT). The relationship between the threshold value used in the proposed test and its size and power is deduced from the approximation. The mean delay time before a change is detected is estimated using the average sample number of a SPRT. The performance of the test using various threshold values is examined on five different data stream scenarios simulated using two synthetic data sets. Finally, experimental results show that the test is effective in detecting changes in time-varying data streams simulated using three benchmark data sets.

Journal ArticleDOI
TL;DR: Wald's approximations are shown to be applicable even though the problem setting deviates from that of the traditional sequential probability ratio test (SPRT), and the proposed scheme achieves significant savings in the cost of data fusion.
Abstract: The problem of decentralized detection in a large wireless sensor network is considered. An adaptive decentralized detection scheme, group-ordered sequential probability ratio test (GO-SPRT), is proposed. This scheme groups sensors according to the informativeness of their data. Fusion center collects sensor data sequentially, starting from the most informative data and terminates the process when the target performance is reached. Wald's approximations are shown to be applicable even though the problem setting deviates from that of the traditional sequential probability ratio test (SPRT). To analyze the efficiency of GO-SPRT, the asymptotic equivalence between the average sample number of GO-SPRT, which is a function of a multinomial random variable, and a function of a normal random variable, is established. Closed-form approximations for the average sample number are then obtained. Compared with fixed sample size test and traditional SPRT, the proposed scheme achieves significant savings in the cost of data fusion.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a methodology for planning of a truncated sequential probability ratio test (SPRT) in which two systems with exponentially distributed times between failures (TBFs) are compared.
Abstract: Purpose – This paper aims to propose a methodology for planning of a truncated sequential probability ratio test (SPRT) in which two systems with exponentially distributed times between failures (TBFs) are compared. The study is concerned with tests with arbitrary probabilities of I‐ and II‐type errors.Design/methodology/approach – The study methodology, based on the proposed optimality criteria for these tests, permitted comparison of different modes of truncation and obviated the drawbacks of discreteness and multidimensionality of their characteristics.Findings – The solution permits planning of a heavily‐truncated test with an average sample number exceeding its counterpart for the optimal (non‐truncated) test by at most a specified percentage. Relationships are outlined for optimal selection of the truncated test boundaries. So are optimality estimation criteria for the constructed test. The superiority of the SPRTs, truncated by the proposed methodology, over their counterparts, processed according ...

Journal ArticleDOI
TL;DR: The proposed approach notably outperforms the state-of-the-art detectors based on WaldBoost and could be applied wherever the goal is to find the strongest response of a classifier among a set of classified samples.
Abstract: Detection of objects in images using statistical classifiers is a well studied and documented technique. Different applications of such detectors often require selection of the image position with the highest response of the detector—they perform non-maxima suppression. This article introduces the concept of early non-maxima suppression, which aims to reduce necessary computations by making the non-maxima suppression decision early based on incomplete information provided by a partially evaluated classifier. We show that the error of one such speculative decision with respect to a decision made based on response of the complete classifier can be estimated by collecting statistics on unlabeled data. The article then considers a sequential strategy of multiple early non-maxima suppression tests which follows the structure of soft-cascade detectors commonly used for object detection. We also show that an optimal (fastest for requested error rate) suppression strategy can be created by a novel variant of Wald’s sequential probability ratio test (SPRT) which we call the conditioned SPRT (CSPRT). Experimental results show that the early non-maxima suppression significantly reduces amount of computation in the case of object localization while the error rates are limited to low predefined values. The proposed approach notably outperforms the state-of-the-art detectors based on WaldBoost. The potential applications of the early non-maxima suppression approach are not limited to object localization and could be applied wherever the goal is to find the strongest response of a classifier among a set of classified samples.

Proceedings ArticleDOI
23 Feb 2012
TL;DR: This paper uses a combination of simulated annealing and sequential hypothesis testing to reduce the number of samples required for parameter discovery of stochastic models, and uses probabilistic bounded linear temporal logic (PBLTL) to express the desired behavioral specification of a model.
Abstract: Stochastic models are often used to study the behavior of biochemical systems and biomedical devices. While the structure of such models is often readily available from first principles, several quantitative features of the model are not easily determined. These quantitative features are often incorporated into the model as parameters. The algorithmic discovery of parameter values from experimentally observed facts (including extreme-scale data) remains a challenge for the computational systems biology community. In this paper, we present a new parameter discovery algorithm based on Wald's sequential probability ratio test (SPRT). Our algorithm uses a combination of simulated annealing and sequential hypothesis testing to reduce the number of samples required for parameter discovery of stochastic models. We use probabilistic bounded linear temporal logic (PBLTL) to express the desired behavioral specification of a model. We also present theoretical results on the correctness of our algorithm, and demonstrate the effectiveness of our algorithm by studying a detailed model of glucose and insulin metabolism.

Journal ArticleDOI
TL;DR: A novel anomaly detection method, the cross-validation-based sequential probability ratio test, is developed and applied to the failure precursor parameters of the resettable circuit protection devices to conduct anomaly detection.
Abstract: As circuit protection devices, failure or abnormal behavior of polymer positive-temperature-coefficient resettable devices can cause damage to circuits. It is necessary to detect anomalies in the resettable circuit protection devices to provide early warning of failure and avoid damage to a circuit. In this paper, a novel anomaly detection method, the cross-validation-based sequential probability ratio test, is developed and applied to the failure precursor parameters of the resettable circuit protection devices to conduct anomaly detection. The cross-validation-based sequential probability ratio test integrates the advantages of both the sequential probability ratio test for in situ anomaly detection and the cross-validation technique for model parameter selection to reduce the probability of false and missed alarms in anomaly detection. The cross-validation-based sequential probability ratio test solves the model parameter selection difficulty of the traditional sequential probability ratio test and improves its performance in anomaly detection.

Book ChapterDOI
11 Dec 2012
TL;DR: The experiments show that the GMAM method is effective in detecting concept changes in two synthetic time-varying data streams and a real world dataset ‘Respiration dataset’.
Abstract: In this paper, we propose a Geometric Moving Average Martingale (GMAM) method for detecting changes in data streams. There are two components underpinning the GMAM method. The first is the exponential weighting of observations which has the capability of reducing false changes. The second is the use of the GMAM value for hypothesis testing. When a new data point is observed, the hypothesis testing decides whether any change has occurred on it based on the GMAM value. Once a change is detected, then all variables of the GMAM algorithm are re-initialized in order to find other changes. The experiments show that the GMAM method is effective in detecting concept changes in two synthetic time-varying data streams and a real world dataset ‘Respiration dataset’.

Proceedings ArticleDOI
27 Jun 2012
TL;DR: It is proved that Wald-Wolfowitz theorem holds true for sequential test with multiple sensors and an optimal sequential test is obtained using dynamic programming and it corresponds to the Sequential Probability Ratio Test (SPRT) when the sensor selection process is stationary.
Abstract: We study the problem of sequential detection for binary hypothesis testing using multiple sensors. We consider a randomized sensor selection strategy in which one sensor can be active at any given time step. We obtain an optimal sequential test using dynamic programming and show that it corresponds to the Sequential Probability Ratio Test (SPRT) when the sensor selection process is stationary. Further, we prove that Wald-Wolfowitz theorem holds true for sequential test with multiple sensors.

Book ChapterDOI
04 Jun 2012
TL;DR: The goal is to rigorously formalize the problem in terms of mathematical decision theory, find the optimal solution to the problem, and derive concrete bounds for its expected loss (number of mistakes the SPIT filter will make in the worst case).
Abstract: This paper presents the first formal framework for identifying and filtering SPIT calls (SPam in Internet Telephony) in an outbound scenario with provable optimal performance. In so doing, our work deviates from related earlier work where this problem is only addressed by ad-hoc solutions. Our goal is to rigorously formalize the problem in terms of mathematical decision theory, find the optimal solution to the problem, and derive concrete bounds for its expected loss (number of mistakes the SPIT filter will make in the worst case). This goal is achieved by considering a scenario amenable to theoretical analysis, namely SPIT detection in an outbound scenario with pure sources. Our methodology is to first define the cost of making an error, apply Wald's sequential probability ratio test, and then determine analytically error probabilities such that the resulting expected loss is minimized. The benefits of our approach are: (1) the method is optimal (in a sense defined in the paper); (2) the method does not rely on manual tuning and tweaking of parameters but is completely self-contained and mathematically justified; (3) the method is computationally simple and scalable. These are desirable features that would make our method a component of choice in larger, autonomic frameworks.

Journal ArticleDOI
TL;DR: In this article, a Poisson generalized linear mixed model (GLMM) was proposed for the design of a multicenter randomized clinical trial that compares two preventive treatments for surgical site infections.

Proceedings Article
09 Jul 2012
TL;DR: A modified decomposition-and-fusion approach for target tracking in the presence of range-gate-pull-off (RGPO) is proposed, which overcomes the deficiencies of the likelihood ratio test, and fits well with the RGPO detection problem.
Abstract: A modified decomposition-and-fusion approach for target tracking in the presence of range-gate-pull-off (RGPO) is proposed. The RGPO detection problem consists of two parts: onset detection and termination detection. The likelihood ratio test used in the decomposition-and-fusion approach is replaced with sequential change detection, such as the cumulative sum test and Shiryayev's sequential probability ratio test. The proposed approach overcomes the deficiencies of the likelihood ratio test, such as uncontrollable detection probability and neglect of old information, and fits well with the RGPO detection problem. These detectors are evaluated. Simulation results show that the proposed solution substantially outperforms the original solution since miss detection rate is greatly reduced by sequential detection.

Posted Content
TL;DR: In this article, a formal framework for identifying and filtering SPIT calls (SPam in Internet Telephony) in an outbound scenario with provable optimal performance is presented. But this work is largely different from related previous work: their goal is to rigorously formalize the problem in terms of mathematical decision theory, find the optimal solution to the problem, and derive concrete bounds for its expected loss (number of mistakes the SPIT filter will make in the worst case).
Abstract: This paper presents a formal framework for identifying and filtering SPIT calls (SPam in Internet Telephony) in an outbound scenario with provable optimal performance. In so doing, our work is largely different from related previous work: our goal is to rigorously formalize the problem in terms of mathematical decision theory, find the optimal solution to the problem, and derive concrete bounds for its expected loss (number of mistakes the SPIT filter will make in the worst case). This goal is achieved by considering an abstracted scenario amenable to theoretical analysis, namely SPIT detection in an outbound scenario with pure sources. Our methodology is to first define the cost of making an error (false positive and false negative), apply Wald's sequential probability ratio test to the individual sources, and then determine analytically error probabilities such that the resulting expected loss is minimized. The benefits of our approach are: (1) the method is optimal (in a sense defined in the paper); (2) the method does not rely on manual tuning and tweaking of parameters but is completely self-contained and mathematically justified; (3) the method is computationally simple and scalable. These are desirable features that would make our method a component of choice in larger, autonomic frameworks.

Proceedings ArticleDOI
26 Jun 2012
TL;DR: A low-complexity sequential probability ratio test (SPRT) is thus developed for effectively detecting the vacant spectrum to meet the requirements of the sensing duty cycle and results show that the proposed detector outperforms conventional ED, especially in low SNR environments.
Abstract: Spectrum sensing is a crucial technique used to discover available bands that are not occupied by primary users in cognitive networks (CNs). With good sensing capability in terms of low probability of a miss occurrence, secondary users can effectively recycle the spectrum resource without disturbing active primary users. With low probability of a false alarm occurrence, spectral utilization may be relatively simple spectrum sensing technique. In practice, a cognitive radio (CR) receiver has to operate at low signal-to-noise ratio (SNR) regimes because of channel fades and noise. Therefore, low SNR inevitably degrades the performance of ED dramatically. In this paper, a sequential test detector based on higher-order statistics (HOS) is investigated to conduct effective spectral sensing, especially in low-SNR environments. By taking advantage of cumulant statistics, spectrum sensing reliability can be significantly improved as Gaussian noise can be overwhelmed. Based on binary hypothesis testing, a low-complexity sequential probability ratio test (SPRT) is thus developed for effectively detecting the vacant spectrum to meet the requirements of the sensing duty cycle. Simulation results show that the proposed detector outperforms conventional ED, especially in low SNR environments.

Proceedings ArticleDOI
25 Mar 2012
TL;DR: This work purpose and analyze an off-line randomized sensor selection strategy for sequential hypothesis testing problem constrained with sensor measurement costs and introduces a quantity, called efficiency, of a sensor and shows that it is critical to the sensor selection in SPRT.
Abstract: We purpose and analyze an off-line randomized sensor selection strategy for sequential hypothesis testing problem constrained with sensor measurement costs. Within the framework of Wald's approximation, the sequential probability ratio test (SPRT) with sensor selection is designed for minimizing the expected total measurement cost subject to reliability and sensor usage constraints. In the case of symmetric hypotheses, we introduce a quantity, called efficiency, of a sensor and show that it is critical to the sensor selection in SPRT. Furthermore, an algorithm with linear time complexity is proposed to obtain the optimal sensor selection probabilities.