scispace - formally typeset
Search or ask a question

Showing papers on "Sequential probability ratio test published in 2013"


Journal ArticleDOI
TL;DR: The feasibility of a strategy of fault detection capable of controlling misclassification probabilities, i.e., balancing false and missed alarms, is investigated.
Abstract: In this paper, we investigate the feasibility of a strategy of fault detection capable of controlling misclassification probabilities, i.e., balancing false and missed alarms. The novelty of the proposed strategy consists of i) a signal grouping technique and signal reconstruction modeling technique (one model for each subgroup), and ii) a statistical method for defining the fault alarm level. We consider a real case study concerning 46 signals of the Reactor Coolant Pump (RCP) of a typical Pressurized Water Reactor (PWR). In the application, the reconstructions are provided by a set of Auto-Associative Kernel Regression (AAKR) models, whose input signals have been selected by a hybrid approach based on Correlation Analysis (CA) and Genetic Algorithm (GA) for the identification of the groups. Sequential Probability Ratio Test (SPRT) is used to define the alarm level for a given expected classification performance. A practical guideline is provided for optimally setting the SPRT parameters' values.

77 citations


Journal ArticleDOI
TL;DR: This paper considers the problem of how to quickly and accurately determine the availability of each spectrum band for a multi-band primary system using one or few sensors and shows that the optimal scanning algorithm is a concatenated sequential probability ratio test (C-SPRT).
Abstract: This paper considers the problem of how to quickly and accurately determine the availability of each spectrum band for a multi-band primary system using one or few sensors. Such problem is referred to as spectrum scanning. Two cases of practical interest are studied: 1) a single sensor case in which only one spectrum band is observed at one time; and 2) a multiple sensor case in which multiple spectrum bands are observed simultaneously. For each case, scenarios with and without a scanning delay constraint are investigated. Using mathematical tools from optimal stopping theory, optimal spectrum scanning algorithms are developed to minimize a cost function that strikes a desirable trade-off between detection performance and sensing delay. In the non delay-constrained case, it is shown that the optimal scanning algorithm is a concatenated sequential probability ratio test (C-SPRT). In the delay-constrained case, the optimal scanning algorithm has a high implementation complexity and truncation algorithms are developed as alternative low complexity options. Numerical examples are provided to illustrate the effectiveness of the proposed algorithms.

45 citations


Journal ArticleDOI
TL;DR: An asymptotic analysis on the average decision delay for the proposed channel-aware scheme is provided, and it is shown that the asymPTotic decision delay is characterized by a Kullback-Leibler information number.
Abstract: We consider decentralized detection through distributed sensors that perform level-triggered sampling and communicate with a fusion center (FC) via noisy channels. Each sensor computes its local log-likelihood ratio (LLR), samples it using the level-triggered sampling mechanism, and at each sampling instant transmits a single bit to the FC. Upon receiving a bit from a sensor, the FC updates the global LLR and performs a sequential probability ratio test (SPRT) step. We derive the fusion rules under various types of channels. We further provide an asymptotic analysis on the average decision delay for the proposed channel-aware scheme, and show that the asymptotic decision delay is characterized by a Kullback-Leibler information number. The delay analysis facilitates the choice of the appropriate signaling schemes under different channel types for sending the 1-bit information from the sensors to the FC.

42 citations


Journal ArticleDOI
TL;DR: This paper explores the strategies of supply chain collaboration by utilizing theory of constraint to achieve the goal of adjusting the target inventory level dynamically using three time-series-data-mining techniques to detect the timing of market demand change.

41 citations


Journal ArticleDOI
TL;DR: The triangular fuzzy number TFN is used to express the fuzzy phenomenon of the sampling plan's parameters in a new sequential sampling plan based on sequential probability ratio test for fuzzy hypotheses testing.
Abstract: In this paper we introduce a new sequential sampling plan based on sequential probability ratio test for fuzzy hypotheses testing. In order to deal with uncertainty happened, the triangular fuzzy number TFN is used to express the fuzzy phenomenon of the sampling plan's parameters. In this plan the acceptable quality level AQL and the lot tolerance percent defective LTPD are TFN. For such a plan, we design a decision criterion of acceptance and rejection for every arbitrary λ-cut and a particular table of rejection and acceptance is calculated. This plan is well defined, since, if the parameters are crisp, it changes to a classical plan.

24 citations


Journal ArticleDOI
TL;DR: A new algorithm is proposed that improves the algorithm accuracy and robustness by employing an M-estimator cost function to decide on the best estimated model from the randomly selected samples and improves the time performance of the algorithm by utilizing a statistical pretest based on Wald's sequential probability ratio test.
Abstract: This paper addresses the problem of fitting a functional model to data corrupted with outliers using a multilayered feed-forward neural network. Although it is of high importance in practical applications, this problem has not received careful attention from the neural network research community. One recent approach to solving this problem is to use a neural network training algorithm based on the random sample consensus (RANSAC) framework. This paper proposes a new algorithm that offers two enhancements over the original RANSAC algorithm. The first one improves the algorithm accuracy and robustness by employing an M-estimator cost function to decide on the best estimated model from the randomly selected samples. The other one improves the time performance of the algorithm by utilizing a statistical pretest based on Wald's sequential probability ratio test. The proposed algorithm is successfully evaluated on synthetic and real data, contaminated with varying degrees of outliers, and compared with existing neural network training algorithms.

18 citations


Journal ArticleDOI
TL;DR: A novel collision resolution scheme for random access in wireless sensor networks that achieves significant channel and power efficiency gain, compared with fixed sample size test, traditional sequential probability ratio test, and Pseudo-Bayesian based contention protocol.
Abstract: In this paper, we propose a novel collision resolution scheme for random access in wireless sensor networks. If a collision incurs during fusion process, splitting algorithm is applied to resolve the collision dynamically and recursively, based on past channel states alone or channel states and sensing information together. The novelty of our splitting algorithm is two-fold: 1. we perform splitting based on the informativeness of sensor data, ensuring that more informative data will be collected first; 2. we optimize splitting intervals based on local summaries of sensors collected at each fusion step. As shown in our simulation results, the proposed schemes achieve significant channel and power efficiency gain, compared with fixed sample size test, traditional sequential probability ratio test, and Pseudo-Bayesian based contention protocol.

16 citations


Journal ArticleDOI
TL;DR: In this paper, a new health monitor method based on Multivariate State Estimation Technique (MSET) and Sequential Probability Ratio Test (SPRT) is proposed in order to demonstrate the performance gain.
Abstract: The monitor of lithium-ion battery health is becoming a challenge because the performance of battery is effect by many environment factors. To address this problem, a new health monitor method based on Multivariate State Estimation Technique (MSET) and Sequential Probability Ratio Test (SPRT) is proposed in this paper. In order to demonstrate the performance gain of the method, a detailed experiment is performed based on a lithium-ion battery. By the comparison of performance parameters actual residuals and healthy residuals driven from the training data based on MSET, the fault detection can be implemented based on the SPRT.

11 citations


Journal ArticleDOI
TL;DR: In this paper, SPRT is theoretically investigated for two different phase-type queueing systems which consist of hyperexponential and mixed Erlang.
Abstract: The control of traffic intensity is one of the important problems in the study of queueing systems. Rao et al. (1984) developed a method to detect changes in the traffic intensity in queueing systems of the and types based on the Sequential Probability Ratio Test (SPRT). In this paper, SPRT is theoretically investigated for two different phase-type queueing systems which consist of hyperexponential and mixed Erlang. Also, for testing against , Operating Characteristic (OC) and Average Sample Number (ASN) functions are obtained with numerical methods using multipoint derivative equations according to different situations of and type errors. Afterward, numerical illustrations for each model are provided with Matlab programming.

10 citations


Journal ArticleDOI
TL;DR: In this article, inductive integral equations governing SPRT and Page's cumulative sum test are developed under very general settings, where the bounds can be time-varying and the LLRs are assumed independent but nonstationary.
Abstract: The sequential probability ratio test (SPRT) is a fundamental tool for sequential analysis. It forms the basis of numerous sequential techniques for different applications; for example, the truncated SPRT and Page's cumulative sum test (CUSUM). The performance of SPRT is characterized by two important functions—operating characteristic (OC) and average sample number (ASN), and CUSUM's performance is revealed by the average run length (ARL) function. These functions have been studied extensively under the assumption of independent and identically distributed log-likelihood ratios (LLRs) with constant bounds, which is too stringent for many applications. In this article, inductive integral equations governing these functions are developed under very general settings—the bounds can be time-varying and the LLRs are assumed independent but nonstationary. These inductive equations provide a theoretical foundation for performance analysis. Unfortunately, they have nonunique solutions in the general case...

10 citations


Journal ArticleDOI
TL;DR: SB for random simulation with multiple responses (outputs), called multi-response SB (MSB), is examined through extensive Monte Carlo experiments that satisfy all MSB assumptions, and through a case study representing a logistic system in China; the results are very promising.
Abstract: Factor screening searches for the really important inputs (factors) among the many inputs that are changed in a realistic simulation experiment. Sequential bifurcation (SB) is a sequential method that changes groups of inputs simultaneously. SB is the most efficient and effective method if the following assumptions are satisfied: (i) second-order polynomials are adequate approximations of the input/output functions implied by the simulation model; (ii) the signs of all first-order effects are known; (iii) if two inputs have no important first-order effects, then they have no important second-order effects either (heredity property). This paper examines SB for random simulation with multiple responses (outputs), called multiresponse SB (MSB). This MSB selects groups of inputs such that within a group all inputs have the same sign for a specific type of output, so no cancellation of first-order effects occurs. MSB also applies Wald’s sequential probability ratio test (SPRT) to obtain enough replicates for correctly classifying a group effect or an individual effect as important or unimportant. MSB enables efficient selection of the initial number of replicates in SPRT. The paper also proposes a procedure to validate the three assumptions of MSB. The performance of MSB is examined through extensive Monte Carlo experiments that satisfy all MSB assumptions, and through a case study representing a logistic system in China; the MSB performance is very promising.

Journal ArticleDOI
TL;DR: A planning methodology is proposed for the sequential probability ratio test (SPRT) for the purpose of practical application, and the relative efficiency—represents the ratio of the test's weighted average sample number till its stopping and its counterpart for the nontruncated SPRT.
Abstract: A planning methodology is proposed for the sequential probability ratio test (SPRT) for the purpose of practical application. The SPRT is the most common acceptance test in the field of reliability and quality control. In it, the hypothesis is checked that the percentage of defective items does not exceed a specified value. Truncation is resorted to compensate for the absence of a limit on the test duration, but it complicates the planning process. Moreover, the discreteness and multidimensionality of the characteristics of such tests prevent their direct comparison and optimization. To remedy these drawbacks, quality features of the test are proposed, one of which—the relative efficiency—represents the ratio of the test's weighted average sample number till its stopping and its counterpart for the nontruncated SPRT. It facilitates solution of the problems in automatic planning of the test. Another important advantage of this relative efficiency is that it yields accurate and simple formulas for the stopping boundary. Besides, these formulas permit sound choice of the truncation level already at early stages of the planning process. A planner's algorithm and an industrial example are also included. The proposed methodology can also be applied to exponential SPRT. The advantages of tests based on the proposed methodology over those in IEC-61123 (the binomial case) and IEC-61124 (the exponential case) are demonstrated, and revision of the standards is recommended. Copyright © 2012 John Wiley & Sons, Ltd.

Proceedings ArticleDOI
26 May 2013
TL;DR: A generalized sequential probability ratio test for composite hypotheses wherein the thresholds are updated in an adaptive manner based on the data recorded up to the current sample using the parametric bootstrap to avoid the asymptotic assumption usually made in earlier works.
Abstract: We present a generalized sequential probability ratio test for composite hypotheses wherein the thresholds are updated in an adaptive manner based on the data recorded up to the current sample using the parametric bootstrap. The resulting test avoids the asymptotic assumption usually made in earlier works. The increase of the average sample number of the proposed method is not significant compared to the sequential probability ratio test which is based on known parameters, especially in a low SNR region. In addition, the probability of false alarm and the probability of missed detection are maintained below the preset values. A comparison shows that the thresholds based on the parametric bootstrap are in close agreement with the thresholds based on Monte-Carlo simulations.

Journal Article
TL;DR: In this paper, the authors examined whether the unidimensional sequential probability ratio test (SPRT) can be pro- ductively combined with multidimensional adaptive testing (MAT) and concluded that MAT will result in a higher percentage of correct classifications than UCAT when more than two dimensions are measured.
Abstract: It is examined whether the unidimensional Sequential Probability Ratio Test (SPRT) can be pro- ductively combined with multidimensional adaptive testing (MAT). With a simulation study, it is investigated whether this combination results in more accurate simultaneous classifications on two or three dimensions compared to several instances of unidimensional adaptive testing (UCAT) in combination with SPRT. The number of cut scores, and the correlation between the dimensions measured were varied. The average test length was mainly influenced by the number of cut scores (one, four) and the adaptive algorithm (MAT, UCAT). With MAT, a lower average test length was achieved in comparison to the UCAT. It is concluded that MAT will result in a higher percentage of correct classifications than UCAT when more than two dimensions are measured.Key words: classification, computerized adaptive testing, item response theory, multidimensional adaptive testing, sequential probability ratio test(ProQuest: ... denotes formulae omitted.)Multidimensional adaptive testing (MAT) is a special approach to the assessment of two or more latent abilities in which the selection of the test items presented to the examinee is based on the responses given by the examinee to previously administered items (e.g., Frey & Seitz, 2009). The main advantage of MAT is its capacity to substantially increase measurement efficiency compared to sequential testing or unidimensional computerized adaptive testing (UCAT). Most of the studies on MAT are focusing its application for assessing individual abilities located on continuous scales. Currently, only very little is known about the capabilities of MAT regarding the classification of test takers to one of several ability categories (e.g., pass vs. fail). To fill in this gap, the present paper focuses on the combination of MAT with the sequential probability ratio test (SPRT; e.g., Kings- bury & Weiss, 1983; Reckase, 1983). The SPRT is a classification method that already has been used successfully in combination with UCAT (e.g., Eggen, 1999; Eggen & Straetmans, 2000; Spray & Reckase, 1996; Thompson, 2007b).Regarding MAT, Spray, Abdel-fattah, Huang, and Lau (1997) made an attempt to modi- fy the SPRT in order to use it with MAT based on items with within-item multidimen- sionality. Items with within-item multidimensionality are allowed to measure more than one dimension simultaneously (Wang, Wilson, & Adams, 1997). Dealing with within- item multidimensionality, the multidimensional item response theory (IRT) model used with MAT is a compensatory model (e.g., Reckase, 2009). With such an IRT-model, the linear combination of the abilities measured leads to a curvilinear function. Therefore, the test statistic of the SPRT, which is a likelihood ratio test, cannot be updated by two unique values required by the SPRT. For details, see Spray et al. (1997). Considering multidimensional pass-fail tests, Spray and colleagues did not find a satisfactory solution for implementing a multidimensional SPRT into such a MAT.Nevertheless, from a practical point of view, tests entailing items measuring exactly one dimension each (between-item multidimensionality) are much more common than tests based on an item pool with within-item multidimensionality. Hence, the present paper focusses on the combination of MAT and SPRT for items with between-item multidi- mensionality. Note that when the MAT approach of Segall (1996) is used for items with between-item multidimensionality, information from items which measure one dimen- sion is used as information about the person's score on other dimensions. This is done by incorporating assumption about the multivariate ability distribution in terms of correla- tions between the measured dimensions. Several studies showed that using this infor- mation results in substantial increase in measurement efficiency compared to using sev- eral unidimensional adaptive tests (e. …

Proceedings ArticleDOI
24 Oct 2013
TL;DR: Sequential probability ratio test (SPRT) method can decrease the test sample size with almost a same operation characteristic as classical method based on binomial distribution and the result shows that the test samples size is remarkably decreased while comparing with the classical method.
Abstract: Testability plays an important role in the readiness of equipment as a good design for testability (DFT) can greatly decrease the fault detection and isolation time, which will accelerate the maintenance actions. Testability verification is a procedure to check that whether the testability indexes such as fault detection rate (FDR) and fault isolation rate (FIR) meet the requirement in the contract. Currently, standards and statistical methods used in testability verification have the problems such as large sample, long period and so on. Sequential probability ratio test (SPRT) method can decrease the test sample size with almost a same operation characteristic as classical method based on binomial distribution. SPRT method and its truncated rules are introduced and the spectrum of expected test number is proposed. Then, the sample size allocation method and failure mode selection method based on failure rate used in sequential testability verification are illustrated. Testability verification of a control system is implemented with the given method and steps. Software named testability demonstration and evaluation system (TDES) which can calculate the decision criteria, plot decision chart, select failure mode and make judgment is used in the test as assistance. The result shows that the test sample size is remarkably decreased while comparing with the classical method.

Proceedings ArticleDOI
19 Dec 2013
TL;DR: In this article, the authors discuss the challenging issues in virtual-sensing, introduce and ultimately combine the Hidden Markov Model and the Edge-based methods, and the resulting solution, based on a Multiple-hypothesis Sequential Probability Ratio Test, combines the advantages of the two methods and delivers significant improvement in disaggregation performance.
Abstract: Virtual-Sensing, which is achieved through the disaggregation of composite power metering signals, is a solution towards achieving fine-grained smart power monitoring. In this work we discuss the challenging issues in Virtual-Sensing, introduce and ultimately combine the Hidden Markov Model and the Edge-based methods. The resulting solution, based on a Multiple-hypothesis Sequential Probability Ratio Test, combines the advantages of the two methods and delivers significant improvement in disaggregation performance. A robust version of the test is also proposed to filter the impulse noise common in real-time monitoring of the plug-in loads power consumption.

Proceedings ArticleDOI
19 Aug 2013
TL;DR: In this paper, the authors proposed a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions, under the assumption that prediction errors are Gaussian.
Abstract: We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

Book ChapterDOI
16 Aug 2013

01 Jan 2013
TL;DR: A joint performance measure is proposed for JDE algorithms for dynamic problems where data is made available sequentially and the requirement of identical distribution is too stringent for many applications.
Abstract: This dissertation mainly consists of three parts. The first part proposes generalized linear minimum mean-square error (GLMMSE) estimation for nonlinear point estimation. The second part proposes a recursive joint decision and estimation (RJDE) algorithm for joint decision and estimation (JDE). The third part analyzes the performance of sequential probability ratio test (SPRT) when the log-likelihood ratios (LLR) are independent but not identically distributed. The linear minimum mean-square error (LMMSE) estimation plays an important role in nonlinear estimation. It searches for the best estimator in the set of all estimators that are linear in the measurement. A GLMMSE estimation framework is proposed in this dissertation. It employs a vector-valued measurement transform function (MTF) and finds the best estimator among all estimators that are linear in MTF. Several design guidelines for the MTF based on a numerical example were provided. A RJDE algorithm based on a generalized Bayes risk is proposed in this dissertation for dynamic JDE problems. It is computationally efficient for dynamic problems where data are made available sequentially. Further, since existing performance measures for estimation or decision are effective to evaluate JDE algorithms, a joint performance measure is proposed for JDE algorithms for dynamic problems. The RJDE algorithm is demonstrated by applications to joint tracking and classification as well as joint tracking and detection in target tracking. The characteristics and performance of SPRT are characterized by two important functions—operating characteristic (OC) and average sample number (ASN). These two functions have been studied extensively under the assumption of independent and identically distributed (i.i.d.) LLR, which is too stringent for many applications. This dissertation relaxes the requirement of identical distribution. Two inductive equations governing the OC and ASN are developed. Unfortunately, they have non-unique solutions in the general case. They do have unique solutions in two special cases: (a) the LLR sequence converges in distributions and (b) the LLR sequence has periodic distributions. Further, the analysis can be readily extended to evaluate the performance of the truncated SPRT and the cumulative sum test.

Journal ArticleDOI
TL;DR: With equivalent parameters, the R-SPRT and LC-CUSUM formulations of sequential tests produced different outcomes, demonstrating that the choice of test method, as well as the choiceof parameters, is important in designing a training scheme.
Abstract: Objective To assess clinical measurement competency by two sequential test formulations [resetting sequential probability ratio test (R-SPRT) and learning curve cumulative summation (LC-CUSUM)]. Design Numerical simulation and retrospective observational study. Setting Obstetric ultrasound department. Participants Cohorts of 10 000 simulated trainees and 62 obstetric sonographers training in nuchal translucency (NT) measurement at the 11–14-week pregnancy scan with limited case availability. Intervention Application of LC-CUSUM and R-SPRT to clinical measurement training. Main Outcome Measures Proportions of real trainees achieving competency by LC-CUSUM and R-SPRT, proportions of simulated competent trainees not achieving competency (Type I error), proportions of simulated incompetent trainees achieving competency (Type II error), distribution of case number required to achieve competency (run length) and frequency of resets. Results For simulated cohorts, significant differences in run-length distribution and true test error rates were found between the R-SPRT and LC-CUSUM tests with equivalent parameters. Increasing the cases available to each trainee reduced the Type I error rate but increased the Type II error rate for both sequential tests for all choices of unacceptable failure rate. Discontinuities in the proportion of trainees expected to be test competent were found at critical values of unacceptable failure rate. Conclusions With equivalent parameters, the R-SPRT and LC-CUSUM formulations of sequential tests produced different outcomes, demonstrating that the choice of test method, as well as the choice of parameters, is important in designing a training scheme. The R-SPRT detects incompetence as well as competence and may indicate need for further training. Simulations are valuable in estimating the proportions of trainees expected to be assessed as competent.

Journal ArticleDOI
TL;DR: In this paper, the Wald's sequential probability ratio test (SPRT) of two simple hypotheses regarding the Levy-Khintchine triplet of a wide family of Levy processes is analyzed.
Abstract: The Wald's sequential probability ratio test (SPRT) of two simple hypotheses regarding the Levy-Khintchine triplet of a wide family of Levy processes is analyzed: we concentrate on continuous paths and pure increasing jump Levy processes. Appealing to the theory of Markov processes, we employ a general method for determining the stopping boundaries and the expected length of the SPRT for a given admissible pair (α, β) of error probabilities. The well-known results of the Wiener and Poisson sequential testing can be derived accordingly. The explicit solution for the SPRT of two simple hypotheses about the parameter p ∈ (0, 1) of a Levy negative binomial process is shown.

Journal ArticleDOI
TL;DR: This work proposes a distribution function constraint along with an empirical likelihood ratio test that significantly outperforms the robust Kolmogorov-Smirnov test and the Cramér-von Mises test when the null hypothesis is nested in the alternative hypothesis.
Abstract: In this work, we study non-parametric hypothesis testing problem with distribution function constraints. The empirical likelihood ratio test has been widely used in testing problems with moment (in)equality constraints. However, some detection problems cannot be described using moment (in)equalities. We propose a distribution function constraint along with an empirical likelihood ratio test. This detector is applicable to a wide variety of robust parametric/non-parametric detection problems. Since the distribution function constraints provide a more exact description of the null hypothesis, the test outperforms the empirical likelihood ratio test with moment constraints as well as many popular goodness-of-fit tests, such as the robust Kolmogorov-Smirnov test and the Cramer-von Mises test. Examples from communication systems with real-world noise samples are provided to show their performance. Specifically, the proposed test significantly outperforms the robust Kolmogorov-Smirnov test and the Cramer-von Mises test when the null hypothesis is nested in the alternative hypothesis. The same example is repeated when we assume no noise uncertainty. By doing so, we are able to claim that in our case, it is necessary to include uncertainty in noise distribution. Additionally, the asymptotic optimality of the proposed test is provided.

Journal ArticleDOI
TL;DR: The method does not rely on manual tuning and tweaking of parameters but is completely self-contained and mathematically justified, and the method is computationally simple and scalable, which are desirable features that would make the method a component of choice in larger, autonomic frameworks.

Journal Article
TL;DR: This work shows how the k-nearest neighbor classification algorithm in machine learning can be utilized as a mathemati- cal framework to derive a variety of novel sequential sampling models and proposes a common mathe- matical framework combining these methods and providing a systematic explanation for understanding different methods.

Proceedings Article
26 Jul 2013
TL;DR: This article made verification experiment based on actuator fault detection of Autonomous GNC system of deep space probes, as can be seen from the simulation results, sequential probability ratio detection method can made a real-time detection of the fault signal.
Abstract: This paper proposes a fault diagnosis method of satellite actuator based on parameters sequential probability ratio test. Sequential probability ratio test is a hypothesis test based on statistical learning, whose greatest advantages is that it does not need to prescribed the number of observed sample group, but compare each hypothesis test data to the set threshold value. Firstly, give the hypothesis testing and drawn threshold based on the given false alarm rate and missed alarm rate. Secondly, extract the characteristic parameters of the signal to be inspected, Finally, calculate the likelihood values and compare to threshold value, so that we can judge whether there is a fault signal. This article made verification experiment based on actuator fault detection of Autonomous GNC system of deep space probes, as can be seen from the simulation results, sequential probability ratio detection method can made a real-time detection of the fault signal. Compared with the fault diagnosis method based on state observer, this method reduces the average inspection time of the fault greatly.

Journal ArticleDOI
TL;DR: This study proposes a Sequential Energy Detection scheme to reduce the average required sample number and sensing time for spectrum sensing in low signal-to-noise ratio regime and retains the high sample-efficiency of sequential probability ratio test.
Abstract: Reliable and swift spectrum sensing is a crucial technical challenge of cognitive radio. This study proposes a Sequential Energy Detection (SED) scheme to reduce the average required sample number and sensing time for spectrum sensing in low signal-to-noise ratio regime. In the scheme, the data samples are first grouped into data blocks and the Sequential Probability Ratio Test (SPRT) use the energies of the data blocks as the statistic variables. The resulting detection rule exhibits simplicity in implementation and in analysis and retains the high sample-efficiency of sequential probability ratio test. The detection performance in terms of Average Sample Number (ASN) is evaluated theoretically. Simulation results are provided to verify the theoretical analysis.

Proceedings ArticleDOI
01 Dec 2013
TL;DR: This work proposes two classifiers: an estimate-then-classify classifier, and a modified MSPRT classifier based on the average likelihood function considering partial knowledge of the PU traffic parameters, which can achieve higher classification performance compared to the traditional maximum likelihood classifier using constant number of samples.
Abstract: We propose a primary user (PU) traffic distribution classifier for dynamic spectrum access networks based on multi-hypothesis sequential probability ratio test (MSPRT). In specific, we propose two classifiers: (i) an estimate-then-classify classifier, and (ii) a modified MSPRT classifier based on the average likelihood function considering partial knowledge of the PU traffic parameters. Using the sequential algorithm, we show that our proposed classifiers can achieve higher classification performance compared to the traditional maximum likelihood classifier using constant number of samples.

01 Jan 2013
TL;DR: In this article, an adaptive nonparametric kernel density estimator derived from an in-control sample of observations with a smoothed bootstrap algorithm that enables the CUSUM to work effectively for reasonably sized sets of in-controlled data is proposed.
Abstract: Cumulative sum (CUSUM) algorithms are used for monitoring in various applications, including manufacturing, network monitoring, financial markets, biosurveillance, and many more. A popular CUSUM technique for detecting a change in the in-control distribution of an independentdata sequence is based on repeated use of the sequential probability ratio test (SPRT). Some optimality results have been derived for the SPRT-based CUSUM when the in-control and out-of-control distributions are fully known. We introduce an approximation formula for thethreshold value of an SPRT-based CUSUM. Limited research has been performed on CUSUM techniques when the distributions are not fully specified. This research is concerned about how to use the CUSUM when the underlying in-control distribution is arbitrary and unknown, and the out-of-control density is either an additive or a multiplicative transformation of the in-control density. The proposed solution combines an adaptive nonparametric kernel density estimator derived from an in-control sample of observations with a smoothed bootstrap algorithm that enables the CUSUM to work effectively for reasonably sized sets of in-control data.

Posted Content
TL;DR: Wald's sequential probability ratio test for deciding whether a sequence of independent and identically distributed observations comes from a specified phase-type distribution or from an exponentially tilted alternative distribution is considered.
Abstract: We consider Wald's sequential probability ratio test for deciding whether a sequence of independent and identically distributed observations comes from a specified phase-type distribution or from an exponentially tilted alternative distribution. In this setting, we derive exact decision boundaries for given Type I and Type II errors by establishing a link with ruin theory. Information on the mean sample size of the test can be retrieved as well. The approach relies on the use of matrix-valued scale functions associated to a certain one-sided Markov additive process. By suitable transformations the results also apply to other types of distributions including some distributions with regularly varying tail.

Journal ArticleDOI
TL;DR: In this article, one-sided cumulative sum (CUSUM) control charts for controlling the parameters of a random variable with erlang-truncated exponential distribution were constructed and the rejection of the Wald's sequential probability ratio test (SPRT) was viewed as the decision lines of a CUSUM control chart for which the variate is a quality characteristic.
Abstract: In this article, we construct one-sided cumulative sum (CUSUM) control charts for controlling the parameters of a random variable with erlang-truncated exponential distribution. The rejection of the Wald’s sequential probability ratio test (SPRT) is viewed as the decision lines of a CUSUM control chart for which the variate is a quality characteristic. Parameters of the CUSUM chart, e.g. lead distance and mask angle, are presented. The results show that the Average Run Length (ARL) of the resulting control charts changes substantially for a slight shift in the parameters of the distribution.