scispace - formally typeset
Search or ask a question

Showing papers on "Sequential probability ratio test published in 2019"


Journal ArticleDOI
TL;DR: To mitigate sensitivity to synchronization errors, a decision feedback equalizer, sequential probability ratio test (DFE-SPRT) based receiver and an approximate maximum-likelihood delay estimator are derived.
Abstract: Molecular communication has been proposed as a signaling mechanism to enable communication between nano-machines via molecular diffusion. Precise synchronization of transmitters and receivers is particularly challenging in diffusive molecular communication environments due to the propagation characteristics. To mitigate sensitivity to synchronization errors, a decision feedback equalizer, sequential probability ratio test (DFE-SPRT) based receiver and an approximate maximum-likelihood delay estimator are derived. The performance of the delay estimator is shown to be close to a Cramer-Rao bound at low transmission rates. Numerical results show that the DFE-SPRT makes a decision well before the full symbol interval is exhausted; the desirability of this effect is theoretically justified through the adaptation of the analysis of the log-loss function for mismatched Poisson random variables. Performance bounds on the DFE-SPRT are employed to design optimized modulation for the mis-synchronized communication channel. The proposed receiver and modulation designs achieve strongly improved asynchronous detection performance for the same data rate relative to recently proposed receiver designs that also exploit decision feedback.

15 citations


Journal ArticleDOI
TL;DR: Three localization-based detection approaches are proposed and extensive results illustrate that SRD outperforms TriRSD and ROSD-RSD, and other existing detection algorithms based on the sequential probability ratio test and maximum likelihood estimation in terms of both false alarm and detection rates.
Abstract: Radiation source detection is an important problem in homeland security-related applications. Deploying a network of detectors is expected to provide improved detection due to the combined, albeit dispersed, capture area of multiple detectors. Recently, localization-based detection algorithms provided performance gains beyond the simple “aggregated” area as a result of localization being enabled by the networked detectors. We propose the following three localization-based detection approaches: 1) source-attractor radiation detection (SRD); 2) triangulation-based radiation source detection (TriRSD); and 3) the ratio of square distance-based radiation source detection (ROSD-RSD). We use canonical datasets from Domestic Nuclear Detection Office's intelligence radiation sensors systems tests to assess the performance of these methods. Extensive results illustrate that SRD outperforms TriRSD and ROSD-RSD, and other existing detection algorithms based on the sequential probability ratio test and maximum likelihood estimation in terms of both false alarm and detection rates.

15 citations


Journal ArticleDOI
26 Jun 2019-Sensors
TL;DR: This paper defines a survival probability which is dependent on target state, and labels individual extracted targets and corresponding particles, and transforms the target confirmation problem into a hypothesis test problem, and utilizes sequential probability ratio test to distinguish real targets and false alarms in real time.
Abstract: Radar target detection probability will decrease as the target echo signal-to-noise ratio (SNR) decreases, which has an adverse influence on the result of multi-target tracking. The performances of standard multi-target tracking algorithms degrade significantly under low detection probability in practice, especially when continuous miss detection occurs. Based on sequential Monte Carlo implementation of Probability Hypothesis Density (PHD) filter, this paper proposes a heuristic method called the Refined PHD (R-PHD) filter to improve multi-target tracking performance under low detection probability. In detail, this paper defines a survival probability which is dependent on target state, and labels individual extracted targets and corresponding particles. When miss detection occurs due to low detection probability, posterior particle weights will be revised according to the prediction step. Finally, we transform the target confirmation problem into a hypothesis test problem, and utilize sequential probability ratio test to distinguish real targets and false alarms in real time. Computer simulations with respect to different detection probabilities, average numbers of false alarms and continuous miss detection durations are provided to corroborate the superiority of the proposed method, compared with standard PHD filter, Cardinalized PHD (CPHD) filter and Cardinality Balanced Multi-target Multi-Bernoulli (CBMeMBer) filter.

14 citations


Journal ArticleDOI
15 Oct 2019
TL;DR: A modification of the WaldBoost based on iterative refinement of the decision boundaries, which can significantly reduce the number of used weak classifiers required for pattern recognition with a given accuracy is proposed.
Abstract: The implementation of the WaldBoost algorithm is considered, and its modification is proposed, which allows to significantly reduce the number of weak classifiers to achieve a given classification accuracy. The efficiency of the proposed algorithm is shown by specific examples. The paper studies modifications of compositions (ensembles) of algorithms for solving real-time pattern recognition problems. The aim of the study is to improve the known machine learning algorithms for pattern recognition using a minimum amount of time (the minimum number of used classifiers) and with a given accuracy of the results. We consider the implementation of the WaldBoost algorithm, which combines two algorithms: adaptive boosting of weak classifiers – AdaBoost (adaptive boosting), which has a high generalizing ability, and the sequential probability ratio test – SPRT (Wald test), which is the optimal rule of decision-making when distinguishing two hypotheses. It is noted that when using the WaldBoost, the values of the actual probability of classification errors, as a rule, are less than given because of the approximate boundaries of the SPRT, so that the classification process uses an excessive series of weak classifiers. In this regard, we propose a modification of the WaldBoost based on iterative refinement of the decision boundaries, which can significantly reduce the number of used weak classifiers required for pattern recognition with a given accuracy. The efficiency of the proposed algorithm is shown by specific examples. The results are confirmed by statistical modeling on several data sets. It is noted that the results can be applied in the refinement of other cascade classification algorithms.

7 citations


Journal ArticleDOI
TL;DR: This paper demonstrates that CMaxSPRT can be performed under nonflat thresholds too, and offers a rule of thumb for establishing the best shape of the signaling threshold in the sense of minimizing expected time to signal and expected sample size.
Abstract: Sequential analysis hypothesis testing is now an important tool for postmarket drug and vaccine safety surveillance. When the number of adverse events accruing in time is assumed to follow a Poisson distribution, and if the baseline Poisson rate is assessed only with uncertainty, the conditional maximized sequential probability ratio test, CMaxSPRT, is a formal solution. CMaxSPRT is based on comparing monitored data with historical matched data, and it was primarily developed under a flat signaling threshold. This paper demonstrates that CMaxSPRT can be performed under nonflat thresholds too. We pose the discussion in the light of the alpha spending approach. In addition, we offer a rule of thumb for establishing the best shape of the signaling threshold in the sense of minimizing expected time to signal and expected sample size. An example involving surveillance for adverse events after influenza vaccination is used to illustrate the method.

5 citations


Journal ArticleDOI
30 Nov 2019-Sensors
TL;DR: A novel method for integrating the multiple hypothesis tracker with detection processing, where the detector acquires an adaptive detection threshold from the output of themultiple hypothesis tracker algorithm, and then the obtained detection threshold is employed to compute the score function and sequential probability ratio test threshold for the data association and track estimation tasks.
Abstract: In extant radar signal processing systems, detection and tracking are carried out independently, and detected measurements are utilized as inputs to the tracking procedure. Therefore, the tracking performance is highly associated with detection accuracy, and this performance may severely degrade when detections include a mass of false alarms and missed-targets errors, especially in dense clutter or closely-spaced trajectories scenarios. To deal with this issue, this paper proposes a novel method for integrating the multiple hypothesis tracker with detection processing. Specifically, the detector acquires an adaptive detection threshold from the output of the multiple hypothesis tracker algorithm, and then the obtained detection threshold is employed to compute the score function and sequential probability ratio test threshold for the data association and track estimation tasks. A comparative analysis of three tracking algorithms in a clutter dense scenario, including the proposed method, the multiple hypothesis tracker, and the global nearest neighbor algorithm, is conducted. Simulation results demonstrate that the proposed multiple hypothesis tracker integrated with detection processing method outperforms both the standard multiple hypothesis tracker algorithm and the global nearest neighbor algorithm in terms of tracking accuracy.

5 citations


Journal ArticleDOI
TL;DR: It is shown that the proposed truncated sequential algorithm, T-SeqRDT, requires even fewer assumptions on the signal model, while guaranteeing the error probabilities to be below pre-specified levels and at the same time makes a decision faster compared to its optimal fixed-sample-size counterpart, BlockRDT.
Abstract: In this paper, we propose a new algorithm for sequential non-parametric hypothesis testing based on Random Distortion Testing (RDT). The data-based approach is non-parametric in the sense that the underlying signal distributions under each hypothesis are assumed to be unknown. Our previously proposed non-truncated sequential algorithm, Seq RDT, was shown to achieve desired error probabilities under a few assumptions on the signal model. In this paper, we show that the proposed truncated sequential algorithm, T- Seq RDT, requires even fewer assumptions on the signal model, while guaranteeing the error probabilities to be below pre-specified levels and at the same time makes a decision faster compared to its optimal fixed-sample-size counterpart, Block RDT. We derive bounds on the error probabilities and the average stopping times of the algorithm. Via numerical simulations, we compare the performance of T- Seq RDT with Seq RDT, Block RDT, sequential probability ratio test, and composite sequential probability ratio tests. We also show the robustness of the proposed approach compared with the standard likelihood ratio based approaches.

5 citations


Journal ArticleDOI
TL;DR: This paper derives exact critical values for CMaxSPRT, as well as statistical power and expected time to signal, for both continuous and group sequential analysis, and for different rejection boundaries.
Abstract: Sequential analysis is now commonly used for post-market drug and vaccine safety surveillance, and a Poisson stochastic process is typically used for rare adverse events. The conditional maximized ...

5 citations


Proceedings ArticleDOI
20 May 2019
TL;DR: The development of a sequential probability ratio test-based detector, which allows for additional observations in the presence of uncertainty due to mis-synchronisation at the receiver, and a modulation design which is optimised for this receiver strategy, are considered.
Abstract: Precise synchronisation of transmitters and receivers is particularly challenging in diffusive molecular communication environments. To this end, a point-to-point molecular communication system is examined wherein the design of the transceiver offers resilience to synchronisation errors. In particular, the development of a sequential probability ratio test-based detector, which allows for additional observations in the presence of uncertainty due to mis-synchronisation at the receiver, and a modulation design which is optimised for this receiver strategy, is considered. The structure of the probability of molecules hitting a receiver within a particular time slot is exploited. An approximate maximum log-likelihood estimator for the synchronisation error is derived and the Cramér-Rao bound (CRB) computed, to show that the performance of the proposed estimator is close to the CRB at low transmission rates. The proposed receiver and modulation designs achieve strongly improved asynchronous detection performance for the same data rate as a decision feedback based receiver by a factor of 3 to 5 on average.

5 citations


Journal ArticleDOI
TL;DR: In this article, the target process is modeled by a multivariate state-space model which may be non-stationary, and the likelihood ratio method, the sequential probability ratio test and the Shiryaev-Roberts procedure are applied to derive control charts signaling a change from the supposed mean structure.
Abstract: In nearly all papers on process control for time-dependent data, it is assumed that the underlying target process is stationary. In the present paper, the target process is modeled by a multivariate state-space model which may be non-stationary. Our aim is to monitor its mean behavior. The likelihood ratio method, the sequential probability ratio test and the Shiryaev–Roberts procedure are applied to derive control charts signaling a change from the supposed mean structure. These procedures depend on certain reference values which have to be chosen by the practitioners. The corresponding generalized approaches are considered as well, and generalized control charts are determined for state-space processes. These schemes do not have further design parameters. In an extensive simulation study, the behavior of the introduced schemes is compared with each other using various performance criteria like the average run length, the average delay, the probability of a successful detection, and the probability of a false detection.

4 citations


Proceedings ArticleDOI
18 Nov 2019
TL;DR: This paper proposes a copula-based distributed sequential detection scheme that characterizes the spatial dependence in wireless sensor networks and shows the asymptotic optimality of the proposed copul-based sequential test.
Abstract: In this paper, we consider the problem of distributed sequential detection using wireless sensor networks (WSNs) in the presence of imperfect communication channels between the sensors and the fusion center (FC). We assume that sensor observations are spatially dependent. We propose a copula-based distributed sequential detection scheme that characterizes the spatial dependence. Specifically, each local sensor collects observations regarding the phenomenon of interest and forwards the information obtained to the FC over noisy channels. The FC fuses the received messages using a copula-based sequential test. Moreover, we show the asymptotic optimality of the proposed copula-based sequential test. Numerical experiments are conducted to demonstrate the effectiveness of our approach.

Proceedings ArticleDOI
01 Jan 2019
TL;DR: An improved CSS scheme, where each cognitive node employs the multi-taper sensing method on every collected observation vector and performs truncated sequential probability ratio test (T-SPRT) on MTM results, decreases sensing latency and improves throughput with high detection probability.
Abstract: The key technique of cognitive radio(CR) is spectrum sensing. It shows that the cooperative spectrum sensing(CSS) can improve the detection probability. However, in the CSS of high real-time scene of internet of things(IoT), sensing and reporting period cannot be neglected, which leads to the high latency and low throughput. In this paper, we consider an improved CSS scheme, where each cognitive node employs the multi-taper sensing method(MTM) on every collected observation vector and performs truncated sequential probability ratio test (T-SPRT) on MTM results. Theoretical analysis and expressions for latency and throughput are derived based on the designed frame structure. Simulation results demonstrate that the proposed scheme decreases sensing latency and improves throughput with high detection probability.

Proceedings ArticleDOI
02 May 2019
TL;DR: The paper compares the traditional method WSPRT (Wald Sequential Probability Ratio Test) with MaxSPRT in the verification process, and the effectiveness and rapidity of the proposed method are verified.
Abstract: Fault diagnosis technology of the temperature sensor on the general processing module (GPM) of integrated modular avionics (IMA) was studied. Through the accelerated life testing of the complex programmable logic device (CPLD) on GPM, the corresponding analytical relationship between the oscillator frequency and temperature was obtained, and the analytical redundancy model between temperature and oscillator frequency was constructed. The fault diagnosis algorithm is designed based on statistical hypothesis testing. The moving mean method is used in the alarm process. The paper compares the traditional method WSPRT (Wald Sequential Probability Ratio Test) with MaxSPRT in the verification process. For MaxSPRT, its hypothesis testing model of the normal distribution is deduced. The simulation model of sensor fault is designed. The effectiveness and rapidity of the proposed method are verified.

Posted Content
Zhenyu Zhao1, Mandie Liu1, Anirban Deb1
TL;DR: This paper proposes a methodology for rolling out features in an automated way using an adaptive experimental design and presents one monitoring algorithm and three ramping up algorithms including time-based, power- based, and risk-based (a Bayesian approach) schedules.
Abstract: During the rapid development cycle for Internet products (websites and mobile apps), new features are developed and rolled out to users constantly. Features with code defects or design flaws can cause outages and significant degradation of user experience. The traditional method of code review and change management can be time-consuming and error-prone. In order to make the feature rollout process safe and fast, this paper proposes a methodology for rolling out features in an automated way using an adaptive experimental design. Under this framework, a feature is gradually ramped up from a small proportion of users to a larger population based on real-time evaluation of the performance of important metrics. If there are any regression detected during the ramp-up step, the ramp-up process stops and the feature developer is alerted. There are two main algorithm components powering this framework: 1) a continuous monitoring algorithm - using a variant of the sequential probability ratio test (SPRT) to monitor the feature performance metrics and alert feature developers when a metric degradation is detected, 2) an automated ramp-up algorithm - deciding when and how to ramp up to the next stage with larger sample size. This paper presents one monitoring algorithm and three ramping up algorithms including time-based, power-based, and risk-based (a Bayesian approach) schedules. These algorithms are evaluated and compared on both simulated data and real data. There are three benefits provided by this framework for feature rollout: 1) for defective features, it can detect the regression early and reduce negative effect, 2) for healthy features, it rolls out the feature quickly, 3) it reduces the need for manual intervention via the automation of the feature rollout process.

Proceedings ArticleDOI
01 Mar 2019
TL;DR: The signal processing model and basic inspection strategy of nuclide recognition method based on statistical test, a Monte Carlo method to select the test threshold is proposed, and a radionuclide characteristic gamma-ray identifier for137Cs based on Wald sequential probability ratio test is constructed.
Abstract: Radionuclide detection is a key step in the control of radioactive materials. Influenced by detector performance, ambient background noise and data analysis model, fast identification of radionuclides is a great challenge faced by traditional gamma-ray spectrometry analysis methods. This paper introduces the signal processing model and basic inspection strategy of nuclide recognition method based on statistical test, a Monte Carlo method to select the test threshold is proposed, and a radionuclide characteristic gamma-ray identifier for137Cs based on Wald sequential probability ratio test is constructed. In the end, the detection performance of the identifier is analyzed through emulate experiment.

01 Jan 2019
TL;DR: In this paper, a study of web usage logs to verify whether it is possible to achieve good recognition rates in the task of distinguishing between human users and automated bots using computational intelligence techniques is presented.
Abstract: This work reports on a study of web usage logs to verify whether it is possible to achieve good recognition rates in the task of distinguishing between human users and automated bots using computational intelligence techniques. Two problem statements are given, offline (for completed sessions) and on-line (for sequences of individual HTTP requests). The former is solved with several standard computational intelligence tools. For the second, a learning version of Wald’s sequential probability ratio test is used.

Journal ArticleDOI
TL;DR: A novel method called Scaled Sequential Probability Ratio Test (SSPRT) produces 2D array of data via special cumulative sum calculation and a peak determination algorithm has also been developed to find significant peaks and to store the corresponding data for further evaluation.
Abstract: Accurate event detection has high priority in many technical applications. Events in acquired data series, their duration, and statistical parameters provide useful information about the observed system and about its current state. This information can be used for condition monitoring, state identification, and many kinds of forecasting as well. In some cases background noise covers the events and simple threshold or power monitoring methods cannot be used effectively. A novel method called Scaled Sequential Probability Ratio Test (SSPRT) produces 2D array of data via special cumulative sum calculation. A peak determination algorithm has also been developed to find significant peaks and to store the corresponding data for further evaluation. The method provides straight information about the endpoints and possible duration of the detected events as well as shows their significance level. The new method also gives representative visual information about the structure of detected events. Application example for thermomechanical fatigue test monitoring and another for vibration based rotational speed estimation of a four-cylinder internal combustion engine is discussed in this paper.

Journal ArticleDOI
TL;DR: In this article, the authors obtained upper and lower bounds for the probability that random walk leaves the strip through the upper boundary, and showed that the probability of random walk leaving through the boundary is bounded.

Book ChapterDOI
01 Jan 2019
TL;DR: The average time between false alarms is introduced and used as a performance metric in addition to the probability of detection and average delay before detection in this chapter.
Abstract: In most applications of remote sensing, signals have an unknown arrival time (e.g., arising from an unknown range in active sensing) and often an unknown duration. Detectors for such signals, which sequentially incorporate and test data as it is measured, are derived and evaluated in this chapter. Sliding incoherent sum and sliding M-of-N (binary integration) detectors are presented for cases where the signal duration is known but the starting time is not. For the opposite scenario, where the starting time is known and the signal duration is not, the sequential probability ratio test (SPRT) is used. When neither the starting time nor signal duration is known, Page’s test is shown to arise as a generalized likelihood ratio detector. For each of these detectors, the probability of false alarm is one because they will eventually declare a detection when left to run for an infinitely long time. As such, the average time between false alarms is introduced and used as a performance metric in addition to the probability of detection and average delay before detection. Various techniques are presented for evaluating the performance measures including approximations and a quantization approach. The chapter is concluded with a design example applying Page’s test to data from a cell-averaging normalizer.

Book ChapterDOI
01 Jan 2019
TL;DR: Three methods are included for selecting the items: random item selection, maximization at the current ability estimate, and the weighting method, which maximizes information based on a combination of the cutoff points weighted by their distance to the ability estimate.
Abstract: Multidimensional computerized classification testing can be used when classification decisions are required for constructs that have a multidimensional structure. Here, two methods for making those decisions are included for two types of multidimensionality. In the case of between-item multidimensionality, each item is intended to measure just one dimension. In the case of within-item multidimensionality, items are intended to measure multiple or all dimensions. Wald’s (1947) sequential probability ratio test and Kingsbury and Weiss (1979) confidence interval method can be applied to multidimensional classification testing. Three methods are included for selecting the items: random item selection, maximization at the current ability estimate, and the weighting method. The last method maximizes information based on a combination of the cutoff points weighted by their distance to the ability estimate. Two examples illustrate the use of the classification and item selection methods.

Patent
23 Aug 2019
TL;DR: In this paper, a gearbox malfunction identification method based on the sequential hypothesis test was proposed, in which an identification system carries out an adaptive intelligent query to a propagation channel with available data.
Abstract: The present invention discloses a gearbox malfunction identification method based on the sequential hypothesis test. The method is that an identification system carries out an adaptive intelligent query to a propagation channel with available data. Firstly, an extracted vibration signal is pretreated with the wavelet packet analysis method. Secondly, a sequence of kurtosis values of the extractedvibration signal is taken as a test object of the sequential probability ratio test. Then effective identification for the mode and gear crack degradation under four states of the gearbox is performedaccording to the sequential probability ratio test algorithm. Finally the gearbox vibration signal is subjected to the three-layer sequential ratio test by combining the sequential probability ratiotest and the root mean square error algorithm. The invention solves the problems of slow identification speed, inaccurate target identification and low identification efficiency in malfunction detection and multi-malfunction identification.

Journal ArticleDOI
TL;DR: The authors bring in multi-hypothesis test (MHT) to the multi-target model within co-located MIMO radar and apply the sequential probability ratio test (SPRT) to locate the multiple range-locate targets efficiently.
Abstract: To distinguish/detect multiple close targets is of great significance in many areas, and the typical tool is the GLRT detection. However, there will be distinguishing difficulty and mutual interference when present targets are very close. Furthermore, it is difficult for binary decision theory to tell where the targets locate exactly as it can only make a decision that whether there is one or two targets present. As it cannot utilise the prior information of target number, it also suffers a poor detection performance. To overcome those disadvantages, the authors bring in multi-hypothesis test (MHT) to the multi-target model within co-located MIMO radar. They also apply the sequential probability ratio test (SPRT) to locate the multiple range-located targets efficiently, whose angles are known as a prior. The simulations show that authors’ SPRT-based detection is more fast and constant comparing to the existing GLRT-based detection taken within cell-by-cell.

01 Jan 2019
TL;DR: Ottoboni et al. as mentioned in this paper used permutation tests and software to address particular questions in randomized and natural experiments, including identifying what, if anything, student evaluations of teaching measure, and whether voting machines malfunctioned in Georgia's November 2018 election.
Abstract: Author(s): Ottoboni, Kellie | Advisor(s): Stark, Philip B | Abstract: Hypothesis testing has come under fire in the past decade as misuses have become increas- ingly visible. It is common to use tests whose assumptions don’t reflect how the data were collected, and editorial policies of many journals reward “p-hacking” by setting the arbitrary threshold of 0.05 to determine whether a result merits publication. In fact, properly designed hypothesis tests are an invaluable tool for inference and decision-making. Classical nonparametric tests, once reserved for problems that could be worked out with pencil and paper or approximated asymptotically, can now be applied to complex datasets with the help of modern computing power. This dissertation tailors some nonparametric tests to modern applications for social good.Permutation tests are a class of hypothesis tests for data that involve random (or plausibly random) assignment. The parametric assumptions for common tests, like the t-test and linear regression, may not hold for randomized experiments; in contrast, the assumptions of permutation tests are implied by the experimental design. But off-the-shelf permutation tests are not a panacea: tests must be tailored to fit the experimental design, and there are subtle numerical issues with implementing the tests in software. We construct permutation tests and software to address particular questions in randomized and natural experiments, including identifying what, if anything, student evaluations of teaching measure, and whether voting machines malfunctioned in Georgia’s November 2018 election.Risk-limiting post-election audits (RLAs) have existed for a decade, but have not been adopted widely, in part due to logistical hurdles. This thesis uses classical nonparametric techniques, including Fisher’s combination method and Wald’s sequential probability ratio test, to build new RLA methods that accommodate the idiosyncratic logistics of statewide elections. A new, more flexible method for using stratified samples in RLAs makes it easier and more efficient to audit elections conducted on heterogeneous voting equipment. This thesis also develops an RLA method based on Bernoulli sampling, which allows ballots to be audited “in parallel” across precincts on Election Day. The RLA method for stratified samples of ballots was piloted in Michigan to study its performance in the face of real-world constraints.

Proceedings ArticleDOI
01 Aug 2019
TL;DR: In this article, the authors compared the sampling characteristics and average sampling times of the single sampling method, SPRT method and SPOT method under the same constraint parameters in testability verification test.
Abstract: The classical sample size determination methods in the testability verification test are introduced, including single, double, multiple sampling methods and sequential probability ratio test (SPRT) method. The sampling characteristics of the methods and the expressions of the average sampling times are analyzed and compared. The results show that the SPRT method has similar sampling characteristics to the single sampling method under the same requirements of bilateral testability indexes and risks, and the SPRT method has the smaller average sampling times. It is recommended to use single sampling method and SPRT method in testability verification test. Considering that these four classical methods do not use prior information, the sample size of the testability verification test is often large. Under consideration of the testability prior information, the Bayesian theory is used to improve the SPRT method to obtain the Sequential Posterior Odds Test (SPOT) method. The sampling characteristics and average sampling times of the single sampling method, SPRT method and SPOT method are compared and analyzed under the same constraint parameters in the test. The results indicate that the average sampling times of SPOT method is less than single sampling method and SPRT method with the premise of approximate sampling characteristics. It is recommended to scientifically use priori information reasonably to reduce the sample size.

Patent
25 Jul 2019
TL;DR: In this paper, a dynamic experimentation evaluation system provides a framework in which a continuous stream of metric data is monitored to establish a causal relationship between changes in a software program and the effect of user-observable behavior.
Abstract: A dynamic experimentation evaluation system provides a framework in which a continuous stream of metric data is monitored to establish a causal relationship between changes in a software program and the effect of user-observable behavior. In one aspect, an A/B test is performed continuously on a stream of metric data representing the usage of a control version of software product and the usage of a treatment version of the software product. A sequential probability ratio test (SPRT) is used as the test statistic to determine when to terminate the test and produce results within a specific confidence interval and controlled error rate.

Patent
23 Aug 2019
TL;DR: In this paper, an identification detection and positioning method based on the sequential probability ratio test, can be widely applied to the detection and the positioning of different types of structural flaws, and accuracy and universality are high.
Abstract: The invention discloses a flaw reflection signal identification method and device based on sequential hypothesis testing. An ultrasonic emitting probe is adopted to emit an ultrasonic signal on a flawtest specimen, and an ultrasonic receiving probe is adopted to receive an ultrasonic echo signal on the same surface of the flaw test specimen; after the echo signal is collected, the echo signal issubjected to wavelet packet transformation de-noising processing; the echo signal subjected to the wavelet packet transformation de-noising processing is subjected to sequential probability ratio testidentification processing to obtain the starting point of a diffraction signal; and according to the starting point of the diffraction signal, the position of the flaw is determined. The invention puts forward an identification detection and positioning method based on the sequential probability ratio test, can be widely applied to the detection and the positioning of different types of structural flaws, and accuracy and universality are high.

Proceedings ArticleDOI
01 Dec 2019
TL;DR: In the asymptotic regime of small detection error probabilities, it is shown that every stochastic encryption degrades the performance of the EC to a greater extent in the sense that the expected sample size at the EC is no fewer than that is required at the LC.
Abstract: We consider sequential detection based on one-bit quantized data in the Internet of Things (IoT) with an eavesdropper. A lightweight physical-layer encryption algorithm, called stochastic encryption, is employed as a counter measure that flips the quantization bits at each IoT sensor according to certain probabilities, and the flipping probabilities are only known to the legitimate cloud (LC) but not the eavesdropping cloud (EC). Due to the optimality of the sequential probability ratio test (SPRT), the LC employs the SPRT for sequential detection whereas the EC employs a mismatched SPRT (MSPRT). We characterize the asymptotic performance of the MSPRT in terms of the expected sample size as a function of the vanishing error probabilities. We show that every symmetric stochastic encryption is ineffective in the sense that it leads to the same expected sample size at the LC and EC when the LC and EC have the same detection accuracy. Next, in the asymptotic regime of small detection error probabilities, we show that every stochastic encryption degrades the performance of the EC to a greater extent in the sense that the expected sample size required at the EC is no fewer than that is required at the LC. Moreover, the optimal stochastic encryption is derived in the sense of maximizing the difference between the expected sample sizes required at the EC and LC.

Patent
23 Aug 2019
TL;DR: In this paper, a principal component analysis and sequential probability ratio test-based centrifugal pump fault diagnosis method was proposed. But, the method is not suitable for the case of a single pump.
Abstract: The invention discloses a principal component analysis and sequential probability ratio test-based centrifugal pump fault diagnosis method. The method comprises the following steps of building a modelby using a common impeller and a fault impeller, and acquiring an original vibration signal by employing a centrifugal pump vibration signal acquisition system; then, performing noise reduction on the signal by applying wavelet packet transformation, and extracting a characteristic parameter of the signal by utilizing a time domain analysis method; then, performing dimension-reduced processing onthe extracted characteristic parameter by employing a principal component analysis method, and selecting a principal component of which the contribution ratio is maximum as a test sequence; and finally, analyzing the running status of a centrifugal pump by utilizing a sequential probability ratio test algorithm and performing classification on a fault by combining with a root-mean-square algorithm. According to the method disclosed by the invention, principal component analysis and sequential probability ratio test are mainly utilized to perform fault state diagnosis, and a classification criterion is established; and the method is higher in effectiveness and accuracy in the aspects of fault diagnosis and recognition of the centrifugal pump.

Patent
30 May 2019
TL;DR: In this paper, the authors present a system that performs prognostic surveillance operations based on sensor signals from a power plant and critical assets in the transmission and distribution grid, using an inferential model trained on previously received signals to generate estimated values for the signals.
Abstract: We present a system that performs prognostic surveillance operations based on sensor signals from a power plant and critical assets in the transmission and distribution grid. The system obtains signals comprising time-series data obtained from sensors during operation of the power plant and associated transmission grid. The system uses an inferential model trained on previously received signals to generate estimated values for the signals. The system then performs a pairwise differencing operation between actual values and the estimated values for the signals to produce residuals. The system subsequently performs a sequential probability ratio test (SPRT) on the residuals to detect incipient anomalies that arise during operation of the power plant and associated transmission grid. While performing the SPRT, the system dynamically updates SPRT parameters to compensate for non-Gaussian artifacts that arise in the sensor data due to changing operating conditions. When an anomaly is detected, the system generates a notification.