scispace - formally typeset
Search or ask a question

Showing papers on "Bonferroni correction published in 2018"


Journal ArticleDOI
TL;DR: This paper discusses how to test multiple hypotheses simultaneously while limiting type I error rate, which is caused by α inflation, and the differences between MCTs and apply them appropriately.
Abstract: Multiple comparisons tests (MCTs) are performed several times on the mean of experimental conditions. When the null hypothesis is rejected in a validation, MCTs are performed when certain experimental conditions have a statistically significant mean difference or there is a specific aspect between the group means. A problem occurs if the error rate increases while multiple hypothesis tests are performed simultaneously. Consequently, in an MCT, it is necessary to control the error rate to an appropriate level. In this paper, we discuss how to test multiple hypotheses simultaneously while limiting type I error rate, which is caused by α inflation. To choose the appropriate test, we must maintain the balance between statistical power and type I error rate. If the test is too conservative, a type I error is not likely to occur. However, concurrently, the test may have insufficient power resulted in increased probability of type II error occurrence. Most researchers may hope to find the best way of adjusting the type I error rate to discriminate the real differences between observed data without wasting too much statistical power. It is expected that this paper will help researchers understand the differences between MCTs and apply them appropriately.

446 citations


Journal ArticleDOI
TL;DR: This study shows that a multivariate method, such as, SCCAN, outperforms VLSM in a number of scenarios, including functional dependency on single or multiple areas, different sample sizes, different multi‐area combinations, and different thresholding mechanisms.

144 citations


Journal ArticleDOI
TL;DR: Applying Non-Parametric Combination (NPC) Test for partial and combined tests, it is concluded that Crohn’s Disease patients and Ulcerative Colitis patients differ between them for most examined variables.
Abstract: Statistical methodology is a powerful tool in the health research; however, there is wide accord that statistical methodologies are not usually used properly. In particular when multiple comparisons are needed, it is necessary to check the rate of false positive results and the potential inflation of type I errors. In this case, permutation testing methods are useful to check the simultaneous significance level and identify the most significant factors. In this paper an application of permutation tests, in the medical context of Inflammatory Bowel Diseases, is performed. The main goal is to assess the existence of significant differences between Crohn’s Disease (CD) and Ulcerative Colitis (UC). The Sequentially Rejective Multiple Test (Bonferroni-Holm procedure) is used to find which of the partial tests are effectively significant and solve the problem of the multiplicity control. Applying Non-Parametric Combination (NPC) Test for partial and combined tests we conclude that Crohn’s Disease patients and Ulcerative Colitis patients differ between them for most examined variables. UC patients compared with the CD patients, have a higher diagnosis age, not show smoking status, proportion of patients treated with immunosuppressants or with biological drugs is lower than the CD patients, even if the duration of such therapies is longer. CD patients have a higher rate of re-hospitalization. Diabetes is more present in the sub-population of UC patients. Analyzing the Charlson score we can highlight that UC patients have a more severe clinical situation than CD patients. Finally, CD patients are more frequently subject to surgery compared to UC. Appling of the Bonferroni Holm procedure, which provided adjusted p-values, we note that only nine of the examined variables are statistically significant: Smoking habit, Immunosuppressive therapy, Surgery, Biological Drug, Diabetes, Adverse Events, Re-hospitalization, Gender and Duration of Immunosoppressive Therapy. Therefore, we can conclude that these are the specific variables that can discriminate effectively the Crohn’s Disease and Ulcerative Colitis groups. We identified significant variables that discriminate the two groups, satisfying the multiplicity problem, in fact we can affirm that Smoking habit, Immunosuppressive therapy, Surgery, Biological Drug, Diabetes, Adverse Events, Hospitalization, Gender and Duration of Immunosoppressive Therapy are the effectively significant variables.

30 citations


Journal ArticleDOI
01 Nov 2018
TL;DR: New aggregation operators are presented that combine these concepts forming the Bonferroni induced heavy ordered weighted average and several particular formulations to provide a more general framework for analysing the data in scenarios where the numerical values may have some complexities that should be assessed with complex attitudinal characters.
Abstract: Averaging aggregation operators analyse a set of data providing a summary of the results. This study focuses on the Bonferroni mean and the induced and heavy aggregation operators. The aim of the work is to present new aggregation operators that combine these concepts forming the Bonferroni induced heavy ordered weighted average and several particular formulations. This approach represents Bonferroni means with order inducing variables and with weighting vectors that can be higher than one. The paper also develops some extensions by using distance measures forming the Bonferroni induced heavy ordered weighted average distance and several particular cases. The study ends with an application in a large companies risk management problem in Colombia. The main advantage of this approach is that it provides a more general framework for analysing the data in scenarios where the numerical values may have some complexities that should be assessed with complex attitudinal characters.

30 citations


Journal ArticleDOI
TL;DR: The Bayes Factor, a commonly admitted statistical procedure, can be computed as the ratio of two normal densities: the first, of the estimate of the marker effect over its posterior standard deviation; the second of the null hypothesis (a value of 0 over the prior standard deviation).
Abstract: Bayesian models for genomic prediction and association mapping are being increasingly used in genetics analysis of quantitative traits. Given a point estimate of variance components, the popular methods SNP-BLUP and GBLUP result in joint estimates of the effect of all markers on the analyzed trait; single and multiple marker frequentist tests (EMMAX) can be constructed from these estimates. Indeed, BLUP methods can be seen simultaneously as Bayesian or frequentist methods. So far there is no formal method to produce Bayesian statistics from GBLUP. Here we show that the Bayes Factor, a commonly admitted statistical procedure, can be computed as the ratio of two normal densities: the first, of the estimate of the marker effect over its posterior standard deviation; the second of the null hypothesis (a value of 0 over the prior standard deviation). We extend the BF to pool evidence from several markers and of several traits. A real data set that we analyze, with ours and existing methods, analyzes 630 horses genotyped for 41711 polymorphic SNPs for the trait "outcome of the qualification test" (which addresses gait, or ambling, of horses) for which a known major gene exists. In the horse data, single marker EMMAX shows a significant effect at the right place at Bonferroni level. The BF points to the same location although with low numerical values. The strength of evidence combining information from several consecutive markers increases using the BF and decreases using EMMAX, which comes from a fundamental difference in the Bayesian and frequentist schools of hypothesis testing. We conclude that our BF method complements frequentist EMMAX analyses because it provides a better pooling of evidence across markers, although its use for primary detection is unclear due to the lack of defined rejection thresholds.

23 citations


Journal ArticleDOI
Zhiming Zhang1
TL;DR: The aim of the paper was to propose the interval-valued intuitionistic fuzzy geometric Bonferroni mean and the weighted interval- VAL fuzzy mean for aggregating interval- values of intuitionism fuzzy sets, taking into account the interrelationship between interval- valued intuitionistic fuzziness arguments.
Abstract: The aim of the paper was to propose the interval-valued intuitionistic fuzzy geometric Bonferroni mean and the weighted interval-valued intuitionistic fuzzy geometric Bonferroni mean for aggregating interval-valued intuitionistic fuzzy sets, taking into account the interrelationship between interval-valued intuitionistic fuzzy arguments. Then, some useful properties and special cases of the developed operators are investigated. Furthermore, the developed operators are used to put forward an approach for multiple attribute group decision making with interval-valued intuitionistic fuzzy information. Finally, an illustrative example is furnished to show the feasibility and practicality of the developed approach.

21 citations


Journal ArticleDOI
TL;DR: The development of a rigorous statistical framework for connexel‐wise significance testing based on the Gaussian random field theory that can identify altered functional connectivities in a major depression disorder dataset, whereas existing methods fail.

19 citations


Book ChapterDOI
10 Jan 2018
TL;DR: This paper compares a Bonferroni adjustment that is based on finite-sample considerations with certain 'asymptotic' adjustments previously suggested in the literature.
Abstract: In many multiple testing problems, the individual null hypotheses (i) concern univariate parameters and (ii) are one-sided. In such problems, power gains can be obtained for bootstrap multiple testing procedures in scenarios where some of the parameters are ‘deep in the null’ by making certain adjustments to the null distribution under which to resample. In this paper, we compare a Bonferroni adjustment that is based on finite-sample considerations with certain ‘asymptotic’ adjustments previously suggested in the literature.

14 citations


Journal ArticleDOI
TL;DR: Improved exact multiple testing procedures are proposed for the setting where two parallel groups are compared in multiple binary endpoints and an optimization algorithm based on constrained optimization and integer linear programming is proposed.

13 citations


Journal ArticleDOI
TL;DR: In this paper, a Holm-type step-down exact parametric procedure is proposed for situations in which between-endpoint correlations are known a priori or estimable. But this procedure does not leverage the correlations between endpoints.
Abstract: Maurer and Bretz developed a class of group-sequential weighted Bonferroni procedures with multiple endpoints. Performed as a step-down consonant shortcut to the group-sequential closed testing procedure, the class of procedures of Maurer and Bretz is simple to use for testing multiple endpoints in classical group-sequential settings. This class uses the correlations of sequential statistics, but does not leverage the correlations between endpoints. Thus, there is room for power improvement by suitably using the between-endpoint correlations in a group-sequential trial while maintaining strong control of the family-wise error rate. To this end, we propose a Holm-type step-down exact parametric procedure for situations in which between-endpoint correlations are known a priori or estimable. An adaptive strategy is suggested for situations in which such correlations are unknown. In addition, we briefly discuss a natural group-sequential extension of the partially parametric Seneta–Chen procedure.

11 citations


Journal ArticleDOI
TL;DR: Diverse clinical factors influenced the long-term risk of completed suicide in this general population sample and only the 6 variables listed above were robust predictors of suicide in the fully adjusted analyses with multiple test correction.

Journal ArticleDOI
TL;DR: In this article, the authors evaluated the Student's t-test and the following corrections: Bonferroni, Holm, Hochberg, Hommel, Holland, Rom, Finner, Benjamini-Hochberg and Li with respect to their power and Type I error rate.
Abstract: Multiple comparisons of treatments means are common in several fields of knowledge. The Student's t-test is one of the first procedures to be used in multiple comparisons, however the \emph{p}-values associated with it are inaccurate, since there is no control on the family-wise Type I error. To solve this problem several corrections were developed. In this work, based on Monte Carlo simulations, we evaluated the t-test and the following corrections: Bonferroni, Holm, Hochberg, Hommel, Holland, Rom, Finner, Benjamini–Hochberg, Benjamini–Yekutieli and Li with respect to their power and Type I error rate. The study was lead varying the sample size, the sample distribution and the degree of variability. For all instances we regarded three balanced treatments and the probability distributions considered were: Gumbel, Logistic and Normal. Although the corrections were approaching when the sample size increased, our study reveals that the BH correction provides the best family-wise Type I error rate and the second overall most powerful correction.

Journal ArticleDOI
TL;DR: In this paper, the authors present a rapid approach to the BH step up and step down tests, and show that the Bonferroni bound is sharp under dependence for the control of the family-wise error rate.

Journal ArticleDOI
TL;DR: The aim of this study is to provide a computer program in R that performs classical and improved Bonferroni procedures for circular data problems.
Abstract: Bonferroni correction procedures are commonly used for performing multiple hypothesis tests in linear data problems. Moreover, several improved Bonferroni type procedures have been proposed and sho...

Journal ArticleDOI
TL;DR: In this article, the authors investigate the bootstrap validity for the subset Anderson and Rubin (1949, AR) test when the nuisance structural parameter, the unrestricted slope coefficient of endogenous regressor, may be weakly identified, and propose a Bonferroni-based size correction method that yields correct asymptotic size for all the test statistics considered.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of simultaneous inference of varying coefficient models for sparse and irregularly observed longitudinal data via the local linear kernel method, where the error and covariate processes are modeled as very general classes of non-Gaussian and non-stationary processes and are allowed to be statistically dependent.
Abstract: Longitudinal data arise frequently in many scientific inquiries. To capture the dynamic relationship between longitudinal covariates and response, varying coefficient models have been proposed with point-wise inference procedures. This paper considers the challenging problem of asymptotically accurate simultaneous inference of varying coefficient models for sparse and irregularly observed longitudinal data via the local linear kernel method. The error and covariate processes are modeled as very general classes of non-Gaussian and non-stationary processes and are allowed to be statistically dependent. Simultaneous confidence bands (SCBs) with asymptotically correct coverage probabilities are constructed to assess the overall pattern and magnitude of the dynamic association between the response and covariates. A simulation based method is proposed to overcome the problem of slow convergence of the asymptotic results. Simulation studies demonstrate that the proposed inference procedure performs well in realistic settings and is favored over the existing point-wise and Bonferroni methods. A longitudinal dataset from the Chicago Health and Aging Project is used to illustrate our methodology.

Journal ArticleDOI
TL;DR: This paper explores the practical use of Cochran’s Q test and pairwise McNemar test to examine the proportion of responses derived from the results of Multiple Responses Analysis (MRA).
Abstract: When utilizing single-response questions for a survey, researchers often overlook the possibility that an item can have a smorgasbord of viable answers. It results in the loss of information as it forces the respondents to select a best-of-fit option. A multiple-responses question allows the respondent to select any number of answers from a set of preformatted options. The ability to capture a flexible number of responses allows collectively exhaustive concepts to manifest for deductive verification. This paper explores the practical use of Cochran’s Q test and pairwise McNemar test to examine the proportion of responses derived from the results of Multiple Responses Analysis (MRA). This includes Cochran’s Q operation on MRA data table using a simulated data set. Cochran’s Q test detects if there is a difference in the proportion of multiple concepts. In the case of a significant result, it would require a post hoc analysis to pinpoint the exact difference in pairwise proportions. This pairwise difference can be detected by utilizing pairwise McNemar test with Bonferroni Correction. This paper serves as a reference for researchers and practitioners who need to examine the proportion of collectively exhaustive concepts collected from a multiple responses item.

Journal ArticleDOI
TL;DR: This paper model a system of fuzzy soft differential equations to analyze the behavior over the time of an individual depending on their companion’s actions under any particular situation against some decision and presents a novel efficient technique for analyzing the future attitude of people due to their present decisions.
Abstract: In this paper, we model a system of fuzzy soft differential equations to analyze the behavior over the time of an individual depending on their companion’s actions under any particular situation against some decision. The Bonferroni mean (BM) is a very useful tool for group decision-making problems when arguments are interrelated to each other as Bonferroni mean can capture the interrelationship of the individual arguments. Using this ability of BM, we define Bonferroni fuzzy soft matrix (BFSM) and weighted Bonferroni fuzzy soft matrix (WBFSM) for data representation. WBFSM is a decision matrix and provide the optimum fuzzy soft constant (OFSC) which is the key element of fuzzy soft differential equation. By utilizing the OFSC, we develop a system of fuzzy soft differential equations to study a dynamical process with nonlinear uncertain and vague data. Second, we present a novel efficient technique for analyzing the future attitude of people due to their present decisions. To illustrate the practicality and feasibility of proposed technique an illustrative example is also discussed with the help of phase portrait and line graphs.

Journal ArticleDOI
TL;DR: In this article, the authors apply the Shapley value and the balance of inequality to the Bonferroni inequality index, and compare it with the Gini concentration index and highlight interesting properties of the index.
Abstract: Additive decomposability is an interesting feature of inequality indices which, however, is not always fulfilled; solutions to overcome such an issue have been given by Deutsch and Silber (2007) and by Di Maio and Landoni (2017). In this paper, we apply these methods, based on the “Shapley value” and the “balance of inequality” respectively, to the Bonferroni inequality index. We also discuss a comparison with the Gini concentration index and highlight interesting properties of the Bonferroni index.

Journal ArticleDOI
TL;DR: The 2-dimensional uncertain linguistic improved weighted Bonferroni harmonic mean (2DULIWBHM) operator, which combine the 2DULWBM with Harmonic Mean is proposed, and a new method to deal with some multi-attribute group decision making problems under 2-dimension uncertain linguistic environment is developed.

Journal ArticleDOI
TL;DR: It is shown in this article how SCIs can be formulated in a simple and transparent way for any MTP in the Burman et al. (2009) class.
Abstract: The Bonferroni-based sequentially rejective graphical procedures of Bretz et al (2009) and Burman et al (2009) include well-known multiple testing procedures (MTPs) such as the fixed-sequence MTP

Posted Content
TL;DR: This paper introduces the new Fast Closed Testing (FACT) method for multiple testing, controlling the family-wise error rate, and proposes the Simes-higher criticism fusion test, which is powerful for detecting both a few strong signals, and also many moderate signals.
Abstract: Multiple hypothesis testing problems arise naturally in science. In this paper, we introduce the new Fast Closed Testing (FACT) method for multiple testing, controlling the family-wise error rate. This error rate is state of the art in many important application areas, and is preferred to false discovery rate control for many reasons, including that it leads to stronger reproducibility. The closure principle rejects an individual hypothesis if all global nulls of subsets containing it are rejected using some test statistics. It takes exponential time in the worst case. When the tests are symmetric and monotone, our method is an exact algorithm for computing the closure, quadratic in the number of tests, and linear in the number of discoveries. Our framework generalizes most examples of closed testing such as Holm's and the Bonferroni method. As a special case of our method, we propose the Simes-higher criticism fusion test, which is powerful for detecting both a few strong signals, and also many moderate signals.

Proceedings ArticleDOI
01 Nov 2018
TL;DR: A new fuzzy aggregation operator called the triangular fuzzy partitioned Bonferroni mean} (TFPBM) operator for aggregating triangular fuzzy numbers is proposed and an approach to deal with multiple attribute decision-making problems under triangular fuzzy environment is developed.
Abstract: The Bonferroni mean (BM) operator, introduced by Bonferroni, is a powerful tool to capture the interrelationship among aggregated arguments. Various generalizations and extensions of BM have developed and applied to solve many realworld problems. Recently, the notion of Partitioned Bonferroni mean (PBM) operator has been proposed with the assumption that the interrelationships do not always exist among all of the attributes. This work studies the PBM operator under triangular fuzzy environment. First, we propose a new fuzzy aggregation operator called the triangular fuzzy partitioned Bonferroni mean} (TFPBM) operator for aggregating triangular fuzzy numbers. Some properties and special cases of the new aggregation operator are also investigated. For the situations where the input arguments have different importance, we then define the triangular fuzzy weighted partitioned Bonferroni mean} (TFWPBM) operator. Furthermore, based on TFWPBM operator, an approach to deal with multiple attribute decision-making problems under triangular fuzzy environment is developed. Finally, a practical example is provided to illustrate the developed approach.

Posted Content
TL;DR: Wyoung controls the family-wise error rate when performing multiple hypothesis tests by estimating adjusted p-values using the free step-down resampling methodology of Westfall and Young (1993).
Abstract: wyoung controls the family-wise error rate when performing multiple hypothesis tests by estimating adjusted p-values using the free step-down resampling methodology of Westfall and Young (1993). See also Jones, D., D. Molitor, and J. Reif. 2018. "What Do Workplace Wellness Programs Do? Evidence from the Illinois Workplace Wellness Study." National Bureau of Economic Research Working Paper No. 24229.

Book ChapterDOI
01 Jan 2018
TL;DR: The testing procedure is based on an invariance principle which provides distributional approximations of functionals of non-Gaussian vectors by those of Gaussian ones and is dependence-adjusted and has an asymptotically correct size and power.
Abstract: We present a systematic theory for tests for means of high-dimensional data. Our testing procedure is based on an invariance principle which provides distributional approximations of functionals of non-Gaussian vectors by those of Gaussian ones. Differently from the widely used Bonferroni approach, our procedure is dependence-adjusted and has an asymptotically correct size and power. To obtain cutoff values of our test, we propose a half-sampling method which avoids estimating the underlying covariance matrix of the random vectors. The latter method is shown via extensive simulations to have an excellent performance.

Journal ArticleDOI
TL;DR: This article introduces a novel procedure for improving power of multiple testing procedures (MTPs) of interval hypotheses by "filtering" a set of P-values and discards those with values above a certain pre-selected threshold, while the rest are corrected by the value of the threshold.
Abstract: In this article, we introduce a novel procedure for improving power of multiple testing procedures (MTPs) of interval hypotheses. When testing interval hypotheses the null hypothesis P-values tend to be stochastically larger than standard uniform if the true parameter is in the interior of the null hypothesis. The new procedure starts with a set of P-values and discards those with values above a certain pre-selected threshold, while the rest are corrected (scaled-up) by the value of the threshold. Subsequently, a chosen family-wise error rate (FWER) or false discovery rate MTP is applied to the set of corrected P-values only. We prove the general validity of this procedure under independence of P-values, and for the special case of the Bonferroni method, we formulate several sufficient conditions for the control of the FWER. It is demonstrated that this "filtering" of P-values can yield considerable gains of power.

Journal ArticleDOI
TL;DR: Two group sequential testing procedures with improved secondary power are proposed: the improved Bonferroni procedure and the improved Pocock procedure, which use the correlation between the interim and final statistics for the secondary endpoint while applying graphical approaches to transfer the significance level from the primary endpoint to thesecondary endpoint.
Abstract: In two-stage group sequential trials with a primary and a secondary endpoint, the overall type I error rate for the primary endpoint is often controlled by an α-level boundary, such as an O'Brien-Fleming or Pocock boundary. Following a hierarchical testing sequence, the secondary endpoint is tested only if the primary endpoint achieves statistical significance either at an interim analysis or at the final analysis. To control the type I error rate for the secondary endpoint, this is tested using a Bonferroni procedure or any α-level group sequential method. In comparison with marginal testing, there is an overall power loss for the test of the secondary endpoint since a claim of a positive result depends on the significance of the primary endpoint in the hierarchical testing sequence. We propose two group sequential testing procedures with improved secondary power: the improved Bonferroni procedure and the improved Pocock procedure. The proposed procedures use the correlation between the interim and final statistics for the secondary endpoint while applying graphical approaches to transfer the significance level from the primary endpoint to the secondary endpoint. The procedures control the familywise error rate (FWER) strongly by construction and this is confirmed via simulation. We also compare the proposed procedures with other commonly used group sequential procedures in terms of control of the FWER and the power of rejecting the secondary hypothesis. An example is provided to illustrate the procedures.

Posted ContentDOI
08 Sep 2018-bioRxiv
TL;DR: An overview of approaches for FWE correction as well as evidence for the faulty implementation of the Benjamini and Yekutieli procedure by Narum using the equations from the respective papers, data from both papers, and the results of simulation are provided.
Abstract: In 2006, Narum published a paper in Conservation Genetics that was motivated by the stringent nature of the Bonferroni approach for family wise error correction. It was suggested that the approach of Benjamini and Yekutieli in 2001 provided adequate correction and was more biologically relevant. However, there are crucial differences between the original Benjamini and Yekutieli 2001 procedure and that described by Narum. After carefully reviewing both papers, we believe that the Narum procedure is both different than the BY procedure and does not adequately control for family wise error. We provide an overview of approaches for FWE correction as well as evidence for the faulty implementation of the BY procedure by Narum using the equations from the respective papers, data from both papers, and the results of simulation.

Journal ArticleDOI
TL;DR: Methods for deriving multiplicity-adjusted upper limits on extra risk and lower bounds on the benchmark dose under Abbott-adjusted log-logistic model are presented and compared.
Abstract: In risk assessment, it is often desired to make inferences on the risk at certain low doses or on the dose(s) at which a specific benchmark risk (BMR) is attained. At times, dose levels or BMRs are of interest, and some form of multiplicity adjustment is necessary to ensure a valid simultaneous inference. Bonferroni correction is often employed in practice for such purposes. Though relative simple to implement, the Bonferroni strategy can suffer from extreme conservatism (Nitcheva et al., 2005; Al-Saidy et al., 2003). Recently, Kerns (2017) proposed the use of simultaneous hyperbolic and three-segment bands to perform multiple inferences in risk assessment under Abbott-adjusted log-logistic model with the dose level constrained to a given interval. In this paper, we present and compare methods for deriving multiplicity-adjusted upper limits on extra risk and lower bounds on the benchmark dose under Abbott-adjusted log-logistic model. Monte Carlo simulations evaluate the characteristics of the simu...

Posted Content
Yuchao Liu1, Jiaqi Guo2
TL;DR: A Bonferroni type testing procedure based on permutation tests is proposed, and it is shown that the proposed test loses no first-order asymptotic power compared to tests with full knowledge of potential elevated submatrix.
Abstract: Given a large matrix containing independent data entries, we consider the problem of detecting a submatrix inside the data matrix that contains larger-than-usual values. Different from previous literature, we do not have exact information about the dimension of the potential elevated submatrix. We propose a Bonferroni type testing procedure based on permutation tests, and show that our proposed test loses no first-order asymptotic power compared to tests with full knowledge of potential elevated submatrix. In order to speed up the calculation during the test, an approximation net is constructed and we show that Bonferroni type permutation test on the approximation net loses no power on the first order asymptotically.