scispace - formally typeset
Search or ask a question

Showing papers in "Communications in Statistics - Simulation and Computation in 2017"


Journal ArticleDOI
TL;DR: This article aims to review and describe some available and popular robust techniques, including some recent developed ones, and compare them in terms of breakdown point and efficiency.
Abstract: Ordinary least-square (OLS) estimators for a linear model are very sensitive to unusual values in the design space or outliers among y values. Even one single atypical value may have a large effect...

160 citations


Journal ArticleDOI
TL;DR: The main focus is on estimation from frequentist point of view, yet, some statistical and reliability characteristics for the model are derived.
Abstract: This article addresses various properties and estimation methods for the Exponentiated Chen distribution. Although, our main focus is on estimation from frequentist point of view, yet, some statist...

75 citations


Journal ArticleDOI
TL;DR: This paper compares the bias of standard errors and statistical power of marginal effects for generalized estimating equations (a design-based method) and generalized/linear mixed effects models (model-based methods) with small sample sizes via a simulation study.
Abstract: Two classes of methods properly account for clustering of data: design-based methods and model-based methods. Estimates from both methods have been shown to be approximately equal with large sample...

60 citations


Journal ArticleDOI
TL;DR: A CV chart is proposed by using the variable sample size and sampling interval (VSSI) feature to improve the performance of the basic CV chart, for detecting small and moderate shifts in the CV.
Abstract: This article proposes a CV chart by using the variable sample size and sampling interval (VSSI) feature to improve the performance of the basic CV chart, for detecting small and moderate shifts in the CV. The proposed VSSI CV chart is designed by allowing the sample size and the sampling interval to vary. The VSSI CV chart's statistical performance is measured by using the average time to signal (ATS) and expected average time to signal (EATS) criteria and is compared with that of existing CV charts. The Markov chain approach is employed in the design of the chart.

57 citations


Journal ArticleDOI
TL;DR: Seven popular goodness-of-fit tests for normality are employed, namely Shapiro–Wilk, Anderson–Darling, Cramer-von Mises, Pearson Chi-square, Shapiro-Francia, Lilliefors and Jarque–Bera tests, together with a searching algorithm to estimate a single transformation parameter.
Abstract: Box–Cox power transformation is a commonly used methodology to transform the distribution of the data into a normal distribution. The methodology relies on a single transformation parameter. In this study, we focus on the estimation of this parameter. For this purpose, we employ seven popular goodness-of-fit tests for normality, namely Shapiro–Wilk, Anderson–Darling, Cramer-von Mises, Pearson Chi-square, Shapiro-Francia, Lilliefors and Jarque–Bera tests, together with a searching algorithm. The searching algorithm is based on finding the argument of the minimum or maximum depending on the test, i.e., maximum for the Shapiro–Wilk and Shapiro–Francia, minimum for the rest. The artificial covariate method of Dag et al. (2014) is also included for comparison purposes. Simulation studies are implemented to compare the performances of the methods. Results show that Shapiro–Wilk and the artificial covariate method are more effective than the others and Pearson Chi-square is the worst performing method. T...

52 citations


Journal ArticleDOI
TL;DR: Two new model selection criteria are introduced that explicitly account for varying dispersion and a fast two step model selection scheme is proposed which is considerably more accurate and is computationally less costly than usual joint model selection.
Abstract: We address the issue of model selection in beta regressions with varying dispersion. The model consists of two submodels, namely: for the mean and for the dispersion. Our focus is on the selection of the covariates for each submodel. Our Monte Carlo evidence reveals that the joint selection of covariates for the two submodels is not accurate in finite samples. We introduce two new model selection criteria that explicitly account for varying dispersion and propose a fast two step model selection scheme which is considerably more accurate and is computationally less costly than usual joint model selection. Monte Carlo evidence is presented and discussed. We also present the results of an empirical application.

44 citations


Journal ArticleDOI
TL;DR: The inverse power Lindley distribution was found to be a good alternative for modeling survival data and was applied considering two real datasets and compared with the fits obtained for already-known distributions.
Abstract: Several probability distributions have been proposed in the literature, especially with the aim of obtaining models that are more flexible relative to the behaviors of the density and hazard rate functions. Recently, a new generalization of the Lindley distribution was proposed by Ghitany et al. (2013), called power Lindley distribution. Another generalization was proposed by Sharma et al. (2015a), known as inverse Lindley distribution. In this paper a distribution is obtained from these two generalizations and named as inverse power Lindley distribution. Some properties of this distribution and study of the behavior of maximum likelihood estimators are presented and discussed. It is also applied considering two real data sets and compared with the fits obtained for already-known distributions. When applied, the inverse power Lindley distribution was found to be a good alternative for modeling survival data.

41 citations


Journal ArticleDOI
TL;DR: The potential usefulness of the three-parameter transmuted generalized exponential distribution for analyzing lifetime data is investigated and various generalizations of the two-parameters exponential distribution are compared using maximum likelihood estimation.
Abstract: In this article, we investigate the potential usefulness of the three-parameter transmuted generalized exponential distribution for analyzing lifetime data. We compare it with various generalizations of the two-parameter exponential distribution using maximum likelihood estimation. Some mathematical properties of the new extended model including expressions for the quantile and moments are investigated. We propose a location-scale regression model, based on the log-transmuted generalized exponential distribution. Two applications with real data are given to illustrate the proposed family of lifetime distributions.

38 citations


Journal ArticleDOI
TL;DR: New methods to estimate the shrinkage parameters of Liu-type logistic estimator proposed by Inan and Erdogan (2013) which is a generalization of the Liu- type estimator defined by Liu (2003) for the linear model are introduced.
Abstract: The binary logistic regression is a widely used statistical method when the dependent variable is binary or dichotomous. In some of the situations of logistic regression, independent variables are collinear which leads to the problem of multicollinearity. It is known that multicollinearity affects the variance of maximum likelihood estimator (MLE) negatively. Thus, this article introduces new methods to estimate the shrinkage parameters of Liu-type logistic estimator proposed by Inan and Erdogan (2013) which is a generalization of the Liu-type estimator defined by Liu (2003) for the linear model. A Monte Carlo study is used to show the effectiveness of the proposed methods over MLE using the mean squared error (MSE) and mean absolute error (MAE) criteria. A real data application is illustrated to show the benefits of new methods. According to the results of the simulation and application proposed methods have better performance than MLE.

36 citations


Journal ArticleDOI
TL;DR: Confidence intervals for the single coefficient of variation and the difference of coefficients of variation in the two-parameter exponential distributions are examined using the method of variance of estimates recovery (MOVER), the generalized confidence interval (GCI), and the asymptotic confidence intervals (ACI).
Abstract: This article examines confidence intervals for the single coefficient of variation and the difference of coefficients of variation in the two-parameter exponential distributions, using the method of variance of estimates recovery (MOVER), the generalized confidence interval (GCI), and the asymptotic confidence interval (ACI). In simulation, the results indicate that coverage probabilities of the GCI maintain the nominal level in general. The MOVER performs well in terms of coverage probability when data only consist of positive values, but it has wider expected length. The coverage probabilities of the ACI satisfy the target for large sample sizes. We also illustrate our confidence intervals using a real-world example in the area of medical science.

36 citations


Journal ArticleDOI
TL;DR: This work considers the standard two-sided power distribution to define other classes like the beta-G and the Kumaraswamy-G classes and extends the idea of two- sidedness to other ordinary distributions like normal.
Abstract: The ordinary-G class of distributions is defined to have the cumulative distribution function (cdf) as the value of the cdf of the ordinary distribution F whose range is the unit interval at G, that is, F(G), and it generalizes the ordinary distribution. In this work, we consider the standard two-sided power distribution to define other classes like the beta-G and the Kumaraswamy-G classes. We extend the idea of two-sidedness to other ordinary distributions like normal. After studying the basic properties of the new class in general setting, we consider the two-sided generalized normal distribution with maximum likelihood estimation procedure.

Journal ArticleDOI
TL;DR: The numerical evidence shows that the bias-corrected estimators are extremely accurate even for very small sample sizes and are superior than the previous estimators in terms of biases and root mean squared errors.
Abstract: The two-parameter weighted Lindley distribution is useful for modeling survival data, whereas its maximum likelihood estimators (MLEs) are biased in finite samples. This motivates us to construct nearly unbiased estimators for the unknown parameters. We adopt a “corrective” approach to derive modified MLEs that are bias-free to second order. We also consider an alternative bias-correction mechanism based on Efron’s bootstrap resampling. Monte Carlo simulations are conducted to compare the performance between the proposed and two previous methods in the literature. The numerical evidence shows that the bias-corrected estimators are extremely accurate even for very small sample sizes and are superior than the previous estimators in terms of biases and root mean squared errors. Finally, applications to two real datasets are presented for illustrative purposes.

Journal ArticleDOI
TL;DR: Simulated results show that the change point approach has a good performance over any possible single step change in process parameters for two special cases of generalized linear profiles, namely Poisson and binomial profiles.
Abstract: In this article, we adopt the change point approach to monitor the generalized linear profiles in phase II Statistical process control (SPC). Generalized linear profiles include a large class of profiles defined in one framework. In contrast to the conventional change point approach, we adopt the Rao score test rather than the likelihood ratio test. Simulated results show that our approach has a good performance over any possible single step change in process parameters for two special cases of generalized linear profiles, namely Poisson and binomial profiles. Some diagnostic aids are also given and a real example is introduced to shed light on the merits of our approach in real applications.

Journal ArticleDOI
TL;DR: An acceptance sampling plan for the weighted exponential distribution under a truncated life test is developed and consumer’s confidence levels and values of the ratio of the experimental time to the specified mean lifetime, the minimum sample size necessary to ensure a certain mean lifetime are obtained.
Abstract: Gupta and Kundu proposed a new class of weighted exponential distributions using the idea of Azzalini. In this article, we develop an acceptance sampling plan for the weighted exponential distribution under a truncated life test. For various acceptance numbers, consumer’s confidence levels and values of the ratio of the experimental time to the specified mean lifetime, the minimum sample size necessary to ensure a certain mean lifetime are obtained. The operating characteristic function values and the associated producer’s risks are also presented. A numerical example is provided to illustrate the acceptance sampling plan.

Journal ArticleDOI
TL;DR: A Bayesian technique, the sampling/importance resampling method (SIR), is used to estimate the parameters of multi-server queueing systems in which inter-arrival and service times are exponentially distributed (Markovian).
Abstract: In this article, we focus on multi-server queueing systems in which inter-arrival and service times are exponentially distributed (Markovian). We use a Bayesian technique, the sampling/importance r...

Journal ArticleDOI
TL;DR: Two modifications of the parameters' structure of the gamma process are proposed, one implies that the random effects affect just the volatility and the second just the rate.
Abstract: The random effects in a gamma process are introduced in terms of its scale parameter. However, the scale parameter affects both its mean and variance. Hence, the variation of the degradation rates ...

Journal ArticleDOI
TL;DR: A flexible cure rate survival model is proposed by assuming that the number of competing causes of the event of interest has a Poisson distribution and the time to event has an OLL-L distribution.
Abstract: We define two new lifetime models called the odd log-logistic Lindley (OLL-L) and odd log-logistic Lindley Poisson (OLL-LP) distributions with various hazard rate shapes such as increasing, decreasing, upside-down bathtub, and bathtub. Various structural properties are derived. Certain characterizations of OLL-L distribution are presented. The maximum likelihood estimators of the unknown parameters are obtained. We propose a flexible cure rate survival model by assuming that the number of competing causes of the event of interest has a Poisson distribution and the time to event has an OLL-L distribution. The applicability of the new models is illustrated by means real datasets.

Journal ArticleDOI
TL;DR: A variable sample size (VSS) scheme directly monitoring the coefficient of variation (CV), instead of monitoring the transformed statistics, is proposed, which provides an easier alternative as no transformation is involved.
Abstract: A variable sample size (VSS) scheme directly monitoring the coefficient of variation (CV), instead of monitoring the transformed statistics, is proposed. Optimal chart parameters are computed based on two criteria: (i) minimizing the out-of-control ARL (ARL1) and (ii) minimizing the out-of-control ASS (ASS1). Then the performances are compared between these two criteria. The advantages of the proposed chart over the VSS chart based on the transformed statistics in the existing literature are: the former (i) provides an easier alternative as no transformation is involved and (ii) requires less number of observations to detect a shift when ASS1 is minimized.

Journal ArticleDOI
TL;DR: The proposed fuzzy robust regression model can be used for modeling natural phenomena whose available observations are reported as imprecise rather than crisp and performs better than the other models in suspended load estimation for the particular dataset.
Abstract: Fuzzy least-square regression can be very sensitive to unusual data (e.g., outliers). In this article, we describe how to fit an alternative robust-regression estimator in fuzzy environment, which attempts to identify and ignore unusual data. The proposed approach concerns classical robust regression and estimation methods that are insensitive to outliers. In this regard, based on the least trimmed square estimation method, an estimation procedure is proposed for determining the coefficients of the fuzzy regression model for crisp input-fuzzy output data. The investigated fuzzy regression model is applied to bedload transport data forecasting suspended load by discharge based on a real world data. The accuracy of the proposed method is compared with the well-known fuzzy least-square regression model. The comparison results reveal that the fuzzy robust regression model performs better than the other models in suspended load estimation for the particular dataset. This comparison is done based on a s...

Journal ArticleDOI
TL;DR: A set of automatic routines useful for simulating and analyzing time series under a copula-based serial dependence and fully automated routines for obtaining maximum likelihood estimates for given time series data and then drawing a Shewhart-type control chart are provided.
Abstract: Modeling serial dependence in time series is an important step in statistical process control. We provide a set of automatic routines useful for simulating and analyzing time series under a copula-based serial dependence. First, we introduce routines that generate time series data under a given copula. Second, we provide fully automated routines for obtaining maximum likelihood estimates for given time series data and then drawing a Shewhart-type control chart. Finally, real data are analyzed for illustration. We make the routines available as “Copula.Markov” package in R.

Journal ArticleDOI
TL;DR: The GMC(Y|X) as mentioned in this paper measure uses Kernel regressions to overcome the linearity of Pearson's correlation coefficients, which is a new generalized correlation measure of 2012, which uses kernel regression to overcome Pearson's linearity.
Abstract: New generalized correlation measures of 2012, GMC(Y|X), use Kernel regressions to overcome the linearity of Pearson's correlation coefficients. A new matrix of generalized correlation coefficients ...

Journal ArticleDOI
TL;DR: In this article, a time-weighted chart is used to monitor the time between events (TB) in attribute charts, which is an alternative to the traditional Shewhart-type attribute charts.
Abstract: Shewhart-type attribute charts are known to be inefficient for small changes in monitoring nonconformities. An alternative way is to use a time-weighted chart to monitor the time between events (TB...

Journal ArticleDOI
TL;DR: In this paper, the one-dimensional Ornstein-Uhlenbeck (OU) processes with marginal law given by tempered stable and tempered infinitely divisible distributions are investigated. And the transition law between consecutive observations of these processes and evaluate the characteristic function of integrated tempered OU processes with a view toward practical applications.
Abstract: We study the one-dimensional Ornstein–Uhlenbeck (OU) processes with marginal law given by tempered stable and tempered infinitely divisible distributions. We investigate the transition law between consecutive observations of these processes and evaluate the characteristic function of integrated tempered OU processes with a view toward practical applications. We then analyze how to draw a random sample from this class of processes by considering both the classical inverse transform algorithm and an acceptance–rejection method based on simulating a stable random sample. Using a maximum likelihood estimation method based on the fast Fourier transform, we empirically assess the simulation algorithm performance.

Journal ArticleDOI
TL;DR: The proposed directional dependence by the Gaussian copula beta regression model reveals that the directional dependence from WTI to S&P 500 is greater than that from S & Poor’s 500 to WTI.
Abstract: This article proposes a new directional dependence by using the Gaussian copula beta regression model. In particular, we consider an asymmetric Generalized AutoRegressive Conditional Heteroscedasti...

Journal ArticleDOI
TL;DR: A competing risks model based on exponential distributions is considered under the adaptive Type-II progressively censoring scheme introduced by Ng et al. for life testing or reliability experiment, and Bayes estimates and the corresponding two sides of Bayesian probability intervals are obtained.
Abstract: In this article, a competing risks model based on exponential distributions is considered under the adaptive Type-II progressively censoring scheme introduced by Ng et al. [2009, Naval Research Logistics 56:687-698], for life testing or reliability experiment. Moreover, we assumed that some causes of failures are unknown. The maximum likelihood estimators (MLEs) of unknown parameters are established. The exact conditional and the asymptotic distributions of the obtained estimators are derived to construct the confidence intervals as well as the two different bootstraps of different unknown parameters. Under suitable priors on the unknown parameters, Bayes estimates and the corresponding two sides of Bayesian probability intervals are obtained. Also, for the purpose of evaluating the average bias and mean square error of the MLEs, and comparing the confidence intervals based on all mentioned methods, a simulation study was carried out. Finally, we present one real dataset to conduct the proposed me...

Journal ArticleDOI
TL;DR: The results show that the new optimization design can improve the performance of the AEWMA control chart from an overall point of view relative to the various designs presented by Cappizzi and Masarotto (2003).
Abstract: The adaptive exponentially weighted moving average (AEWMA) control chart is a smooth combination of the Shewhart and exponentially weighted moving average (EWMA) control charts. This chart was proposed by Cappizzi and Masarotto (2003) to achieve a reasonable performance for both small and large shifts. Cappizzi and Masarotto (2003) used a pair of shifts in designing their control chart. In this study, however, the process mean shift is considered as a random variable with a certain probability distribution and the AEWMA control chart is optimized for a wide range of mean shifts according to that probability distribution and not just for a pair of shifts. Using the Markov chain technique, the results show that the new optimization design can improve the performance of the AEWMA control chart from an overall point of view relative to the various designs presented by Cappizzi and Masarotto (2003). Optimal design parameters that achieve the desired in-control average run length (ARL) are computed in s...

Journal ArticleDOI
TL;DR: The Shapiro–Wilk test was seen to have the best global performance overall, though in some circumstances the Shapiro–Francia or the D'Agostino tests offered better results, and the differences between the tests were not as clear for smaller sample sizes.
Abstract: There are several statistical hypothesis tests available for assessing normality assumptions, which is an a priori requirement for most parametric statistical procedures. The usual method for compa...

Journal ArticleDOI
TL;DR: A Markov model in Phase II profile monitoring with autocorrelated binary response variable is introduced and two control charts are extended in which the covariance matrix is derived based on the Fisher information matrix.
Abstract: This paper introduces a Markov model in Phase II profile monitoring with autocorrelated binary response variable. In the proposed approach, a logistic regression model is extended to describe the w...

Journal ArticleDOI
TL;DR: The estimation of the parameters for Gompertz distribution is obtained using maximum likelihood method (MLE) and Bayesian method under three different loss functions and the existence and uniqueness of the MLE is proved.
Abstract: In this article, the statistical inference for the Gompertz distribution based on Type-II progressively hybrid censored data is discussed. The estimation of the parameters for Gompertz distribution is obtained using maximum likelihood method (MLE) and Bayesian method under three different loss functions. We also proved the existence and uniqueness of the MLE. The one-sample Bayesian prediction intervals are obtained. The work is done for different values of the parameters. We apply the Monto Carlo simulation to compare the proposed methods, also an example is discussed to construct the Prediction intervals.

Journal ArticleDOI
TL;DR: In this paper, a new class of distributions is introduced, which generalizes the linear failure rate distribution and is obtained by compounding this distribution and power series classes of distributions and contains some new distributions such as linear failure-rate-geometric, linear failurerate-Poisson, Linear failure rate-logarithmic, Linear Failure Rate-binomial distributions, and Rayleigh-power series class of distribution.
Abstract: In this article, a new class of distributions is introduced, which generalizes the linear failure rate distribution and is obtained by compounding this distribution and power series class of distributions. This new class of distributions is called the linear failure rate-power series distributions and contains some new distributions such as linear failure rate-geometric, linear failure rate-Poisson, linear failure rate-logarithmic, linear failure rate-binomial distributions, and Rayleigh-power series class of distributions. Some former works such as exponential-power series class of distributions, exponential-geometric, exponential-Poisson, and exponential-logarithmic distributions are special cases of the new proposed model. The ability of the linear failure rate-power series class of distributions is in covering five possible hazard rate function, that is, increasing, decreasing, upside-down bathtub (unimodal), bathtub and increasing-decreasing-increasing shaped. Several properties of this class...