scispace - formally typeset
Search or ask a question

Showing papers in "Quality and Reliability Engineering International in 2011"


Journal ArticleDOI
Bong-Jin Yum1, Kwan-Woo Kim1
TL;DR: This paper contains a bibliography of approximately 530 journal papers and books on process capability indices for the period 2000–2009, and special applications include acceptance sampling plans, supplier selection, and tolerance design and other optimizations.
Abstract: This paper contains a bibliography of approximately 530 journal papers and books on process capability indices for the period 2000–2009. The related literature is classified into four major categories, namely, books, review/overview papers, theory- and method-related papers, and special applications. Theory- and method-related papers are further classified into univariate and multivariate cases, and special applications include acceptance sampling plans, supplier selection, and tolerance design and other optimizations. Copyright © 2010 John Wiley & Sons, Ltd.

141 citations


Journal ArticleDOI
TL;DR: How a manufacturing process can use a systematic methodology to move towards world-class quality level is explored in this paper, which deals with the application of the Six Sigma methodology in reducing defects in a fine grinding process of an automotive company in India.
Abstract: Six Sigma is a data-driven leadership approach using specific tools and methodologies that lead to fact-based decision making. This paper deals with the application of the Six Sigma methodology in reducing defects in a fine grinding process of an automotive company in India. The DMAIC (Define–Measure–Analyse–Improve–Control) approach has been followed here to solve the underlying problem of reducing process variation and improving the process yield. This paper explores how a manufacturing process can use a systematic methodology to move towards world-class quality level. The application of the Six Sigma methodology resulted in reduction of defects in the fine grinding process from 16.6 to 1.19%. The DMAIC methodology has had a significant financial impact on the profitability of the company in terms of reduction in scrap cost, man-hour saving on rework and increased output. A saving of approximately US$2.4 million per annum was reported from this project.

108 citations


Journal ArticleDOI
TL;DR: In this paper, two runs rules schemes are proposed to be applied on EWMA control charts and evaluated their performance in terms of the Average Run Length (ARL) for small and moderate shifts.
Abstract: Control charts are extensively used in processes and are very helpful in determining the special cause variations so that a timely action may be taken to eliminate them. One of the charting procedures is the Shewhart-type control charts, which are used mainly to detect large shifts. Two alternatives to the Shewhart-type control charts are the cumulative (CUSUM) control charts and the exponentially weighted moving average (EWMA) control charts that are specially designed to detect small and moderately sustained changes in quality. Enhancing the ability of design structures of control charts is always desirable and one may do it in different ways. In this article, we propose two runs rules schemes to be applied on EWMA control charts and evaluate their performance in terms of the Average Run Length (ARL). Comparisons of the proposed schemes are made with some existing representative CUSUM and EWMA-type counterparts used for small and moderate shifts, including the classical EWMA, the classical CUSUM, the fast initial response CUSUM and EWMA, the weighted CUSUM, the double CUSUM, the distribution-free CUSUM and the runs rules schemes-based CUSUM. The findings of the study reveal that the proposed schemes are able to perform better than all the other schemes under investigation. Copyright © 2010 John Wiley & Sons, Ltd.

104 citations


Journal ArticleDOI
TL;DR: This study proposes two runs rules schemes for the CUSUM charts and reveals that the proposed schemes perform better for small and moderate shifts, whereas they reasonably maintain their efficiency for large shifts as well.
Abstract: The control chart is an important statistical technique that is used to monitor the quality of a process. Shewhart control charts are used to detect larger disturbances in the process parameters, whereas CUSUM and EWMA charts are meant for smaller and moderate changes. Runs rules schemes are generally used to enhance the performance of Shewhart control charts. In this study, we propose two runs rules schemes for the CUSUM charts. The performance of these two schemes is compared with the usual CUSUM, the weighted CUSUM, the fast initial response CUSUM and the usual EWMA schemes. The comparisons revealed that the proposed schemes perform better for small and moderate shifts, whereas they reasonably maintain their efficiency for large shifts as well. Copyright © 2010 John Wiley & Sons, Ltd.

97 citations


Journal ArticleDOI
TL;DR: In this paper, a method based on likelihood ratio approach is developed to determine the location of shifts and a numerical simulation is used to evaluate the performance of the proposed method.
Abstract: In certain cases, the quality of a process or a product can be effectively characterized by two or more multiple linear regression profiles in which response variables are correlated. This structure can be modeled as multivariate multiple linear regression profiles. When linear profiles are monitored separately, then correlation between response variables is ignored and misleading results could be expected. To overcome this problem, the use of methods that consider the multivariate structure between response variables is inevitable. In this paper, we propose four methods to monitor this structure in Phase II. The performance of the methods is compared through simulation studies in terms of the average run length criterion. Furthermore, a method based on likelihood ratio approach is developed to determine the location of shifts and a numerical simulation is used to evaluate the performance of the proposed method. Finally, the use of the methods is illustrated by a numerical example. Copyright © 2010 John Wiley & Sons, Ltd.

92 citations


Journal ArticleDOI
TL;DR: This expository paper reviews the methods implemented forBernoulli processes in health-related monitoring, offers advice to practitioners and presents a comprehensive literature review for researchers.
Abstract: Bernoulli processes have been monitored using a wide variety of techniques in statistical process control. The data consist of information on successive items classified as conforming (nondefective) or nonconforming (defective). In some cases, the probability of obtaining a nonconforming item is very small; this is known as a high quality process. This area of statistical process control is also applied to health-related monitoring, where the incidence rate of a medical problem such as a congenital malformation is of interest. In these applications, standard Shewhart control charts based on the binomial distribution are no longer useful. In our expository paper, we review the methods implemented for these scenarios and present ideas for future work in this area. We offer advice to practitioners and present a comprehensive literature review for researchers. Copyright © 2011 John Wiley & Sons, Ltd.

80 citations


Journal ArticleDOI
TL;DR: Cost-based selective maintenance decision-making, which is the best method for a selected group of machines in machine line is presented under limited maintenance durations and shows that the fault losses can be further reduced by the optimization of maintenance interval and maintenance duration.
Abstract: Machine line is a type of manufacturing system in which machines are connected in series or in parallel. It is significant to ensure the reliability as well as to reduce the total cost of maintenance and failure losses in the maintenance programs of such systems. Cost-based selective maintenance decision-making, which is the best method for a selected group of machines in machine line is presented under limited maintenance durations. Fault losses and maintenance costs of a single machine under different maintenance actions i.e. minimal repair, preventive maintenance and overhaul on the fault rate of the machine are calculated. An algorithm combining the heuristic rules and tabu search is proposed to solve the presented selective maintenance model. Finally, a case study on the maintenance decision-making problem of a connecting rod machining line in the automobile engine workshop is presented to illustrate the applicability of the proposed method. The end result shows that the fault losses can be further reduced by the optimization of maintenance interval and maintenance duration. Copyright © 2010 John Wiley & Sons, Ltd.

77 citations


Journal ArticleDOI
TL;DR: The t-charts are tested for implementation in short production runs to monitor the process mean and their statistical properties are evaluated and show that the t-Charts can be successfully implemented to monitor a short run.
Abstract: Short-run productions are common in manufacturing environments like job shops, which are characterized by a high degree of flexibility and production variety. Owing to the limited number of possible inspections during a short run, often the Phase I control chart cannot be performed and correct estimates for the population mean and standard deviation are not available. Thus, the hypothesis of known in-control population parameters cannot be assumed and the usual control chart statistics to monitor the sample mean are not applicable. t-charts have been recently proposed in the literature to protect against errors in population standard deviation estimation due to the limitation of available sampling measures. In this paper the t-charts are tested for implementation in short production runs to monitor the process mean and their statistical properties are evaluated. Statistical performance measures properly designed to test the chart sensitivity during short runs have been considered to compare the performance of Shewhart and EWMA t-charts. Two initial setup conditions for the short run fixing the population mean exactly equal to the process target or, alternatively, introducing an initial setup error influencing the statistic distribution have been modelled. The numerical study considers several out-of-control process operating conditions including one-step shifts for the population mean and/or standard deviation. The obtained results show that the t-charts can be successfully implemented to monitor a short run. Finally, an illustrative example is presented to show the use of the investigated t charts. Copyright © 2010 John Wiley & Sons, Ltd.

75 citations


Journal ArticleDOI
TL;DR: It is pointed out that Six Sigma has Statistical Thinking as its foundation; for Six Sigma to continue to be effective, it is important that users have a clear understanding of the nature of Six Sigma and be able to address the related challenges in practice.
Abstract: Six Sigma has enjoyed considerable popularity in the industry for about a quarter of a century. While the standard contents of Six Sigma have been described and discussed widely, some little articulated aspects of Six Sigma implementation deserve the attention of serious practitioners. In this paper, a ‘5W+1H’ (What, Why, When Where, Who, How) format is used to elucidate the nature of Six Sigma in a non-mathematical discussion, followed by observations peculiar to the usual mode of development of Six Sigma professionals. It is pointed out that Six Sigma has Statistical Thinking as its foundation; for Six Sigma and its associated frameworks such as Design for Six Sigma and Lean Six Sigma to continue to be effective, it is important that users have a clear understanding of the nature of Six Sigma and be able to address the related challenges in practice. Copyright © 2010 John Wiley & Sons, Ltd.

67 citations


Journal ArticleDOI
TL;DR: A unifying and quantitative conceptual framework for healthcare processes from the viewpoint of process improvement that links on to process improvement methodologies such as business process reengineering, six sigma, lean thinking, theory of constraints, and total quality management.
Abstract: This paper aims to develop a unifying and quantitative conceptual framework for healthcare processes from the viewpoint of process improvement. The work adapts standard models from operation management to the specifics of healthcare processes. We propose concepts for organizational modeling of healthcare processes, breaking down work into micro processes, tasks, and resources. In addition, we propose an axiological model which breaks down general performance goals into process metrics. The connexion between both types of models is made explicit as a system of metrics for process flow and resource efficiency. The conceptual models offer exemplars for practical support in process improvement efforts, suggesting to project leaders how to make a diagrammatic representation of a process, which data to gather, and how to analyze and diagnose a process’s flow and resource utilization. The proposed methodology links on to process improvement methodologies such as business process reengineering, six sigma, lean thinking, theory of constraints, and total quality management. In these approaches, opportunities for process improvement are identified from a diagnosis of the process under study. By providing conceptual models and practical templates for process diagnosis, the framework relates many disconnected strands of research and application in process improvement in healthcare to the unifying pursuit of process improvement. Copyright © 2011 John Wiley & Sons, Ltd.

66 citations


Journal ArticleDOI
TL;DR: A classification of the different types of failures is presented and policies for analyzing data at the system and component levels taking into account the failure types are established.
Abstract: This paper proposes a method to analyze statistically maintenance data for complex medical devices with censoring and missing information. It presents a classification of the different types of failures and establishes policies for analyzing data at the system and component levels taking into account the failure types. The results of this analysis can be used as basic assumptions in the development of a maintenance/inspection optimization model. As a case study, we present the reliability analysis of a general infusion pump from a hospital. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: Cost comparisons of Fixed ratio sampling, VSI, VSS, VSIVSS with DWL, and multivariate exponentially weighted moving average (MEWMA) charts are made, which indicate the economic efficacy of using either VSIV SS withDWL or MEWMA charts in practice if cost minimization is of interest to the control chart user.
Abstract: Recent studies have shown that enhancing the common T2 control chart by using variable sample sizes (VSS) and variable sample intervals (VSI) sampling policies with a double warning line scheme (DWL) yields improvements in shift detection times over either pure VSI or VSS schemes in detecting almost all shifts in the process mean. In this paper, we look at this problem from an economical perspective, certainly at least as an important criterion as shift detection time if one considers what occurs in the industry today. Our method is to first construct a cost model to find the economic statistical design (ESD) of the DWL T2 control chart using the general model of Lorenzen and Vance (Technometrics 1986; 28:3–11). Subsequently, we find the values of the chart parameters which minimize the cost model using a genetic algorithm optimization method. Cost comparisons of Fixed ratio sampling, VSI, VSS, VSIVSS with DWL, and multivariate exponentially weighted moving average (MEWMA) charts are made, which indicate the economic efficacy of using either VSIVSS with DWL or MEWMA charts in practice if cost minimization is of interest to the control chart user. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, a new non-parametric CUSUM Mean Chart is proposed to monitor the possible small mean shifts in the process and showed better detection ability than the two existing charts in monitoring and detecting small process mean shifts.
Abstract: Not all data in practice came from a process with normal distribution. When the process distribution is non-normal or unknown, the commonly used Shewhart control charts are not suitable. In this paper, a new non-parametric CUSUM Mean Chart is proposed to monitor the possible small mean shifts in the process. The sampling properties of the new monitoring statistics are examined and the average run lengths of the proposed chart are examined. Two numerical examples are used to illustrate the proposed chart and compare with the two existing charts, assuming normality and Beta distribution, respectively. The CUSUM Mean Chart showed better detection ability than those two charts in monitoring and detecting small process mean shifts. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: By redefining and listing a set of control charting rules, their performance on the X-bar, R, S and S2 charts will be evaluated and application of a few of these rules with real data sets will show their detection ability and use for practitioners.
Abstract: In the literature a number of control charting rules are proposed to decide whether a process is in control or out of control. Some issues with these rules will be highlighted in this article. By redefining and listing a set of rules we will evaluate their performance on the X-bar, R, S and S2 charts. Also we will compare the performance of these rules using their power curves to figure out the superior ones. Application of a few of these rules with real data sets will show their detection ability and use for practitioners.

Journal ArticleDOI
TL;DR: A sample size model is developed and the acceptance/rejection criteria is updated and the OCC developed for the Cpk sampling plan shows a higher accuracy (in an order of magnitude) in classifying lots correctly.
Abstract: Common acceptance sampling plans for variables do not take into account the process performance while determining the sample size needed. An attempt to overcome this gap was suggested by Negrin et al. (Quality Eng. 2009; 21: 306–318), where a sampling plan based on the Cpk index was developed. The plan is a multistage acceptance sampling plan based on Cpk for variables taken as a random sample from a lot of size N having (approximately) normal distribution with a known variance. In the current research, we relax the assumption of a known variance and develop a Cpk sampling plan based on unknown variance. We develop a sample size model and update the acceptance/rejection criteria of the previous sampling plan to the new, more realistic model with unknown variance. In addition we develop the operational characteristic curve (OCC). The sampling plan is compared with the commonly used plan MIL-STD-414 (Sampling Procedures and Tables for Inspection by Variables for Percent Defective. Department of Defense: Washington, DC, 1957) and it is found (via simulations) that the Cpk sampling plan has a smaller probability of accepting defective lots and the required sample size needed is found to be smaller for large lots. In addition, a comparison is made between the two OCCs and it is found that the OCC developed for the Cpk sampling plan shows a higher accuracy (in an order of magnitude) in classifying lots correctly. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The results reveal that the method proposed by Kimtextitet al. (J. Qual. Technol. 2003) can be robust to non-normality for both highly skewed and heavy-tailed distributions.
Abstract: In some statistical process control (SPC) applications, it is assumed that a quality characteristic or a vector of quality characteristics of interest follows a univariate or multivariate normal distribution, respectively. However, in certain applications this assumption may fail to hold and could lead to misleading results. In this paper, we study the effect of non-normality when the quality of a process or product is characterized by a linear profile. Skewed and heavy-tailed symmetric non-normal distributions are used to evaluate the non-normality effect numerically. The results reveal that the method proposed by Kimtextitet al. (J. Qual. Technol. 2003; 35:317–328) can be designed to be robust to non-normality for both highly skewed and heavy-tailed distributions. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A new multivariate CUSUM control chart, which is based on self adaption of its reference value according to the information from current process readings, to quickly detect the multivariate process mean shifts is proposed, which achieves an overall performance for detecting a particular range of shifts.
Abstract: We propose a new multivariate CUSUM control chart, which is based on self adaption of its reference value according to the information from current process readings, to quickly detect the multivariate process mean shifts. By specifying the minimum magnitude of the process mean shift in terms of its non-centrality parameter, our proposed control chart can achieve an overall performance for detecting a particular range of shifts. This adaptive feature of our method is based on two EWMA operators to estimate the current process mean level and make the detection at each step be approximately optimal. Moreover, we compare our chart with the conventional multivariate CUSUM chart. The advantages of our control chart detection for range shifts over the existing charts are greatly improved. The Markovian chain method, through which the average run length can be computed, is also presented. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper provides a procedure for optimal designs of the multivariate synthetic T2 chart for the process mean, based on MRL, for both the zero and steady-state modes.
Abstract: The average run length (ARL) is usually used as a sole measure of performance of a multivariate control chart. The Hotelling's T2, multivariate exponentially weighted moving average (MEWMA) and multivariate cumulative sum (MCUSUM) charts are commonly optimally designed based on the ARL. Similar to the case of univariate quality control, in multivariate quality control, the shape of the run length distribution changes in accordance to the magnitude of the shift in the mean vector, from highly skewed when the process is in-control to nearly symmetric for large shifts. Because the shape of the run length distribution changes with the magnitude of the shift in the mean vector, the median run length (MRL) provides additional and more meaningful information about the in-control and out-of-control performances of multivariate charts, not given by the ARL. This paper provides a procedure for optimal designs of the multivariate synthetic T2 chart for the process mean, based on MRL, for both the zero and steady-state modes. Two Mathematica programs, each for the zero state and steady-state modes are given for a quick computation of the optimal parameters of the synthetic T2 chart, designed based on MRL. These optimal parameters are provided in the paper, for the bivariate case with sample sizes, nin{4, 7, 10}. The MRL performances of the synthetic T2, MEWMA and Hotelling's T2 charts are also compared. Copyright © 2011 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The general architecture of the suggested MMF is described and modelled through diagrams elucidating the general operation of PAS 55, and the operation structure of a software tool that can incorporate MIMOSA standards and be made suitable for e‐maintenance functions, as an alternative to the commercial systems are appreciated.
Abstract: This article shows the process of modelling a reference maintenance management framework (MMF) that represents the general requirements of the asset management specification PAS 55. The modelled MMF is expressed using the standardized and publicly available Business Process Modelling (BPM) languages UML 2.1 (Unified Modelling Language) and BPMN 1.0 (BPM Notation). The features of these notations allow to easily integrate the modelled processes into the general information system of an organization and to create a flexible structure that can be quickly and even automatically adapted to new necessities. This article presents a brief review about the usage of UML in maintenance projects, general characteristics of PAS 55, modelling concepts and their applications in the project of modelling the MMF. The arguments underlying the methodology and the choice of UML and BPMN are exposed. The general architecture of the suggested MMF is described and modelled through diagrams elucidating the general operation of PAS 55. From this development is appreciated the operation structure of a software tool that can incorporate MIMOSA standards and that can be made suitable for e-maintenance functions, as an alternative to the commercial systems. Finally, some conclusions about the modelled framework are presented. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A systems approach of the Graph Theory is applied in this paper for quantifying human error in maintenance activities that models the identified factors and their interactions/interrelationships in terms of human error digraph.
Abstract: Assessment of human error in maintenance requires identification of the contributing factors that lead to human error(s). These factors are called human error inducing factors (HEIFs), which take into consideration both the active and latent error contributing aspects related to man, machine and environment. A systems approach of the Graph Theory is applied in this paper for quantifying human error in maintenance activities that models the identified factors and their interactions/interrelationships in terms of human error digraph. The nodes in the digraph represent the HEIFs and the edges represent their interrelationships. The digraph is converted into an equivalent matrix and an expression based on this is developed, which is characteristic of the human error in maintenance. This expression is used to evaluate a human error index by substituting the numerical value of the factors and their interrelations. The index is a measure of the human error potential involved in the maintenance of systems. A higher value of index indicates that the error likelihood is more for the associated tasks, and more efforts are required to make the system less prone to human error. The proposed methodology is illustrated using a case study. The approach is anticipated to play a significant role in identifying sources of human errors and predicting their impact; and will help to integrate human factors during design stage with the objective of reducing human error in maintenance. Copyright © 2011 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: It is demonstrated that the hierarchical modeling approach is flexible and powerful in modeling a complex degradation process with nonlinear function and random coefficient.
Abstract: In this case study, we investigate the degradation process of light-emitting diodes (LEDs), which is used as a light source in DNA sequencing machines. Accelerated degradation tests are applied by varying temperature and forward current, and the light outputs are measured by a computerized measuring system. A degradation path model, which connects to the LED function recommended in Mitsuo (1991), is used in describing the degradation process. We consider variations in both measurement errors and degradation paths among individual test units. It is demonstrated that the hierarchical modeling approach is flexible and powerful in modeling a complex degradation process with nonlinear function and random coefficient. After fitting the model by maximum likelihood estimation, the failure time distribution can be obtained by simulation. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The NNs are studied for the detection and determination of mean and/or variance shifts as well as in pattern recognition in the SPC charts and the use of NNs in multivariate control charts is addressed.
Abstract: Neural networks (NNs) are massively parallel computing mechanism emulating a human brain. It has been proved that they had a satisfactory performance when they were used for a wide variety of applications. In the recent years, the efficiencies that provided the NNs also began to be applied in statistical process control (SPC). SPC charts have become one of the most commonly used tools for monitoring process stability and variability in today's manufacturing environment. These tools are used to determine whether the process is statistically under or out of control but in some cases such as the presence of autocorrelation as well as the presence of a specific pattern in the data do not provide the possibility of correctly and quickly detecting and classifying the existing fault. These problems have led many researchers to propose alternative methods for monitoring processes such as the use of NNs. In this paper, we discuss issues concerning the combination of both tools. Specifically, we study the NNs for the detection and determination of mean and/or variance shifts as well as in pattern recognition in the SPC charts. Furthermore, the use of NNs when the data are correlated is discussed. Finally, the use of NNs in multivariate control charts is addressed. The networks architectures that were used for each case, the way of operation and the performance of the proposed NNs applications are pointed out. Copyright © 2011 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper proposes control charts for monitoring changes in the Weibull shape parameter β based on the range of a random sample from the smallest extreme value distribution, and derives control limits for both one- and two-sided control charts, unbiased with respect to the ARL.
Abstract: In this paper, we propose control charts for monitoring changes in the Weibull shape parameter β. These charts are based on the range of a random sample from the smallest extreme value distribution. The control chart limits depend only on the sample size, the desired stable average run length (ARL), and the stable value of β. We derive control limits for both one- and two-sided control charts. They are unbiased with respect to the ARL. We discuss sample size requirements if the stable value of βis estimated from past data. The proposed method is applied to data on the breaking strengths of carbon fibers. We recommend one-sided charts for detecting specific changes in βbecause they are expected to signal out-of-control sooner than the two-sided charts. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: An alternate way of computing the information matrix is proposed, a key consideration in planning an accelerated life test and the generalized linear model approach allows optimal designs to be computed using iteratively weighted least-square solutions versus a maximum likelihood method.
Abstract: Optimal experimental design practices are prominent in many applications. This paper proposes an alternate way of computing the information matrix, a key consideration in planning an accelerated life test. The generalized linear model approach allows optimal designs to be computed using iteratively weighted least-square solutions versus a maximum likelihood method. This approach is demonstrated with an assumed exponential distribution and allows the practitioner to observe the underlying structure of the optimal experimental design matrix and its relationship to important factors such as censoring and a nonlinear response function. Optimality criteria are discussed for both parameter estimation and prediction variance at an intended usage condition, which is typically outside the feasible accelerated test region. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: An exponentially weighted moving average (EWMA)-based control chart that plots only one statistic at a time is proposed to simultaneously monitor the mean and variability with individual observations.
Abstract: A traditional approach to monitor both the location and the scale parameters of a quality characteristic is to use two separate control charts. These schemes have some difficulties in concurrent tracking and interpretation. To overcome these difficulties, some researchers have proposed schemes consisting of only one chart. However, none of these schemes is designed to work with individual observations. In this research, an exponentially weighted moving average (EWMA)-based control chart that plots only one statistic at a time is proposed to simultaneously monitor the mean and variability with individual observations. The performance of the proposed scheme is compared with one of the two other existing combination charts by simulation. The results show that in general the proposed chart has a significantly better performance than the other combination charts. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The model robust regression (MRR) technique, a semiparametric method, is proposed to improve the quality of model estimation and adapt its fits of each response to the desirability function approach, one of the most popular MRO techniques.
Abstract: Multi-response optimization (MRO) in response surface methodology is quite common in applications. Before the optimization phase, appropriate fitted models for each response are required. A common problem is model misspecification and occurs when any of the models built for the responses are misspecified resulting in an erroneous optimal solution. The model robust regression (MRR) technique, a semiparametric method, has been shown to be more robust to misspecification than either parametric or nonparametric methods. In this study, we propose the use of MRR to improve the quality of model estimation and adapt its fits of each response to the desirability function approach, one of the most popular MRO techniques. A case study and simulation studies are presented to illustrate the procedure and to compare the semiparametric method with the parametric and nonparametric methods. The results show that MRR performs much better than the other two methods in terms of model comparison criteria in most situations during the modeling stage. In addition, the simulated optimization results for MRR are more reliable during the optimization stage. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This paper provided some tables for Tableau and presented an application example that showed how the model can be modified to suit the changing needs of the rapidly changing environment.
Abstract: PU. We not only provided some tables but also presented an application example. Copyright © 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: This methodology achieves the robustness objective function and input variables constraints simultaneously and is applicable to general functions of the system performance with random variables.
Abstract: Robust design is an efficient method for product and process improvement which combines experimentation with optimization to create a system that is less sensitive to uncontrollable variation. In this article, a simple and integrated modeling methodology for robust design is proposed. This methodology achieves the robustness objective function and input variables constraints simultaneously. The objective function is written in terms of the multivariate process capability vector (MCpm) of several competing features of the system under study. The proposed methodology is applicable to general functions of the system performance with random variables. The effectiveness of the methodology is verified using two real-world examples which are compared with those of other robust design methods. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A novel approach is proposed, based on goal programming, to find the best combination of factors so as to optimize multiresponse-multicovariate surfaces with consideration of location and dispersion effects.
Abstract: In many complex experiments, nuisance factor may have large effects that must be accounted for. Covariates are one of the most important kinds of nuisance factors that can be measured but cannot be controlled within the experimental runs. In this paper a novel approach is proposed, based on goal programming, to find the best combination of factors so as to optimize multiresponse-multicovariate surfaces with consideration of location and dispersion effects. Furthermore, it is supposed that several covariates considered in the experiment have probability distributions of known form. One objective is to find the most probable values of each covariate. For this purpose, a multiobjective mathematical optimization model is proposed and its efficacy is demonstrated by two numerical examples. Copyright © 2010 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A high-dimensional (HD) control chart approach for profile monitoring that is based on the adaptive Neyman test statistic for the coefficients of discrete Fourier transform of profiles is proposed.
Abstract: Profile monitoring is an important and rapidly emerging area of statistical process control. In many industries, the quality of processes or products can be characterized by a profile that describes a relationship or a function between a response variable and one or more independent variables. A change in the profile relationship can indicate a change in the quality characteristic of the process or product and, therefore, needs to be monitored for control purposes. We propose a high-dimensional (HD) control chart approach for profile monitoring that is based on the adaptive Neyman test statistic for the coefficients of discrete Fourier transform of profiles. We investigate both linear and nonlinear profiles, and we study the robustness of the HD control chart for monitoring profiles with stationary noise. We apply our control chart to monitor the process of nonlinear woodboard vertical density profile data of Walker and Wright (J. Qual. Technol. 2002; 34:118–129) and compare the results with those presented in Williams et al. (Qual. Reliab. Eng. Int. 2007; to appear). Copyright © 2010 John Wiley & Sons, Ltd.