scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Experiments: Planning, Analysis, and Parameter Design Optimization

01 Jan 2002-Journal of Quality Technology (Informa UK Limited)-Vol. 34, Iss: 1, pp 134-136
TL;DR: This work discusses the practice of problem solving, testing hypotheses about statistical parameters, calculating and interpreting confidence limits, tolerance limits and prediction limits, and setting up and interpreting control charts.
Abstract: THE best adjective to describe this work is \"sweep11 ing.\" The range of subject matter is so broad that it can almost be described as containing everything except fuzzy set theory. Included are explicit discussions of the basics of probability (relegated to an appendix); the practice of problem solving; testing hypotheses about statistical parameters; calculating and interpreting confidence limits; tolerance limits and prediction limits; setting up and interpreting control charts; design of experiments; analysis of variance; line and surface fitting; and maximum likelihood procedures. If you can think of something that is not in this list, then it probably means I have overlooked it.
Citations
More filters
Journal ArticleDOI
TL;DR: A procedure in which an adaptive sequential design is employed to derive surrogate models and estimate sensitivity indices for different sub-groups of inputs is drawn attention, which is particularly useful when there is little prior knowledge about the response surface.
Abstract: If a computer model is run many times with different inputs, the results obtained can often be used to derive a computationally cheaper approximation, or surrogate model, of the original computer code. Thereafter, the surrogate model can be employed to reduce the computational cost of a variance-based sensitivity analysis (VBSA) of the model output. Here, we draw attention to a procedure in which an adaptive sequential design is employed to derive surrogate models and estimate sensitivity indices for different sub-groups of inputs. The results of such group-wise VBSAs are then used to select inputs for a final VBSA. Our procedure is particularly useful when there is little prior knowledge about the response surface and the aim is to explore both the global variability and local nonlinear features of the model output. Our conclusions are based on computer experiments involving the process-based river basin model INCA-N, in which outputs like the average annual riverine load of nitrogen can be regarded as functions of 19 model parameters.

284 citations


Cites methods from "Experiments: Planning, Analysis, an..."

  • ...This was confirmed by a preliminary investigation using an optimal 219e10V fractional factorial design (Wu and Hamada, 2000) in which the whole set of 19 inputs varied simultaneously....

    [...]

  • ...This was confirmed by a preliminary investigation using an optimal 219e10 V fractional factorial design (Wu and Hamada, 2000) in which the whole set of 19 inputs varied simultaneously....

    [...]

Journal ArticleDOI
TL;DR: The results of this experiment do not support the hypotheses that pair programming in general reduces the time required to solve the tasks correctly or increases the proportion of correct solutions.
Abstract: A total of 295 junior, intermediate, and senior professional Java consultants (99 individuals and 98 pairs) from 29 international consultancy companies in Norway, Sweden, and the UK were hired for one day to participate in a controlled experiment on pair programming. The subjects used professional Java tools to perform several change tasks on two alternative Java systems with different degrees of complexity. The results of this experiment do not support the hypotheses that pair programming in general reduces the time required to solve the tasks correctly or increases the proportion of correct solutions. On the other hand, there is a significant 84 percent increase in effort to perform the tasks correctly. However, on the more complex system, the pair programmers had a 48 percent increase in the proportion of correct solutions but no significant differences in the time taken to solve the tasks correctly. For the simpler system, there was a 20 percent decrease in time taken but no significant differences in correctness. However, the moderating effect of system complexity depends on the programmer expertise of the subjects. The observed benefits of pair programming in terms of correctness on the complex system apply mainly to juniors, whereas the reductions in duration to perform the tasks correctly on the simple system apply mainly to intermediates and seniors. It is possible that the benefits of pair programming will exceed the results obtained in this experiment for larger, more complex tasks and if the pair programmers have a chance to work together over a longer period of time

274 citations

Journal ArticleDOI
TL;DR: Investigators in preventive medicine and related areas should begin considering factorial experiments alongside other approaches, and experimental designs should be chosen from a resource management perspective.

225 citations

Journal ArticleDOI
TL;DR: The degradation modeling framework presented herein addresses the challenge of building a degradation database by utilizing failure time data, which are easier to obtain, and readily available (relative to sensor-based degradation signals) from historical maintenance/repair records.
Abstract: Recent developments in degradation modeling have been targeted towards utilizing degradation-based sensory signals to predict residual life distributions. Typically, these models consist of stochastic parameters that are estimated with the aid of an historical database of degradation signals. In many applications, building a degradation database, where components are run-to-failure, may be very expensive and time consuming, as in the case of generators or jet engines. The degradation modeling framework presented herein addresses this challenge by utilizing failure time data, which are easier to obtain, and readily available (relative to sensor-based degradation signals) from historical maintenance/repair records. Failure time values are first fitted to a Bernstein distribution whose parameters are then used to estimate the prior distributions of the stochastic parameters of an initial degradation model. Once a complete realization of a degradation signal is observed, the assumptions of the initial degradation model are revised and improved for future predictions. This approach is validated using real world vibration-based degradation information from a rotating machinery application.

224 citations


Cites methods from "Experiments: Planning, Analysis, an..."

  • ...This is done by testing the hypothesis [23]...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: In this article, a nonparametric multiple regression (NMM) method is presented, which models the regression surface as a sum of general smooth functions of linear combinations of the predictor variables in an iterative manner.
Abstract: A new method for nonparametric multiple regression is presented. The procedure models the regression surface as a sum of general smooth functions of linear combinations of the predictor variables in an iterative manner. It is more general than standard stepwise and stagewise regression procedures, does not require the definition of a metric in the predictor space, and lends itself to graphical interpretation.

2,224 citations

Journal ArticleDOI
TL;DR: A resource management perspective on making a complete or reduced factorial design decision is advocated, in which the investigator seeks a strategic balance between service to scientific objectives and economy.
Abstract: An investigator who plans to conduct an experiment with multiple independent variables must decide whether to use a complete or reduced factorial design. This article advocates a resource management perspective on making this decision, in which the investigator seeks a strategic balance between service to scientific objectives and economy. Considerations in making design decisions include whether research questions are framed as main effects or simple effects; whether and which effects are aliased (confounded) in a particular design; the number of experimental conditions that must be implemented in a particular design and the number of experimental subjects the design requires to maintain the desired level of statistical power; and the costs associated with implementing experimental conditions and obtaining experimental subjects. In this article 4 design options are compared: complete factorial, individual experiments, single factor, and fractional factorial. Complete and fractional factorial designs and single-factor designs are generally more economical than conducting individual experiments on each factor. Although relatively unfamiliar to behavioral scientists, fractional factorial designs merit serious consideration because of their economy and versatility.

319 citations

Journal ArticleDOI
TL;DR: A procedure in which an adaptive sequential design is employed to derive surrogate models and estimate sensitivity indices for different sub-groups of inputs is drawn attention, which is particularly useful when there is little prior knowledge about the response surface.
Abstract: If a computer model is run many times with different inputs, the results obtained can often be used to derive a computationally cheaper approximation, or surrogate model, of the original computer code. Thereafter, the surrogate model can be employed to reduce the computational cost of a variance-based sensitivity analysis (VBSA) of the model output. Here, we draw attention to a procedure in which an adaptive sequential design is employed to derive surrogate models and estimate sensitivity indices for different sub-groups of inputs. The results of such group-wise VBSAs are then used to select inputs for a final VBSA. Our procedure is particularly useful when there is little prior knowledge about the response surface and the aim is to explore both the global variability and local nonlinear features of the model output. Our conclusions are based on computer experiments involving the process-based river basin model INCA-N, in which outputs like the average annual riverine load of nitrogen can be regarded as functions of 19 model parameters.

284 citations

Journal ArticleDOI
TL;DR: The results of this experiment do not support the hypotheses that pair programming in general reduces the time required to solve the tasks correctly or increases the proportion of correct solutions.
Abstract: A total of 295 junior, intermediate, and senior professional Java consultants (99 individuals and 98 pairs) from 29 international consultancy companies in Norway, Sweden, and the UK were hired for one day to participate in a controlled experiment on pair programming. The subjects used professional Java tools to perform several change tasks on two alternative Java systems with different degrees of complexity. The results of this experiment do not support the hypotheses that pair programming in general reduces the time required to solve the tasks correctly or increases the proportion of correct solutions. On the other hand, there is a significant 84 percent increase in effort to perform the tasks correctly. However, on the more complex system, the pair programmers had a 48 percent increase in the proportion of correct solutions but no significant differences in the time taken to solve the tasks correctly. For the simpler system, there was a 20 percent decrease in time taken but no significant differences in correctness. However, the moderating effect of system complexity depends on the programmer expertise of the subjects. The observed benefits of pair programming in terms of correctness on the complex system apply mainly to juniors, whereas the reductions in duration to perform the tasks correctly on the simple system apply mainly to intermediates and seniors. It is possible that the benefits of pair programming will exceed the results obtained in this experiment for larger, more complex tasks and if the pair programmers have a chance to work together over a longer period of time

274 citations