scispace - formally typeset
Search or ask a question

Showing papers by "Daniel D. Frey published in 2007"


Journal ArticleDOI
TL;DR: A large number of inventions documented by US patents that claimed robustness as a key advantage over the prior art were grouped on the basis of the general strategies they employed and it was found that the strategies can be usefully organized via the P-diagram.
Abstract: The term ‘robust design’ denotes various engineering methods intended to make a product's function more consistent despite variations in the environment, manufacturing, deterioration, and customer use patterns. Most robust design methods are employed at the detailed design stage, but the benefits derived may be significantly higher if efforts are made earlier in the design process so that the design concept itself is inherently capable of being made robust. To make progress toward these ends, we studied a large number of inventions documented by US patents that claimed robustness as a key advantage over the prior art. We grouped these patents on the basis of the general strategies they employed. We found that the strategies can be usefully organized via the P-diagram. This paper will describe a few of these strategies by means of examples and explain the relationship of the strategies to the P-diagram.

70 citations


Proceedings ArticleDOI
01 Jan 2007
TL;DR: The models developed suggest that Pugh’s method, under a substantial range of assumptions, results in better design outcomes than those from these alternative procedures.
Abstract: This paper evaluates a method known as Pugh Controlled Convergence and its relationship to recent developments in design theory. Computer executable models are proposed simulating a team of people involved in iterated cycles of evaluation, ideation, and investigation. The models suggest that: 1) convergence of the set of design concepts is facilitated by the selection of a strong datum concept; 2) iterated use of an evaluation matrix can facilitate convergence of expert opinion, especially if used to plan investigations conducted between matrix runs; and 3) ideation stimulated by the Pugh matrices can provide large benefits both by improving the set of alternatives and by facilitating convergence. As a basis of comparison, alternatives to Pugh’s methods were assessed such as using a single summary criterion or using a Borda count. The models we developed suggest that Pugh’s method, under a substantial range of assumptions, results in better design outcomes than those from these alternative procedures.© 2007 ASME

32 citations


Journal IssueDOI
TL;DR: Examining the engineering practices related to part count, applying three different theories—Theory of Inventive Problem Solving, Axiomatic Design, and Highly Optimized Tolerance, concludes that at the overall system level, jet engine part count has generally increased in response to escalating demands for system robustness as suggested by the theory of Highly Optimization Tolerance.
Abstract: Systems engineering frequently includes efforts to reduce part count with the goal of cutting costs, enhancing performance, or improving reliability. This paper examines the engineering practices related to part count, applying three different theories—Theory of Inventive Problem Solving, Axiomatic Design, and Highly Optimized Tolerance. Case studies from the jet engine industry are used to illustrate the complicated tradesoffs involved in real-world part count reduction efforts. The principal conclusions are that: (1) Part consolidation at the component level has generally been accomplished as technological advancements enable them which is consistent with the “law of ideality” in the Theory of Inventive Problem Solving; (2) part count reduction frequently increases coupling among functional requirements, design parameters, and processing variables while also delivering higher reliability which conflicts with the theory of Axiomatic Design; and (3) at the overall system level, jet engine part count has generally increased in response to escalating demands for system robustness as suggested by the theory of Highly Optimized Tolerance. © 2007 Wiley Periodicals, Inc. Syst Eng 10: 203–221, 2007 This paper was presented at the 16th Annual International Symposium of the International Council on Systems Engineering (INCOSE), July 9–14, 2006, Orlando, FL.

32 citations


Journal ArticleDOI
TL;DR: It is found that the compound noise strategy is very effective for systems which exhibit effect sparsity, and the alternative method requires less information to formulate compound noise as compared to Taguchi's formulation.
Abstract: This paper evaluates compound noise as a robust design method. Application of compound noise as a robust design method leads to a reduction in experimental effort. The compound noise strategy was applied to two types of situation: the first type has been described with active effects up to two-factor interactions and the second type has been described with effects up to three-factor interactions. These two situations are illustrated with help of case studies. The paper provides theoretical justification for the effectiveness of the compound noise strategy as formulated by Taguchi and Phadke. For example, we found that the compound noise strategy is very effective for systems which exhibit effect sparsity. This paper gives an alternative procedure to formulate a compound noise, distinctly different from Taguchi's formulation. The alternative method requires less information to formulate compound noise as compared to Taguchi's formulation. Overall, the paper studies the effectiveness of such an alternative formulation, outlines scenarios where compound noise as a robust design method can be effectively used and gives alternative strategies for the systems on which compound noise cannot be effective. Copyright © 2006 John Wiley & Sons, Ltd.

8 citations


Proceedings ArticleDOI
01 Jan 2007
TL;DR: In this paper, the problem of achieving improvements through adaptive experimentation is considered and it is shown that, in a Bayesian framework, one factor at a time experimentation is an optimally efficient response to step by step accrual of sample information.
Abstract: This paper considers the problem of achieving improvements through adaptive experimentation. To limit the focus we consider only design spaces with discrete two-level factors. We prove that, in a Bayesian framework, one factor at a time experimentation is an optimally efficient response to step by step accrual of sample information. We derive Bayesian predictive distributions for experimentation outcomes given natural conjugate priors. Using an example based on fatigue life of weld repaired castings, we show how to use our results.Copyright © 2007 by ASME