scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Mechanical Design in 2012"


Journal ArticleDOI
TL;DR: In this paper, the authors demonstrate that using multiple responses, each of which depends on a common set of calibration parameters, can substantially enhance identifiability, and explain the mechanisms behind it, and attempt to shed light on when a system may or may not be identifiable.
Abstract: To use predictive models in engineering design of physical systems, one should first quantify the model uncertainty via model updating techniques employing both simulation and experimental data. While calibration is often used to tune unknown calibration parameters of a computer model, the addition of a discrepancy function has been used to capture model discrepancy due to underlying missing physics, numerical approximations, and other inaccuracies of the computer model that would exist even if all calibration parameters are known. One of the main challenges in model updating is the difficulty in distinguishing between the effects of calibration parameters versus model discrepancy. We illustrate this identifiability problem with several examples, explain the mechanisms behind it, and attempt to shed light on when a system may or may not be identifiable. In some instances, identifiability is achievable under mild assumptions, whereas in other instances, it is virtually impossible. In a companion paper, we demonstrate that using multiple responses, each of which depends on a common set of calibration parameters, can substantially enhance identifiability.

284 citations


Journal ArticleDOI
TL;DR: A cognitive engineering design study is presented that examines the effect of the distance of analogical design stimuli on design solution generation, and places those findings in context of results from the literature.
Abstract: This work lends insight into the meaning and impact of “near” and “far” analogies. A cognitive engineering design study is presented that examines the effect of the distance of analogical design stimuli on design solution generation, and places those findings in context of results from the literature. The work ultimately sheds new light on the impact of analogies in the design process and the significance of their distance from a design problem. In this work, the design repository from which analogical stimuli are chosen is the U.S. patent database, a natural choice, as it is one of the largest and easily accessed catalogued databases of inventions. The “near” and “far” analogical stimuli for this study were chosen based on a structure of patents, created using a combination of Latent Semantic Analysis and a Bayesian based algorithm for discovering structural form, resulting in clusters of patents connected by their relative similarity. The findings of this engineering design study are contextualized with the findings of recent work in design by analogy, by mapping the analogical stimuli used in the earlier work into similar structures along with the patents used in the current study. Doing so allows the discovery of a relationship between all of the stimuli and their relative distance from the design problem. The results confirm that “near” and “far” are relative terms, and depend on the characteristics of the potential stimuli. Further, although the literature has shown that “far” analogical stimuli are more likely to lead to the generation innovative solutions with novel characteristics, there is such a thing as too far. That is, if the stimuli are too distant, they then can become harmful to the design process. Importantly, as well, the data mapping approach to identify analogies works, and is able to impact the effectiveness of the design process. This work has implications not only in the area of finding inspirational designs to use for design by analogy processes in practice, but also for synthesis, or perhaps even unification, of future studies in the field of design by analogy.

222 citations


Journal ArticleDOI
TL;DR: This paper presents a novel approach, referred to as the WordTree design-by-analogy method, for identifying distant-domain analogies as part of the ideation process and highlights potentialimprovements for the method and areas for future research in engineering design theory.
Abstract: This paper presents a novel approach, referred to as the WordTree design-by-analogymethod, for identifying distant-domain analogies as part of the ideation process. TheWordTree method derives its effectiveness through a design team’s knowledge and read-ily available information sources (e.g., patent databases, Google) and does not requirespecialized computational knowledge bases. A controlled cognitive experiment and anevaluation of the method with redesign projects illustrate the method’s influence in assist-ing engineers in design-by-analogy. Individuals using the WordTree method identifiedsignificantly more analogies and searched outside the problem domain as compared tothe control group. The team redesign projects demonstrate the WordTree method’s effec-tiveness in longer-term, more realistic, higher validity team projects and with a variety ofdifferent design problems. Teams successfully identified effective analogies, analogousdomains, and analogous patents. Unexpected and unique solutions are identified usingthe method. For example, one of the teams identified a dump truck and panning forgold as effective analogies for the design of a self-cleaning cat litter box. In thecontrolled experiment, a cherry pitter was identified and implemented as a solution fordesigning a machine to shell peanuts. The experimental results also highlight potentialimprovements for the method and areas for future research in engineering design theory.[DOI: 10.1115/1.4006145]Keywords: analogy, conceptual design, innovation, design method, idea generation

198 citations


Journal ArticleDOI
TL;DR: In this article, a nested extreme response surface (NERS) approach was proposed to efficiently carry out time-dependent reliability analysis and determine the optimal designs for RBDO with probabilistic constraints.
Abstract: A primary concern in practical engineering design is ensuring high system reliability throughout a product's lifecycle, which is subject to time-variant operating conditions and component deteriorations. Thus, the capability of dealing with time-dependent probabilistic constraints in reliability-based design optimization (RBDO) is of vital importance in practical engineering design applications. This paper presents a nested extreme response surface (NERS) approach to efficiently carry out time-dependent reliability analysis and determine the optimal designs. This approach employs the kriging model to build a nested response surface of time corresponding to the extreme value of the limit state function. The efficient global optimization (EGO) technique is integrated with the NERS approach to extract the extreme time responses of the limit state function for any given system design. An adaptive response prediction and model maturation (ARPMM) mechanism is developed based on the mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-dependent reliability analysis can be converted into the time-independent reliability analysis, and existing advanced reliability analysis and design methods can be used. The NERS approach is compared with existing time-dependent reliability analysis approaches and integrated with RBDO for engineered system design with time-dependent probabilistic constraints. Two case studies are used to demonstrate the efficacy of the proposed NERS approach.

190 citations


Journal ArticleDOI
TL;DR: In this paper, a monolithic torsional spring is used as the basic component of a modular compliant actuators for series elastic actuators, whose design was refined through an iterative FEA-based optimization process, having an external diameter of 85 mm, a thickness of 3 mm and a weight of 61.5 g.
Abstract: The introduction of intrinsic compliance in the actuation system of assistive robots improves safety and dynamical adaptability. Furthermore, in the case of wearable robots for gait assistance, the exploitation of conservative compliant elements as energy buffers can mimic the intrinsic dynamical properties of legs during locomotion. However, commercially available compliant components do not generally allow to meet the desired requirements in terms of admissible peak load, as typically required by gait assistance, while guaranteeing low stiffness and a compact and lightweight design. This paper presents a novel compact monolithic torsional spring to be used as the basic component of a modular compliant system for series elastic actuators. The spring, whose design was refined through an iterative FEA-based optimization process, has an external diameter of 85 mm, a thickness of 3 mm and a weight of 61.5 g. The spring, characterized using a custom dynamometric test bed, shows a linear torque versus angle characteristic. The compliant element has a stiffness of 98 N·m/rad and it is capable of withstanding a maximum torque of 7.68 N·m. A good agreement between simulated and experimental data were observed, with a maximum resultant error of 6%. By arranging a number of identical springs in series or in parallel, it is possible to render different torque versus angle characteristics, in order to match the specific applications requirements.

106 citations


Journal ArticleDOI
TL;DR: In this article, an adjustable constant-force mechanism (ACFM) is proposed to passively regulate the contact force of a robot end-effector by combining the negative stiffness of a bistable mechanism and positive stiffness of linear spring to generate a constant force output.
Abstract: Force regulation is a challenging problem for robot end-effectors when interacting with an unknown environment. It often requires sophisticated sensors with computerized control. This paper presents an adjustable constant-force mechanism (ACFM) to passively regulate the contact force of a robot end-effector. The proposed ACFM combines the negative stiffness of a bistable mechanism and positive stiffness of a linear spring to generate a constant-force output. Through prestressing the linear spring, the constant-force magnitude can be adjusted to adapt to different working environments. The ACFM is a monolithic compliant mechanism that has no frictional wear and is capable of miniaturization. We propose a design formulation to find optimal mechanism configurations that produce the most constant-force. A resulting force to displacement curve and maximal stress curve can be easily manipulated to fit a different application requirement. Illustrated experiments show that an end-effector equipped with the ACFM can adapt to a surface of variable height, without additional motion programming. Since sensors and control effort are minimized, we expect this mechanism can provide a reliable alternative for robot end-effectors to interact friendly with an environment.

104 citations



Journal ArticleDOI
Chen Jiang1, Xu Han1, Li Wang1, Jie Liu1, Z. Zhang1 
TL;DR: In this paper, a new reliability analysis technique is developed based on a hybrid uncertain model, which can deal with problems with limited information, where uncertain parameters are treated as random variables, while some of their distribution parameters are not given precise values but variation intervals.
Abstract: Traditional reliability analysis generally uses probability approach to quantify the uncertainty, while it needs a great amount of information to construct precise distributions of the uncertain parameters. In this paper, a new reliability analysis technique is developed based on a hybrid uncertain model, which can deal with problems with limited information. All uncertain parameters are treated as random variables, while some of their distribution parameters are not given precise values but variation intervals. Due to the existence of the interval parameters, a limit-state strip enclosed by two bounding hyper-surfaces will be resulted in the transformed normal space, instead of a single hyper-surface as we usually obtain in conventional reliability analysis. All the limit-state strips are then summarized into two different classes and corresponding reliability analysis models are proposed for them. A monotonicity analysis is carried out for probability transformations of the random variables, through which effects of the interval distribution parameters on the limit state can be well revealed. Based on the monotonicity analysis, two algorithms are then formulated to solve the proposed hybrid reliability models. Three numerical examples are investigated to demonstrate the effectiveness of the present method.

92 citations


Journal ArticleDOI
TL;DR: In this paper, a large-range CPM with enhanced out-of-plane stiffness (LRXYCPMEOS) is presented, which is obtained from a 4-PP-E (E: planar) decoupled parallel mechanism.
Abstract: There is an increasing need for compact large-range XY compliant parallel manipulators (CPMs). This paper deals with a novel large-range XY CPM with enhanced out-of-plane stiffness (LRXYCPMEOS). Unlike most of XY CPMs based on the 4-PP (P: prismatic) decoupled parallel mechanism, the LRXYCPMEOS is obtained from a 4-PP-E (E: planar) decoupled parallel mechanism by replacing each P joint with a planar double multibeam parallelogram module (DMBPM) and the E joint with a spatial double multibeam parallelogram module. Normalized analytical models for the LRXYCPMEOS are then presented. As a case study, an LRXYCPMEOS with a motion range 10 mm × 10 mm in both positive directions is presented in detail, covering the geometrical parameter determination, performance characteristics analysis, actuation force check, and buckling check. The analytical models are compared with the finite element analysis (FEA) models. Finally, dynamics consideration, manufacturability, out-of-plane stiffness, and result interpretation are discussed. It is shown that the LRXYCPMEOS in the case study has the following merits: large range of motion up to 20 mm × 20 mm, enhanced out-of-plane stiffness which is approximately 7.1 times larger than the associated planar XY CPM without the spatial compliant leg, and well-constrained parasitic motion with the parasitic translation along the Z-axis less than 2 × 10−4 mm, the parasitic rotation about the X-axis/Y-axis less than 2 × 10−6 rad, and the parasitic rotation about the Z-axis below 1 × 10−6 rad.

83 citations


Journal ArticleDOI
TL;DR: In this article, the authors presented a symbolic formulation for analytical compliance analysis and synthesis of flexure mechanisms with serial, parallel, or hybrid topologies based on the screw theory that characterizes flexure deformations with motion twists and loadings with force wrenches.
Abstract: This paper presents a symbolic formulation for analytical compliance analysis and synthesis of flexure mechanisms with serial, parallel, or hybrid topologies. Our approach is based on the screw theory that characterizes flexure deformations with motion twists and loadings with force wrenches. In this work, we first derive a symbolic formulation of the compliance and stiffness matrices for commonly used flexure elements, flexure joints, and simple chains. Elements of these matrices are all explicit functions of flexure parameters. To analyze a general flexure mechanism, we subdivide it into multiple structural modules, which we identify as serial, parallel, or hybrid chains. We then analyze each module with the known flexure structures in the library. At last, we use a bottom-up approach to obtain the compliance/stiffness matrix for the overall mechanism. This is done by taking appropriate coordinate transformation of twists and wrenches in space. Four practical examples are provided to demonstrate the approach. A numerical example is employed to compare analytical compliance models against a finite element model. The results show that the errors are sufficiently small (2%, compared with finite element (FE) model), if the range of motion is limited to linear deformations. This work provides a systematical approach for compliance analysis and synthesis of general flexure mechanisms. The symbolic formulation enables subsequent design tasks, such as compliance synthesis or sensitivity analysis. [DOI: 10.1115/1.4006441] 1 Introduction and Motivations Flexure mechanisms [1] are formed by multiple (often identical) flexure pivots or simple chains that are designed to produce a defined motion upon application of an appropriate loading. They are widely used in various precision instruments and machines [2] from nanomanipulators [3], nanopositioners [4], and optical scanning mirrors [5] to scanning transmission X-ray microscopy [6]. One important step toward the control and design of flexure mechanisms is the compliance analysis or mapping; the goal of which is to determine the relationship between the deformation and the loading applied to the device. Dimentberg [7] applied the screw calculus [8] to study the statics and vibration of an elastic suspension system. Loncaric [9] applied Lie algebra to the stiffness and compliance analysis of robotic devices and showed that the stiffness and compliance matrices can be reduced to a normal form by a particular choice of the coordinate frame. Lipkin and Patterson [10‐12] studied the structure of compliance matrices via eigenvalue decomposition. Selig and Ding [13,14] applied the screw theory [15,16] to the compliance analysis of static beams. In their work, a general loading is represented by a wrench, while a general deformation is represented by a motion screw. They derived the compliance matrix for a cantilever beam subject to an end loading, and showed that the result was consistent with the

81 citations


Journal ArticleDOI
TL;DR: The results show that the FBS linkage model is promising and improves current methods in several ways: the model accounts explicitly for all possible dependencies between product elements, allows capturing and modeling of all relevant change requests, and improves the understanding of why and how changes propagate.
Abstract: Engineering change (EC) is a source of uncertainty. While the number of changes to a design can be optimized, their existence cannot be eliminated. Each change is accompanied by intended and unintended impacts both of which might propagate and cause further knock-on changes. Such change propagation causes uncertainty in design time, cost, and quality and thus needs to be predicted and controlled. Current engineering change propagation models map the product connectivity into a single-domain network and model change propagation as spread within this network. Those models miss out most dependencies from other domains and suffer from “hidden dependencies”. This paper proposes the function-behavior-structure (FBS) linkage model, a multidomain model which combines concepts of both the function-behavior-structure model from Gero and colleagues with the change prediction method (CPM) from Clarkson and colleagues. The FBS linkage model is represented in a network and a corresponding multidomain matrix of structural, behavioral, and functional elements and their links. Change propagation is described as spread in that network using principles of graph theory. The model is applied to a diesel engine. The results show that the FBS linkage model is promising and improves current methods in several ways: The model (1) accounts explicitly for all possible dependencies between product elements, (2) allows capturing and modeling of all relevant change requests, (3) improves the understanding of why and how changes propagate, (4) is scalable to different levels of decomposition, and (5) is flexible to present the results on different levels of abstraction. All these features of the FBS linkage model can help control and counteract change propagation and reduce uncertainty and risk in design.

Journal ArticleDOI
TL;DR: In this article, the axial migration of the rollers relative to the nut in the planetary roller screw mechanism (PRSM) is predicted, which is an undesirable phenomenon that can cause binding and eventually lead to the destruction of the mechanism.
Abstract: This paper develops a kinematic model to predict the axial migration of the rollers relative to the nut in the planetary roller screw mechanism (PRSM). This axial migration is an undesirable phenomenon that can cause binding and eventually lead to the destruction of the mechanism. It is shown that this migration is due to slip at the nut–roller interface, which is caused by a pitch mismatch between the spur-ring gear and the effective nut– roller helical gear pairs. This pitch circle mismatch can be due to manufacturing errors, deformations of the mechanism due to loading, and uncertainty in the radii of contact between the components. This paper derives the angle through which slip occurs and the subsequent axial migration of the roller. It is shown that this roller migration does not affect the overall lead of the PRSM. In addition, the general orbital mechanics, in-plane slip velocity at the nut–roller interface, and the axial slip velocities at the nut–roller and the screw–roller interfaces are also derived. Finally, an example problem is developed using a range of pitch mismatch values for the given roller screw dimensions, and the axial migration and slip velocities are determined.

Journal ArticleDOI
TL;DR: In this article, the authors modify the first-order reliability method (FORM) so that the truncated random variables are transformed into truncated standard normal variables, and saddlepoint approximation is then used to estimate the reliability.
Abstract: In many engineering applications, the probability distributions of some random variables are truncated; these truncated distributions are resulted from restricting the domain of other probability distributions. If the first order reliability method (FORM) is directly used, the truncated random variables will be transformed into unbounded standard normal distributions. This treatment may result in large errors in reliability analysis. In this work, we modify FORM so that the truncated random variables are transformed into truncated standard normal variables. After the first order approximation and variable transformation, saddlepoint approximation is then used to estimate the reliability. Without increasing the computational cost, the proposed method is generally more accurate than the original FORM for problems with truncated random variables. [DOI: 10.1115/1.4007150]

Journal ArticleDOI
TL;DR: A new methodology for uncertainty quantification in systems that require multidisciplinary iterative analysis between two or more coupled component models is proposed, based on computing the probability of satisfying the interdisciplinary compatibility equations, conditioned on specific values of the coupling variables.
Abstract: This paper proposes a new methodology for uncertainty quantification in systems that require multidisciplinary iterative analysis between two or more coupled component models. This methodology is based on computing the probability of satisfying the interdisciplinary compatibility equations, conditioned on specific values of the coupling (or feedback) variables, and this information is used to estimate the probability distributions of the coupling variables. The estimation of the coupling variables is analogous to likelihood-based parameter estimation in statistics and thus leads to the proposed likelihood approach for multidisciplinary analysis (LAMDA). Using the distributions of the feedback variables, the coupling can be removed in any one direction without loss of generality, while still preserving the mathematical relationship between the coupling variables. The calculation of the probability distributions of the coupling variables is theoretically exact and does not require a fully coupled system analysis. The proposed method is illustrated using a mathematical example and an aerospace system application—a fire detection satellite.

Journal ArticleDOI
TL;DR: In this paper, a traffic light that uses Light-emitting Diodes (LED) instead of incandescent bulbs challenging the designer subject with the issue of snow accumulation since LED's do not melt the snow.
Abstract: The objective of this paper is to present experimental results of a specific ideation method: TRIZ. Our hypothesis is that TRIZ improves the creativity of subjects using it as observed in the produced design outcomes. The experiments were conducted simultaneously at two institutions: University of Texas at El Paso and University of Maryland. The same ideation task was used at both institutions, a redesign of a traffic light that uses Light-emitting Diodes (LED) instead of incandescent bulbs challenging the designer subject with the issue of snow accumulation since LED’s do not melt the snow. The assessment was performed on the outcome (i.e., ideas generated) using quantity, novelty and variety as metrics. Numerical results show that using TRIZ improved the Novelty and Variety of the concepts generated by students at both institutions.© 2012 ASME

Journal ArticleDOI
TL;DR: It is proposed in this work that in choice modeling for usage context-based design, usage context should be a part of the primary descriptors in the definition of a customer profile, in addition to the socio-demographic attributes for modeling customers’ heterogeneity.
Abstract: Usage Context-Based Design (UCBD) is an area of growing interest within the design community. Usage context is the set of scenarios in which a product (or service) is to be used, including the environments in which the product is used, the types of tasks the product performs, and the conditions under which the product is purchased and operates. It is proposed in this work that in choice modeling for usage context-based design, usage context should be a part of the primary descriptors in the definition of a customer profile, in addition to the socio-demographic attributes for modeling customers’ heterogeneity. As customers become more technology-savvy and market-educated, current choice modeling methods in engineering design could greatly benefit from exploiting the rich contextual information existing in product usage. In this work, we propose a choice modeling framework for Usage Context-based Design (UCBD) to quantify the impact of usage context on customer choices. We start with defining a taxonomy for UCBD. By explicitly modeling usage context’s influence on both product performances and customer preferences, a step-by-step choice modeling procedure is proposed to support UCBD. Two case studies, a jigsaw example with stated preference data and a hybrid electric vehicle example with revealed preference data, demonstrate the needs and benefits of incorporating usage context in choice modeling.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the nature of the contact between the load transferring surfaces in the roller screw mechanism, i.e., between the screw and roller threads and between the nut and roller thread.
Abstract: This paper investigates the nature of the contact between the load transferring surfaces in the roller screw mechanism, i.e., between the screw and roller threads and between the nut and roller threads. The analysis is applied to both planetary roller screws and recirculating roller screws. Prior work has neglected to take a fundamental approach toward understanding the mechanics of the contact between these components, and as a consequence, detailed analysis of aspects such as contact mechanics, friction, lubrication, and wear are not carried out correctly. Accordingly, in this paper, the principle of conjugate surfaces is used to establish contact at the screw-roller and nut-roller interfaces. The in-plane angles to the contact points are derived and it is shown that for the screw-roller interface, the contact point cannot lie on the bodies’ line of centers as has been the assumption in previous papers. Then, based on the curved profile of the roller thread, the radii of contact on the roller, screw, and nut bodies are also derived. Knowledge of the contact point locations is necessary to understand the interaction forces between the key components of the roller screw mechanism. In addition, accurate estimates of the radii of contact are necessary for minimizing the phenomenon of roller migration, a condition that can cause binding between components and eventually lead to the destruction of the mechanism. Last, the principal radii of curvature at the contact points and the angle between the principal axes are derived. These are essential for further development of the contact mechanics, such as the surface stresses, deformations, and consideration of wear.Copyright © 2012 by ASME

Journal ArticleDOI
TL;DR: The approach of object-oriented graph grammars for the computational synthesis of product models based on a Function–Behavior–Structure (FBS) representation is introduced and advances in terms of extendibility, efficiency, and flexible formalization of declarative and procedural engineering knowledge are achieved.
Abstract: Computational design synthesis aims to iteratively and automatically generate solution spaces of standard and novel design alternatives to support the innovation process. New approaches are required to generate alternative solutions at the function and behavior level as well as to ease the computational modeling of design knowledge. This paper introduces the approach of object-oriented graph grammars for the computational synthesis of product models based on a Function–Behavior–Structure (FBS) representation. The approach combines the advantages of a generic and systematic design method with a highly computable graph representation and object-oriented concepts. Through this combination, advances in terms of extendibility, efficiency, and flexible formalization of declarative and procedural engineering knowledge are achieved. Validation of the method is given through the synthesis of hybrid powertrains. The generation of hybrid powertrain solution spaces is shown, especially focusing on the impact of an evolving vocabulary, or building blocks, for synthesis. Future work includes integrating search methods in the synthesis process along with quantitative evaluation using simulation methods.

Journal ArticleDOI
TL;DR: In this article, the tip width of the inner rotor is controlled by inserting a circular-arc curve between the hypocycloid and epicycloid curves, and the outer rotor is designed using the closed-form equation for the internal rotor and the width correction coefficient.
Abstract: In the case of internal gear pumps, the eccentricity of the outer rotor, which resembles a circular lobe, must be limited to a certain value in order to avoid the formation of cusps and loops; furthermore, the tip width of the inner rotor, which has a hypocycloid curve and an epicycloid curve, should not be allowed to exceed the limit value. In this study, we suggest that the tip width of the inner rotor be controlled by inserting a circular-arc curve between the hypocycloid and epicycloid curves. We also suggest that the outer rotor be designed using the closed-form equation for the inner rotor and the width correction coefficient. Thus, it is possible to design a gerotor for which there is no upper limit on the eccentricity, as in this case, undercut is prevented and there is no restriction on the tip width. We also develop an automated program for rotor design and calculation of the flow rate and flow rate irregularity. We demonstrate the superior performance of the gerotor developed in this study by analyzing the internal fluid flow using a commercial computational fluid dynamics (CFD)-code.

Journal ArticleDOI
TL;DR: The method is a sequential experimentation around the approximate most probable point (MPP) at each step of optimization process and is compared with the methods of MPP-based sampling, lifted surrogate function, and nonsequential random sampling.
Abstract: Reliability-based design optimization (RBDO) has a probabilistic constraint that is used for evaluating the reliability or safety of the system In modern engineering design, this task is often performed by a computer simulation tool such as finite element method (FEM) This type of computer simulation or computer experiment can be treated a black box, as its analytical function is implicit This paper presents an efficient sampling strategy on learning the probabilistic constraint function under the design optimization framework The method is a sequential experimentation around the approximate most probable point (MPP) at each step of optimization process Our method is compared with the methods of MPP-based sampling, lifted surrogate function, and nonsequential random sampling We demonstrate it through examples

Journal ArticleDOI
TL;DR: In this article, the authors developed a general mathematical model of trochoidal gearing that can be applied for Gerotor pumps and cyclo reducers, where the inner gear profile is described by peritrochoid equidistance and the outer gear profile by circular arc.
Abstract: This paper explains development of the general mathematical model of trochoidal gearing that can be applied for gerotor pumps and cyclo reducers. The model analyzes geometry and physics of the gearing pair in trochoidal pump where the outer gear has one tooth more than the inner gear. The inner gear profile is described by peritrochoid equidistance and the outer gear profile by circular arc. Mathematical model of gearing with clearances is based on the principle of an ideal profile development. Minimum clearance height between teeth profiles in relation to instantaneous gear ratio is determined. The influence of gear profile geometrical parameters on gearing process, clearance height change, and pulsation of drive moment is analyzed and presented in numerical examples. Obtained results can be used for the design of the trochoidal gearing where accurate and silent operation is required. [DOI: 10.1115/1.4005621]

Journal ArticleDOI
TL;DR: In this article, the design and testing of a powered ankle prosthesis was discussed. And a prototype prosthesis based on the optimization was designed, fabricated, and tested, achieving 93.3% of the simulated theoretical ankle moment.
Abstract: This article discusses the design and testing of a powered ankle prosthesis. This new prosthesis mimics nonamputee (normal) ankle moments during the stance phase of gait through the use of an optimized spring loaded four-bar mechanism. A prototype prosthesis based on the optimization was designed, fabricated, and tested. The experimental results achieved 93.3% of the simulated theoretical ankle moment giving substantial evidence that this approach is a viable in designing powered ankle prostheses.

Journal ArticleDOI
TL;DR: In this article, the authors propose a model for positioning a remanufactured product by considering original product design, target market (i.e., customer preferences and competing products), and recovery economics.
Abstract: In a market with rapid changes in technology and customer preferences, technological obsolescence of end-of-life products poses a significant challenge to product recovery. Remanufacturing with optimal part upgrades can be a promising solution for overcoming the obsolescence. This paper proposes a model for positioning a remanufactured product. By considering original product design, target market (i.e., customer preferences and competing products), and recovery economics, the model helps to find optimal specifications and the selling price of a remanufactured product at which maximum remanufacturing profit is expected. Two versions of the model are presented under different assumptions on product takeback. The first model assumes that the remanufacturer passively accepts all returns without paying any financial incentives. The second model assumes that the remanufacturer buys back end-of-life products so as to control the quality and quantity of returns. The two models are illustrated with the example of a desktop computer. [DOI: 10.1115/1.4023000]

Journal ArticleDOI
TL;DR: The divergent thinking test as mentioned in this paper was designed to evaluate four direct measures (fluency, flexibility, originality, and quality) and four indirect measures (abstractability, afixability, detailability, and decomplexability).
Abstract: A number of cognitive skills relevant to conceptual design were identified previously. They include divergent thinking (DT), visual thinking (VT), spatial reasoning (SR), qualitative reasoning (QR), and problem formulation (PF). A battery of standardized tests is being developed for these design skills. This paper focuses only on the divergent thinking test. This particular test has been given to over 500 engineering students and a smaller number of practicing engineers. It is designed to evaluate four direct measures (fluency, flexibility, originality, and quality) and four indirect measures (abstractability, afixability, detailability, and decomplexability). The eight questions on the test overlap in some measures and the responses can be used to evaluate several measures independently (e.g., fluency and originality can be evaluated separately from the same idea set). The data on the twenty-three measured variables were factor analyzed using both exploratory and confirmatory procedures. A four-factor solution with correlated (oblique) factors was deemed the best available solution after examining solutions with more factors. The indirect measures did not appear to correlate strongly either among themselves or with the other direct measures. The four-factor structure was then taken into a confirmatory factor analytic procedure that adjusted for the missing data. It was found to provide a reasonable fit. Estimated correlations among the four factors (F) ranged from a high of 0.32 for F1 and F2 to a low of 0.06 for F3 and F4. All factor loadings were statistically significant. [DOI: 10.1115/1.4005594]


Journal ArticleDOI
TL;DR: In this article, the energy absorption properties of hexagonal honeycomb structures of varying cellular geometries under high speed in-plane crushing were investigated and a randomly filled, non-repeating design of experiments (DOEs) was generated to determine the effects of these geometric parameters on the output of energy absorbed.
Abstract: This paper presents the energy absorption properties of hexagonal honeycomb structures of varying cellular geometries under high speed in-plane crushing. While the crushing responses in terms of energy absorption and densification strains have been extensively researched and reported, a gap is identified in the generalization of honeycombs with contr’olled and varying geometric parameters. This paper addresses this gap through a series of finite element (FE) simulations where the cell angle and the inclined wall thickness, are varied while maintaining a constant mass of the honeycomb structure. A randomly filled, nonrepeating design of experiments (DOEs) is generated to determine the effects of these geometric parameters on the output of energy absorbed and a statistical sensitivity analysis is used to determine the parameters significant for the crushing energy absorption of honeycombs. It is found that while an increase in the inclined wall thickness enhances the energy absorption of the structure, increases in either the cell angle or ratio of cell angle to inclined wall thickness have adverse effects on the output. Finally, the optimization results suggest that a cellular geometry with a positive cell angle and a high inclined wall thickness provides for maximum energy absorption, which is verified with a 6% error when compared to a FE simulation. [DOI: 10.1115/1.4006739]


Journal ArticleDOI
TL;DR: Wu et al. as mentioned in this paper proposed a classification for foldable/unfoldable surfaces that comprehend non fully developable (and also non fully foldable) surfaces and a method for the description of folding motion.
Abstract: Origami and paperfolding techniques may inspire the design of structures that have the ability to be folded and unfolded: their geometry can be changed from an extended, servicing state to a compact one, and back-forth. In traditional origami, folds are introduced in a sheet of paper (a developable surface) for transforming its shape, with artistic, or decorative intent; in recent times the ideas behind origami techniques were transferred in various design disciplines to build developable foldable/unfoldable structures, mostly in aerospace industry (Miura, 1985, “Method of Packaging and Deployment of Large Membranes in Space,” Inst. Space Astronaut. Sci. Rep., 618 , pp. 1–9; Ikema , 2009, “Deformation Analysis of a Joint Structure Designed for Space Suit With the Aid of an Origami Technology,” 27th International Symposium on Space Technology and Science (ISTS)). The geometrical arrangement of folds allows a folding mechanism of great efficiency and is often derived from the buckling patterns of simple geometries, like a plane or a cylinder (e.g., Miura-ori and Yoshimura folding pattern) (Wu , 2007, “Optimization of Crush Characteristics of the Cylindrical Origami Structure,” Int. J. Veh. Des., 43 , pp. 66–81; Hunt and Ario, 2005, “Twist Buckling and the Foldable Cylinder: An Exercise in Origami,” Int. J. Non-Linear Mech., 40 (6), pp. 833–843). Here, we interest ourselves to the conception of foldable/unfoldable structures for civil engineering and architecture. In those disciplines, the need for folding efficiency comes along with the need for structural efficiency (stiffness); for this purpose, we will explore nondevelopable foldable/unfoldable structures: those structures exhibit potential stiffness because, when unfolded, they cannot be flattened to a plane (nondevelopability). In this paper, we propose a classification for foldable/unfoldable surfaces that comprehend non fully developable (and also non fully foldable) surfaces and a method for the description of folding motion. Then, we propose innovative geometrical configurations for those structures by generalizing the Miura-ori folding pattern to nondevelopable surfaces that, once unfolded, exhibit curvature.

Journal ArticleDOI
TL;DR: A genetic algorithm is adopted to solve the nonlinear constrained optimization problem, minimizing cost and maximizing power output, and results show that, given a projected participation rate, the most crucial plots prior to the negotiation process with landowners can be identified, increasing the efficiency of wind farm development.
Abstract: Current wind farm layout optimization research assumes a continuous piece of land is readily available and focuses on advancing optimization methods. In reality, projects rely on landowners’ permission for success. When a viable site is identified, local residents are approached for permission to build turbines on their land, typically in exchange for monetary compensation. Landowners play a crucial role in the development process, and some land parcels are more important to the success of project than others. This paper relaxes the assumption that a continuous piece of land is available, developing a novel approach that includes a model of landowner participation rates. A genetic algorithm (GA) is adopted to solve the nonlinear constrained optimization problem, minimizing cost and maximizing power output. The optimization results show that, given a projected participation rate, we can identify the most crucial plots prior to the negotiation process with landowners. This will ultimately increase the efficiency of wind farm development. [DOI: 10.1115/1.4006999]