scispace - formally typeset
Search or ask a question

Showing papers in "Structural and Multidisciplinary Optimization in 2004"


Journal ArticleDOI
TL;DR: A survey of current continuous nonlinear multi-objective optimization concepts and methods finds that no single approach is superior and depends on the type of information provided in the problem, the user's preferences, the solution requirements, and the availability of software.
Abstract: A survey of current continuous nonlinear multi-objective optimization (MOO) concepts and methods is presented. It consolidates and relates seemingly different terminology and methods. The methods are divided into three major categories: methods with a priori articulation of preferences, methods with a posteriori articulation of preferences, and methods with no articulation of preferences. Genetic algorithms are surveyed as well. Commentary is provided on three fronts, concerning the advantages and pitfalls of individual methods, the different classes of methods, and the field of MOO as a whole. The Characteristics of the most significant methods are summarized. Conclusions are drawn that reflect often-neglected ideas and applicability to engineering problems. It is found that no single approach is superior. Rather, the selection of a specific method depends on the type of information that is provided in the problem, the user’s preferences, the solution requirements, and the availability of software.

4,263 citations


Journal ArticleDOI
TL;DR: Several common themes arose from the discussion, including differentiating between design of experiments and design and analysis of computer experiments, visualizing experimental results and data from approximation models, capturing uncertainty with approximation methods, and handling problems with large numbers of variables.
Abstract: This paper summarizes the discussion at the Approximation Methods Panel that was held at the 9 th AIAA/ISSMO Symposium on Multidisciplinary Analysis & Optimization in Atlanta, GA on September 2–4, 2002. The objective of the panel was to discuss the current state-of-the-art of approximation methods and identify future research directions important to the community. The panel consisted of five representatives from industry and government: (1) Andrew J. Booker from The Boeing Company, (2) Dipankar Ghosh from Vanderplaats Research & Development, (3) Anthony A. Giunta from Sandia National Laboratories, (4) Patrick N. Koch from Engineous Software, Inc., and (5) Ren-Jye Yang from Ford Motor Company. Each panelist was asked to (i) give one or two brief examples of typical uses of approximation methods by his company, (ii) describe the current state-of-the-art of these methods used by his company, (iii) describe the current challenges in the use and adoption of approximation methods within his company, and (iv) identify future research directions in approximation methods. Several common themes arose from the discussion, including differentiating between design of experiments and design and analysis of computer experiments, visualizing experimental results and data from approximation models, capturing uncertainty with approximation methods, and handling problems with large numbers of variables. These are discussed in turn along with the future directions identified by the panelists, which emphasized educating engineers in using approximation methods.

424 citations


Journal ArticleDOI
TL;DR: Reliability-Based Topology Optimization (RBTO) as mentioned in this paper integrates reliability analysis into topology optimization problems, in which reliability constraints are introduced into a deterministic top-ology optimization formulation.
Abstract: The objective of this work is to integrate reliability analysis into topology optimization problems. The new model, in which we introduce reliability constraints into a deterministic topology optimization formulation, is called Reliability-Based Topology Optimization (RBTO). Several applications show the importance of this integration. The application of the RBTO model gives a different topology relative to deterministic topology optimization. We also find that the RBTO model yields structures that are more reliable than those produced by deterministic topology optimization (for the same weight).

337 citations


Journal ArticleDOI
TL;DR: In this paper, the reliability-based design optimization (RBDO) approach was used to evaluate the crashworthiness of a large-scale vehicle side impact under probabilistic constraints using the Reliability Index Approach (RIA) and the Performance Measure Approach (PMA).
Abstract: With the advent of powerful computers, vehicle safety issues have recently been addressed using computational methods of vehicle crashworthiness, resulting in reductions in cost and time for new vehicle development. Vehicle design demands multidisciplinary optimization coupled with a computational crashworthiness analysis. However, simulation-based optimization generates deterministic optimum designs, which are frequently pushed to the limits of design constraint boundaries, leaving little or no room for tolerances (uncertainty) in modeling, simulation uncertainties, and/or manufacturing imperfections. Consequently, deterministic optimum designs that are obtained without consideration of uncertainty may result in unreliable designs, indicating the need for Reliability-Based Design Optimization (RBDO). Recent development in RBDO allows evaluations of probabilistic constraints in two alternative ways: using the Reliability Index Approach (RIA) and the Performance Measure Approach (PMA). The PMA using the Hybrid Mean Value (HMV) method is shown to be robust and efficient in the RBDO process, whereas RIA yields instability for some problems. This paper presents an application of PMA and HMV for RBDO for the crashworthiness of a large-scale vehicle side impact. It is shown that the proposed RBDO approach is very effective in obtaining a reliability-based optimum design.

316 citations


Journal ArticleDOI
TL;DR: The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented, and recommendations for the utilization of the algorithm in future multidisciplinary optimization applications are presented.
Abstract: The purpose of this paper is to demonstrate the application of particle swarm optimization to a realistic multidisciplinary optimization test problem. The paper's new contributions to multidisciplinary optimization are the application of a new algorithm for dealing with the unique challenges associated with multidisciplinary optimization problems, and recommendations for the utilization of the algorithm in future multidisciplinary optimization applications. The selected example is a bi-level optimization problem that demonstrates severe numerical noise and has a combination of continuous and discrete design variables. The use of traditional gradient-based optimization algorithms is thus not practical. The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented. The algorithm is capable of dealing with the unique challenges posed by multidisciplinary optimization, as well as the numerical noise and discrete variables present in the current example problem.

249 citations


Journal ArticleDOI
TL;DR: In this paper, the authors define six sigma in an engineering design context and present an implementation of a robust optimization formulation that incorporates approaches from structural reliability and robust design with the concepts and philosophy of 6 sigma.
Abstract: The current push in industry is focused on ensuring not only that a product performs as desired but also that the product consistently performs as desired. To ensure consistency in product performance, "quality" is measured, improved, and controlled. Most quality initiatives have originated and been implemented in the product manufacturing stages. More recently, however, it has been observed that much of a product's performance and quality is determined by early design decisions, by the design choices made early in the product design cycle. Consequently, quality pushes have made their way into the design cycle, and "design for quality" is the primary objective. How is this objective measured and met? The most recent quality philosophy, also originating in a manufacturing setting, is six sigma. The concepts of six sigma quality can be defined in an engineering design context through relation to the concepts of design reliability and robustness --- probabilistic design approaches. Within this context, design quality is measured with respect to probability of constraint satisfaction and sensitivity of performance objectives, both of which can be related to a design "sigma level". In this paper, we define six sigma in an engineering design context and present an implementation of design for six sigma --- a robust optimization formulation that incorporates approaches from structural reliability and robust design with the concepts and philosophy of six sigma. This formulation is demonstrated using a complex automotive application: vehicle side impact crash simulation. Results presented illustrate the tradeoff between performance and quality when optimizing for six sigma reliability and robustness.

232 citations


Journal ArticleDOI
TL;DR: In this paper, a topology optimization methodology for the conceptual design of aeroelastic structures accounting for the fluid-structure interaction is presented, where the geometrical layout of the internal structure, such as the layout of stiffeners in a wing, is optimized by material topology.
Abstract: A topology optimization methodology is presented for the conceptual design of aeroelastic structures accounting for the fluid–structure interaction. The geometrical layout of the internal structure, such as the layout of stiffeners in a wing, is optimized by material topology optimization. The topology of the wet surface, that is, the fluid–structure interface, is not varied. The key components of the proposed methodology are a Sequential Augmented Lagrangian method for solving the resulting large-scale parameter optimization problem, a staggered procedure for computing the steady-state solution of the underlying nonlinear aeroelastic analysis problem, and an analytical adjoint method for evaluating the coupled aeroelastic sensitivities. The fluid–structure interaction problem is modeled by a three-field formulation that couples the structural displacements, the flow field, and the motion of the fluid mesh. The structural response is simulated by a three-dimensional finite element method, and the aerodynamic loads are predicted by a three-dimensional finite volume discretization of a nonlinear Euler flow. The proposed methodology is illustrated by the conceptual design of wing structures. The optimization results show the significant influence of the design dependency of the loads on the optimal layout of flexible structures when compared with results that assume a constant aerodynamic load.

212 citations


Journal ArticleDOI
Ren-Jye Yang1, Lei Gu1
TL;DR: In this work, several approximate RBDO methods are coded, discussed, and tested against a double loop algorithm through four design problems.
Abstract: Traditional reliability-based design optimization (RBDO) requires a double loop iteration process. The inner optimization loop is to find the most probable point (MPP) and the outer is the regular optimization loop to optimize the RBDO problem with reliability objectives or constraints. It is well known that the computation can be prohibitive when the associated function evaluation is expensive. As a result, many approximate RBDO methods, which convert the double loop to a single loop, have been developed. In this work, several approximate RBDO methods are coded, discussed, and tested against a double loop algorithm through four design problems.

183 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an efficient strategy for dealing with topology optimization associated with the problem of mass minimization under material failure constraints by combining an augmented Lagrangian technique for the stress constraints and a trust region box-type algorithm for the side constraints.
Abstract: This work presents an efficient strategy for dealing with topology optimization associated with the problem of mass minimization under material failure constraints. Although this problem characterizes one of the oldest mechanical requirements in structural design, only a few works dealing with this subject are found in the literature. Several reasons explain this situation, among them the numerical difficulties introduced by the usually large number of stress constraints. The original formulation of the topological problem (existence/non-existence of material) is partially relaxed by following the SIMP (Solid Isotropic Microstructure with Penalization) approach and using a continuous density field ź as the design variable. The finite element approximation is used to solve the equilibrium problem, as well as to control ź through nodal parameters. The formulation accepts any failure criterion written in terms of stress and/or strain invariants. The whole minimization problem is solved by combining an augmented Lagrangian technique for the stress constraints and a trust-region box-type algorithm for dealing with side constraints (0<źmin≤ź≤1) . Numerical results show the efficiency of the proposed approach in terms of computational costs as well as satisfaction of material failure constraints. It is also possible to see that the final designs define quite different shapes from the ones obtained in classical compliance problems.

168 citations


Journal ArticleDOI
TL;DR: The homogenization method is extended to such a framework and yields an efficient numerical algorithm for topology optimization, using special microstructures which are sequential laminated composites.
Abstract: This paper is devoted to minimum stress design in structural optimization. The homogenization method is extended to such a framework and yields an efficient numerical algorithm for topology optimization. The main idea is to use a partial relaxation of the problem obtained by introducing special microstructures which are sequential laminated composites. Indeed, the so-called corrector terms of such microgeometries are explicitly known, which allows us to compute the relaxed objective function. These correctors can be interpreted as stress amplification factors, caused by the underlying microstructure.

150 citations


Journal ArticleDOI
TL;DR: This paper reviews how increases in computer power were utilized in structural optimization and concludes that problems with the highest possible complexity can be solved in only two of the three components of model, analysis procedure or optimization.
Abstract: Rapid increases in computer processing power, memory and storage space have not eliminated computational cost and time constraints on the use of structural optimization for design. This is due to the constant increase in the required fidelity (and hence complexity) of analysis models. Anecdotal evidence seems to indicate that analysis models of acceptable accuracy have required at least six to eight hours of computer time (an overnight run) throughout the last thirty years. This poses a severe challenge for global optimization or reliability-based design. In this paper, we review how increases in computer power were utilized in structural optimization. We resolve problem complexity into components relating to complexity of analysis model, analysis procedure and optimization methodology. We explore the structural optimization problems that we can solve at present and conclude that we can solve problems with the highest possible complexity in only two of the three components of model, analysis procedure or optimization. We use examples of optimum design of composite structures to guide the discussion due to our familiarity with such problems. However, these are supplemented with other structural optimization examples to illustrate the universality of the message.

Journal ArticleDOI
TL;DR: The authors conclude that the proposed node-based implementation is viable for continued usage in continuum topology optimization and immune to element-wise checkerboarding instabilities that are a concern with element-based design variables.
Abstract: A node-based design variable implementation for continuum structural topology optimization in a finite element framework is presented and its properties are explored in the context of solving a number of different design examples. Since the implementation ensures C0continuity of design variables, it is immune to element-wise checkerboarding instabilities that are a concern with element-based design variables. Nevertheless, in a subset of design examples considered, especially those involving compliance minimization with coarse meshes, the implementation is found to introduce a new phenomenon that takes the form of “layering” or “islanding” in the material layout design. In the examples studied, this phenomenon disappears with mesh refinement or the enforcement of sufficiently restrictive design perimeter constraints, the latter sometimes being necessary in design problems involving bending to ensure convergence with mesh refinement. Based on its demonstrated performance characteristics, the authors conclude that the proposed node-based implementation is viable for continued usage in continuum topology optimization.

Journal ArticleDOI
TL;DR: A level-set method is used as a region representation with a moving boundary model to approach the problem of structural shape and topology optimization, and demonstrates outstanding flexibility in handling topological changes, the fidelity of boundary representation, and the degree of automation.
Abstract: In this paper we present a new framework to approach the problem of structural shape and topology optimization. We use a level-set method as a region representation with a moving boundary model. As a boundary optimization problem, the structural boundary description is implicitly embedded in a scalar function as its “iso-surfaces.” Such level-set models are flexible in handling complex topological changes and are concise in describing the material regions of the structure. Furthermore, by using a simple Hamilton–Jacobi convection equation, the movement of the implicit moving boundaries of the structure is driven by a transformation of the objective and the constraints into a speed function that defines the level-set propagation. The result is a 3D structural optimization technique that demonstrates outstanding flexibility in handling topological changes, the fidelity of boundary representation, and the degree of automation, comparing favorably with other methods in the literature based on explicit boundary variation or homogenization. We present two numerical techniques of conjugate mapping and variational regularization for further enhancement of the level-set computation, in addition to the use of efficient up-wind schemes. The method is tested with several examples of a linear elastic structure that are widely reported in the topology optimization literature.

Journal ArticleDOI
TL;DR: In this paper, a probabilistic sufficiency factor approach is proposed that combines safety factor and probability of failure, which represents a factor of safety relative to a target probability of failures.
Abstract: A probabilistic sufficiency factor approach is proposed that combines safety factor and probability of failure. The probabilistic sufficiency factor approach represents a factor of safety relative to a target probability of failure. It provides a measure of safety that can be used more readily than the probability of failure or the safety index by designers to estimate the required weight increase to reach a target safety level. The probabilistic sufficiency factor can be calculated from the results of Monte Carlo simulation with little extra computation. The paper presents the use of probabilistic sufficiency factor with a design response surface approximation, which fits it as a function of design variables. It is shown that the design response surface approximation for the probabilistic sufficiency factor is more accurate than that for the probability of failure or for the safety index. Unlike the probability of failure or the safety index, the probabilistic sufficiency factor does not suffer from accuracy problems in regions of low probability of failure when calculated by Monte Carlo simulation. The use of the probabilistic sufficiency factor accelerates the convergence of reliability-based design optimization.

Journal ArticleDOI
TL;DR: This paper employs parallel updates by searching an expected improvement surface generated from a radial basis function model to look at optimization based on standard and gradient-enhanced models.
Abstract: Approximation methods are often used to construct surrogate models, which can replace expensive computer simulations for the purposes of optimization. One of the most important aspects of such optimization techniques is the choice of model updating strategy. In this paper we employ parallel updates by searching an expected improvement surface generated from a radial basis function model. We look at optimization based on standard and gradient-enhanced models. Given Np processors, the best Np local maxima of the expected improvement surface are highlighted and further runs are performed on these designs. To test these ideas, simple analytic functions and a finite element model of a simple structure are analysed and various approaches compared.

Journal ArticleDOI
TL;DR: In this article, a design methodology is proposed that combines reliability-based design optimization and high-fidelity aeroelastic simulations for the analysis and design of aero-elastic structures.
Abstract: Aeroelastic phenomena are most often either ignored or roughly approximated when uncertainties are considered in the design optimization process of structures subject to aerodynamic loading, affecting the quality of the optimization results. Therefore, a design methodology is proposed that combines reliability-based design optimization and high-fidelity aeroelastic simulations for the analysis and design of aeroelastic structures. To account for uncertainties in design and operating conditions, a first-order reliability method (FORM) is employed to approximate the system reliability. To limit model uncertainties while accounting for the effects of given uncertainties, a high-fidelity nonlinear aeroelastic simulation method is used. The structure is modelled by a finite element method, and the aerodynamic loads are predicted by a finite volume discretization of a nonlinear Euler flow. The usefulness of the employed reliability analysis in both describing the effects of uncertainties on a particular design and as a design tool in the optimization process is illustrated. Though computationally more expensive than a deterministic optimum, due to the necessity of solving additional optimization problems for reliability analysis within each step of the broader design optimization procedure, a reliability-based optimum is shown to be an improved design. Conventional deterministic aeroelastic tailoring, which exploits the aeroelastic nature of the structure to enhance performance, is shown to often produce designs that are sensitive to variations in system or operational parameters.

Journal ArticleDOI
TL;DR: A fixed cost local search, which sequentially becomes global, is developed in this work, and is particularly adapted to tackling multimodal, discontinuous, constrained optimization problems, for which it is uncertain that a global optimization can be afforded.
Abstract: One of the fundamental difficulties in engineering design is the multiplicity of local solutions. This has triggered much effort in the development of global search algorithms. Globality, however, often has a prohibitively high numerical cost for real problems. A fixed cost local search, which sequentially becomes global, is developed in this work. Globalization is achieved by probabilistic restarts. A spacial probability of starting a local search is built based on past searches. An improved Nelder–Mead algorithm is the local optimizer. It accounts for variable bounds and nonlinear inequality constraints. It is additionally made more robust by reinitializing degenerated simplexes. The resulting method, called the Globalized Bounded Nelder–Mead (GBNM) algorithm, is particularly adapted to tackling multimodal, discontinuous, constrained optimization problems, for which it is uncertain that a global optimization can be afforded. Numerical experiments are given on two analytical test functions and two composite laminate design problems. The GBNM method compares favorably with an evolutionary algorithm, both in terms of numerical cost and accuracy.

Journal ArticleDOI
TL;DR: In this paper, a robust algorithm based on a modified isoline technique is presented that generates the appropriate loading surface which remains on the boundary of potential structural domains during the topology evolution.
Abstract: This paper describes a new computational approach for optimum topology design of 2D continuum structures subjected to design-dependent loading. Both the locations and directions of the loads may change as the structural topology changes. A robust algorithm based on a modified isoline technique is presented that generates the appropriate loading surface which remains on the boundary of potential structural domains during the topology evolution. Issues in connection with tracing the variable loading surface are discussed and treated in the paper. Our study indicates that the influence of the variation of element material density is confined within a small neighbourhood of the element. With this fact in mind, the cost of the calculation of the sensitivities of loads may be reduced remarkably. Minimum compliance is considered as the design problem. There are several models available for such designs. In the present paper, a simple formulation with weighted unit cost constraints based on the expression of potential energy is employed. Compared to the traditional models (i.e., the SIMP model), it provides an alternative way to implement the topology design of continuum structures. Some 2D examples are tested to show the differences between the designs obtained for fixed, design-independent loading, and for variable, design-dependent loading. The general and special features of the optimization with design-dependent loads are shown in the paper, and the validity of the algorithm is verified. An algorithm dealing with 3D design problems is described in Part II, which is developed from the 2D algorithm in the present Part I of the paper.

Journal ArticleDOI
TL;DR: In this paper, an integrated approach is developed to provide the user with the freedom of combining sizing, shape, and topology optimization in a single process, where the interaction of sizing and shape variables with topology modification is excluded.
Abstract: Topology optimization has become very popular in industrial applications, and most FEM codes have implemented certain capabilities of topology optimization. However, most codes do not allow simultaneous treatment of sizing and shape optimization during the topology optimization phase. This poses a limitation on the design space and therefore prevents finding possible better designs since the interaction of sizing and shape variables with topology modification is excluded. In this paper, an integrated approach is developed to provide the user with the freedom of combining sizing, shape, and topology optimization in a single process.

Journal ArticleDOI
TL;DR: The reduction in large-scale MDO solution times through HPC is significant in that it now makes it possible for such technologies to impact the vehicle design cycle and improve the engineering productivity.
Abstract: Multidisciplinary Design Optimization of a vehicle system for safety, NVH (noise, vibration and harshness) and weight, in a scalable HPC environment, is addressed. High performance computing, utilizing several hundred processors in conjunction with approximation methods, formal MDO strategies and engineering judgement are effectively used to obtain superior design solutions with significantly reduced elapsed computing times. The increased computational complexity in this MDO work is due to addressing multiple safety modes including frontal crash, offset crash, side impact and roof crush, in addition to the NVH discipline, all with detailed, high fidelity models and analysis tools. The reduction in large-scale MDO solution times through HPC is significant in that it now makes it possible for such technologies to impact the vehicle design cycle and improve the engineering productivity.

Journal ArticleDOI
TL;DR: The objective of this work is to present a new method to find hinge-free designs using multiscale wavelet-based topology optimization formulation using a translation-invariant wavelet shrinkage method.
Abstract: In topology optimization applications for the design of compliant mechanisms, the formation of hinges is typically encountered. Often such hinges are unphysical artifacts that appear due to the choice of discretization spaces for design and analysis. The objective of this work is to present a new method to find hinge-free designs using multiscale wavelet-based topology optimization formulation. The specific method developed in this work does not require refinement of the analysis model and it consists of a translation-invariant wavelet shrinkage method where a hinge-free condition is imposed in the multiscale design space. To imbed the shrinkage method implicitly in the optimization formulation and thus facilitate sensitivity analysis, the shrinkage method is made differentiable by means of differentiable versions of logical operators. The validity of the present method is confirmed by solving typical two-dimensional compliant mechanism design problems.

Journal ArticleDOI
TL;DR: In this paper, a growing and branching tree model is proposed for the generation of stiffener layout patterns for plate structures based on the inspiration of branching systems in nature, and a new and direct topology optimization method is proposed.
Abstract: From the inspiration of branching systems in nature, this paper suggests a new and direct topology optimization method for the generation of stiffener layout patterns for plate structures by introducing the growing and branching tree model. The growth technique begins with the growing of ground baby stiffeners around a given set of seeds. Each stiffener extends by obeying growing and branching rules like those for trees. A potential for branching is assigned to stiffeners whose cross-sectional areas are greater than a specified threshold dimension, and the best growing direction of a branch is selected depending on the effect of the extension of a new branch. During the growing of a stiffener, the volume growth rate is controlled so as to make it possible to create new branches and to eliminate degenerated stiffeners. The growing process stops when the stiffener volume reaches a given upper limit. Some numerical examples are used to illustrate the effectiveness of the proposed method, and the influence of various factors on the method is discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors describe a quite innovative multidisciplinary optimisation method based on robust design techniques: MORDACE (multidisciplinary optimization and robust design approaches applied to concurrent engineering), which allows concurrently designing different aspects or parts of a complex product.
Abstract: The increasing economic competition of all industrial markets and growing complexity of engineering problems lead to a progressive specialisation and distribution of expertise, tools and work sites. Most industrial sectors manage this fragmentation using the concurrent engineering approach, which is based on tools integration and shared databases and requires significant investments in design and work organisation. Besides, the multidisciplinary design optimisation (MDO) is more and more used as a method for optimal solutions search with regard to multiple coupled disciplines. The paper describes a quite innovative multidisciplinary optimisation method based on robust design techniques: MORDACE (multidisciplinary optimisation and robust design approaches applied to concurrent engineering). Managing uncertainty due to design teams collaboration, our automatic optimisation strategy allows concurrently designing different aspects or parts of a complex product. The method assures effective design work distribution and high optimisation results, containing the CPU time. In addition, our strategy is suited to the early stages of the design cycle, where evolutions of design goals and constraints are possible and exhaustive information about the design space is necessary. A roll stabiliser fin optimisation is presented as an example of this method applied to an industrial design problem.

Journal ArticleDOI
TL;DR: In this article, a methodology for selecting the product plat-form by using information obtained from the individual optimization of the product variants is presented, under the assumption that the product variety requires only mild design changes, a performance deviation vector is derived by taking into consideration individual optimal designs and sensitivities of functional requirements.
Abstract: Identification of the product platform is a key step in designing a family of products. This article presents a methodology for selecting the product plat- form by using information obtained from the individual optimization of the product variants. Under the assump- tion that the product variety requires only mild design changes, a performance deviation vector is derived by taking into consideration individual optimal designs and sensitivities of functional requirements. Commonality de- cisions are based on values of the performance deviation vector, and the product family is designed optimally with respect to the chosen platform. The proposed method- ology is applied to the design of a family of automotive body structures. Variants are defined by changing the functional requirements they need to satisfy and/or the geometry of the associated finite element models.

Journal ArticleDOI
TL;DR: In this paper, flexural lamination parameters are used as continuous design variables for unstiffened composite panels, and genetic optimization is compared with continuous optimization for the stacking sequence that accounts for discreteness of the design space and constraints on the number of contiguous plies of the same orientation.
Abstract: Unstiffened composite panels are optimized by using flexural lamination parameters as continuous design variables for the case in which the amounts of 0°, ±45°, and 90° plies are given. It is shown that for this case, the lamination parameters are located in a hexagonal domain. Continuous optimization is compared with genetic optimization for the stacking sequence that accounts for the discreteness of the design space and constraints on the number of contiguous plies of the same orientation. It is shown that only for very thin panels with low aspect ratios is there a significant difference between the continuous and discrete solutions.

Journal ArticleDOI
TL;DR: The topology description function (TDF) approach is a method for describing geometries in a discrete fashion, i.e. without intermediate densities, and is used to carry out topology optimization.
Abstract: The topology description function (TDF) approach is a method for describing geometries in a discrete fashion, i.e. without intermediate densities. Hence, the TDF approach may be used to carry out topology optimization, i.e. to solve the material distribution problem. However, the material distribution problem may be ill-posed. This ill-posedness can be avoided by limiting the complexity of the design, which is accomplished automatically by limiting the number of design parameters used for the TDF. An important feature is that the TDF design description is entirely decoupled from a finite element (FE) model. The basic idea of the TDF approach is as follows. In the TDF approach, the design variables are parameters that determine a function on the so-called reference domain. Using a cut-off level, this function unambiguously determines a geometry. Then, the performance of this geometry is determined by a FE analysis. Several optimization techniques are applied to the TDF approach to carry out topology optimization. First, a genetic algorithm is applied, with (too) large computational costs. The TDF approach is shown to work using a heuristic iterative adaptation of the design parameters. For more effective and sound optimization methods, design sensitivities are required. The first results on design sensitivity analysis are presented, and their accuracy is studied. Numerical examples are provided for illustration.

Journal ArticleDOI
TL;DR: In this paper, the problem of topology optimization of 3D structures with design-dependent loading is considered, and an algorithm for generating the valid loading surface of the 3D structure is presented, constituting an extension of the algorithm for 2D structures developed in Part I of this paper.
Abstract: The problem of topology optimization of 3D structures with design-dependent loading is considered. An algorithm for generating the valid loading surface of the 3D structure is presented, constituting an extension of the algorithm for 2D structures developed in Part I of this paper on the basis of a modified isoline technique. In this way the complicated calculation of the fit of the loading surface of a 3D structure may be avoided. Since the finite element mesh is fixed in the admissible 3D design domain during the period of topology evolution, the design-dependent loading surface may intersect the elements as the design changes. Independent interpolation functions are introduced along the loading surface so that the surface integral for generating the loading on the surface of the 3D structure can be performed more efficiently and simply. The bilinear 4-node serendipity surface element is constructed to describe the variable loading surface, and this matches well with the 8-node isoparametric 3D elements which have been used for the discretization of the 3D design domain. The validity of the algorithm is verified by numerical examples for 3D problems. Results of designing with design-dependent loads and with corresponding fixed loads are presented, and some important features of the computational results are discussed.

Journal ArticleDOI
TL;DR: In this paper, the authors present results from a major research program funded by the European Union and involving 14 partners from across the Union, which was able to optimise a large civil airliner wing for weight, drag and cost.
Abstract: This paper presents results from a major research programme funded by the European Union and involving 14 partners from across the Union. It shows how a complex tool set was assembled which was able to optimise a large civil airliner wing for weight, drag and cost. A multi-level MDO process was constructed and implemented through a hierarchical system in which cost comprised the top level. Conventional structural sizing parameters were employed to optimise structural weight but the upper-level optimisation used 6 overall design variables representing major design parameters. The paper concludes by presenting results from a case study which included all the components of the total design system.

Journal ArticleDOI
TL;DR: A general decomposition method developed by Haftka and Watson is applied to global-local structural optimization problems and allows much of the search for a global optimum to be conducted in low dimensions for each component separately.
Abstract: A general decomposition method developed by Haftka and Watson is applied to global-local structural optimization problems. First, a large number of component optimizations for maximization of margins are performed. Response surface approximations (RSAs) for maximum margins of component optimization are constructed. At the system-level optimization, the RSAs of maximum margins are used as surrogates for the components. One advantage of the decomposition approach is that it allows much of the search for a global optimum to be conducted in low dimensions for each component separately. Minimization of a portal frame weight with eight local optima is used to demonstrate the approach.

Journal ArticleDOI
TL;DR: In this article, a simple and effective robust optimization formulation is proposed to improve robustness of the objective function by minimizing a gradient index (GI), defined as a function of gradients of performance functions with respect to uncertain variables.
Abstract: This paper discusses a simple and effective robust optimization formulation and illustrates its application to MicroElectroMechanical Systems (MEMS) devices. The proposed formulation improves robustness of the objective function by minimizing a gradient index (GI), defined as a function of gradients of performance functions with respect to uncertain variables. The level of constraint feasibility is also enhanced by adding a term determined by a constraint value and the gradient index. In the robust optimal design procedure, a deterministic optimization for performance improvement is followed by a sensitivity analysis with respect to uncertainties such as MEMS fabrication errors and changes of material properties. During the process of the deterministic optimization and sensitivity analysis, dominant performances and critical uncertain variables are identified to define the GI. Our approach for robust design requires no statistical information on the uncertainties and yet achieves robustness effectively. Two MEMS application examples including a micro accelerometer and a resonant-type micro probe are presented.