scispace - formally typeset
Search or ask a question

Showing papers on "Design tool published in 1997"


Journal Article
TL;DR: In this article, a numerical analysis technique for predicting welding-induced buckling is presented, which combines two-dimensional welding simulations with three-dimensional structural analyses in a decoupled approach.
Abstract: This paper presents a numerical analysis technique for predicting welding-induced distortion. The technique combines two-dimensional welding simulations with three-dimensional structural analyses in a decoupled approach. The numerical technique is particularized on evaluating welding-induced buckling. The numerical predictions can be utilized either as a design evaluation or manufacturing analysis tool. As a design tool, the effect of the welding procedures can be determined and incorporated into the evaluation and optimization of the design configurations. As a manufacturing analysis tool, for a fixed design, different welding processes and procedures can be evaluated to minimize welding distortion. Experimental results obtained from small- and large-scale mock-up panels verify the numerical modeling approach.

244 citations


Journal ArticleDOI
TL;DR: A model of a Stewart Platform based machine tool is developed that provides the framework for inclusion of all relevant error sources and an error analysis is presented based on an error model formed through differentiation of the kinematic equations.

132 citations


Journal ArticleDOI
TL;DR: An optimization model for the design of rectangular reinforced concrete beams subject to a specified set of constraints is presented, which minimizes the cost of the beam on strength design procedures, while also considering the costs of concrete, steel and shuttering.
Abstract: We present an optimization model for the design of rectangular reinforced concrete beams subject to a specified set of constraints. Our model is more realistic than previously published models because it minimizes the cost of the beam on strength design procedures, while also considering the costs of concrete, steel and shuttering. Thus our method leads to very practical designs. As there is an infinite number of possible beam dimensions and reinforcement ratios that yield the same moment of resistance, an efficient search technique is preferred over the more traditional iterative methods. We employ a simple genetic algorithm as the search engine, and we compare our results with those obtained via geometric programming. Since the adjustment of parameters in a genetic algorithm (e.g., population size, crossover and mutation rates, and maximum number of generations) is a significant problem for any application, we present our own methodology to deal with this problem. A prototype of this system is currently being tested in Mexico, in order to evaluate its potential as a reliable design tool for real world applications.

112 citations


Journal ArticleDOI
TL;DR: In this article, the authors used the building energy simulation computer program DOE-2 to carry out a parametric study of a generic high-rise air-conditioned office building in Hong Kong.

86 citations


Journal ArticleDOI
01 Mar 1997
TL;DR: This paper discusses design technology issues for embedded systems using processor cores, with a focus on software compilation tools, and conducts a comprehensive survey of both existing and new software compilation techniques that are considered important in the context of embedded processors.
Abstract: The increasing use of embedded software, often implemented on a core processor in a single-chip system, is a clear trend in the telecommunications, multimedia, and consumer electronics industries. A companion paper (Paulin et al., 1997) presents a survey of application and architecture trends for embedded systems in these growth markets. However, the lack of suitable design technology remains a significant obstacle in the development of such systems. One of the key requirements is more efficient software compilation technology. Especially in the case of fixed-point digital signal processor (DSP) cores, it is often cited that commercially available compilers are unable to take full advantage of the architectural features of the processor. Moreover, due to the shorter lifetimes and the architectural specialization of many processor cores, processor designers are often compelled to neglect the issue of compiler support. This situation has resulted in an increased research activity in the area of design tool support for embedded processors. This paper discusses design technology issues for embedded systems using processor cores, with a focus on software compilation tools. Architectural characteristics of contemporary processor cores are reviewed and tool requirements are formulated. This is followed by a comprehensive survey of both existing and new software compilation techniques that are considered important in the context of embedded processors.

75 citations


Journal ArticleDOI
TL;DR: An abstract model, called Labyrinth, allows the design of platform-independent hypermedia applications, the categorisation, generalisation and abstraction of sparse unstructured heterogeneous information in multiple and interconnected levels and the creation of personalisations in multiuser hyperdocuments.

35 citations


Journal ArticleDOI
TL;DR: In this article, Von Misses stress and strain distributions have been calculated by the finite element method (FEM) and the knowledge of these distributions is a good design guideline for an accurate location of the piezoresistors.
Abstract: The main features involved in the design of a pressure sensor are the maximum non-destructive pressure and the sensitivity. In this work, these two characteristics are related to the following design variables: dimensions of the membrane and mechanical properties of the selected material. Von Misses stress and strain distributions have been calculated by the finite-element method (FEM). The knowledge of these distributions is a good design guideline for an accurate location of the piezoresistors. The results obtained have been applied to the design of silicon microsensors for biomedical and domestic applications.

32 citations


Journal ArticleDOI
18 May 1997
TL;DR: In this article, a general design tool that can be used for small and extra small electric and magnetic devices is described. But, due to their dimensions and specially due to the short axial length compared to the radial dimensions a three dimensional analysis of the field is required.
Abstract: The paper describes a general design tool that can be used for small and extra small electric and magnetic devices. Such devices generate micro motions. Due to their dimensions and specially due to the short axial length compared to the radial dimensions a three dimensional analysis of the field is required. Complex geometries recommend a well tuned engineering tool to guarantee reliable results. Here, a standard three dimensional finite element approach is chosen to compute the electric and magnetic field quantities respectively. In combination with a user-friendly computer interface, controlling the necessary finite element procedures, a powerful engineering tool with a general application range is obtained. For that reason it was possible to apply numerical optimization algorithms to realize a fully automated design tool. Various configurations are studied using the same software tools. The paper aims at the application of the finite element method (FEM).

32 citations


Journal Article
TL;DR: RaPID as mentioned in this paper is a tool for the design and implementation of optimal PID controllers that integrates data acquisition, system identification and optimal PID control design, which allows for an easy model-based optimization of any PID controller.
Abstract: RaPID (Robust Advanced PID Control) is a new and original engineering tool for the design and implementation of optimal PID controllers. It integrates data acquisition, system identification and optimal PID control design. Special care has been taken in emphasizing engineering intuition and requirements as opposed to the mathematical details. The result is an extremely user-friendly design tool which allows for an easy model-based optimization of any PID controller.

28 citations


Patent
01 Aug 1997
TL;DR: In this article, a component design tool extracts a parameter associated with a component defined in a model of a physical system, discretizes the component and generates a matrix representative of a specified parameter.
Abstract: A component design tool extracts a parameter associated with a component defined in a model of a physical system. The design tool discretizes the component and generates a matrix representative of a specified parameter. By subdividing the matrix into a hierarchy of submatrices and iteratively compressing and blending the submatrices, the design tool produces a compressed matrix. The compressed matrix is efficiently solved using iterative techniques. From the solution of the matrix, the design tool calculates the specified parameter.

28 citations


Journal ArticleDOI
TL;DR: The three psychological experiments presented here indicate that it is possible to express interactive behavior in a more direct fashion by letting the designers compose software from interaction elements with built-in behavior.

Journal ArticleDOI
TL;DR: In this article, the authors present an algorithm to extend the design application of the Statistical Energy Analysis (SEA) method through prediction of the variances of RMS in automotive structures and interior spaces.
Abstract: Sound and vibration transmission modeling methods are important to the design process for high quality automotive vehicles. Statistical Energy Analysis (SEA) is an emerging design tool for the automotive industry that was initially developed in the 1960's to estimate root-mean-square sound and vibration levels in structures and interior spaces. Although developed to estimate statistical mean values, automotive design application of SEA needs the additional ability to predict statistical variances of the predicted mean values of sound and vibration. This analytical ability would allow analysis of vehicle sound and vibration response sensitivity to changes in vehicle design specifications and their statistical distributions. This paper will present an algorithm to extend the design application of the SEA method through prediction of the variances of RMS. responses of vibro-acoustic automobile structures and interior spaces from variances in SEA automotive model physical parameters. The variance analysis is applied to both a simple, complete illustrative example and a more complex automotive vehicle example. Example variance results are verified through comparison with a Monte Carlo test of 2,000 SEA responses whose physical parameters were given Gaussian distributions with means at design values. Analytical predictions of the response statistics agree with the statistics generated by the Monte Carlo method but only require about 1/300 of the computational effort.

Patent
27 Oct 1997
TL;DR: In this article, the authors present a method for generating a base design in the form of one or more data files including assignment data, and a variation design is created by adding at least one additional assignment associated with the variation design to the assignment data.
Abstract: A method is provided in which a base design is generated in the form of one or more data files including assignment data. A variation design is created by adding at least one additional assignment associated with the variation design to the assignment data. The assignment data has an identifier that is associated with an entity defined within the base design, a first data field that can be used in making an assignment to the entity within the base design and a second data field for use in making the additional assignment to the entity within the variation design. The data files are compiled to generate a base output file and one or more variation output design files that can include one or more common result values. Comparison data is generated by comparing the common result values associated with the base design file and the variation design file. A design tool is provided for use with a computer system having a processor. The design tool includes a selector and a variation mechanism. Both the selector and variation mechanism are configured to run on the processor and are capable of accepting inputs from a user. The selector generates a base design in the form of one or more data files including assignment data. The variation mechanism generates a variation design by adding at least one additional assignment associated with the variation design.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss two-dimensional distributed transducer shape and shading and their implications for the active control of plates and show that transducers can be combined with gain-weighting to provide a close approximation of continuously shaded transducers distributions.

Proceedings ArticleDOI
07 Apr 1997
TL;DR: A finite element code was developed for the prediction of the on-orbit static and modal dynamic performance of inflatable antennas and inflatable solar concentrators and enables one to design an inflatable antenna and predict its surface accuracy and RF antenna parameters.
Abstract: A finite element code was developed for the prediction of the on-orbit static and modal dynamic performance of inflatable antennas and inflatable solar concentrators. The computer program for the Finite Element Analysis of Inflatable Membranes (FAIM) is a geometric nonlinear finite element solver with nonlinear material capability. The code was interfaced to an RF antenna code, a ray-tracing code, and a commercially available graphical pre- and post-processor. The result was an integrated set of tools for the analysis and design of inflatable antennas and concentrators. This enables one to design an inflatable antenna and predict its surface accuracy and RF antenna parameters. For solar concentrators, the companion code, RAYTRK, calculates the solar collected intensity and concentration ratios. The code calculates the deformations and stresses due to the applied loads and outputs the stiffness and mass matrices for use by a companion code that calculates the natural frequencies and mode shapes.

Proceedings ArticleDOI
13 Jun 1997
TL;DR: In this article, an integrated finite element-control methodology for the design/analysis of smart composite structures is presented, which includes finite element modeling; control algorithms; and deterministic, fuzzy and probabilistic optimization and integrity assessment of the structures and control systems.
Abstract: This paper presents an integrated finite element-control methodology for the design/analysis of smart composite structures. The method forms part of an effort to develop an integrated computational tool that includes finite element modeling; control algorithms; and deterministic, fuzzy and probabilistic optimization and integrity assessment of the structures and control systems. The finite element analysis is based on a 20 node thermopiezoelectric composite element for modeling the composite structure with surface bonded piezoelectric sensors and actuators; and control is based on the linear quadratic regulator and the independent modal space control methods. The method has been implemented in a computer code called SMARTCOM. Several example problems have been used to verify various aspects of the formulations and the analysis results from the present study compare well against other numerical or experimental results. Being based on the finite element method, the present formation can be conveniently used for the analysis and design of smart composite structures with complex geometrical configurations and loadings.© (1997) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Proceedings ArticleDOI
20 Apr 1997
TL;DR: Dynamic simulation is used to optimize the design of an existing micro-electromechanical device, called the manipulation chip (M-Chip), which contains an excess of 10000 moving actuators, called resonators, which oscillate torsionally at a few kHz.
Abstract: We use dynamic simulation to optimize the design of an existing micro-electromechanical (MEM) device, called the manipulation chip (M-Chip). This device contains an excess of 10000 moving actuators, called resonators, which oscillate torsionally at a few kHz. Parts dropped on the chip's surface are conveyed towards a unique direction. Given the enormous number of moving parts, it is impractical to attempt to measure the device's (or part's) dynamic state during a manipulation task. Yet, knowing this information is crucial for redesign and optimization. We make use of a powerful dynamic simulation tool, called "Impulse", to generate synthetic measurements over a range of experiments. From these results, we suggest redesign options which debug existing problems and improve the feed rate. The array is found to behave similar to a viscous spring-loaded conveyor belt; most of its energy is spent on driving the part vertically, calling for a more efficient design.

Proceedings ArticleDOI
12 May 1997
TL;DR: In this article, the authors examined the implications of high frequencies on the theoretical basis of the acoustic impedance models and to double the upper limit of validated models to around 1213,000 Hz.
Abstract: The ability to design, build, and test miniaturized acoustic treatment panels on scale model fan rigs representative of the full-scale engine provides not only a cost-saving but an opportunity to optimize the treatment by allowing tests of different designs. To be able to use scale model treatment as a full-scale design tool, it is necessary that the designer be able to reliably translate the scale model design and performance to an equivalent full-scale design. The key to this accomplishment is the acoustic treatment impedance parameter. Current full-scale impedance models for acoustic treatment panels have been validated by measurement only up to about 6000 Hz. It is the purpose of this study to examine implications of high frequencies on the theoretical basis of the acoustic impedance models and to double the upper limit of validated models to around 1213,000 Hz. Ultimately, it will be necessary to extend the models to 40,000 Hz or higher to realize the full potential of scale model treatment as a design tool.

01 Jan 1997
TL;DR: The ENERGY-10 program overcomes many of the time-consuming tasks, shortening the time required from hours or days to minutes, and planned enhancements including a capability for photovoltaics simulation are outlined.
Abstract: A major barrier to using energy simulation tools during the design process of a building has been the difficulty of using the available programs. The ENERGY-10 program overcomes this hurdle by automating many of the time-consuming tasks, shortening the time required from hours or days to minutes. Building descriptions are created automatically based on defaults. The APPLY and RANK features speed the process of comparing the performance of energy-efficient strategies by automatically modifying the building description and sequencing the operations. Graphical output greatly aids the process of assimilating and understanding the results. This paper describes the program’s features, simulation engines, the associated design guidelines book, and the workshop training program. It also outlines planned enhancements including a capability for photovoltaics simulation, and the steps required to make the program useful outside the United States.

Journal ArticleDOI
TL;DR: A method of morphological filter design using GAs is described, together with an efficient parallelization implementation, which allows the use of massively parallel computers or inhomogeneous clusters of workstations.
Abstract: Mathematical morphology has produced an important class of nonlinear filters. Unfortunately, design methods existing for these types of filter tend to be computationally intractable or require some expert knowledge of mathematical morphology. Genetic algorithms (GAs) provide useful tools for optimization problems which are made difficult by substantial complexity and uncertainty. Although genetic algorithms are easy to understand and simple to implement in comparison with deterministic design methods, they tend to require long computation times. But the structure of a genetic algorithm lends itself well to parallel implementation and, by parallelization of the GA, major improvements in computation time can be achieved. A method of morphological filter design using GAs is described, together with an efficient parallelization implementation, which allows the use of massively parallel computers or inhomogeneous clusters of workstations.

Proceedings ArticleDOI
02 Jun 1997
TL;DR: In this article, a genetic algorithm coupled with a versatile preliminary design tool is employed to demonstrate the concept of an autonomous design procedure for gas turbine combustors with user specified performance criteria.
Abstract: A genetic algorithm, coupled with a versatile preliminary design tool, is employed to demonstrate the concept of an autonomous design procedure for gas turbine combustors with user specified performance criteria.The chosen preliminary design program utilises a network based approach which provides considerable geometric flexibility allowing for a wide variety of combustor types to be represented. The physical combustor is represented by a number of independent, though interconnected, semi-empirical sub-flows or elements. A full conjugate heat transfer model allows for convection, conduction and radiative heat transfer to be modelled and a constrained equilibrium calculation simulates the combustion process. The genetic algorithm, whose main advantage lies in its robustness, uses the network solver in order to progress towards the optimum design parameters defined by the user. The capabilities of the genetic program are demonstrated for some simple design requirements, for example zone fuel/air ratio, pressure drop and wall temperatures.Copyright © 1997 by ASME. © British Crown copyright 1997/DERA. Published by the American Society of Mechanical Engineers with permission of the Controller of Her Majesty’s Stationery Office

01 Jan 1997
TL;DR: It will be argued that the extensive quantitative input requirement in such tools acts to produce a psychological separation between the acts of design and the act of analysis, and that existing building design and performance analysis tools poorly serve these needs.
Abstract: A significant amount of the research referred to by Manning has been directed into the development of computer software for building simulation and performance analysis. A wide range of computational tools are now available and see relatively widespread use in both research and commercial applications. The focus of development in this area has long been on the accurate simulation of fundamental physical processes, such as the mechanisms of heat flow though materials, turbulent air movement and the inter-reflection of light. The adequate description of boundary conditions for such calculations usually requires a very detailed mathematical model. This has tended to produce tools with a very engineering-oriented and solution-based approach. Whilst becoming increasingly popular amongst building services engineers, there has been a relatively slow response to this technology amongst architects. There are some areas of the world, particularly the UK and Germany, where the use of such tools on larger projects is routine. However, this is almost exclusively during the latter stages of a project and usually for purposes of plant sizing or final design validation. The original conceptual work, building form and the selection of materials being the result of an aesthetic and intuitive process, sometimes based solely on precedent. There is no argument that an experienced designer is capable of producing an excellent design in this way. However, not all building designers are experienced, and even fewer have a complete understanding of the fundamental physical processes involved in building performance. These processes can be complex and often highly inter-related, often even counter-intuitive. It is the central argument of this thesis that the needs of the building designer are quite different from the needs of the building services engineer, and that existing building design and performance analysis tools poorly serve these needs. It will be argued that the extensive quantitative input requirement in such tools acts to produce a psychological separation between the act of design and the act of analysis. At the conceptual stage, building geometry is fluid and subject to constant change, with solid quantitative information relatively scarce. Having to measure off surface areas or search out the emissivity of a particular material forces the designer to think mathematically at a time when they are thinking intuitively. It is, however, at this intuitive stage that the greatest potential exists for performance efficiencies and environmental economies. The right orientation and fenestration choice can halve the airconditioning requirement. Incorporating passive solar elements and natural ventilation pathways can eliminate it altogether. The building form can even be designed to provide shading using its own fabric, without any need for additional structure or applied shading. It is significantly more difficult and costly to retrofit these features at a later stage in a project’s development. If the role of the design tool is to serve the design process, then a new approach is required to accommodate the conceptual phase. This thesis presents a number of ideas on what that approach may be, accompanied by some example software that demonstrates their implementation.

Proceedings ArticleDOI
22 Mar 1997
TL;DR: Users of this software construction kit can design layouts for virtual spaces based on Kevin Lynch's elements of the city image: districts, paths, edges, nodes, and landmarks.
Abstract: Users of this software construction kit can design layouts for virtual spaces. The elements of the software kit are based on Kevin Lynch's elements of the city image: districts, paths, edges, nodes, and landmarks (Lynch, 1960; Banerjee & Southworth, 1990).

Patent
01 Jul 1997
TL;DR: In this paper, a process and design tool are presented for the accurate prediction of design parameters (42) for components (38) of an integrated circuit (22) during the early stages of the design of that integrated circuit.
Abstract: A process (20) and design tool (62) are presented for the accurate prediction of design parameters (42) for components (38) of an integrated circuit (22) during the early stages of the design of that integrated circuit (22). These predicted design parameters (42) include pin count parameters (50), propagation delay parameters (52), layout area parameters (54), dynamic power parameters (56), static power parameters (58), and total power parameters (60). With these parameters, the designer interactively modifies the design prior to the layout and prototyping of the integrated circuit (22). The dynamic power parameters (56) and total power parameters (60) may be repetitively predicted with differing input items to establish a power usage pattern for the integrated circuit (22).

Journal ArticleDOI
TL;DR: The cascade strategy available in the combined COMETBOARDS, FLOPS, and NEPP design tool converges to the same global optimum solution even when it starts from different design points, which eliminates manual intervention in the design of aircraft and of air-breathing propulsion engines where it eases the cycle analysis procedures.
Abstract: Design optimization for subsonic and supersonic aircraft and for air-breathing propulsion engine concepts has been accomplished by soft-coupling the Flight Optimization System (FLOPS) and the NASA Engine Performance Program analyzer (NEPP), to the NASA Lewis multidisciplinary optimization tool COMETBOARDS. Aircraft and engine design problems, with their associated constraints and design variables, were cast as nonlinear optimization problems with aircraft weight and engine thrust as the respective merit functions. Because of the diversity of constraint types and the overall distortion of the design space, the most reliable single optimization algorithm available in COMETBOARDS could not produce a satisfactory feasible optimum solution. Some of COMETBOARDS' unique features, which include a cascade strategy, variable and constraint formulations, and scaling devised especially for difficult multidisciplinary applications, successfully optimized the performance of both aircraft and engines. The cascade method has two principal steps: In the first, the solution initiates from a user-specified design and optimizer, in the second, the optimum design obtained in the first step with some random perturbation is used to begin the next specified optimizer. The second step is repeated for a specified sequence of optimizers or until a successful solution of the problem is achieved. A successful solution should satisfy the specified convergence criteria and have several active constraints but no violated constraints. The cascade strategy available in the combined COMETBOARDS, FLOPS, and NEPP design tool converges to the same global optimum solution even when it starts from different design points. This reliable and robust design tool eliminates manual intervention in the design of aircraft and of air-breathing propulsion engines where it eases the cycle analysis procedures. The combined code is also much easier to use, which is an added benefit. This paper describes COMETBOARDS and its cascade strategy and illustrates the capability of the combined design tool through the optimization of a subsonic aircraft and a high-bypass-turbofan wave-rotor-topped engine.

Journal ArticleDOI
TL;DR: An approach to hardware-software partitioning for real-time embedded systems is presented, so that cost and performance tradeoffs can be studied early in the design process and a large design space can be explored.
Abstract: In this paper, we present an approach to hardware-software partitioning for real-time embedded systems. Hardware and software components are modeled at the system level, so that cost and performance tradeoffs can be studied early in the design process and a large design space can be explored. Feasibility factor is introduced to measure the possibility of a real-time system being feasible, and is used as both a constraint and an attribute during the optimization process. An imprecise value function is employed to model the tradeoffs among multiple performance attributes. Optimal partitioning is achieved through the use of an existing computer-aided design tool. We demonstrate the application of our approach through the design of an example embedded system.

Journal ArticleDOI
TL;DR: In this article, a multilevel computation system for predicting ship motions and wave loads, up through and including extreme sea conditions, is presented, including a traditional strip theory approach and newly developed linear and nonlinear three-dimensional time-domain methods.
Abstract: Despite the limits inherent within linearized frequency-domain ship motion and wave load computer codes, strip theory has been found to provide the design community with a fairly robust, practical design tool with reasonable accuracy for most conventional displacement monohulls. However, the advent of new design concepts including multi-hulls and application of new materials as well as the push to incorporate reliability methods within surface ship structural design criteria has highlighted the need for more rigorous methods of developing a lifetime load spectrum. In this paper, a multilevel computation system for predicting ship motions and wave loads, up through and including extreme sea conditions, is presented. This system includes a traditional strip theory approach and newly developed linear and nonlinear three-dimensional time-domain methods. The new nonlinear methods are currently in the process of being validated by the U. S. Navy. The status of the current development is presented. Sample numerical results from the new nonlinear methods are compared with both linear frequency domain predictions and model tests.

Proceedings ArticleDOI
01 Jan 1997
TL;DR: The SMALLSAT Model as discussed by the authors is a computer-aided Phase A design and technology evaluation tool for small satellites, which enables satellite designers, mission planners, and technology program managers to observe the likely consequences of their decisions in terms of satellite configuration, non-recurring and recurring cost, and mission life cycle costs and availability statistics.
Abstract: A toolset for the rapid development of small satellite systems has been created. The objective of this tool is to support the definition of spacecraft mission concepts to satisfy a given set of mission and instrument requirements. The objective of this report is to provide an introduction to understanding and using the SMALLSAT Model. SMALLSAT is a computer-aided Phase A design and technology evaluation tool for small satellites. SMALLSAT enables satellite designers, mission planners, and technology program managers to observe the likely consequences of their decisions in terms of satellite configuration, non-recurring and recurring cost, and mission life cycle costs and availability statistics. It was developed by Princeton Synergetic, Inc. and User Systems, Inc. as a revision of the previous TECHSAT Phase A design tool, which modeled medium-sized Earth observation satellites. Both TECHSAT and SMALLSAT were developed for NASA.

Proceedings ArticleDOI
W.F. Hoffman1, A. Locascio
05 May 1997
TL;DR: A system for evaluating the environmental impact of product designs with as its core a staged approach which matches analysis tools to appropriate stages in product design and can be extended to evaluate the environmentalimpact of other mechanical design applications including housings and packaging.
Abstract: Increasing market opportunities in the European Community require product lifecycle impact analysis early in the design stage. Decisions such as material selection and processing ultimately have a big impact on our environment. In this paper, we present a system for evaluating the environmental impact of product designs. This system has as its core a staged approach which matches analysis tools to appropriate stages in product design. In the first stage, when little detailed information is available about a product, matrix approaches are used. These have been pioneered by Graedel and Allenby (1994, 1995). We have adapted the matrix method to match the sphere of influence of our design team and the likelihood that change can be made. The focus is to make changes where possible while still including impacts from outside our sphere of influence. The second stage in product design starts during the design of piece parts. After considerable search for a design tool for this stage in product development, at the time no tools were available. To meet the needs of designers during this stage we developed a software system for determining an environmental score, beginning with RF shield components and continuing with other design applications. A team of engineers and researchers from across the corporation determined eight criteria to measure the shield's environmental score. The criteria are diverse (different ranges, units, and importance levels); they are often in competition with each other (improvements in one result in worsening of others); and each impacts the overall score differently. These factors are all considered in the scoring system. A decision analysis method was used to combine the eight criteria into a single metric. A Value function for each criteria was constructed based on its impact. The combination of impacts, or values, were aggregated into an overall multicriteria value function. The overall environmental score depends the current values of all criteria as well as the importance weightings for each. The weightings represent allowable tradeoffs between competing criteria. The environmental scoring software can easily be extended to evaluate the environmental impact of other mechanical design applications including housings and packaging. Work is underway to extend the system application to whole products. In the final stage of product design, prototype manufacture, traditional life cycle tools can be employed. The scoring algorithms developed in the previous tool can be used for assessment of the results of a Life Cycle Inventory. Although Life Cycle Assessment (LCA) has not been performed on a Motorola product, it is expected that as LCA matures LCA studies will become more common.

DOI
02 Apr 1997
TL;DR: The new version of the DAISY (Activated Sludge DYnamic simulator) version 2.0 introduces an optimization module that estimates automatically the optimum dimensions and of an activated sludge plant to reduce the global cost function that could include effluent quality, construction cost and plant restrictions.
Abstract: This paper presents the theoretical basis and some practical applications of the simulation program DAISY (Activated Sludge DYnamic simulator) version 2.0 with the aim of optimizing the design of complex activated sludge plants for wastewater treatment. The software, developed by CEIT, was designed for the Spanish firm CADAGUA S.A. and is currently used as design tool by the engineering department of the company. The new version of the simulator introduces an optimization module that estimates automatically the optimum dimensions and of an activated sludge plant to reduce the global cost function that could include effluent quality, construction cost and plant restrictions. This global cost function can combine continuous cost and penalty functions. The automatic optimization is based on a direct search algorithm introduced in the simulation software previously developed. This algorithm determines the path towards an optimum by evaluating the objective function at several points without calculating derivatives. The paper includes some examples to illustrate the vast possibilities of these kind of optimization procedures in the design of complex wastewater treatment plants including organic matter and nutrient removal.