scispace - formally typeset
Search or ask a question

Showing papers on "Design tool published in 2002"


Journal ArticleDOI
TL;DR: In this paper, the authors present a new three-dimensional electron gun and collector design tool, which targets problem classes including gridded-guns, sheet-beam guns, multibeam devices, and anisotropic collectors.
Abstract: The development of a new three-dimensional electron gun and collector design tool is reported. This new simulation code has been designed to address the shortcomings of current beam optics simulation and modeling tools used for vacuum electron devices, ion sources, and charged-particle transport. The design tool specifically targets problem classes including gridded-guns, sheet-beam guns, multibeam devices, and anisotropic collectors, with a focus on improved physics models. The code includes both structured and unstructured grid systems for meshing flexibility. A new method for accurate particle tracking through the mesh is discussed. In the area of particle emission, new models for thermionic beam representation are included that support primary emission and secondary emission. Also discussed are new methods for temperature-limited and space-charge-limited (Child's law) emission, including the Longo-Vaughn formulation. A new secondary emission model is presented that captures true secondaries and the full range rediffused electrons. A description of the MICHELLE code is presented.

211 citations


Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo simulation technique that makes explicit allowance for the probability-distributed nature of the key flood producing variables and the dependencies between them to determine derived flood frequency curves is presented.

186 citations


Journal ArticleDOI
TL;DR: In this paper, quantitative metrics are developed that allow designers to identify products that are similar in a manner critical to the success of a design, based on the functional similarity of products.
Abstract: During the design and development of new products, design engineers use many tech- niques to generate and define new and ''good'' concepts. Inherent in this search for solutions is the conscious and unconscious reliance on prior experience and knowledge, or design-by-analogy. In this paper, a quantitative metric for design-by-analogy is devel- oped. This metric is based on the functional similarity of products. By using this product- similarity metric, designers are able to formalize and quantify design-by-analogy tech- niques during concept and layout design. The methods, as developed in this paper, allow a designer with limited experience to develop sophisticated solutions that enhance the overall design of a new product. Also, a designer's current design-by-analogy vocabulary can be extended beyond his or her immediate experience, providing access and contribu- tions to new domains by discovering different products with common functions. The simi- larity metric and its application are clarified and validated through a case study. The case study is the original design of a pickup winder. @DOI: 10.1115/1.1475317# During the design and development of new products, design engineers use many techniques to generate and define new and ''good'' concepts. Inherent in this search for solutions is the con- scious and unconscious reliance on prior experience and knowl- edge. Numerous attempts have been made to organize, qualify, and make accessible the critical design experience and knowledge needed to solve particular problems. Some of these techniques take the form of knowledge-based design, expert design systems, and design rules or design guidelines. In this paper, quantitative metrics are developed that allow designers to identify products that are similar in a manner critical to the success of a design. This focused identification allows these similar products to be reviewed within the context of the design problem at hand for configuration, concept, and embodiment information. These metrics allow for- malized design-by-analogy efforts by identifying products that have design-critical similarity. The paper is organized in the following way. First, the notion of similarity as used here is clarified. Toward the goal of finding the important product similarities, groundwork is developed to make comparisons between products. In the remainder of this paper, these notions of product similarity in the search for analogies are explored. Also, a procedure for applying these techniques to a design problem is presented. Lastly, an example application of the design-by-analogy techniques is applied to an original design case study. The paper concludes with a brief discussion of the contri- butions of the work presented here. 2 Relevant Analogies The notions of similarity and analogies based on similarity are broad. From Moody charts to the Periodic Table, organizing schemes based on similarities and differences are critical tools in engineering and science. In fluid mechanics, the comparison of different objects based on similarities in the Reynolds number, the Biot number, or other meaningful metrics for comparison, is not only common place but critical to the fundamental understanding of the relevant physics that affect the systems. Before developing a design tool based on analogy, the basis for making the compari- son is necessary. For example, based on a color comparison, a car and a watch may be similar. In fact, they also may share the similarity of manufacturing country of origin. Reviewing a watch as an exercise to find alternative ways to mix fuel and air in the car is likely a fruitless exercise. Before searching for design in- formation in existing and similar designs, the notion of similarity needs to be understood in the context of design.

136 citations


Journal ArticleDOI
TL;DR: The strut-and-tie method (STM) is gaining recognition as a code-worthy and consistent design methodology for discontinuity (D-) regions in structural concrete as mentioned in this paper, however, the development of code provisions for the STM has been hampered by uncertainties in defining strength and dimensions of an idealized load-resisting truss.
Abstract: The strut-and-tie method (STM) is gaining recognition as a code-worthy and consistent design methodology for discontinuity (D-) regions in structural concrete. However, the development of code provisions for the STM has been hampered by uncertainties in defining strength and dimensions of an idealized load-resisting truss. In addition, the STM has been encumbered by an iterative and time-consuming design procedure in which many geometric details must be considered. To deal with the problem, researchers are developing computer-based tools, such as the authors' computer-aided strut-and-tie (CAST) design tool. CAST provides a graphical working environment for all aspects of the design process, including definition of the D- region, selection of the STM, truss analysis, member definitions, and creation of a design summary. This study details the STM, barriers to its advancement, the capabilities of computer-based design tools, and the CAST program. Suggestions are also made for future STM research.

70 citations


Journal ArticleDOI
TL;DR: In this article, an integrated, life-cycle-oriented design tool (LEGOE) is proposed and its general criteria identified for use in practice including the scope of methods as well as their scalability over the manufacture, construction, use, cleaning, maintenance, refurbishment and recycling/waste cycles of the building and its components.
Abstract: Although assessment models exist for evaluating buildings after they are designed, the problem is creating an appropriate tool for the design team to use during the design process to create sustainable (green) buildings. An integrated, life-cycle-oriented design tool (LEGOE) is proposed and its general criteria identified for use in practice including the scope of methods as well as their scalability over the manufacture, construction, use, cleaning, maintenance, refurbishment and recycling/waste cycles of the building and its components. The design tool is compared with assessment tools for green buildings. The integration of basic methods of life-cycle analysis (LCA), in particular energy and massflow analysis, into the normal practice environment (CAAD, specification and quantity surveying, energy need and building physics calculation, comfort and health risk appreciation) is considered. Feedback from the initial practical applications of the integrated tool is discussed. Conclusions concern the time-s...

64 citations


Journal ArticleDOI
TL;DR: In this article, the OPTI program (the office building module) is designed to help architects, engineers and design departments to take into account the impact of design choices on energy consumption designing a project.

63 citations


Patent
30 Apr 2002
TL;DR: In this article, a network based computer implemented system for designing products is disclosed, which comprises a network server such as an Internet web server (30) operable to manage interaction of the system with external computers, and a product design module (204) communicating with the network server.
Abstract: A network based computer implemented system for designing products is disclosed. The system comprises a network server, such as an Internet web server (30) operable to manage interaction of the system with external computers, and a product design module (204) communicating with the network server. The product design module (204) is operable to generate product designs. The system further comprises a live agent support system (44) communicating with the network server. The live agent support system (44) is operable to provide live assistance to a customer of the system. An agent (312) of the live agent support system (44) and a customer of the system simultaneously view a same page of the design tool. The customer and agent (312) also mutually communicate while collaboratively manipulating the page.

62 citations


Journal ArticleDOI
TL;DR: This paper will explore the use of a random-guided search method for multiobjective optimization of compliant mechanisms through genetic programming techniques, and describe and demonstrate the successful use of genetic programming to create a general design tool for topological synthesis of compliant mechanism design.
Abstract: Compliant mechanisms achieve desired force and displacement characteristics through elastic deformation of their structure Current research in the synthesis of compliant mechanism topology has pursued multiobjective optimization using gradient-based search methods This paper will explore the use of a random-guided search method for multiobjective optimization of compliant mechanisms through genetic programming techniques The combination of genetic algorithms and compliant mechanisms is an effective and interesting combination of two biologically inspired engineering design areas This paper will describe and demonstrate the successful use of genetic programming to create a general design tool for topological synthesis of compliant mechanisms Features that exploit the implementation of genetic algorithms to compliant mechanism design, such as multiple criteria specification, multiple-design parameter variation, and final selections from a family of solutions will be presented and discussed Finally, the use of this design tool will be demonstrated on several familiar examples for validation and discussion

61 citations


Proceedings ArticleDOI
09 Mar 2002
TL;DR: In this article, the authors present a cost/benefit analysis for a prognostics and health management (PHM) system design tool that integrates a model-based FMECA methodology with state-of-the-art system simulation directly linked to downstream life cycle costs.
Abstract: Provides an update to the developments associated with a prognostics and health management (PHM) system design tool that integrates a model-based FMECA methodology with state-of-the-art system simulation directly linked to downstream life cycle costs (LCC). This design tool will seek out recommended PHM system designs based on a cost function that accurately represents key LCC variables such as system availability, maintainability, reliability, and failure mode observability. The tool will be capable of assessing PHM sensor requirement specifications at the component and subsystem levels, and will then allow for integration into a broader integrated system model. Tradeoff, sensitivity and "what if" analysis will then allow the designer/user to examine the cost/benefit relationship of either adding or removing sensor and algorithms under consideration for the PHM design. A simplified example of a health management system cost/benefit analysis on an aircraft electromechanical valve is provided for illustration of the concepts introduced.

61 citations


Proceedings ArticleDOI
08 Dec 2002
TL;DR: In this paper, an artificial neural network (ANN) metamodel is developed for the simulation model of an asynchronous assembly system and simulated annealing (SA) is used to optimize the buffer sizes in the system.
Abstract: When the systems under investigation are complex, the analytical solutions to these systems become impossible. Because of the complex stochastic characteristics of the systems, simulation can be used as an analysis tool to predict the performance of an existing system or a design tool to test new systems under varying circumstances. However, simulation is extremely time consuming for most problems of practical interest. As a result, it is impractical to perform any parametric study of system performance, especially for systems with a large parameter space. One approach to overcome this limitation is to develop a simpler model to explain the relationship between the inputs and outputs of the system. Simulation metamodels are increasingly being used in conjunction with the original simulation, to improve the analysis and understanding of decision-making processes. In this study, an artificial neural network (ANN) metamodel is developed for the simulation model of an asynchronous assembly system and an ANN metamodel together with simulated annealing (SA) is used to optimize the buffer sizes in the system.

61 citations


Journal ArticleDOI
TL;DR: In this paper, a performance-based optimization (PBO) technique is proposed for automatically producing optimal strut-and-tie models for the design and detailing of structural concrete members, which is treated as an optimal topology design problem of continuum structures.
Abstract: Conventional trial-and-error methods are not efficient in developing appropriate strut-and-tie models in complex structural concrete members. This paper describes a performance-based optimization ~PBO! technique for automatically producing optimal strut- and-tie models for the design and detailing of structural concrete. The PBO algorithm utilizes the finite element method as a modeling and analytical tool. Developing strut-and-tie models in structural concrete is treated as an optimal topology design problem of continuum structures. The optimal strut-and-tie model that idealizes the load transfer mechanism in cracked structural concrete is generated by gradually removing regions that are ineffective in carrying loads from a structural concrete member based on overall stiffness performance criteria. A performance index is derived for evaluating the performance of strut-and-tie systems in an optimization process. Fundamental concepts underlying the development of strut-and-tie models are introduced. Design examples of a low-rise concrete shearwall with openings and a bridge pier are presented to demonstrate the validity and effectiveness of the PBO technique as a rational and reliable design tool for structural concrete.

Dissertation
01 Jan 2002
TL;DR: In this paper, a program valuation tool is developed and demonstrated that measures the overall program value associated with a set of either one or two new aircraft concepts, based on a combination of a performance model, a development and manufacturing cost model; a revenue model; and a dynamic programming-based algorithm that accounts for uncertainty in future market conditions.
Abstract: This research considers the commercial aircraft design process from the perspective of program value. Whereas traditionally, the conceptual design of aircraft has often focused on minimum weight, or sometimes minimum cost, this approach demonstrates the feasibility and usefulness of design based on maximum value to the aircraft manufacturer. A program valuation tool is developed and demonstrated that measures the overall program value associated with a set of either one or two new aircraft concepts. The tool is based on a combination of a performance model; a development and manufacturing cost model; a revenue model; and a dynamic programming-based algorithm that accounts for uncertainty in future market conditions and the program’s ability to cope with such uncertainty through real-time decision-making. The cost model, using a component-based representation of the aircraft, allows for the consideration of the effects of part commonality on development and production costs. The revenue model, based on an analysis of existing commercial aircraft, estimates a market price and demand forecast for a new aircraft based on several key characteristics. The dynamic programming algorithm, used to find program value, treats annual aircraft quantity demanded as a stochastic process, evolving unpredictably with time. The algorithm borrows from Real Options theory to discount future cash flows using riskneutral expectations and models the aircraft program as an actively managed project with real-time decision-making to maximize expected program value. Several examples are drawn from the Blended-Wing-Body aircraft concept to demonstrate the operation of the program valuation tool. The results suggest that the value of part commonality between aircraft may be strongly sensitive to the weight penalty and increased fuel burn resulting from a common derivative design. More generally, the example results illustrate the usefulness of the explicit consideration of flexibility in program valuation and the feasibility of a conceptual aircraft design tool based on the metric of program value. Thesis Supervisor: Karen Willcox Title: Assistant Professor, Aeronautics and Astronautics

Proceedings ArticleDOI
13 Jan 2002
TL;DR: This work unifies two important threads of research in intelligent user interfaces which share the common element of explicit task modeling by implementing a collection of tools which generate both a GUI and a collaborative interface agent from the same task model.
Abstract: This work unifies two important threads of research in intelligent user interfaces which share the common element of explicit task modeling. On the one hand, longstanding research on task-centered GUI design (sometimes called model-based design) has explored the benefits of explicitly modeling the task to be performed by an interface and using this task model as an integral part of the interface design process. More recently, research on collaborative interface agents has shown how an explicit task model can be used to control the behavior of a software agent that helps a user perform tasks using a GUI. This paper describes a collection of tools we have implemented which generate both a GUI and a collaborative interface agent from the same task model. Our task-centered GUI design tool incorporates a number of novel features which help the designer to integrate the task model into the design process without being unduly distracted. Our implementation of collaborative interface agents is built on top of the COLLAGEN middleware for collaborative interface agents.

Journal ArticleDOI
TL;DR: The sensitivity of the optimum system design to the tap water draw-off and the draw-offs pattern has been determined using the optimisation algorithm.

Journal ArticleDOI
TL;DR: In this article, the authors present a discussion of the choice of stress and strain measures used in this large deformation context, and the simulation of the compaction of an industrial PM part, intended to illustrate the usefulness of the simulation approach in the task of improving the design of PM part and process.

Journal ArticleDOI
TL;DR: This paper attempts to synthesize the guidelines and empirical data related to the formatting of screen layouts into a well-defined model and suggests that esthetic characteristics of this model are important to prospective viewers.
Abstract: Gestalt psychologists promulgated the principles of visual organization in the early twentieth century. These principles have been discussed and re-emphasized, and their importance and relevance to user interface design are understood. However, a limited number of systems represent and make adequate use of this knowledge in the form of a design tool that supports certain aspects of the user interface design process. The graphic design rules that these systems use are extremely rudimentary and often vastly oversimplified. Most of them have no concept of design basics such as visual balance or rhythm. In this paper, we attempt to synthesize the guidelines and empirical data related to the formatting of screen layouts into a well-defined model. Fourteen esthetic characteristics have been selected for the purpose. The results of our exercise suggest that these characteristics are important to prospective viewers.

01 Jan 2002
TL;DR: In this article, a model of curiosity in design is developed as the selection of design actions with the goal of generating novel artefacts, called "curious design agents", and the behaviour of these agents is demonstrated with a range of applications to visual and non-visual design domains.
Abstract: Creative products are generally recognised as satisfying two requirements: firstly they are useful, and secondly they are novel. Much effort in AI and design computing has been put into developing systems that can recognise the usefulness of the products that they generate. In contrast, the work presented in this thesis has concentrated on developing computational systems that are able to recognise the novelty of their work. The research has shown that when computational systems are given the ability to recognise both the novelty and the usefulness of their products they gain a level of autonomy that opens up new possibilities for the study of creative behaviour in single agents and the emergence of social creativity in multi-agent systems. The work presented in this thesis has developed a model of curiosity in design as the selection of design actions with the goal of generating novel artefacts. Agents that embody this model of curiosity are called “curious design agents”. The behaviour of curious design agents is demonstrated with a range of applications to visual and nonvisual design domains. Visual domains include rectilinear drawings, Spirograph patterns, and “genetic artworks” similar to the work of Karl Sims. Non-visual domains include an illustrative abstract design space useful for visualising the behaviour of curious agents and the design of doorways to accommodate the passage of large crowds. The design methods used in the different domains show that the model of curiosity is applicable to models of designing by direct manipulation, parametric configuration or by using a separate design tool that embodies the generative aspects of the design process. In addition, an approach to developing multi-agent systems with autonomous notions of creativity called artificial creativity is presented. The opportunities for studying social creativity in design are illustrated with an artificial creativity system used to study the emergence of social notions of whom and what are creative in a society of curious design agents. Developing similar artificial creativity systems promises to be a useful synthetic approach to the study of socially situated, creative design.

Proceedings ArticleDOI
04 Sep 2002
TL;DR: In this article, a design methodology is proposed that combines reliability-based design optimization and high-fidelity aeroelastic simulations for the analysis and design of aero-elastic structures.
Abstract: Aeroelastic phenomena are most often either ignored or roughly approximated when uncertainties are considered in the design optimization process of structures subject to aerodynamic loading, affecting the quality of the optimization results. Therefore, a design methodology is proposed that combines reliability-based design optimization and high-fidelity aeroelastic simulations for the analysis and design of aeroelastic structures. To account for uncertainties in design and operating conditions, a first-order reliability method (FORM) is employed to approximate the system reliability. To limit model uncertainties while accounting for the effects of given uncertainties, a high-fidelity nonlinear aeroelastic simulation method is used. The structure is modelled by a finite element method, and the aerodynamic loads are predicted by a finite volume discretization of a nonlinear Euler flow. The usefulness of the employed reliability analysis in both describing the effects of uncertainties on a particular design and as a design tool in the optimization process is illustrated. Though computationally more expensive than a deterministic optimum, due to the necessity of solving additional optimization problems for reliability analysis within each step of the broader design optimization procedure, a reliability-based optimum is shown to be an improved design. Conventional deterministic aeroelastic tailoring, which exploits the aeroelastic nature of the structure to enhance performance, is shown to often produce designs that are sensitive to variations in system or operational parameters.

01 Mar 2002
TL;DR: The evolutionary paradigm is shown to be the only successful design system on which this new phase of design tool could be based and any characterisation of design as a search problem is argued to be a serious misconception.
Abstract: Design tools that aim not only to analyse and evaluate, but also to generate and explore alternative design proposals are now under development. An evolutionary paradigm is presented as a basis for creating such tools. First, the evolutionary paradigm is shown to be the only successful design system on which this new phase of design tool could be based. Secondly, any characterisation of design as a search problem is argued to be a serious misconception. Instead it is proposed that evolutionary design systems should be seen as generative processes that are able to evaluate their own output. Thirdly, a generic framework for generative evolutionary design systems is presented. Fourth, the generative process is introduced as a key element within this generic framework. The role of the environment within this process is fundamental. Finally, the direction of future research within the evolutionary design paradigm is discussed with possible short and long term goals being presented.

Journal ArticleDOI
TL;DR: In this paper, a generative evolutionary design system is presented, which is based on the evolutionary paradigm, and the role of the environment within this process is discussed. And the direction of future research within the evolutionary design paradigm is discussed with possible short and long term goals being presented.
Abstract: Design tools that aim not only to analyse and evaluate, but also to generate and explore alternative design proposals are now under development. An evolutionary paradigm is presented as a basis for creating such tools. First, the evolutionary paradigm is shown to be the only successful design system on which this new phase of design tool could be based. Secondly, any characterisation of design as a search problem is argued to be a serious misconception. Instead it is proposed that evolutionary design systems should be seen as generative processes that are able to evaluate their own output. Thirdly, a generic framework for generative evolutionary design systems is presented. Fourth, the generative process is introduced as a key element within this generic framework. The role of the environment within this process is fundamental. Finally, the direction of future research within the evolutionary design paradigm is discussed with possible short and long term goals being presented.

Patent
31 Jul 2002
TL;DR: In this article, a system to leverage functional knowledge by a design tool in an engineering project includes a functional knowledge repository created by modeling a plurality of requirements of the engineering project and a requirement and space planning tool interfacing between the functional knowledge repositories and the design tool.
Abstract: A system to leverage functional knowledge by a design tool in an engineering project includes a functional knowledge repository created by modeling a plurality of requirements of the engineering project and a requirement and space planning tool interfacing between the functional knowledge repository and the design tool. The requirement and space planning tool includes a requirements wizard to capture the plurality of requirements of the engineering project, a designer graphical user interface to display a preliminary design representing the plurality of requirements and functionality to modify the plurality of requirements, and a design tool interface to transfer the plurality of requirements and the preliminary design between the design tool and the functional knowledge repository.

Journal ArticleDOI
TL;DR: A new design tool framework called IMPACCT is proposed, which correctly combines the state-of-the-arttechniques at the system level, thereby saving even experienced designers from many pitfalls of system-level power management.
Abstract: Power-aware systems are those that must exploit a widerange of power/performance trade-offs in order to adapt to the power availabilityand application requirements. They require the integration of many novel powermanagement techniques, ranging from voltage scaling to subsystem shutdown.However, those techniques do not always compose synergistically with eachother; in fact, they can combine subtractively and often yield counterintuitive,and sometimes incorrect, results in the context of a complete system. Thiscan become a serious problem as more of these power aware systems are beingdeployed in mission critical applications. To address the problem of technique integration for power-aware embedded systems, we propose a new design tool framework called IMPACCT and the associated design methodology. The system modeling methodology includes application model for capturing timing/powerconstraints and mode dependencies at the system level. The tool performs power-awarescheduling and mode selection to ensure that all timing/power constraintsare satisfied and that all overhead is taken into account. IMPACCT then synthesizesthe implementation targeting a symmetric multiprocessor platform. Experimentalresults show that the increased dynamic range of power/performance settingsenabled a Mars rover to achieve significant acceleration while using lessenergy. More importantly, our tool correctly combines the state-of-the-arttechniques at the system level, thereby saving even experienced designersfrom many pitfalls of system-level power management.

Book ChapterDOI
01 Jan 2002
TL;DR: In this paper, the unit load method has been developed to achieve a pre-defined target configuration of section forces in cable-stayed bridges by optimizing the tensioning of the stay-cables.
Abstract: The unit load method has been developed to achieve a pre-defined target configuration of section forces in cable-stayed bridges by optimizing the tensioning of the stay-cables The method has been further developed into a versatile design tool that allows the definition of a target distribution of section forces or deflections in any structure This paper briefly describes the method and provides three application examples where this method has been used: a cable stayed bridge, a concrete arch and the application of the method to the automated simulation of the incremental launching process of bridges

Proceedings ArticleDOI
24 Feb 2002
TL;DR: A method for extending existing VHDL design and verification software available for the Xilinx Virtex series of FPGAs to the design of dynamically reconfigurable logic (DRL) through the conversion of a dynamic design into multiple static designs, suitable for input to standard synthesis and APR tools.
Abstract: This paper reports on a method for extending existing VHDL design and verification software available for the Xilinx Virtex series of FPGAs. It allows the designer to apply standard hardware design and verification tools to the design of dynamically reconfigurable logic (DRL). The technique involves the conversion of a dynamic design into multiple static designs, suitable for input to standard synthesis and APR tools. For timing and functional verification after APR, the sections of the design can then be recombined into a single dynamic system. The technique has been automated by extending an existing DRL design tool named DCSTech, which is part of the Dynamic Circuit Switching (DCS) CAD framework. The principles behind the tools are generic and should be readily extensible to other architectures and CAD toolsets. Implementation of the dynamic system involves the production of partial configuration bitstreams to load sections of circuitry. The process of creating such bitstreams, the final stage of our design flow, is summarized.

Journal ArticleDOI
TL;DR: In this paper, the authors summarise the results of a 3-year research project by the CRC for Catchment Hydrology (CRCCH) on the joint probability approach (Monte Carlo simulation technique) to design flood estimation and the subsequent research activities to further the CRCCH method towards industrial applications.
Abstract: This paper summarises the results of a 3-year research project by the CRC for Catchment Hydrology (CRCCH) on the joint probability approach (Monte Carlo simulation technique) to design flood estimation and the subsequent research activities to further the CRCCH method towards industrial applications. It identifies significant shortcomings in the current design event approach to rainfall-based design flood estimation, and argues that substantial improvements in the accuracy and reliability of flood estimates can be obtained from a more rigorous treatment of probability aspects in the generation of design floods. Applications of the proposed Monte Carlo simulation approach to test catchments in Victoria and Queensland have produced promising results, and demonstrated the feasibility and in-principle advantages of the approach. More recently, the Monte Carlo simulation approach has been integrated with the industry-based flood estimation model URBS thus significantly broadening its range of application. The paper discusses how far the recent research on the joint probability approach has advanced towards resolving the main research issues, and outlines desirable future development work to allow the new method to be routinely applied as a design tool.

01 Jan 2002
TL;DR: In this paper, the authors present a 3D modeling program for three proposed hydropower projects in Northern Manitoba, including the Gull Generating Station (680 MW) on the Nelson River, near Gillam, Manitoba, and the Notigi (100 MW) and Wuskwatim (200 MW) generators.
Abstract: Hydropower Dam designers and developers are discovering the capabilities of computational fluid dynamics for a range of applications, from hydraulic design to the analysis of dam break flooding. The design and engineering assessment of hydroelectric facilities involves developing an understanding of the very complex behaviour of moving water. To accomplish this, the engineer must develop a thorough understanding of the complexities of fluid flow phenomena - complexities that are often highly two and three dimensional in nature. In early years, physical model studies would have been the only practical medium available to gain insight into the threedimensional and time-dependent nature of fluid flow. However, physical modelling is typically only undertaken during the final stages of design, and can be costly to execute. With the advancements in computing power made since the 1980's, CFD analysis has emerged as a powerful alternative design tool, and can be used to provide insight into hydraulic design at all levels of study. Manitoba Hydro and Acres International Ltd., both of Winnipeg, Manitoba, Canada, have undertaken an extensive 3-dimensional modelling program as a part of pre-commitment level studies for three proposed hydropower projects in Northern Manitoba. These three projects are the Gull Generating Station (680 MW) on the Nelson River, near Gillam, Manitoba, and the Notigi (100 MW) and Wuskwatim (200 MW) Generating Stations, both on the Burntwood River, near

Proceedings ArticleDOI
07 Jul 2002
TL;DR: The SCORES-II program as mentioned in this paper is a conceptual-level analysis tool for the performance prediction of liquid propellant rocket engines that can be run on all major computing platforms (e.g., UNIX, PC, Mac).
Abstract: The SCORES-II program is a conceptual-level engineering design and analysis tool for the performance prediction of liquid propellant rocket engines. The tool is written in the C++ programming language and can be compiled and executed on all major computing platforms (e.g. UNIX, PC, Mac). SCORES-II can be executed through a command line/console application or via the ModelCenter © environment with its filewrapper. The Analysis Server © filewrap allows for automated execution and integration with other disciplinary analysis tools. SCORES-II can support a wide variety of engine configurations. In addition to the built-in propellant options, the chemical equilibrium routine is capable of handling any generic fuel and oxidizer combination. Numerous options exist for sizing an engine, including thrust and throat area matching. An 'expert system' of engine efficiencies is also included, allowing for cycle, chemical reaction, injector/combustor, and nozzle influences on performance. The equations used for the performance calculations, equilibrium chemistry model, and engine sizing will be presented and discussed. Chemical equilibrium results obtained from implementing these equations will then be compared with results from another equilibrium code widely used in industry. Performance predictions from SCORES-II will be compared with known performance values for a number of existing engines. Sample results demonstrating SCORES-II's throttled engine analysis will be presented for a notional engine design. The tool's user-interface options will also be discussed. Finally, the paper will include a future work section detailing additional improvements and capabilities planned for future incarnations of SCORES.

Journal ArticleDOI
TL;DR: A visual, windows-based, user-friendly design software has been developed using VB programming language and the results presented here indicate the value of this software as a design tool.

Journal ArticleDOI
TL;DR: In this article, the OPTI program (the dwellings module) is designed to help architects to take into account the impact of design choices on energy consumption when designing a dwelling project.

Book ChapterDOI
01 Jan 2002
TL;DR: This chapter discusses the methods for understanding the user needs early in the product development cycle and how these methods can help user experience practitioners understand workflow, improve efficiency, and eliminate weak areas that cause hindrance.
Abstract: This chapter discusses the methods for understanding the user needs early in the product development cycle and how these methods can help user experience practitioners understand workflow, improve efficiency, and eliminate weak areas that cause hindrance. User needs analysis sets the foundation for the entire design process. The principal purpose of this stage of design is to define the design goals and constraints and develop an understanding of the audience and what they do. User needs analysis involves four primary activities including investigation, analysis, specification, and documentation. The functional specs are referenced throughout the design and production of the site to verify that the system being produced corresponds to the necessary functionality. Several forms of background research are used to uncover user needs such as surveys, scenarios, competitive analysis, interviews, and focus groups. These give a better idea about true user profile, user needs, and user preferences. Task analysis can also be performed to specify how the information and functionality found in the requirements analysis will be used. In addition to codifying user procedures, task analysis can also be used as a design tool.