scispace - formally typeset
Search or ask a question

Showing papers on "Design tool published in 2005"


Journal ArticleDOI
TL;DR: It is speculated that if valid human posture and motion prediction models are developed and used, these can be combined with psychophysical and biomechanical models to provide a much greater understanding of dynamic human performance and population specific limitations.
Abstract: This paper presents the need to improve existing digital human models (DHMs) so they are better able to serve as effective ergonomics analysis and design tools. Existing DHMs are meant to be used by a designer early in a product development process when attempting to improve the physical design of vehicle interiors and manufacturing workplaces. The emphasis in this paper is placed on developing future DHMs that include valid posture and motion prediction models for various populations. It is argued that existing posture and motion prediction models now used in DHMs must be changed to become based on real motion data to assure validity for complex dynamic task simulations. It is further speculated that if valid human posture and motion prediction models are developed and used, these can be combined with psychophysical and biomechanical models to provide a much greater understanding of dynamic human performance and population specific limitations and that these new DHM models will ultimately provide a powerful ergonomics design tool.

198 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explore the feasibility of integrating noise and emissions as optimization objectives at the aircraft conceptual design stage, thereby allowing a quantitative analysis of the tradeoffs between environmental performance and operating cost.
Abstract: Although civil aircraft environmental performance has been important since the beginnings of commercial aviation, continuously increasing air traffic and a rise in public awareness have made aircraft noise and emissions two of the most pressing issues hampering commercial aviation growth today. This, in turn, has created the demand for an understanding of the impact of noise and emissions requirements on the design of the aircraft. In response, the purpose of this research is to explore the feasibility of integrating noise and emissions as optimization objectives at the aircraft conceptual design stage, thereby allowing a quantitative analysis of the tradeoffs between environmental performance and operating cost. A preliminary design tool that uses a multiobjective genetic algorithm to determine optimal aircraft configurations and to estimate the sensitivities between the conflicting objectives of low noise, low emissions, and operating costs was developed. Beyond evaluating the ability of a design to meet regulations and establishing environmental performance trades, the multidisciplinary design tool allows the generation of conventional but extremely low-noise and low-emissions designs that could, in the future, dramatically decrease the environmental impact of commercial aviation, albeit at the expense of increased operating cost. The tool incorporates ANOPP, a noise prediction code developed at NASA Langley Research Center, NASA Glenn Research Center's Engine Performance Program engine simulator, and aircraft design, analysis, and optimization modules developed at Stanford University. The trend that emerges from this research among the seemingly conflicting objectives of noise, fuel consumption, and NO x emissions is the opportunity for significant reductions in environmental impact by designing the aircraft to fly slower and at lower altitude.

165 citations


Journal ArticleDOI
TL;DR: The transformation of an existing set of heterogeneous product knowledge into a coherent design repository that supports product design knowledge archival and web-based search, display, and design model and tool generation is described.
Abstract: This paper describes the transformation of an existing set of heterogeneous product knowledge into a coherent design repository that supports product design knowledge archival and web-based search, display, and design model and tool generation. Guided by design theory, existing product information was analyzed and compared against desired outputs to ascertain what information management structure was needed to produce design resources pertinent to the design process. Several test products were catalogued to determine what information was essential without being redundant in representation. This set allowed for the creation of a novel single point of entry application for product information and the development of a relational database for design knowledge archival. Web services were then implemented to support design knowledge retrieval through search, browse, and real-time design tool generation. Further explored in this paper are the fundamental enabling technologies of the design repository system. Additionally, repository-generated design tools are scrutinized alongside human-generated design tools for validation. Through this process researchers have been able to improve the way in which artifact data are gathered, archived, distributed and used.

150 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a method based on the coupling of three different types of simulation models that is economical in terms of computing time, and thereby, suitable for design purposes.

126 citations


Proceedings ArticleDOI
25 Jul 2005
TL;DR: The Prometheus Design Tool as mentioned in this paper is a graphical editor that supports the design tasks specified within the Prometheus methodology for designing agent systems, propagating information where possible and ensuring consistency between various parts of the design.
Abstract: The Prometheus Design Tool is a graphical editor which supports the design tasks specified within the Prometheus methodology for designing agent systems. The tool propagates information where possible and ensures consistency between various parts of the design.

91 citations


Journal ArticleDOI
TL;DR: A debugger which uses the design artifacts of the Prometheus agent-oriented software engineering methodology to alert the developer testing the system, that a specification has been violated and detailed information is provided regarding the error which can help the developer in locating its source.

85 citations


Proceedings ArticleDOI
18 Oct 2005
TL;DR: This paper presents a design tool in a format that can be understood by practitioners who come from a range of backgrounds and is intended as a tool to guide the design research and development process in the application of smart technologies.
Abstract: The research and development of attractive smart clothing, with truly embedded technologies, demands the merging of science and technology with art and design. This paper looks at a comparatively new and unique design discipline that has been given little prior consideration. The concept of 'wearables' crosses the boundaries between many disciplines. A gap exists for a common 'language' that facilitates creativity and a systematic design process. Designers of smart clothing require guidance in their enquiry, as gaining an understanding of issues such as; usability, manufacture, fashion, consumer culture, recycling and end user needs can seem forbidding and difficult to prioritise. This paper presents a design tool in a format that can be understood by practitioners who come from a range of backgrounds. The representation of the 'critical path' is intended as a tool to guide the design research and development process in the application of smart technologies.

84 citations


Book ChapterDOI
Alain Martel1
01 Jan 2005
TL;DR: In this paper, a mathematical programming approach to design international production-distribution networks for make-to-stock products with convergent manufacturing processes is proposed, and a typical model is presented, and the use of successive mixed-integer programming to solve it with commercial solvers is discussed.
Abstract: This text proposes a mathematical programming approach to design international production-distribution networks for make-to-stock products with convergent manufacturing processes. Various formulations of the elements of production-distribution network design models are discussed. The emphasis is put on modeling issues encountered in practice which have a significant impact on the quality of the logistics network designed. The elements discussed include the choice of an objective function, the definition of the planning horizon, the manufacturing process and product structures, the logistics network structure, demand and service requirements, facility layouts and capacity options, product flows and inventory modeling, as well as financial flows modeling. Major contributions from the literature are reviewed and a number of new formulation elements are introduced. A typical model is presented, and the use of successive mixed-integer programming to solve it with commercial solvers is discussed. A more general version of the model presented and the solution method described were implemented in a commercial supply chain design tool which is now available on the market.

83 citations


Journal ArticleDOI
TL;DR: Two large-grain, architectural design patterns that solve specific design tool integration problems that have been implemented and used in real-life engineering processes are described and compared.
Abstract: Design tool integration is a highly relevant area of software engineering that can greatly improve the efficiency of development processes. Design patterns have been widely recognized as important contributors to the success of software systems. This paper describes and compares two large-grain, architectural design patterns that solve specific design tool integration problems. Both patterns have been implemented and used in real-life engineering processes.

68 citations


Journal ArticleDOI
TL;DR: The novel integration of linear physical programming within the collaborative optimization framework is described, which enables designers to formulate multiple system-level objectives in terms of physically meaningful parameters.
Abstract: Multidisciplinary design optimization (MDO) is a concurrent engineering design tool for large-scale, complex systems design that can be affected through the optimal design of several smaller functional units or subsystems. Due to the multiobjective nature of most MDO problems, recent work has focused on formulating the MDO problem to resolve tradeoffs between multiple, conflicting objectives. In this paper, we describe the novel integration of linear physical programming within the collaborative optimization framework, which enables designers to formulate multiple system-level objectives in terms of physically meaningful parameters. The proposed formulation extends our previous multiobjective formulation of collaborative optimization, which uses goal programming at the system and subsystem levels to enable multiple objectives to be considered at both levels during optimization. The proposed framework is demonstrated using a racecar design example that consists of two subsystem level analyses — force and aerodynamics — and incorporates two system-level objectives: (1) minimize lap time and (2) maximize normalized weight distribution. The aerodynamics subsystem also seeks to minimize rearwheel downforce as a secondary objective. The racecar design example is presented in detail to provide a benchmark problem for other researchers. It is solved using the proposed formulation and compared against a traditional formulation without collaborative optimization or linear physical programming. The proposed framework capitalizes on the disciplinary organization encountered during large-scale systems design.

60 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a probabilistic basis for appropriate combinations of loads to facilitate fire-resistant structural design and recommend specific load combinations for this purpose, for measuring compliance with performance objectives, for comparing alternatives, and for making the role of uncertainty in the decision process transparent.
Abstract: Fire protection of building structural systems traditionally has relied on component qualification testing, with acceptance criteria based on component survival during a "standard" fire for a prescribed rating period. These test procedures do not address the impact of the fire on a structural system. With advances in fire science and the advent of advanced structural analysis, the routine use of the computer as a design tool and limit states design, it is becoming possible to consider realistic fire scenarios and effects explicitly as part of the structural design process. In this modern engineering design approach, load requirements for considering structural actions due to fire in combination with other loads are essential, but have yet to be implemented in standards and codes in the United States. This paper provides a probabilistic basis for appropriate combinations of loads to facilitate fire-resistant structural design and recommends specific load combinations for this purpose. The probabilistic basis is essential for measuring compliance with performance objectives, for comparing alternatives, and for making the role of uncertainty in the decision process transparent.

Proceedings ArticleDOI
01 Jul 2005
TL;DR: In this paper, a parametric mass and volume model is used to assess an aerospace hybrid system design for a commercial aircraft electrical power unit, which is based on preliminary analyses, a SOFC/gas turbine system may be a potential solution.
Abstract: In aerospace power systems, mass and volume are key considerations to produce a viable design. The utilization of fuel cells is being studied for a commercial aircraft electrical power unit. Based on preliminary analyses, a SOFC/gas turbine system may be a potential solution. This paper describes the parametric mass and volume models that are used to assess an aerospace hybrid system design. The design tool utilizes input from the thermodynamic system model and produces component sizing, performance, and mass estimates. The software is designed such that the thermodynamic model is linked to the mass and volume model to provide immediate feedback during the design process. It allows for automating an optimization process that accounts for mass and volume in its figure of merit. Each component in the system is modeled with a combination of theoretical and empirical approaches. A description of the assumptions and design analyses is presented.

Proceedings Article
30 Jan 2005
TL;DR: A number of beautification issues and requirements for sketching-based design tools are discussed, illustrating these with examples from two quite different sketched-based applications.
Abstract: With the advent of the Tablet PC and stylus-based PDAs, sketching-based user interfaces for design tools have become popular. However, a major challenge with such interfaces is the need for appropriate "beautification" of the sketches. This includes both interactive beautification as content is sketched and post-design conversion of sketches to formalised, computer-drawn diagrams. We discuss a number of beautification issues and requirements for sketching-based design tools, illustrating these with examples from two quite different sketching-based applications. We illustrate ways of supporting beautification, user interface design and implementation challenges, and results from preliminary evaluations of such interfaces.

Proceedings ArticleDOI
07 Jul 2005
TL;DR: InkKit as mentioned in this paper is an informal design platform that uses pen input on a tablet PC to imitate the informality of a low fidelity tool, which is used to provide a foundation for further research into domain specific sketch support.
Abstract: In this paper, we describe the design philosophy, implementation and evaluation of InkKit, an informal design platform that uses pen input on a tablet PC to imitate the informality of a low fidelity tool. The aim is for this toolkit to provide a foundation for further research into domain specific sketch support.Designers initially hand-sketch their ideas [3, 6] because informal tools, such as pen and paper, offer the freedom to work with partly formed or ambiguous designs. The emergence of electronic pen input systems has seen a number of exploratory projects applying pen-based sketch software to the design process. Even though these projects differ, most of them use the same general framework. Thus a significant part of the implementation incorporates the same basic functionalities.

Journal ArticleDOI
01 Jan 2005-Robotica
TL;DR: A dual stage system with the coarse and fine actuators is adopted to achieve sub-micron accuracy with a large working space for the proposed new three degree-of-freedom (DOF) miniaturized micro parallel mechanism with high mobility.
Abstract: In part I of this paper (previous issue of Robotica) a dual stage system with the coarse and fine actuators is adopted to achieve sub-micron accuracy with a large working space for the proposed new three degree-of-freedom (DOF) miniaturized micro parallel mechanism with high mobility and one type of the architecture with vertical actuator locations in all three legs (C-VV type) among six possible coarse actuator architectures is selected for the coarse actuator architecture.In this part of the paper, an optimal kinematic parameter set is determined for the selected coarse actuator architecture. To determine this set, the design tool of the physical model of the solution space (PMSS) and the evaluation of the conditioning index (CI) and global mobility conditioning index (GMCI) are used. The basic size of the micro parallel mechanism is 45.0 mm×22.5 mm×22.9 mm with 100° mobility, the workspace 5.0 mm (y-axis)×5.0 mm (z-axis), and sub-micron resolution. After finishing the design of the main coarse actuator architecture, one architecture among six possible fine actuator architectures is selected to achieve sub-micron positioning accuracy based on the requirements of the continuous fine motion and smaller platform resolution. The selected coarse-and-fine actuator combination is used for the micro positioning platform for laser-machining application.

Journal ArticleDOI
TL;DR: The description of a knowledge-based expert system for the earthquake resistant design of reinforced concrete buildings is considered as an interactive analysis/design tool in which the structural engineer is guided from the preliminary earthquake design to the final detailed design including non-linear dynamic analysis.
Abstract: Earthquake engineering includes a wide range of disciplines; these include the analysis and evaluation of earthquake risk and the analysis and design of different types of engineering structures. The most frequently encountered of these many specialists is the analysis and design of building for earthquake excitations. In this case, the design engineer has to deal with concepts and requirements, which are not normally encountered when designing for gravity or wind loads. For example, strength and ductility that are key consideration in earthquake resistant design. Additionally, uncertainties are associated with the determination of the earthquake forces, the stiffness and strength of the different structural elements; the selection of the mathematical models which represent the structure behaviour; and the form and intensity of the design earthquakes. Thus, earthquake resistant design of buildings requires extensive knowledge of the conceptual design of structures, mathematical models and analysis assumptions used in structural analysis, and good element detailing. Computer-based earthquake resistant design assistants are needed to provide practicing engineers with decision support tools and to guide them through earthquake resistant design process. This paper is concerned with the description of a knowledge-based expert system for the earthquake resistant design of reinforced concrete buildings. The system is considered as an interactive analysis/design tool in which the structural engineer is guided from the preliminary earthquake design to the final detailed design including non-linear dynamic analysis. The earthquake design methodology included in this system, which is based on the ductile design approach concept, is briefly described.

Journal ArticleDOI
TL;DR: PAD is a chart-based design environment dedicated to the design of analog circuits aiming to optimize design and quality by finding good tradeoffs and its interactive interface enables instantaneous visualization of design tradeoffs.
Abstract: This paper presents a new Procedural Analog Design tool called PAD. It is a chart-based design environment dedicated to the design of analog circuits aiming to optimize design and quality by finding good tradeoffs. This interactive tool allows step-by-step design of analog cells by using guidelines for each analog topology. Its interactive interface enables instantaneous visualization of design tradeoffs. At each step, the user modifies interactively one subset of design parameters and observes the effect on other circuit parameters. At the end, an optimized design is ready for simulation (verification and fine-tuning). The present version of PAD covers the design of basic analog structures (one transistor or groups of transistors) and the procedural design of transconductance amplifiers (OTAs) and different operational amplifier topologies. The basic analog structures' calculator embedded in PAD uses the complete set of equations of the EKV MOS model, which links the equations for weak and strong inversion in a continuous way [1, 2]. Furthermore, PAD provides a layout generator for matched substructures such as current mirrors, cascode stages and differential pairs.

Journal ArticleDOI
TL;DR: In this paper, a hybrid methodology for conceptual design of large systems with the goal of enhancing system reliability is presented, which integrates the features of several design methodologies and maintenance planning concepts with the traditional reliability analysis.
Abstract: This paper presents a hybrid methodology for conceptual design of large systems with the goal of enhancing system reliability. It integrates the features of several design methodologies and maintenance planning concepts with the traditional reliability analysis. The methodology considers the temporal quality characteristic “reliability” as the main objective and determines the optimal system design. Key ideas from several design methodologies, namely axiomatic design, robust design, and the theory of inventive problem solving, have been integrated with the functional prioritization framework provided by reliability-centered maintenance. A case study of the conceptual design of a multiphase pumping station for crude oil production is presented. The methodology provides a new design tool for determining system configurations with enhanced reliability taking into account maintenance resources and variability.

Proceedings ArticleDOI
08 Jun 2005
TL;DR: A hybrid, hierarchical architecture for mission control of autonomous underwater vehicles (AUVs) with semiautomatic verification of safety and performance specifications as a primary capability in addition to the usual requirements such as real-time constraints, scheduling, shared-data integrity, etc.
Abstract: We present a hybrid, hierarchical architecture for mission control of autonomous underwater vehicles (AUVs). The architecture is model based and is designed with semiautomatic verification of safety and performance specifications as a primary capability in addition to the usual requirements such as real-time constraints, scheduling, shared-data integrity, etc. The architecture is realized using a commercially available graphical hybrid systems design and code generation tool. While the tool facilitates rapid redesign and deployment, it is crucial to include safety and performance verification into each step of the (re)design process. A formal model of the interacting hybrid automata in the design tool is outlined, and a tool is presented to automatically convert hybrid automata descriptions from the design tool into a format required by two hybrid verification tools. The application of this mission control architecture to a survey AUV is described and the procedures for verification outlined.

Patent
19 Apr 2005
TL;DR: In this paper, the authors present a system for configuring interfaces between system solution components and component behaviors cooperative with a system solution design tool, in which each connection or interface between system components defined by the user results in the display of a first prompt to select a basic, automatic, or advanced mode of defining the interface parameters and component configuration options.
Abstract: A system for configuring interfaces between system solution components and component behaviors cooperative with a system solution design tool in which each connection or interface between system components defined by the user results in the display of a first prompt to select a basic, automatic, or advanced mode of defining the interface parameters and component configuration options. If the user has a low level of expertise in the components being used, the user may select the automatic or basic option, following which the system design tool employs a nearly fully pre-configured deployment descriptor, prompting the user for a minimum of parameter choices. If the user has a high level of expertise with the components being used, the system design tool then prompts the user for a greater set of options and choices, following which a configurable deployment descriptor is configured according to the expert user's choices.

Book ChapterDOI
01 Jan 2005
TL;DR: The chapter shows how Greedypipe can be used to obtain good assignments in an environment where the total number of processor stages in all the pipelines is constrained.
Abstract: This chapter presents Greedypipe, a heuristic approach that quickly performs near optimal task-to-pipeline stage assignments in a multiple flow, multiple pipeline environment. Though obtaining optimal assignments is a network processors (NP) hard problem, Greedypipe quickly obtains very good assignments. While Greedypipe can be used in obtaining throughput performance in a variety of pipelining environments, its development was motivated by the growing importance of embedded processor systems that are based on having multiple pipelines on a single chip. This sort of design is used extensively in current network processors. The chapter also illustrates how Greedypipe can be employed as a design tool in situations where the effects of pipeline depth, task sharing, and task partitioning are to be explored as part of the design process. This is done for a generic design case, and a case where three flows are present, each flow requiring one or more networking functions (Longest Prefix Match, Encryption, and Compression). The chapter shows how Greedypipe can be used to obtain good assignments in an environment where the total number of processor stages in all the pipelines is constrained. New methods are now being explored to incorporate into Greedypipe performance issues related to memory constraints and contention, and in using both dynamic programming and statistical optimization techniques to obtain optimal solutions.

Book ChapterDOI
01 Jan 2005
TL;DR: This effort describes a computer-aided design tool suite, LINK-UP, which supports the design process for specific genre of systems that cross many domains-notification systems, and contrasting underlying concepts with typical task-based modelling approaches.
Abstract: We propose an interface design process compatible with scenario-based design methods, but specifically intended to facilitate three primary goals: design knowledge reuse, comparison of design products, and long-term research growth within HCI. This effort describes a computer-aided design tool suite, LINK-UP, which supports the design process for specific genre of systems that cross many domains-notification systems. We describe the vision for LINK-UP, contrasting underlying concepts with typical task-based modelling approaches. To achieve its stated goals, the design process is organised and guided by critical parameters, presenting several challenges that we reflect on through the results of a design simulation study. The possibilities envisioned through this approach have important implications for the integration of reusable design knowledge, HCI processes, and design support tools.

Proceedings ArticleDOI
27 Sep 2005
TL;DR: An integrated aided design tool based on an Virtual Desktop is developed that modifies the relation existing between the designer and his/her model and generates a new type of augmented interaction.
Abstract: Many CAD tools already allow to create and manipulate directly ideas in a digital way. However, designers still use the pen and paper technique during the early design phase of their projects. Indeed, existing CAD tools constraint the creative work. There is a need for a spontaneous human computer interaction in design computing. In order to answer to this need, we develop an integrated aided design tool based on an Virtual Desktop. The designer sits in front of a classical desktop where s/he can create and manipulate drawings and generated models. The paper relates the observations made from an experiment about the use of the Virtual Desktop by a designer. That experiment demonstrates that the immersive aspect of our system interface modifies the relation existing between the designer and his/her model and, this way, its generates a new type of augmented interaction.

01 Jan 2005
TL;DR: In this paper, the authors have documented the parallel paths of program support and technology development currently employed at Marshall Space Flight Center in an effort to move CFD to the forefront of injector design.
Abstract: New programs are forcing American propulsion system designers into unfamiliar territory. For instance, industry s answer to the cost and reliability goals set out by the Next Generation Launch Technology Program are engine concepts based on the Oxygen- Rich Staged Combustion Cycle. Historical injector design tools are not well suited for this new task. The empirical correlations do not apply directly to the injector concepts associated with the ORSC cycle. These legacy tools focus primarily on performance with environment evaluation a secondary objective. Additionally, the environmental capability of these tools is usually one-dimensional while the actual environments are at least two- and often three-dimensional. CFD has the potential to calculate performance and multi-dimensional environments but its use in the injector design process has been retarded by long solution turnaround times and insufficient demonstrated accuracy. This paper has documented the parallel paths of program support and technology development currently employed at Marshall Space Flight Center in an effort to move CFD to the forefront of injector design. MSFC has established a long-term goal for use of CFD for combustion devices design. The work on injector design is the heart of that vision and the Combustion Devices CFD Simulation Capability Roadmap that focuses the vision. The SRL concept, combining solution fidelity, robustness and accuracy, has been established as a quantitative gauge of current and desired capability. Three examples of current injector analysis for program support have been presented and discussed. These examples are used to establish the current capability at MSFC for these problems. Shortcomings identified from this experience are being used as inputs to the Roadmap process. The SRL evaluation identified lack of demonstrated solution accuracy as a major issue. Accordingly, the MSFC view of code validation and current MSFC-funded validation efforts were discussed in some detail. The objectives of each effort were noted. Issues relative to code validation for injector design were discussed in some detail. The requirement for CFD support during the design of the experiment was noted and discussed in terms of instrumentation placement and experimental rig uncertainty. In conclusion, MSFC has made significant progress in the last two years in advancing CFD toward the goal of application to injector design. A parallel effort focused on program support and technology development via the SCIT Task have enabled the progress.

Proceedings ArticleDOI
25 Jul 2005
TL;DR: This paper studies and formalizes important generic properties of commitment protocols that can ease their correct development significantly and provide algorithms that can directly be used to check these properties in such a design tool.
Abstract: Interaction protocols enable agents to communicate with each other effectively. Whereas several approaches exist to specify interaction protocols, none of them has design tools that can help protocol designers catch semantical protocol errors at design time. As research in networking protocols has shown, flawed specifications of protocols can have disastrous consequences. Hence, it is crucial to systematically analyze protocols in time to ensure correct specification. This paper studies and formalizes important generic properties of commitment protocols that can ease their correct development significantly. Since these properties are formal, they can easily be incorporated in a software tool to (semi-)automate the design and specification of commitment protocols. Where appropriate we provide algorithms that can directly be used to check these properties in such a design tool.

Journal ArticleDOI
TL;DR: In this paper, the authors present a simple computer-based design method to improve the thermal comfort conditions in the built environment by means of controlling winds access and therefore natural ventilation, where the criterion applied to control the access or obstruction of prevailing winds in a site was the idea of desirability or undesirability of these winds.

Proceedings Article
10 May 2005
TL;DR: The GUIDE tool as discussed by the authors allows the user to explore a design in UML interactively by playing a game, where the game incorporates both the design model and a specification of what it means for the design to be correct.
Abstract: In this paper we present our design tool GUIDE, which allows the user to explore a design in UML interactively by playing a game. The game incorporates both the design model and a specification of what it means for the design to be correct. The central idea of this approach is that the designer can increment the game during a play and gradually add more detail to it. Specification and design are refined by repeated plays of the game. The designer stops playing when design and specification are detailed enough for his purpose and fit to each other. The interactive game approach helps to cope with incompleteness and informal definition of UML models, which make strictly formal verification techniques difficult. The designer may resolve these problems when they arise during a play or let the GUIDE tool determine how the play should proceed.

01 Jan 2005
TL;DR: In this paper, a multidisciplinary design tool is used to embed downstream processes for conceptual design and evaluation allowing simulation of life cycle properties, and a knowledge enabled engineering approach was used to capture the engineering activities for designing and evaluation of jet engine component flanges.
Abstract: The actual product ownership often remains with the manufacturer as functional (total care) products emerge in aerospace business agreements. The business risk is then transferred to the manufacturer why downstream knowledge needs to be available in the concept phase to consider all product life cycle aspects. The aim of this work is to study how a multidisciplinary design tool can be used to embed downstream processes for conceptual design and evaluation allowing simulation of life cycle properties. A knowledge enabled engineering approach was used to capture the engineering activities for design and evaluation of jet engine component flanges. For every design change, cost of manufacturing operations, maintenance and performance aspects can be directly assessed. The design tool assures that the engineering activities are performed accordingly to company design specification which creates a better control over the process quality. It also creates a better understanding enabling the engineers to optimize the concept in real time from an overall product life cycle view. The new tool will be the base for optimize the total product system and will be used not only between companies but also between product development departments in large global companies.

Proceedings ArticleDOI
07 Jul 2005
TL;DR: This work examined beautification and its value in supporting the design process by prototyping a design tool incorporating several beautification techniques and described the design, construction and evaluation of the grid based design environment.
Abstract: Beautification of vague, imprecise sketchy ink input is an interesting area for exploration, especially with the emergence of pen-based systems, such as the Tablet PC. Fifty percent of the total time spent creating drawings on a computer is on formalisation operations [3], why waste this time when the same result is achievable via recognition and beautification techniques? We examined beautification and its value in supporting the design process by prototyping a design tool incorporating several beautification techniques. The following is a description of the design, construction and evaluation of our grid based design environment.

01 Jan 2005
TL;DR: Requirements on future sketching media have been derived which aim to combine the advantages from paper-sketches with the opportunities provided by CAD, and they are met by the 3D-Sketcher, which is the prototype of a digital and truly 3-dimensional sketching device for tomorrow’s design workplaces.
Abstract: Sketching is a basic activity in conceptual design. It combines aspects of design methodology, cognitive psychology and work science. Surveys among designers from industry have shown that the paper-sketch is the main design tool besides CAD. Both these tools have not been integrated in one technological and methodological framework, though. Experimental studies have revealed that sketches are capable of the representation of geometrical design information on a high level of abstraction. Due to effects from basic processes of perception and action, a sketch is a highly efficient tool for the development of conceptual design solutions. Requirements on future sketching media have been derived which aim to combine the advantages from paper-sketches with the opportunities provided by CAD. They are met by the 3D-sketcher, which is the prototype of a digital and truly 3-dimensional sketching device for tomorrow’s design workplaces.