scispace - formally typeset
Search or ask a question

Showing papers on "Design tool published in 1994"


DOI
22 Sep 1994
TL;DR: Through configuration-level analysis, cost and performance tradeoffs can be studied early in the design process and a large design space can be explored.
Abstract: In this paper, we present an approach to hardware/software partitioning for real-time embedded systems The abstraction level we have adopted is referred to as the configuration level, where hardware is modeled as resources with no detailed functionality and software is modeled as tasks utilizing the resources Through configuration-level analysis, cost and performance tradeoffs can be studied early in the design process and a large design space can be explored Feasibility factor is introduced to measure the possibility of a real-time system being feasible, and is used as both a constraint and an attribute during the optimization process Optimal partitioning is achieved through the use of an existing computer-aided design tool

68 citations


Journal ArticleDOI
TL;DR: This paper describes an interactive (decision maker-computer) methodology for multiple response optimization of simulation models based on a multiple criteria optimization technique called the STEP method.
Abstract: Simulation is a popular tool for the design and analysis of manufacturing systems. The popularity of simulation is due to its flexibility, its ability to model systems when analytical methods have failed, and its ability to model the time dynamic behavior of systems. However, in and of itself, simulation is not a design tool; therefore, in order to optimize a simulation model, it often must be used in conjunction with an optimum-seeking method. This paper describes an interactive (decision maker-computer) methodology for multiple response optimization of simulation models. This approach is based on a multiple criteria optimization technique called the STEP method. The proposed methodology is illustrated with an example involving the optimization of a manufacturing system. >

62 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe the derivation and validation of a first-order thermal model which has a clear physical interpretation, is based on uncomplicated calculation procedures and requires limited input information.

60 citations


Journal ArticleDOI
TL;DR: In this paper, a simplified but sufficiently accurate simulation procedure for passive buildings was recently developed and implemented in the computer program, EASY, which provided a sound basis for the development of a design tool with the aid of which any type of evaporative cooler can be analysed.

38 citations


Journal ArticleDOI
TL;DR: In this paper, the authors describe how heat flow into the ground can be simplified and incorporated in an efficient design tool based on a first order thermal model calculated by means of straightforward equations.

31 citations


Journal ArticleDOI
TL;DR: It has taken us several decades longer, but commercial software such as Lighting Technologies's FiELD and Lighting Sciences's CAD-LITE to analyze the authors' nonimaging luminaire designs and the ability of these programs to accurately predict a design's photometric distribution makes them invaluable design analysis tools.
Abstract: Introduction In the past, luminaires were designed using a combination of ray-tracing techniques and hard-won experience_ Designing reflectors and lenses by hand was a time-consuming and arduous task_ While designs could be quickly evaluated using red-green diagrams, e_g_, Elmer, IO and other shortcuts, a complete photometric performance analysis required tracing hundreds of rays using a calculator, pencil, and paper_ It was often easier to build a prototype and measure its performance_ A designer's successes and failures eventually led to a sixth sense regarding luminaire design-or early retirement_ The advent of computers in the late 1940s allowed designers to quickly trace thousands of rays_ Other design approaches such as finite element radiative transfer (radiosity) techniques could also be employed_ Designers of imaging optical systems were quick to adopt this new technology_ It has taken us several decades longer, but we now have commercial software such as Lighting Technologies's FiELD and Lighting Sciences's CAD-LITE to analyze our nonimaging luminaire designs_ The ability of these programs to accurately predict a design's photometric distribution makes them invaluable design analysis tools_ Notice the adjective analysis_ These programs must trace millions of rays or calculate flux transfers between thousands of reflector elements if they are to generate accurate predictions_ Either approach takes time: hours or even days on even the fastest desktop computers_ As useful as these programs are, they are too slow to be used as interactive design tools_ We must still rely on trial-and-error experimentation and hard-won experience for our design ideas_ We could create an interactive luminaire design tool by tracing a limited number of rays, sufficient only to determine an approximate photometric distribution_ We don't need the features of a commercial product; a serviceable tool can be created within a computer-aided drafting program such as Autodesk's AUTO CAD by using its script language facilities, in

29 citations


Proceedings ArticleDOI
28 Apr 1994
TL;DR: DesignSpace is a computer-aided-design (CAD) system that facilitates dexterous manipulation of mechanical design representations that encapsulates technologies from several CDR projects within a conceptual design environment context to serve as a testbed for evaluating their effective application.
Abstract: pointing device shifted the paradigm and allowed visualization without explicit numerical references. DesignSpace is a computer-aided-design (CAD) system that facilitates dexterous manipulation of mechanical design representations. The system consists of an interactive simulation programmed with a seamless extended model of the designer's physical environment and driven with continuous instrumentation of the designer's physical actions. The simulation displays consistent visual and aural images of the virtual environment without occluding the designer's sensation of the physical surroundings. Developed at Stanford University's Center for Design Research (CDR), DesignSpace serves as an experimental testbed for design theory and methodology research. DesignSpace includes significant contributions from recent CDR development projects: TalkingGlove, CutPlane, VirtualHand, TeleSign, and VirtualGrasp. The current DesignSpace prototype provides modeling facility for only crude conceptual design and assembly, but can network multiple systems to share a common virtual space and arbitrate the collaborative interaction. The DesignSpace prototype employs three head-tracked rear projection images, head-coupled binaural audio, hand instrumentation, and electromagnetic position tracking. 3D CAD faces similar resistance today while workstation systems channel the design interaction through one and two dimensional interfaces. A design tool should maintain full dimensionality in the design process and not subject the design to unnecessary constraints in the communication between designers, the design media, and the final realized artifact. DesignSpace embraces this ideal by providing facilities for interactive simulation, dexterous manipulation, and remote collaboration. BACKGROUND CDR was founded in 1983 as an industry-academia collaborative and interdisciplinary R&D center to improve the engineering and product design processes. The Center accepts design problems from industry and government, and confronts them with creative design teams, for the purpose of design process observation and study, experimental design practice, and new design tool development. A long-term CDR goal is to aid the design process so that problem complexity does not impede creativity, design knowledge reuse, and human skill. CDR researchers and designers collaborate on projects to study the design process, to develop devices and interfaces to better map manual skills to data operations, to experiment with alternative means of design knowledge storage and retrieval, and to investigate design tool effectiveness. DesignSpace encapsulates technologies from several CDR projects within a conceptual design environment context to serve as a testbed for evaluating their effective application.

27 citations


DissertationDOI
01 Jan 1994
TL;DR: In this paper, the design of MEMS (Micro Electro Mechanical Systems) on the millimeter to micron length scales is examined in detail, and a broad base of knowledge has been developed concerning the etching processes commonly used in MEMS fabrication.
Abstract: The design of MEMS (Micro Electro Mechanical Systems) on the millimeter to micron length scales will be examined in this thesis. A very broad base of knowledge has been developed concerning the etching processes commonly used in MEMS fabrication. The fundamental problem we have sent out to study is how to model the shape transformations that occur in MEMS fabrication. The ultimate goal is to determine the required input mask geometry for a desired output etched shape. The body of work begins with the crystal structure of silicon and ends with etched shapes. The underlying crystal structure causes different rates for different directions; this behavior has been modeled to obtain rate models. The information in these rate models has then been used in a number of shape modelers. High level models like the Eshape model provide not only simulation but a framework for true design. Other models such as the Cellular Automata model take a different approach and provide flexible and robust simulators. The tools were used to develop real world MEMS applications such as compensation structures. As important as the individual models, is the ability to integrate them together to a coherent design tool and allow information to flow between different parts. This synthesis allows a fuller understanding of the etching process from start to finish. It is important to note that while this thesis deals with etching, the methods developed are very general and are applicable to many shape transformation processes.

25 citations


Journal ArticleDOI
TL;DR: The solutions to some real-world flight control problems and a very powerful design tool (QFT CAD package) are presented and one of these solutions was successfully implemented and flight tested.
Abstract: Quantitative feedback theory (QFT) technique has emerged as a powerful multivariable control system design method. This design method addresses real-world problems with structured plant parameter uncertainties. The paper provides the control community with an overview and the basic mathematical concepts of the QFT technique. The solutions to some real-world flight control problems and a very powerful design tool (QFT CAD package) are presented. One of these solutions was successfully implemented and flight tested.

22 citations


Journal ArticleDOI
TL;DR: The article reviews what has been achieved in areas where the C-concepts can be applied fruitfully in the study of the displacement type finite element method and provides a more complete paradigmatic understanding of the issues involved.

22 citations


Proceedings ArticleDOI
26 Jun 1994
TL;DR: The imprecise design tool (IDT) presented in this paper implements the method of imprecision, which incorporates the designer's uncertainty in choice into design calculations, using a mathematics derived from fuzzy sets.
Abstract: The imprecise design tool (IDT) presented in this paper implements the method of imprecision, which incorporates the designer's uncertainty in choice into design calculations, using a mathematics derived from fuzzy sets. IDT is intended to be a computational tool for preliminary engineering design. >

Proceedings ArticleDOI
10 Oct 1994
TL;DR: The design tool, JOSHUA, uses an integer linear programming (ILP) formulation to solve the three interdependent subproblems simultaneously and optimally to present a novel approach to the high-level synthesis problems of scheduling, allocation, and binding for multiblock behavioral descriptions.
Abstract: Presents a novel approach to the high-level synthesis problems of scheduling, allocation, and binding for multiblock behavioral descriptions. Our design tool, JOSHUA, uses an integer linear programming (ILP) formulation to solve the three interdependent subproblems simultaneously and optimally. The system allows the designer to minimize time, area, and the number of microwords for the entire design, or for specific segments of the design. A diverse module library provides a selection of modules that can perform a specific operation in differing amounts of time (control steps). A novel feature is the ability to select an implementation for part of an algorithm from among a set of implementation alternatives. The system can also handle the issues of path frequencies, loops, parallel threads of execution, and register allocation. >

Proceedings Article
31 Oct 1994
TL;DR: It is argued that current design methodologies are oriented towards cathedrals, and it is proposed object oriented design techniques and tools that are suitable for farmhouses.
Abstract: The ambition of every designer is the software equivalent of a cathedral. But maintenance programmers are more comfortable in a farmhouse than a cathedral. We argue that current design methodologies are oriented towards cathedrals, and we propose object oriented design techniques and tools that are suitable for farmhouses.During the lifetime of a useful program, its users requirements change and the code changes to track the requirements. The code drifts away from the original design, becomes increasingly brittle, and eventually can no longer be maintained; each repair introduces new faults. The cure for these ills--design for change--is well-known, but current design methodologies and tools do not facilitate useful changes.We describe a design tool that supports evolutionary object oriented design. Designers can create and modify designs, view them in textual and graphical form, check their internal consistency, and match them to requirements and code.To accomplish this, we use text, tables, and diagrams with multiple levels of formality. The tool processes formal entities completely (as a compiler can process source code completely); it stores, retrieves, and displays informal entities (whereas a compiler discards comments); and it can perform limited operations on semiformal entities. Our work borrows from formal specification techniques, but is intended for software that evolves.

Proceedings ArticleDOI
13 Jun 1994
TL;DR: A transonic compressor was designed for the Naval Postgraduate School Turbopropulsion Laboratory as discussed by the authors, which used an Euler code augmented by a distributed body force model to account for viscous effects.
Abstract: A transonic compressor stage has been designed for the Naval Postgraduate School Turbopropulsion Laboratory. The design relied heavily on CFD techniques while minimizing conventional empirical design methods. The low aspect ratio (1.2) rotor has been designed for a specific head ratio of .25 and a tip relative inlet Mach number of 1.3. Overall stage pressure ratio is 1.56. The rotor was designed using an Euler code augmented by a distributed body force model to account for viscous effects. This provided a relatively quick-running design tool, and was used for both rotor and stator calculations. The initial stator sections were sized using a compressible, cascade panel code. In addition to being used as a case study for teaching purposes, the compressor stage will be used as a research stage. Detailed measurements, including non-intrusive LDV, will be compared with the design computations, and with the results of other CFD codes, as a means of assessing and improving the computational codes as design tools.Copyright © 1994 by ASME

Journal ArticleDOI
TL;DR: In this article, an object-oriented paradigm is applied as a tool to break down the material flow system model, as well as the design environment, into modules, and the model is built, evaluated and improved using a collection of software applications that have been developed to address partial problems in the design process.
Abstract: In this paper we propose a framework for the management of the complexity involved in building material flow system models. The object-orientated paradigm is applied as a tool to break down the material flow system model, as well as the design environment, into modules. The designer is provided with a generic material flow system model from which to start with the design process. The model is built, evaluated and improved using a collection of software applications that have been developed to address partial problems in the material flow system design process. The organization of design data and design tools into logical units clearly establishes precedence between software and data. This feature opens up possibilities for computer-assisted design tool sequencing.

Proceedings ArticleDOI
01 May 1994
TL;DR: In this article, a 2-D distributed transducer shape and shading and their implications for the active control of plates are discussed, and an optimization method is described to fit the approximation to a continuous distribution.
Abstract: This paper discusses 2-D distributed transducer shape and shading and their implications for the active control of plates. Two-dimensional transducer shaping is shown to be a useful design tool for the control problem. In addition, transducer shaping can be combined with gain-weighting to provide close approximation of continuously shaded transducer distributions. An optimization method is described which can be used to fit the approximation to a continuous distribution. The analysis is applied to two examples of transducers used to control a rectangular, simply supported plate. The first relies on shaping alone and is shown to spatially filter out the even-even modes. The second was developed using the optimization technique and is shown to provide `all-mode' controllability and observability.

Journal ArticleDOI
TL;DR: In this article, the authors describe the implementation and verification of the new flow model derived in Part 1 of this paper into such an integrated natural ventilation design tool, which can be used successfully to optimize the design of a naturally ventilated factory building.

Proceedings ArticleDOI
01 Jan 1994
TL;DR: In this paper, a transversely coupled SAW resonator filter was proposed along with the method of experimentally determining the COM and electrostatic parameters necessary for the model, and detailed design examples are given which are based on this modeling technique and the parameter database.
Abstract: The design of transversely coupled SAW resonator filters is more complex than conventional SAW's because of its two dimensional nature. Previously, an innovative COM modeling technique for TCF's was proposed along with the method of experimentally determining the COM and electrostatic parameters necessary for the model. In this paper detailed design examples are given which are based on this modeling technique and the parameter database. The paper gives all design information (number of electrodes, beam width, grating and transducer periods, etc.) and corresponding COM model parameters. Also, the complete measured and predicted electric performances are presented. An excellent agreement is found between the theoretical predictions and the experiments over a wide frequency range. As a practical design tool this approach has shown an excellent accuracy and high efficiency for optimum design procedures

Proceedings ArticleDOI
07 Sep 1994
TL;DR: In this paper, a new approach to shape design sensitivity analysis and optimisation of mechanical components is proposed, which draws upon the capabilities offered by state-of-the-art associative modelers to provide for shape design parameterization and design velocity computations.
Abstract: In this paper, a new approach to shape design sensitivity analysis and optimisation of mechanical components is proposed. The approach draws upon the capabilities offered by state-of-the-art associative modelers to provide for shape design parameterization and design velocity computations, and uses a commercial finite element code for performance analysis and adjoint computations. The material derivative concept of continuum mechanics and a domain method of shape design sensitivity analysis are used for design sensitivity computations. The approach makes shape design optimisation a practical design tool bringing closer design, analysis, and manufacturing. The integrated design environment allows designers to create models that capture design intent, create finite element meshes for finite element analysis, perform geometric as well as performance what-if studies, modify design to improve performance, and generate 2-D production drawings at any design iteration. A number of approaches have been proposed during the last years for shape design parameterization. The ability to obtain an improved or optimal shape has been hindered so far by the ability to couple the analysis model with the design model. The earliest proposed approach used the position of the boundary nodes as design parameters, Zienkiewicz and ~am~belll. The method presented some severe drawbacks, as pointed out by in^^. Furthermore, as design variables were assigned to nodes on the moving part of the boundary, the number of design variables becomes very large leading to high computational effort. Also, undesirable shapes may result due to difficulty in ensuring boundary slope continuity. Since only boundary nodes move, it is difficult to maintain finite element mesh during the optimisation process.

Book ChapterDOI
01 Jan 1994
TL;DR: In this article, a method for the analysis of three dimensional (3D) electron optical components using the finite difference method has been developed and implemented in a suite of computer programs which can be run on a desktop or laptop personal computer.
Abstract: A method has been developed for the analysis of three dimensional (3D) electron optical components using the finite difference method. Electrodes, dielectric materials (including any applied surface charge distributions), and ferromagnetic materials of quite general 3D shapes including the coil windings can all be analyzed in a unified way. The new finite difference equations have been derived in full mathematical detail. The method has been implemented in a suite of computer programs which have been developed and run on a desktop or laptop personal computer. This has necessitated making the software as fast and memory efficient as possible. The programs compute the 3D electric and magnetic field distributions, perform direct electron ray-tracing through the combined fields and provide graphical output of the fields and trajectories. The accuracy of the software has been established by various analytic tests and comparisons with existing software. Computed fields in round electron lenses compared favorably with those from rotationally symmetric finite element analyses. Tolerances for lens electrode machining errors agreed to within a few percent with the results of a perturbation analysis. Optical properties for electrostatic and magnetic lenses have been computed by direct ray-tracing using the 3D programs and the results compared with those from 2D programs using a paraxial ray-trace and aberration integrals. In all cases the agreement is good. Examples which highlight the effectiveness of the programs as a design tool have been presented. These include: 3D aspects of photomultiplier tube design, with specific examples in the photocathode region and the dynode stack; magnetic immersion lenses for a scanning electron microscope (SEM), which allow specimen access into the lenses via side slots; surface charging effects which occur during inspection of insulating specimens in the SEM and a simulation of the line scan; a Wien filter for monochromating an electron beam; and a combined electric and magnetic focussing and deflection system in the presence of stray 3D fields. It has been shown, therefore, that complicated 3D analyses of a variety of electron optical components can be performed to a high degree of accuracy on the new generation of small personal or portable microcomputers. This provides designers of electron beam equipment with a relatively cheap and convenient design tool.

Book ChapterDOI
01 Jan 1994
TL;DR: The COMBINE project as mentioned in this paper is a first step towards the development of intelligent integrated building design systems (IIBDS) through which the energy use, services, functional and other performance characteristics of a planned building can be analyzed.
Abstract: The COMBINE project is a first step towards the development of intelligent integrated building design systems (IIBDS) through which the energy use, services, functional and other performance characteristics of a planned building can be analyzed. The first phase of the research has concentrated on data integration, based on the concept of a set of separate actors grouped around a central common data repository. At this stage the selection of actors is dominated by the need to address energy and HVAC performance aspects in the early design stages. The resulting data exchange system consists of a suite of Design Tool Prototypes (DTP) communicating through an Integrated Data Model (IDM) in a standard format. Research objectives and results are discussed and an outline of the next phase of the project is given.

Book ChapterDOI
01 Jan 1994
TL;DR: The designer of a material flow system is expected to analyse the role of each component as a part of the total system and consider its influence on the overall system performance.
Abstract: Material flow is a significant factor in the design of manufacturing systems. The designer of a material flow system is faced not only with the specification of individual system components but also with the overall objective of the manufacturing system. The association between components and the interaction of the material flow system with the manufacturing system are the basis by which its performance is judged. A material flow system design may be optimal in itself, but if the design cannot be integrated into the overall manufacturing system, it may have a negative impact on the manufacturing system performance. Therefore, the designer is expected to analyse the role of each component as a part of the total system and consider its influence on the overall system performance.

01 Sep 1994
TL;DR: A systematic approach to engineer a complex textile structure with the help of modem computational techniques is proposed in this paper, where it is suggested that textile designers and engineers should start from first principles and then develop powerful new approaches to suit to the special complexities of various textile structure in order to meet the needs of customers and manufacturers.
Abstract: A systematic approach to engineer a complex textile structure with the help of modem computational techniques is proposed. It is suggested that textile designers and engineers should start from first principles and then develop powerful new approaches to suit to the special complexities of various textile structure in order to meet the needs of customers and manufacturers. It is also suggested that the textile engineers should accept the cultural change and develop advanced software to create, manipulate and evaluate new fabric structures in order to make the fabric mechanics as a valuable design tool.


01 May 1994
TL;DR: The purpose of the design tool is to serve as a platform for experimentation with existing and future synthesis-for-test techniques, and it can currently generate designs optimized for both parallel and circular built-in self-test architectures.
Abstract: Hardware synthesis techniques automatically generate a structural hardware implementation given an abstract (e.g., functional, behavioral, register transfer) description of the behavior of the design. Existing hardware synthesis systems typically use cost and performance as the main criteria for selecting the best hardware implementation, and seldom even consider test issues during the synthesis process. We have developed and implemented a computer-aided design tool whose primary objective is to generate the lowest-cost, highest-performance hardware implementation that also meets specified testability requirements. By considering testability during the synthesis process, the tool is able to generate designs that are optimized for specific test techniques. The input to the tool is a behavioral VHDL specification that consists of high-level software language constructs such as conditional statements, assignment statements, and loops, and the output is a structural VHDL description of the design. Implemented synthesis procedures include compiler optimizations, inter-process analysis, high-level synthesis operations (scheduling, allocation, and binding) and control logic generation. The purpose of our design tool is to serve as a platform for experimentation with existing and future synthesis-for-test techniques, and it can currently generate designs optimized for both parallel and circular built-in self-test architectures.

Journal ArticleDOI
TL;DR: In this article, a prototype Seismic Design Assistant (SDA) is developed specifically to assist, advise, and guide design engineers engaged in the preliminary seismic design of reinforced concrete buildings (with coupled shear walls).
Abstract: Many structural engineers only rarely need to be concerned with seismic design. It is a relatively difficult process that can involve advanced analytical techniques and concepts such as ductility which are not normally encountered except in this context. A prototype Seismic Design Assistant (SDA) has been developed specifically to assist, advise, and guide design engineers engaged in the preliminary seismic design of reinforced concrete buildings (with coupled shear walls). Available seismic design methods are reviewed, and a particular method is outlined that incorporates sophisticated analytical procedures to enable ductility requirements to be satisfied. This method provides the basis of the knowledge base employed within the SDA. The prototype design tool has been implemented on a Sun workstation using Quintec-Prolog, Quintec-Flex, and Fortran 77. Details are presented of the architecture of the SDA, of the knowledge representation that has been employed, and of the integration of traditional procedural software within a knowledge-based approach.

Journal ArticleDOI
TL;DR: In this article, a physical model of a technology in its early stages of development is presented, referred to as ex-situ forced aeration of soil piles, or, briefly, [open quotes]soil pile aeration.
Abstract: This paper presents a physical model of a technology in its early stages of development, which is here referred to as [open quotes]ex-situ forced aeration of soil piles,[close quotes] or, briefly, [open quotes]soil pile aeration.[close quotes] The model can be used in a dual capacity. First, it can be used to screen the technology; that is, to determine if, based on soil characteristics and contamination levels, the technology is at all applicable. Second, if the technology is deemed applicable, the model can be used as a design tool to optimize the basic design parameters, which are the pile radius and length, and the vacuum blower characteristics. The physical model presented in this paper provides a useful framework for understanding the influence that the design and soil parameters have on the effectiveness and/or viability of the technology, and can be a useful tool for preliminary design of aeration piles. 9 refs., 9 figs., 3 tabs.

01 Jan 1994
TL;DR: In this paper, the authors describe the development of a concept design tool based on multi-attribute decision-making approach for an innovative monohull ro-ro ship, which is structured to describe the main design aspects.
Abstract: This paper describes the development of a concept design tool based on multi-attribute decision-making approach for an innovative monohull ro-ro ship. The mathematical model of a fast ro-ro ship is structured to describe the main design aspects. Preliminary analysis of hull forms within predicted size and speed limits, provides initial design guidance. Deck area utilization for loading/unloading of trailers is modelled. Intact and damaged stability are accounted for. Added resistance and other dynamic effects are taken from systematic seakeeping calculations. Power prediction is tuned on regression analysis of experimental data. A structural design study provides an insight into weight breakdown and into location of the centre of gravity. An analysis of potential traffic flow in the Mediterranean area serves as a basis for ship specifications. Port and shipyard facilities data are addressed as constraints. Although emphasis is on technical design rather than economics, the set of feasible and efficient non-dominated ship designs provides a basis for technical decisions related to the resultant economic consequences.

Patent
31 Aug 1994
TL;DR: In this paper, a computer design tool for determining a predictive indication of slit width variation to adjust an arbor setup in rotary slitting of metal in accordance with the indication of slitting width variation is presented.
Abstract: A computer design tool for determining a predictive indication of slit width variation to adjust an arbor setup in rotary slitting of metal in accordance with the indication of slit width variation. A method of using such a design tool is also provided. The design tool performs an analysis on historical slitting runs to provide the predictive indication of slit width variation.

Proceedings ArticleDOI
26 Jun 1994
TL;DR: A new fuzzy control design system that is realized by using digital signal processor (DSP) and can perform the systematic experiment for tuning, the repetitive of changing the parameters and carrying on the fuzzy control, in extremely short time because of the merits of DSP.
Abstract: One of the most significant problems and requirements about fuzzy control is the establishment of tuning method of parameters concerned about fuzzy control. It seems that the suitable tuning method which clearly this demands hasn't been developed. Therefore the fuzzy control system design tool that will be able to perform a systematic experiment for tuning the parameters in brief time is required. This paper proposes a new fuzzy control design system that is realized by using digital signal processor (DSP). This system can perform the systematic experiment for tuning, the repetitive of changing the parameters and carrying on the fuzzy control, in extremely short time because of the merits of DSP. The use of DSP in design system can calculate the fuzzy calculations in high speed and can change the parameters of fuzzy system easily by software itself. So it seems that this design system is available to decide the optimum parameter set at the time of development of a new fuzzy control system, especially requires the extremely short control period. >