scispace - formally typeset
Search or ask a question

Showing papers on "Design tool published in 1992"


Dissertation
01 Jan 1992
TL;DR: A polynomial-time programming algorithm for embedding the desired circuit graph onto the prefabricated routing resources is presented, and is implemented as part of a general design tool for specifying, manipulating and comparing circuit netlists.
Abstract: This thesis develops a theoretical model for the wiring complexity of wide classes of systems, relating the degree of connectivity of a circuit to the dimensionality of its interconnect technology. This model is used to design an efficient, hierarchical interconnection network capable of accommodating large classes of circuits. Predesigned circuit elements can be incorporated into this hierarchy, permitting semi-customization for particular classes of systems (e.g., photoreceptors included on vision chips). A polynomial-time programming algorithm for embedding the desired circuit graph onto the prefabricated routing resources is presented, and is implemented as part of a general design tool for specifying, manipulating and comparing circuit netlists. This thesis presents a system intended to facilitate analog circuit design. At its core is a VLSI chip that is electrically configured in the field by selectively connecting predesigned elements to form a desired circuit, which is then tested electrically. The system may be considered a hardware accelerator for simulation, and its large capacity permits testing system ideas, which is impractical using current means. A fast-turnaround simulator permitting rapid conception and evaluation of circuit ideas is an invaluable aid to developing an understanding of system design in a VLSI context. We have constructed systems using both reconfigurable interconnection switches and laser-programmed interconnect. Prototypes capable of synthesizing circuits consisting of over 1000 transistors have been constructed. The flexibility of the system has been demonstrated, and data from parametric tests have proven the validity of the approach. Finally, this thesis presents several new circuits that have become key components in many analog VLSI systems. Fast, dense and provably safe one-phase latches and hierarchical arbiters are presented, as are a low-noise analog switch, an isotropic novelty filter, a dense, active high-resistance element, and a subthreshold differential amplifier with a large linear input range.

303 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a set of basic tools which can be used to construct systematic procedures for nonlinear feedback design, including a backstepping procedure for observer-based global stabilization and tracking of a class of nonlinear systems.

268 citations


Journal ArticleDOI
TL;DR: This work designs tool integration relationships as separate components called mediators, and designs tools to implicitly invoke mediators that integrate them, and applies this model both to analyze existing mechanisms and in the design of a mechanism for C++.
Abstract: Common software design approaches complicate both tool integration and software evolution when applied in the development of integrated environments. We illustrate this by tracing the evolution of three different designs for a simple integrated environment as representative changes are made to the requirements. We present an approach that eases integration and evolution by preserving tool independence in the face of integration. We design tool integration relationships as separate components called mediators, and we design tools to implicitly invoke mediators that integrate them. Mediators separate tools from each other, while implicit invocation allows tools to remain independent of mediators. To enable the use of our approach on a range of platforms, we provide a formalized model and requirements for implicit invocation mechanisms. We apply this model both to analyze existing mechanisms and in the design of a mechanism for C++.

168 citations


Book ChapterDOI
01 Jan 1992
TL;DR: The static and predetermined capabilities of many knowledge based design systems prevent them from acquiring design experience for future use to overcome this limitation techniques for reasoning and learning by analogy that can aid the design process have been developed.
Abstract: The static and predetermined capabilities of many knowledge based design sys tems prevent them from acquiring design experience for future use To overcome this limitation techniques for reasoning and learning by analogy that can aid the design process have been developed These techniques along with a nonmonotonic reason ing capability have been incorporated into Argo a tool for building knowledge based systems Closely integrated into Argo s analogical reasoning facilities are modules for the acquisition storage retrieval evaluation and application of previous expe rience Problem solving experience is acquired in the form of problem solving plans represented as rule dependency graphs From increasingly abstract versions of these graphs Argo calculates sets of macrorules These macrorules are partially ordered according to an abstraction relation for plans from which the system can e ciently retrieve the most speci c plan applicable for solving a new problem Knowledge based applications written in Argo can use these plan abstractions to solve problems that are not necessarily identical but just analogous to those solved previously Ex periments with an application for designing VLSI digital circuits are yielding insights into how design tools can improve their capabilities as they are used Introduction and Background A number of knowledge based systems for design have been developed recently These systems are particularly suited to situations in which heuristic expert knowledge must be employed because algorithmic techniques are unavailable or prohibitively expensive Restrictions on the types of problems or domains handled by such design systems are progressively being eased Unfortunately the knowledge embodied in many of these systems is static it fails to capture the iterative aspects of the design process that involve solving new problems by building upon the experience of previous design e orts Given the same problem ten times these systems will solve it the same way each time taking as long for the tenth as for the rst The work reported here is based on the contention that a truly intelligent design system should improve as it is used i e it should have the means for remembering the relevant parts of previous design e orts and be able to employ this accumulated experience in solving future design problems Learning from experience is a powerful technique used by humans to improve their problem solving ability For a design tool the remembered experience should consist of design results design plans and preferences among these results and plans These constitute di erent aspects of previous design e orts that the design tool can use as training examples Learning from Experience Existing approaches to learning from experience attempt to generalize these training exam ples in order to obtain more widely applicable results The STRIPS problem solving system incorporates a technique for generalizing plans and their preconditions based on the formation of macro operators MACROPs In this technique an existing plan consisting of a sequence of operators whose execution yields a goal state is stored in a data structure called a triangle table This table represents the preconditions and postconditions for each operator in the plan The plan is generalized by replacing all precondition constants by distinct parameters and then correcting for overgeneralization by substituting for incon sistent parameters The resultant generalized plan a MACROP is stored and later used as either a plan a set of subplans or an execution monitor A better procedure for generalization developed in the context of learning from exam ples uses a proof based explanation or veri cation mechanism often termed explanation based generalization EBG It is an improvement over the use of a triangle table in that it does not require any heuristics to compensate for possible over generalizations The proof employed comprises information about why a training example satis es a particular goal The procedure involves rst a modi ed regression of the goal through the proof structure whereby su cient constraints on the domain of train ing examples for which the proof holds are computed These constraints are based on the codomain of goals allowed The second stage of the procedure is to reapply the proof structure to the resultant generalized domain to obtain a generalized codomain In the terminology used above a plan is like a proof a plan precondition is the domain for the proof and the resultant design is the codomain For design problems EBG like generalizations are limited in that they arbitrarily give equal weight to all portions of the examples without regard to whether each portion is relevant or important to solving future problems More abstract generalizations can be obtained by taking this factor into account Abstract planning i e choosing a partial sequence of operators to reach a goal is accomplished in ABSTRIPS by ignoring operator preconditions considered to be details Criticality values are attached to the preconditions of operators to determine their importance These values are computed based on the presumed di culty of satisfying each precondition if it is not already satis ed Only if a plan succeeds at an abstract level is it expanded by the addition of subplans to handle the details at a subsequent level Another technique for reusing past design experience is to replay a previously recorded plan or design history This approach is interesting in its exibility with respect to replaying portions of a stored plan to solve or at least partially solve a new problem Unfortunately the correspondence between the stored plan and subproblems of a partial design is di cult to establish automatically The transfer of experience from previous problem solving e orts to new problems has also been accomplished via analogical reasoning methods Analogical reason ing is a mapping from a base domain to a target domain that allows the sharing of features between these domains With respect to problem solving many of the previously reported methods are limited by their requirements that either new problems be very similar to previously solved ones or analogies be supplied by a user and match perfectly

44 citations


Journal ArticleDOI
TL;DR: In this paper, a design tool that combines the simplified real frequency technique of broadbanding with an appropriate application of algebraic network decomposition and replacement techniques is presented, which is used for constructing practical MMIC matching networks, since the fringing lumped element parasitics which arise due to the physical implementation can easily be adsorbed within the mixed element network structure.
Abstract: A design tool that combines the simplified real frequency technique of broadbanding with an appropriate application of algebraic network decomposition and replacement techniques is presented. The method incorporates the outstanding merits of modern single-variable broadbanding techniques, i.e. during the course of the design, there is no need to choose a circuit topology with mixed elements in advance, nor is it necessary to invent a realizable transfer function in two variables to measure the system performance. Examples are presented to exhibit the application of the tool, which is expected to be useful for constructing practical MMIC matching networks, since the fringing lumped element parasitics which arise due to the physical implementation can easily be adsorbed within the mixed element network structure. >

42 citations


Journal ArticleDOI
TL;DR: In this paper, a new tool for the interactive design of garments in 3D is introduced, making use of an elastic surface model, animation allows one to examine the garment design in three dimensions dynamically.

35 citations


Journal ArticleDOI
TL;DR: The Garnet toolkit was specifically designed to make highly interactive graphical programs easier to design and implement and has been used to create three different interactive design tools: Gilt, a simple interface builder for laying out widgets; Lapidary, a sophisticated design tool for constructing application-specific graphics and custom widgets; and C32, a spreadsheet interface to constraints.
Abstract: The Garnet toolkit was specifically designed to make highly interactive graphical programs easier to design and implement. Visual, interactive, user-interface design tools clearly fall into this category. At this point, we have used the Garnet toolkit to create three different interactive design tools: Gilt, a simple interface builder for laying out widgets; Lapidary, a sophisticated design tool for constructing application-specific graphics and custom widgets; and C32, a spreadsheet interface to constraints. The features of the Garnet toolkit that made these easier to create include use of a prototype-instance object system instead of the usual class-instance model, integration of constraints with the object system, graphics model that supports automatic graphical update and saving to disk of on-screen objects, separation of specifying the graphics of objects from their behavior, automatic layout of graphical objects in a variety of styles, and a widget set that supports such commonly used operations as selection, moving and growing objects, and displaying and setting their properties.

34 citations


Journal ArticleDOI
TL;DR: In this article, a method which extends the use of the well established chart for the design of solar shading devices was developed, which provides a quantitative proportion of the opening's area which is exposed at any given time to direct solar radiation, in addition to the binary answer returned by the old chart whether any part of an opening is exposed or not exposed to such radiation at all.

30 citations


Journal ArticleDOI
TL;DR: A practical computer code has been developed which uses the accepted two-fluid model to simulate He II flow in complicated systems, retaining the coupling between the pressure, temperature and velocity fields.

22 citations


Journal ArticleDOI
TL;DR: The authors illustrate the use of a new PC-based interactive computer program that has been developed specifically for the design of dry-type transformers that enables transformer designers to test, modify, and optimize their designs.
Abstract: The authors illustrate the use of a new PC-based interactive computer program that has been developed specifically for the design of dry-type transformers. The BASIC language executes on a DOS-based PC with 512 kbyte of RAM and a CGA display. No additional software is needed. Simulation of a design case requires approximately 10 s of execution time on a 4.7 MHz PC/XT. The program enables transformer designers to test, modify, and optimize their designs. Once a base design is established, sensitivity to operating conditions can be easily determined by changing the appropriate input data and re-executing the program. This program can be a useful training tool in the classroom and an effective production design tool in the power electronics industry. >

19 citations


Book
01 Sep 1992
TL;DR: Learning to design, designing to learn - a more creative role for technology is discussed in this article, where the authors transform the curriculum using professional tools: software design as a learning environment and the future of CAD - technological support for kids building artifacts via design practice.
Abstract: Learning to design, designing to learn - a more creative role for technology. Part 1 Transforming the curriculum using professional tools: software design as a learning environment. Part 2 The future of CAD - technological support for kids building artifacts: science via design practice - where students actively investigate and persuade. Part 3 Learning to think like a designer - a microworld and a tutor: using structure as a design tool in algorithmic problem solving.

Journal ArticleDOI
TL;DR: This paper is an exploration of how different computer techniques can be brought to bear in the analysis of catalogues of design precedents, including automatic classification, various kinds of frequency analysis, clustering, and connectionist techniques.
Abstract: This paper is an exploration of how different computer techniques can be brought to bear in the analysis of catalogues of design precedents. The techniques include automatic classification, various kinds of frequency analysis, clustering, and connectionist techniques. Connectionist techniques are demonstrated as forming the basis of a useful design tool that facilitates a rudimentary kind of design dialogue. This dialogue involves a kind of exploration within the constraints of an atomistic representation schema in which key features are determined a priori.

Book ChapterDOI
01 Jan 1992
TL;DR: A general framework for configuration design is presented in which the design specifications are separated into basic functions, performance goals, and constraints, and a knowledge-based design tool, called HYSYN (HYdraulics SYNthesizer), was developed.
Abstract: Configuration design is a type of design activity in which a set of pre-defined components can be combined in certain ways to design a system (Mittal and Frayman 1989). This paper focuses on the configuration design of power transmission systems in general and hydraulics systems in particular. We present a general framework for configuration design in which the design specifications are separated into basic functions, performance goals, and constraints. The design space is divided into a functional space and a physical space. Each are further organized into hierarchies of functional modules and generic physical devices respectively. Functional modules are representations of behaviors of physical devices and are domain-independent. Starting with design specifications, a skeletal design consisting of a network of essential functions is formed. Functions are mapped to physical devices that satisfy performance goals and constraints. Based on the framework presented in this paper, a knowledge-based design tool, called HYSYN (HYdraulics SYNthesizer), was developed. A design example of systematic configuration of hydraulic systems is also presented. Issues in automated configuration design issues, such as function-sharing, granularity of the building blocks, and combinatorial explosion, are also discussed.

Journal ArticleDOI
TL;DR: It is argued that the natural structure of such fields is given by tubes of flux and/or slices bounded by equipotentials, and these two descriptions are shown to provide upper and lower bounds for the system energy and for the equivalent circuit parameters describing that energy.
Abstract: To overcome the difficulty experienced by students in visualizing electric and magnetic fields, a graphical method is proposed using a personal computer to display the field. It is argued that the natural structure of such fields is given by tubes of flux and/or slices bounded by equipotentials. These two descriptions are shown to provide upper and lower bounds for the system energy and for the equivalent circuit parameters describing that energy. The calculations are extremely simple and can be used by the student in an interactive mode. The method is, therefore, suitable as a design tool. It can also be usefully combined with a finite element calculation to improve the accuracy. A suitable computer package called TAS is briefly described. >

01 Jan 1992
TL;DR: In this paper, an evaluation of beach response to detached breakwaters was conducted using the numerical shoreline response model GENESIS, which is intended for determining preliminary, feasibility-level project specifications.
Abstract: An evaluation of beach response to detached breakwaters was conducted using the numerical shoreline response model GENESIS. Functional design parameters such as structure length, gap distance, wave breaker height, wave period, depth at structure, and structure transmission were varied to develop a series of design nomographs for predicting morphological response. This type of design tool is intended for determining preliminary, feasibility-level project specifications. More detailed evaluations of candidate designs using a prototype test, and/or numerical/physical models with site-specific parameters are required to determine final project specifications. This paper discusses the study methodology and presents preliminary breakwater design nomographs.

01 Sep 1992
TL;DR: In this paper, composite transport fuselage crown panel design and manufacturing plans were optimized to have projected cost and weight savings of 18 percent and 45 percent, respectively, by combining cost and performance constraints with a random search optimization algorithm.
Abstract: Composite transport fuselage crown panel design and manufacturing plans were optimized to have projected cost and weight savings of 18 percent and 45 percent, respectively. These savings are close to those quoted as overall NASA ACT program goals. Three local optimization tasks were found to influence the cost and weight of fuselage crown panels. This paper summarizes the effect of each task and describes in detail the task associated with a design cost model. Studies were performed to evaluate the relationship between manufacturing cost and design details. A design tool was developed to aid in these investigations. The development of the design tool included combining cost and performance constraints with a random search optimization algorithm. The resulting software was used in a series of optimization studies that evaluated the sensitivity of design variables, guidelines, criteria, and material selection on cost. The effect of blending adjacent design points in a full scale panel subjected to changing load distributions and local variations was shown to be important. Technical issues and directions for future work were identified.

Proceedings Article
Arthur G. Ryman1
09 Nov 1992
TL;DR: The combined use of Entity- Relationship modelling and GraphLog to bridge the textual and graphical views is described.
Abstract: 4Thought, a prototype design tool, is based on the notion that design artifacts are complex, formal, mathematical objects that require complementary textual and graphical views to be adequately comprehended. This paper describes the combined use of Entity- Relationship modelling and GraphLog to bridge the textual and graphical views. These techniques are illustrated by an example that is formally specified in Z Notation.

Journal ArticleDOI
TL;DR: A prototype system in the language of Routine Design (DSPL) is implemented that covers part of the domain of protocol design for thin film epoxy-resin composite materials and describes the overall problem-solving architecture.
Abstract: Current design approaches for composite materials may typically be categorized as either (a) dependent on the experience of seasoned designers of manufacturing protocols for composites or (b) dependent on fundamental studies of the materials involved. We are currently undertaking research to combine these two approaches in an artificial intelligence-based problem-solving system. We report research in progress aimed at automating capability (a) by leveraging a known AI technique: Routine Design. We have implemented a prototype system in the language of Routine Design (DSPL) that covers part of the domain of protocol design for thin film epoxy-resin composite materials. Results encourage further development. We conclude by describing the overall problem-solving architecture we are developing.

Dissertation
01 Oct 1992
TL;DR: In this article, a CCD camera mounted under the front of the vehicle senses obstacles as they emerge into the projection area and reflect the light pattern, and a light coding system has been designed which simplifies the image analysis task and allows a low-cost embedded microcontroller to carry out image processing, code recognition and obstacle avoidance planning functions.
Abstract: Most Industrial Automated Guided Vehicles CAGV s) follow fixed guide paths embedded in the floor or bonded to the floor surface. Whilst reliable in their basic operation, these AGV systems fail if unexpected obstacles are placed in the vehicle path. This can be a problem particularly in semi-automated factories where men and AGVs share the same environment. The perfonnance of line-guided AGVs may therefore be enhanced with a capability to avoid unexpected obstructions in the guide path. The research described in this thesis addresses some fundamental problems associated with obstacle avoidance for utomated vehicles. A new obstacle avoidance system has been designed which operates by detecting obstacles as they disturb a light pattern projected onto the floor ahead of the AGV. A CCD camera mounted under the front of the vehicle senses obstacles as they emerge into the projection area and reflect the light pattern. Projected light patterns have been used as an aid to static image analysis in the fields f Computer Aided Design and Engineering. This research extends these ideas in a real-time mobile application. A novel light coding system has been designed which simplifies the image analysis task and allows a low-cost embedded microcontroller to carry out the image processing, code recognition and obstacle avoidance planning functions. An AGV simulation package has been developed as a design tool for obstacle avoidance algorithms. This enables potential strategies to be developed in a high level language and tested via a Graphical User Interface. The algorithms designed using the simulation package were successfully translated to assembler language and implemented on the embedded system. An experimental automated vehicle has been designed and built as a test bed for the research and the complete obstacle avoidance system was evaluated in the Flexible Manufacturing laboratory at the University of Huddersfield.

Book ChapterDOI
01 Jan 1992
TL;DR: A simplified numerical code for thermal analysis of a helical heat exchanger for use in long-term thermal energy storage in soil was developed and provides a convenient and reliable design tool for such a system.
Abstract: A simplified numerical code for thermal analysis of a helical heat exchanger for use in long-term thermal energy storage in soil was developed. The model was verified for a particular case for which an analytical solution was available from the literature and was validated with experimental data obtained from field experiments. The differences between predicted and measured data were in the range of ±1°C, which is considered satisfactory for engineering design purposes. The model was prepared for use with an personal computer and thus provides a convenient and reliable design tool for such a system. The computer code may be easily modified for the study of the influence of incorporating phase-change material elements in the storage well.

01 Mar 1992
TL;DR: The implications of packaging and interconnection technologies for reduced instruction set computing (RISC) microprocessor memory hierarchies are examined and a first-order model is developed that allows interactive investigation of tradeoffs at prenetlist phases of design.
Abstract: This report presents a prototype early analysis tool for exploring trade-offs in cache architecture and packaging and inter- connection (P/I) technology for RISC microprocessor based systems. We define early analysis as the evaluation of system trade-offs, including implementation technology effects, at pre-netlist phases of development. Prior work in cache performance estimation and P/I modeling are combined and extended to be more specific to RISC systems. After describing the model, several case studies are presented. Although limited by the accuracy of the first-order model as well as by assumptions and estimations regarding input data, these studies indicate general trends and quantify several important trade-offs for multi-chip model systems. MCM characteristics favor larger off-chip caches, with improved performance as well as yield advantages. When combined with flip-chip mounting, MCM technology also permits cache system architectural changes which can significantly improve perfor- mance. The prototype is not intended as a design tool, but rather to demonstrate the utility and importance of an interactive early analysis tool, combining architecture and implementation technology issues. Suggestions for future tool development are made.

Journal ArticleDOI
TL;DR: In this article, a 2D model of the solar cell was developed, based on an approximate analysis that allows the obtaining of a closed-form solution of the fundamental transport equations of the device.
Abstract: A 2-D model of the grooved solar cell has been developed, based on an approximate analysis that allows the obtaining of a closed-form solution of the fundamental transport equations of the device. This model represents a flexible tool for the analysis of the cell, for understanding the basis of its operation, and as a design tool. In order to assess the validity of the proposed model, the analytical solution is also compared with a full 2-D solution, obtained from PISCES simulation.

Proceedings ArticleDOI
05 Nov 1992
TL;DR: In this article, a simulation design strategy for both 1-D and 2-D composite array configurations, with specific emphasis placed on the latter structure, is described, and their imaging potential for applications in underwater visualization systems, in which the array will be configured as part of a scanning system, is investigated.
Abstract: Ceramic-epoxy composite transducers, in the form of a matrix of piezoceramic rods embedded in an epoxy substrate, offer significant advantages for sonar array design in the frequency range 100 kHz - 2 MHz. Good matching to the water load, coupled with high sensitivity, low lateral crosstalk, and wideband performance are extremely attractive features for modern array design. This paper describes a simulation design strategy for both 1-D and 2-D composite array configurations, with specific emphasis placed on the latter structure. The development and subsequent evaluation of an interactive software design tool for the performance assessment of composite arrays, and in particular, their imaging potential for applications in underwater visualization systems, in which the array will be configured as part of a scanning system, is investigated. Firstly, finite element analysis (FEA) is used to evaluate the various factors which relate to the micro-structure of the composite material i.e., volume fraction, and the size and number of ceramic rods under the electrode. A linear systems approach is then adopted to investigate the macro-structure of the composite transducer, and considers the effects of material composition; ceramic-epoxy volume fraction and transducer backing impedance. Finally, the scattering responses from arbitrary target structures are considered, using practical array dimensions, as specified by the FEA and linear systems analysis. The model employed for target scattering is capable of simulating a wide range of composite configurations, and also of varying mechanical and electrical load conditions. In addition, it permits the analysis of realistic target scenarios, comprising an arbitrary arrangement of arcs, circles, and rectangles. A range of examples are presented, including both 1-D and 2-D transducer array configurations, and a selection of imaging results obtained via a 10 X 10 2-D array operating at 1.2 MHz. Good agreement between theory and experiment is obtained.© (1992) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.

Proceedings ArticleDOI
01 Nov 1992
TL;DR: The authors describe an approach towards modeling the heterogeneous aspects of design environments which is based on a paradigm of separation and integration, yielding an adequate, well structured, non-redundant, and integrated design model for generic design environments.
Abstract: The authors describe an approach towards modeling the heterogeneous aspects of design environments which is based on a paradigm of separation and integration, yielding an adequate, well structured, non-redundant, and integrated design model for generic design environments. The design model consists of five partial models: (1) design flow model; (2) design tool model; (3) design structure model; (4) design object model; and (5) design subject model. The design structure model is introduced for modeling on a higher level of abstraction exactly those aspects of design objects which are necessary for design methodology management. The applied concept is called design object abstraction. >

Book
01 Jan 1992
TL;DR: This cutting-edge resource explores the design of RT level components; the application of these components in a core-based design; and the development of a complete processor design with its hardware and software as a core in a system-on-a-chip (SoC).
Abstract: The classic VHDL: Modular Design and Synthesis of Cores and Systems has been fully updated to cover methodologies of modern design and the latest uses of VHDL for digital system design The book shows you how to utilize VHDL to create specific constructs for specific hardware parts, focusing on VHDLs new libraries and packages This cutting-edge resource explores the design of RT level components; the application of these components in a core-based design; and the development of a complete processor design with its hardware and software as a core in a system-on-a-chip (SoC) Filled with over 150 illustrations, VHDL: Modular Design and Synthesis of Cores and Systems features: An entire toolkit for register-transfer level digital system design Testbench development techniques New to this edition: Coverage of the latest uses of VHDL for digital system design, design of IP cores, interactive and self checking testbench development, and VHDLs new libraries and packages Inside this State-of-the-Art VHDL Design Tool


Book ChapterDOI
01 Jan 1992
TL;DR: A scenario demonstrating how DIDS will be used to build configuration systems is presented, with a set of tools capable of rapidly constructing configuration-design systems from a library of reusable software elements, called mechanisms.
Abstract: This paper describes the Domain-Independent Design System (DIDS). DIDS provides a set of tools capable of rapidly constructing configuration-design systems from a library of reusable software elements, called mechanisms. The power of DIDS comes from its model of configuration design that enables reusable mechanism to be identified. DIDS contains four components. The first component, the Problem-Solving-Method (PSM) Editor builds PSMs by combining mechanisms. The Code Generator, DIDS’s second component, generates a problem solver from the PSM description created in the editor. The third component, the Knowledge-Acquisition Tool Generator builds a knowledge acquisition (KA) tool that interviews the domain expert to gather the knowledge required by the DIDS-generated problem solver. The final component, the Debugging Tool, monitors the execution of the problem solver to uncover errors made during KA, and to improve the performance of the design tool. This paper presents a scenario demonstrating how DIDS will be used to build configuration systems.


Journal ArticleDOI
TL;DR: A hypermedia based intelligent system for design with particular reference to fluid film journal bearings is described, developed by integrating hypermedia, artificial intelligence and intelligent database technologies.

Journal Article
TL;DR: A design tool currently being developed to improve traffic circulation so as to encourage a safer and more efficient pattern of movement is described, based on a heuristic algorithm that attempts to minimize the overall level of 'conflict' generated through the interaction between traffic streams on an urban network.
Abstract: Conventional traffic models can be used to predict how road-users will respond to traffic management measures, but they are not capable of making intelligent decisions about how the measures themselves should be configured. This paper describes a design tool currently being developed to improve traffic circulation so as to encourage a safer and more efficient pattern of movement. It is based on a heuristic algorithm that attempts to minimize the overall level of 'conflict' generated through the interaction between traffic streams on an urban network.