scispace - formally typeset
Search or ask a question

Showing papers presented at "Computer Aided Systems Theory in 1993"


Book ChapterDOI
22 Feb 1993
TL;DR: This paper presents an approach to the representation of the evolution of these systems, based on Le Moigne's theory of the General System, and a prototype developed in an object-oriented programming language which implements the concepts used.
Abstract: The more commonly used software development methods assign the activities and results of the process of modification and evolution of software systems to maintenance. In this paper we present an approach to the representation of the evolution of these systems, based on Le Moigne's theory of the General System, and a prototype developed in an object-oriented programming language which implements the concepts used.

11 citations


Book ChapterDOI
22 Feb 1993
TL;DR: An architecture is proposed that facilitates an automatic generation of different plans of sequencing operations, synthesis of action plan for robots servicing the devices, synthesisation of the workcell's simulation model, and verification of control variants based on simulation of the overall cell's architecture.
Abstract: A comprehensive framework for design of an intelligent cell-controller requires integration of several layers of support methods and tools. We have proposed an architecture that facilitates an automatic generation of different plans of sequencing operations, synthesis of action plan for robots servicing the devices, synthesis of the workcell's simulation model, and verification of control variants based on simulation of the overall cell's architecture. The real-time discrete event simulator is used next to generate a sequence of future events of the virtual cell in a given time-window. These events are compared with current states of the real cell and are used to predict motion commands of robots and to monitor the process flow. The architecture, called Computer Assisted Workcell, offers support methods and tools at the following layers of control: organization, coordination, and execution.

9 citations


Book ChapterDOI
22 Feb 1993
TL;DR: General automata are considered with respect to normalization over semirings, and the identity of possibilistic automata strongly consistent with stochastic automata is shown.
Abstract: General automata are considered with respect to normalization over semirings. Possibilistic automata are defined as normal pessimistic fuzzy automata. Possibilistic automata are analogous to stochastic automata where stochastic (+/×) semirings are replaced by possibilistic (∨/∧) semirings; but where stochastic automata must be normal, fuzzy automata may be (resulting in possibilistic automata) or may not be (resulting in fuzzy automata proper). While possibilistic automata are direct generalizations of nondeterministic automata, stochastic automata are not. Some properties of possibilistic automata are considered, and the identity of possibilistic automata strongly consistent with stochastic automata is shown.

8 citations


Book ChapterDOI
22 Feb 1993
TL;DR: A feasibility study concerning the possibility of integrating Medtool (an expert system shell) with STIMS (a object-oriented modelling and simulation environment) is presented, and the relationship between the formalisms implemented by the two environments has resulted in the possibility to relate one formalism to each other.
Abstract: A feasibility study concerning the possibility of integrating Medtool (an expert system shell) with STIMS (a object-oriented modelling and simulation environment) is presented. The relationship between the formalisms implemented by the two environments has resulted in the possibility to relate one formalism to each other. The generalized magnitudes formalism on which Medtool is based, appears as a formalism that may be considered for systems modelling and simulation introducing new Artificial Intelligence features into Computer Aided Systems Theory. Finally, having associated the concepts in one formalism to the concepts in the other, the integration of the two shells can be accomplished in an easier fashion and with more meaning.

7 citations


Book ChapterDOI
22 Feb 1993
TL;DR: A new theoretical approach about intensional desciptions of subsets extensionally defined, which consists of finding a property based on the attributes associated with the elements that allows classifying all the elements of the domain in their corresponding subset.
Abstract: In this paper we present a new theoretical approach about intensional desciptions of subsets extensionally defined In this approach some subsets of a certain domain are initially defined in an extensional way, in such a way that for each element of the subsets some attributes are known (the attributes are the same for each element) Obtaining an intensional description of the defined subsets consist of finding a property based on the attributes associated with the elements that allows classifying all the elements of the domain in their corresponding subset

5 citations


Book ChapterDOI
22 Feb 1993
TL;DR: In this article, a brief introduction to the notion of categorical shape theory to model approximating situations is given, along with a brief explanation of shape theory in the context of complex systems.
Abstract: Categorical modelling is a useful tool in the study of systems. To motivate this, interpretation of elementary category theory ideas are illustrated in terms of time-varying complex systems. A brief introduction to the notion of categorical shape theory to model approximating situations is given.

5 citations


Book ChapterDOI
22 Feb 1993
TL;DR: A meta model to describe project models, based on the theory of many sorted algebra, is introduced, which is applied to describe parts of the meta models of Softlab's MAESTROII and IBM's ADPS.
Abstract: An integral component of a Software Engineering Environment(SEE) is a project model which is a description of a class of intended processes. Changing the SEE must not necessarily mean to loose the already established project model. Via migration it is possible to transfer the project model from one SEE to another. In order to manage the migration process a meta model to describe project models, based on the theory of many sorted algebra, is introduced. As an example, the theory is applied to describe parts of the meta models of Softlab's MAESTROII and IBM's ADPS.

4 citations


Book ChapterDOI
22 Feb 1993
TL;DR: It appears that Computer-Aided Systems Technology (CAST) may provide the necessary powerful scientific framework to handle the inherent complexity associated with such requirements of next generation intelligent machines.
Abstract: Next generation intelligent machines which will work autonomously or semi-autonomously as well as computer-aided problem solving environments including computer-aided design and computer-aided software system design environments will have sophisticated computerization requirements. It appears that Computer-Aided Systems Technology (CAST) may provide the necessary powerful scientific framework to handle the inherent complexity associated with such requirements. A system theoretic concept, coupling, has been shown to provide a reach paradigm for this purpose. The following types of couplings are discussed: Data coupling, stamp coupling, control coupling, external coupling, common coupling, and content coupling; 1-system coupling, n-system coupling; external and internal couplings; nested, feedback, cascade, and conjunctive couplings; time-varying coupling and multimodels; coupling applied to events, processes, and multimodel process models; time-varying coupling and multifacetted models; hard and soft couplings; and coupling and object-oriented formalism.

4 citations


Book ChapterDOI
22 Feb 1993
TL;DR: This paper state requirements the authors pose on a CAST (Computer Aided Systems Theory) tool for reactive system design and discuss approaches in existence, and evaluates some of the commercial available tools with respect to the requirements.
Abstract: There is a general consensus on the lack of appropriate methods and tools to support complex reactive system design, i.e., design of systems which are, to a large extent, event-driven, continuously having to react to external and internal stimuli. Design methods and tools developed for transformational systems have shown to be inadequate for complex reactive system design. These methods have no means to represent the event-driven behavior of reactive systems and they do not provide any analysis and test methods essential for reactive system design. In this paper we want to give a short statement of the problem of complex reactive system design from our systems theory point of view. We state requirements we pose on a CAST (Computer Aided Systems Theory) tool for reactive system design and discuss approaches in existence. We also evaluate some of the commercial available tools with respect to the requirements.

4 citations


Book ChapterDOI
22 Feb 1993
TL;DR: An approach for the FSM state assignment problem based on an enhanced algorithm for shift register realizations is presented, reducing the hardware overhead for testability purposes by exploiting special optimization potentials.
Abstract: An approach for the FSM state assignment problem based on an enhanced algorithm for shift register realizations is presented. A FSM is transformed into a generalized shift register structure, where memory elements are put together to one or more shift registers. The proposed procedure considers a final scan path architecture of FSM memory cells already during state assignment, reducing the hardware overhead for testability purposes by exploiting special optimization potentials. Theoretically founded criteria are used to cancel the computation at an early stage if no fruitful shiftregister realization is possible. Experimental results for two-level and multi-level implementations are given for MCNC benchmark machines.

3 citations


Book ChapterDOI
M. Kilger1, T. Dietl1
22 Feb 1993
TL;DR: A hierarchically structured real-time image sequence processing architecture is presented in which low-level parameters are adapted according to the results of the high-level scene interpretation, which ensures an optimised performance in varying operating conditions.
Abstract: A hierarchically structured real-time image sequence processing architecture is presented in which low-level parameters are adapted according to the results of the high-level scene interpretation. This adaptive parameter control ensures an optimised performance in varying operating conditions. These general concepts are applied to traffic monitoring, where long outdoor sequences have to be analysed. The algorithms have been extensively evaluated on our real-time image processing system.

Book ChapterDOI
22 Feb 1993
TL;DR: Any generalizing theory will have scientific and practical validity if it reduces theoretical complexities in special theories caused by some inherent limitations and gives the new effective analysis methods and algorithms for the solution of important practical problems.
Abstract: Any generalizing theory will have scientific and practical validity only if it: 1. reduces theoretical complexities in special theories caused by some inherent limitations; 2. contains special theories as special cases, moreover results, obtained beforehand, must be reproduced in a new more wider scale; 3. brings to appearing of new theoretical results, which are impossible in special theories, moreover, it gives other new questions and problems then solves the old ones; 4. gives the new effective analysis methods and algorithms for the solution of important practical problems.

Book ChapterDOI
22 Feb 1993
TL;DR: This work considers some one-dimensional diffusion processes arising in single neurons' activity modelling and discusses some of the related theoretical and computational first passage time problems.
Abstract: In this work we consider some one-dimensional diffusion processes arising in single neurons' activity modelling and discuss some of the related theoretical and computational first passage time problems. With reference to the Wiener and the Ornstein-Uhlenbeck processes, we outline some theoretical methods and algorithmic procedures. In particular, the relevance of the computational methods to infer about asymptotic trends of the firing pdf is pointed out.

Book ChapterDOI
22 Feb 1993
TL;DR: A fundamental concept in Artifitial Vision and Image Processing is that of Complete Description of a data field, which is determinated by the receptive fields and the functional performed on them.
Abstract: A fundamental concept in Artifitial Vision and Image Processing is that of Complete Description of a data field. From the analitycal point of view completness requires the conservation in the number of degrees of freedom of the visual environment. This constancy has allowed us to establish a special conservation principe, which is determinated by the receptive fields and the functional performed on them.

Book ChapterDOI
22 Feb 1993
TL;DR: This paper shows how formal reasoning tools may be used to help address this complexity problem and allow the designer to explore the design space with impunity, thanks to the rigour afforded by the mathematical formalism, in the sure knowledge that the final design behaviour will satisfy the specification.
Abstract: The complexity of future systems level problems is driving the need for alternative approaches to examining the problem of architectural synthesis at higher levels of abstraction. In this paper we show how formal reasoning tools may be used to help address this complexity problem and allow the designer to explore the design space with impunity, thanks to the rigour afforded by the mathematical formalism, in the sure knowledge that the final design behaviour will satisfy the specification.

Book ChapterDOI
22 Feb 1993
TL;DR: It is shown that the minimum identification problem is polynomially transformable into a problem of determining a simplest congruence of the so-called basic hypothesis, and it is proved that the method produces a weakly exclusive set of simplest hypotheses.
Abstract: The paper proposes a problem transformation method for solving the minimum automaton identification problem. An algebraic characterization of a set of all simplest hypotheses explaining a given set of input-experiments is performed. It is shown that the minimum identification problem is polynomially transformable into a problem of determining a simplest congruence of the so-called basic hypothesis. It is proved that the method produces a weakly exclusive set of simplest hypotheses.

Book ChapterDOI
22 Feb 1993
TL;DR: The issues of building up a hierarchical knowledge representation of the environment with limited sensor input that can be actively acquired by an agent capable of interacting with the environment are examined.
Abstract: One of the fundamental issues in building autonomous agents is to be able to sense, represent and react to the world. Some of the earlier work [Mor83, Elf90, AyF89] has aimed towards a reconstructionist approach, where a number of sensors are used to obtain input that is used to construct a model of the world that mirrors the real world. Sensing and sensor fusion was thus an important aspect of such work. Such approaches have had limited success, and some of the main problems were the issues of uncertainty arising from sensor error and errors that accumulated in metric, quantitative models. Recent research has therefore looked at different ways of examining the problems. Instead of attempting to get the most accurate and correct model of the world, these approaches look at qualitative models to represent the world, which maintain relative and significant aspects of the environment rather than all aspects of the world. The relevant aspects of the world that are retained are determined by the task at hand which in turn determines how to sense. That is, task directed or purposive sensing is used to build a qualitative model of the world, which though inaccurate and incomplete is sufficient to solve the problem at hand. This paper examines the issues of building up a hierarchical knowledge representation of the environment with limited sensor input that can be actively acquired by an agent capable of interacting with the environment. Different tasks require different aspects of the environment to be abstracted out. For example, low level tasks such as navigation require aspects of the environment that are related to layout and obstacle placement. For the agent to be able to reposition itself in an environment, significant features of spatial situations and their relative placement need to be kept. For the agent to reason about objects in space, for example to determine the position of one object relative to another, the representation needs to retain information on relative locations of start and finish of the objects, that is endpoints of objects on a grid. For the agent to be able to do high level planning, the agent may need only the relative position of the starting point and destination, and not the low level details of endpoints, visual clues and so on. This indicates that a hierarchical approach would be suitable, such that each level in the hierarchy is at a different level of abstraction, and thus suitable for a different task. At the lowest level, the representation contains low level details of agent's motion and visual clues to allow the agent to navigate and reposition itself. At the next level of abstraction the aspects of the representation allow the agent to perform spatial reasoning, and finally the highest level of abstraction in the representation can be used by the agent for high level planning.

Book ChapterDOI
22 Feb 1993
TL;DR: This paper addresses the question concerning the direction in which traditional Systems Theory should be extended in order to meet the requirements of new technologies which have been developed for the engineering professions and proposes an extension to newly established fields such as automation engineering and mechatronics.
Abstract: This paper addresses the question concerning the direction in which traditional Systems Theory should be extended in order to meet the requirements of new technologies which have been developed for the engineering professions. It is of primary importance to include the available computer technology. Although computer science and computer engineering offer a highly developed theoretical framework, it seems that new approaches, which more fully take the systems aspects into account, have to be developed. In the near future an extension of the traditional Systems Theory is required to enable it to provide theoretical support to the application of ”Systems Technology” for complex computer-based engineering systems. Different activities such as the IEEE task force on CBSE or the ESPRIT project ”ATMOSPHERE” show that there is common awareness of this problem area. Of specific interest would be such an extension to newly established fields such as automation engineering and mechatronics.

Book ChapterDOI
22 Feb 1993
TL;DR: The diversity, specificity and complexity of anatomo-physiological contacts and the variety of local processes carried out by those contacts make one think of authentic subcellular microcomputation.
Abstract: Linear and non-linear local computation with self-programming facilities is the more used model of biological neural nets. The diversity, specificity and complexity of anatomo-physiological contacts (dendrite-dendrite, axon-axon, axon-dendrite,...) and the variety of local processes carried out by those contacts make one think of authentic subcellular microcomputation.

Book ChapterDOI
22 Feb 1993
TL;DR: The objective of the method is to add rigor to the prevailing inexactness in the development of heterogeneous systems by introducing well-defined formal synchronization points — the models — into the design process.
Abstract: This paper describes an approach to the design of heterogeneous Hardware-Software systems. It defines a strict sequence of transformations that begins with a system specification, and leads to an implementation of the system. The steps in the sequence are defined by system models on decreasing levels of abstraction, and every step in the sequence transforms an input model into an output model. The final output model is equivalent to the system implementation. The objective of our method is to add rigor to the prevailing inexactness in the development of heterogeneous systems by introducing well-defined formal synchronization points — the models — into the design process. The feasibility of our approach has been proven with two examples. Further we discuss the use of existing tools to support our method.

Book ChapterDOI
22 Feb 1993
TL;DR: The goals of this paper is to develop effective algorithms to achieve acceptable performance in what is called Visually Detectable Defects (V.D.D); and also to define from systems concepts a test strategy.
Abstract: The goals of this paper is to develop effective algorithms to achieve acceptable performance in what we call Visually Detectable Defects (V.D.D); and also to define from systems concepts a test strategy. The main problem in V.D.D. has been finally identified as that of the detection and labelling of subtle changes in the texture of an image. In consequence, none of the standard procedures for texture discrimination gave good results, so that increasing the complexity of the process was decided upon, to provide a new decision level.

Book ChapterDOI
Zdenek Zdrahal1
22 Feb 1993
TL;DR: It is demonstrated that different variations of the basic diagnostic problem are defined by the choice of the epistemological level of the system and the model.
Abstract: The aim of the paper is to provide an overview of various alternatives to consistency-based diagnostic reasoning and to pinpoint their relationship with General Systems Theory (GST). The inconsistencies between the observations of the system to be diagnosed and predictions provided by its model are used to calculate all possible culprits, i.e. to diagnose the system. It is demonstrated that different variations of the basic diagnostic problem are defined by the choice of the epistemological level of the system and the model. Proposing the most informative next measurement is an important part of diagnosis. The paper also indicates how the techniques of model-based diagnosis can be applied to program debugging.

Book ChapterDOI
22 Feb 1993
TL;DR: Qualitative simulation, the main reasoning technique used in qualitative reasoning, necessarily generates spurious solutions and therefore cannot be used as a substitute for empirical evidence.
Abstract: Qualitative simulation, the main reasoning technique used in qualitative reasoning, necessarily generates spurious solutions [9,

Book ChapterDOI
Erwin M. Thurner1
22 Feb 1993
TL;DR: A graph-based representation of Timing Diagrams has been developed, the Trigger-Graph, for constructing a bus interface, where the partial protocols of a bus are considered by their meaning and are represented as a Simple Operation.
Abstract: While constructing a bus interface, mainly two problems arise: On the one hand, the protocols of both sides of the interface have to be described, on the other hand, the transformation between these protocols has to be done. To cope with the first problem, a graph-based representation of Timing Diagrams has been developed, the Trigger-Graph. For the second problem, the partial protocols of a bus are considered by their meaning and are represented as a Simple Operation (SIOP) The SIOPs are independent of their realisation and can be transformed into each other by formal means.

Book ChapterDOI
22 Feb 1993
TL;DR: A model of a standard cordless telephone system is presented and results are presented describing the behaviour of the system both qualitatively and quantitatively, which derive a general environment capable of supporting the modelling and performance analysis of a range of such systems.
Abstract: A model of a standard cordless telephone system is presented and analysed. Some results are presented describing the behaviour of the system both qualitatively and quantitatively. By considering the problems involved in treating this complex stochastic system in some generality, we proceed to derive a general environment capable of supporting the modelling and performance analysis of a range of such systems.

Book ChapterDOI
22 Feb 1993
TL;DR: An automatic adjustment procedure of the sampling period that tries to maintain constant the number of samples during the rise time is presented and the results obtained in simulation confirm the possibilities of the method.
Abstract: The sampling period can be considered as an additional parameter in the design of a controller. In this paper we present an automatic adjustment procedure of the sampling period that tries to maintain constant the number of samples during the rise time If the sampling period is modified obviously the parameters of the discrete model will change. However from the knowledge of the initial sampling period, the final sampling period and the initial values of the parameters it is possible to deduce a good approximation for the final parameters once the sampling period has been adjusted. In this sense a new approximate method, very simple to implement, is proposed. The results obtained in simulation confirm the possibilities of the method.

Book ChapterDOI
22 Feb 1993
TL;DR: A repository-based CAST-CAD environment which uses Finite State Automata-based tools to tackle complexity issues as well as modelling and model processing formalisms.
Abstract: Design is a model-based activity and is essential in any engineering field. Computer-Aided Design (CAD) can benefit from Computer-Aided Systems Technology (CAST) since systems theories on which CAST is based on, provide powerful bases to tackle complexity issues as well as modelling and model processing formalisms. In computerization, another type of complexity, tool interface complexity arises. For a system where n software tools communicate, the order of the interface complexity is n2 which may become unmanageable. With the use of a repository, the tool interface complexity is reduced to n. The article describes an example repository-based CAST-CAD environment which uses Finite State Automata-based tools.

Book ChapterDOI
22 Feb 1993
TL;DR: A Bundle Graph representation of nonlinear systems is defined and is used to solve problems in control theory with algorithmic methods.
Abstract: A Bundle Graph representation of nonlinear systems is defined and is used to solve problems in control theory with algorithmic methods.

Book ChapterDOI
22 Feb 1993
TL;DR: Methods of program design for the automation of practical industrial processes are discussed and knowledge-based techniques by means of the programming language Prolog are introduced to enable logic reasoning as a tool to be used in program construction and verification.
Abstract: Methods of program design for the automation of practical industrial processes are discussed. The Grafcet standard is used as a means to reduce the complexity of large programming tasks and to formalise the state-machine structure often used in the control of discrete-event dynamic systems. Knowledge-based techniques by means of the programming language Prolog are introduced to enable logic reasoning as a tool to be used in program construction and verification.

Book ChapterDOI
22 Feb 1993
TL;DR: The polynomial systems theory for time-varying and distributed parameter systems is applied to design of a vision based control system for exact positioning of a camera above a moving object.
Abstract: The polynomial systems theory for time-varying and distributed parameter systems is briefly introduced. The theory is applied to design of a vision based control system for exact positioning of a camera above a moving object. The algebraic structure needed here is a ring of skew polynomials with skew polynomial coefficients. Both an observer for estimating nonmeasurable outputs and a stabilizing controller are designed.