scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A theory of discontinuities in physical system models

TL;DR: An algorithm for inferring the correct new mode and state variable values in the hybrid modeling framework, and a verification scheme that ensures hybrid models conform to physical system principles based on the principles of divergence of time and temporal evolution in behavior transitions are developed.
Abstract: Physical systems are by nature continuous, but often display nonlinear behaviors that make them hard to analyze. Typically, these nonlinearities occur at a time scale that is much smaller than the time scale at which gross system behavior needs to be described. In other situations, nonlinear effects are small and of a parasitic nature. To achieve efficiency and clarity in building complex system models, and to reduce computational complexity in the analysis of system behavior, modelers often abstract away any parasitic component parameter effects, and analyze the system at more abstract time scales. However, these abstractions often introduce abrupt, instantaneous changes in system behavior. To accommodate mixed continuous and discrete behavior, this paper develops a hybrid modeling formalism that dynamically constructs bond graph model fragments that govern system behavior during continuous operation. When threshold values are crossed, a meta-level control model invokes discontinuous state and model configuration changes. Discontinuities violate physical principles of conservation of energy and continuity of power , but the principle of invariance of state governs model behavior when the control module is active. Conservation of energy and continuity of power again govern behavior generation as soon as a new model configuration is established. This allows for maximally constrained continuous model fragments. The two primary contributions of this paper are an algorithm for inferring the correct new mode and state variable values in the hybrid modeling framework, and a verification scheme that ensures hybrid models conform to physical system principles based on the principles of divergence of time and temporal evolution in behavior transitions . These principles are employed in energy phase space analysis to verify physical consistency of models.
Citations
More filters
Journal ArticleDOI
01 Nov 1999
TL;DR: Monitoring, prediction, and fault isolation methods for abrupt faults in complex dynamic systems are developed and successfully applied to monitoring of the secondary sodium cooling loop of a fast breeder reactor.
Abstract: The complexity of present day embedded systems (continuous processes controlled by digital processors), and the increased demands on their reliability motivate the need for monitoring and fault isolation capabilities in the embedded processors. This paper develops monitoring, prediction, and fault isolation methods for abrupt faults in complex dynamic systems. The transient behavior in response to these faults is analyzed in a qualitative framework using parsimonious topological system models. Predicted transient effects of hypothesized faults are captured in the form of signatures that specify future faulty behavior as higher order time-derivatives. The dynamic effects of faults are analyzed by a progressive monitoring scheme till transient analysis mechanisms have to be suspended in favor of steady state analysis. This methodology has been successfully applied to monitoring of the secondary sodium cooling loop of a fast breeder reactor.

280 citations


Cites background or methods from "A theory of discontinuities in phys..."

  • ...The modeling methodology is further developed by Mosterman and Biswas [11], [13]....

    [...]

  • ...Abrupt changes in the parameter values of energy storage elements may cause an abrupt change in some measured variables [10], [13]....

    [...]

  • ...Conservation of state [13] was applied when capacitance and inductance failures occurred....

    [...]

Proceedings Article
09 Aug 2003
TL;DR: This paper presents a methodology for online tracking and diagnosis of hybrid systems and demonstrates the effectiveness of the approach with experiments conducted on the fuel-transfer system of fighter aircraft.
Abstract: Recent years have seen a proliferation of embedded systems that combine a digital (discrete) supervisory controller with an analog (continuous) plant. Diagnosing faults in such hybrid systems, require techniques that are different from those used for discrete and continuous systems. In addition, these algorithms have to be deployed online to meet the real time requirements of embedded systems. This paper presents a methodology for online tracking and diagnosis of hybrid systems. We demonstrate the effectiveness of the approach with experiments conducted on the fuel transfer system of fighter aircraft.

261 citations

Journal ArticleDOI
01 May 2007
TL;DR: In this paper, the authors present a methodology for online tracking and diagnosis of hybrid systems that combine digital (discrete) supervisory controllers with analog (continuous) plants, and demonstrate the effectiveness of the approach with experiments conducted on the fuel transfer system of fighter aircraft.
Abstract: Techniques for diagnosing faults in hybrid systems that combine digital (discrete) supervisory controllers with analog (continuous) plants need to be different from those used for discrete or continuous systems. This paper presents a methodology for online tracking and diagnosis of hybrid systems. We demonstrate the effectiveness of the approach with experiments conducted on the fuel-transfer system of fighter aircraft

245 citations

Journal ArticleDOI
TL;DR: The need for finding generic approaches for modular, stable, and accurate coupling of simulation units, as well as expressing the adaptations required to ensure that the coupling is correct, is identified.
Abstract: Modeling and simulation techniques are today extensively used both in industry and science. Parts of larger systems are, however, typically modeled and simulated by different techniques, tools, and algorithms. In addition, experts from different disciplines use various modeling and simulation techniques. Both these facts make it difficult to study coupled heterogeneous systems.Co-simulation is an emerging enabling technique, where global simulation of a coupled system can be achieved by composing the simulations of its parts. Due to its potential and interdisciplinary nature, co-simulation is being studied in different disciplines but with limited sharing of findings.In this survey, we study and survey the state-of-the-art techniques for co-simulation, with the goal of enhancing future research and highlighting the main challenges.To study this broad topic, we start by focusing on discrete-event-based co-simulation, followed by continuous-time-based co-simulation. Finally, we explore the interactions between these two paradigms, in hybrid co-simulation.To survey the current techniques, tools, and research challenges, we systematically classify recently published research literature on co-simulation, and summarize it into a taxonomy. As a result, we identify the need for finding generic approaches for modular, stable, and accurate coupling of simulation units, as well as expressing the adaptations required to ensure that the coupling is correct.

210 citations

References
More filters
Book
01 Jan 1974
TL;DR: This text introduces the basic data structures and programming techniques often used in efficient algorithms, and covers use of lists, push-down stacks, queues, trees, and graphs.
Abstract: From the Publisher: With this text, you gain an understanding of the fundamental concepts of algorithms, the very heart of computer science. It introduces the basic data structures and programming techniques often used in efficient algorithms. Covers use of lists, push-down stacks, queues, trees, and graphs. Later chapters go into sorting, searching and graphing algorithms, the string-matching algorithms, and the Schonhage-Strassen integer-multiplication algorithm. Provides numerous graded exercises at the end of each chapter. 0201000296B04062001

9,262 citations

Journal ArticleDOI
TL;DR: A general framework for the formal specification and algorithmic analysis of hybrid systems is presented, which considers symbolic model-checking and minimization procedures that are based on the reachability analysis of an infinite state space.

2,091 citations

Book
01 Jan 1978
TL;DR: DeMarco's "Structured Analysis and System Specification" as mentioned in this paper is the final paper chosen for inclusion in this book of classic articles on the structured revolution, and it is a good summary of the current state of the art in structured analysis.
Abstract: DeMarco's "Structured Analysis and System Specification" is the final paper chosen for inclusion in this book of classic articles on the structured revolution. It is last of three on the subject of analysis, and, together with Ross/Schoman [Paper 22] and Teichroew/Hershey [Paper 23], provides a good idea of the direction that structured analysis will be taking in the next few years. Any competent systems analyst undoubtedly could produce a five-page essay on "What's Wrong with Conventional Analysis." DeMarco, being an ex-analyst, does so with pithy remarks, describing conventional analysis as follows" "Instead of a meaningful interaction between analyst and user, there is often a period of fencing followed by the two parties' studiously ignoring each other... The cost-benefit study is performed backwards by deriving the development budget as a function of expected savings. (Expected savings were calculated by prorating cost reduction targets handed down from On High.)" In addition to providing refreshing prose, DeMarco's approach differs somewhat --- in terms of emphasis --- from that of Teichroew/Hershey and of Ross/Schoman. Unlike his colleagues, DeMarco stresses the importance of the maintainability of the specification. Take, for instance, the case of one system consisting of six million lines of COBOL and written over a period of ten years by employees no longer with the organization. Today, nobody knows what the system does.t Not only have the program listings and source code been lost --- a relatively minor disaster that we all have seen too often --- but the specifications are completely out of date. Moreover, the system has grown so large that neither the users nor the data processing people have the faintest idea of what the system is supposed to be doing, let alone how the mysterious job is being accomplished! The example is far from hypothetical, for this is the fate that all large systems eventually will suffer, unless steps are taken to keep the specifications both current and understandable across generations of users. The approach that DeMarco suggests --- an approach generally known today as structured analysis --- is similar in form to that proposed by Ross and Schoman, and emphasizes a top-down, partitioned, graphic model of the system-to-be. However, in contrast to Ross and Schoman, DeMarco also stresses the important role of a data dictionary and the role of scaled-down specifications, or minispecs, to be written in a rigorous subset of the English language known as Structured English. DeMarco also explains carefully how the analyst proceeds lrom a physical description of the user's current system, through a logical description of that same system, and eventually into a logical description of the new system that the user wants. Interestingly, DeMarco uses top-down, partitioned dataflow diagrams to illustrate this part of the so-called Project Life Cycle --- thus confirming that such a graphic model can be used to portray virtually any system. As in other short papers on the subject, the details necessary for carrying out DeMarco's approach are missing or are dealt with in a superficial manner. Fortunately, the details can be found: Listed at the end of the paper are references to three full-length books and one videotape training course, all dealing with the kind of analysis approach recommended by DeMarco.

1,655 citations

Journal ArticleDOI
Johan de Kleer1, John Seely Brown1
TL;DR: A fairly encompassing account of qualitative physics, which introduces causality as an ontological commitment for explaining how devices behave, and presents algorithms for determining the behavior of a composite device from the generic behavior of its components.

1,550 citations

Journal ArticleDOI
01 Dec 1989

1,381 citations