scispace - formally typeset
Search or ask a question

Showing papers on "Formal system published in 2000"


Book ChapterDOI
25 Mar 2000
TL;DR: KIV is a tool for formal systems development for the development of safety critical systems from formal requirements specifications to executable code, including the verification of safety requirements and the correctness of implementations.
Abstract: KIV is a tool for formal systems development. It can be employed, e.g., — for the development of safety critical systems from formal requirements specifications to executable code, including the verification of safety requirements and the correctness of implementations, — for semantical foundations of programming languages from a specification of the semantics to a verified compiler, — for building security models and architectural models as they are needed for high level ITSEC [7] or CC [1] evaluations.

137 citations


Journal ArticleDOI
TL;DR: It is argued that there is a formal system for common-sense conception that underlies the acquisition of an important class of generic knowledge and represents the stable knowledge the authors have about kinds of things.

135 citations



Book
01 Mar 2000
TL;DR: This work proposes the L-calculus, an extension of the -Calculus, as a formal foundation for software composition, dene a language in terms of it, and illustrates how this language can be used to plug components together.
Abstract: A composition language based on a formal semantic foundation will facilitate precise specication of glue abstractions and compositions, and will support reasoning about their behaviour. The semantic foundation, however, must address a set of requirements like encapsulation, objects as processes, components as abstractions, plug compatibility, a formal object model, and scalability. In this work, we propose the L-calculus, an extension of the -calculus, as a formal foundation for software composition, dene a language in terms of it, and illustrate how this language can be used to plug components together.

56 citations


Journal ArticleDOI
TL;DR: This paper systematically reviews the results of studies published from 1985 through 1998 on relationships involving self-, informal and formal care within these settings to suggest that formal services are not used to displace or substitute for informal care but rather, that formal Services tend to be used to supplement and complement the care provided by the informal network.
Abstract: Increasing emphasis is being placed on the need to have older adults, their families and formal service providers work together collaboratively or “in partnerships” to provide long-term care, both in community and residential care settings. There is therefore a need to determine how such relationships are currently structured. This paper systematically reviews the results of studies published from 1985 through 1998 on relationships involving self-, informal and formal care within these settings. The findings suggest that formal services are not used to displace or substitute for informal care but rather, that formal services tend to be used to supplement and complement the care provided by the informal network. This is true both in community and residential care settings. Exactly how these partnerships are structured and the relationships between self-care and both informal and formal systems of care are less clear. The findings point to a need to refocus attention away from the creation of partnerships and protecting against unnecessary substitution, towards broader concerns with supporting the partnerships that already exist.

49 citations


01 Jan 2000
TL;DR: This thesis presents the design of the MetaPRL formal system, which builds on the experience with logical frameworks and with structured programming concepts like inheritance and re-use to provide an efficient, highly abstract, logical machine.
Abstract: This thesis is primarily about the design of formal programming environments for building large software systems. The work presented here is based on two essential concepts. First, design methods for large software systems must include multiple languages, methodologies, and refinement techniques that are suited to problem subdomains. This means that any formal system must provide the ability to define multiple logics, and it is by definition a logical framework. Second, the framework must provide the ability to express formal relations between logical theories to address the problem of system decomposition. This thesis presents the design of the MetaPRL formal system. The MetaPRL design goal has been to provide a modular, abstract logical framework where multiple design methods can be expressed and related. The MetaPRL design builds on our experience with logical frameworks and with structured programming concepts like inheritance and re-use to provide an efficient, highly abstract, logical machine. The contribution includes several parts. (1) The development of an untyped meta-logic using explicit substitution. (2) The definition of a very-dependent function type in the Nuprl type theory. The very-dependent function type provides the logical specification of theories and modules, and it provides the basis for compositional reasoning. (3) A system architecture for generic multi-logical development. The architecture ties together a refiner (the theorem prover in a PRL system), a logical library of system components, an interactive proof system, and a compiler. (4) A generic refiner that provides automation and enforcement for the multiple logical theories in logical environment. The refiner provides the basis for defining and relating logics, and it is also critical performance bottleneck. We have been able to keep the refiner generic but efficient. (5) A module system for logics and theories. The module system provides an inheritance mechanism that is the basis for re-use of proofs and programs as well as for expressing relations between theories. (6) A generic distributed interactive theorem prover. The distribution mechanism is tactic-based, interactive, and multi-logical. System faults, such as machine and network failures, are handled transparently, allowing a developer to make maximum use of the available computational resources for development.

49 citations


Book ChapterDOI
14 Aug 2000
TL;DR: The aim of this paper is to design the theoretical models required for the extension of Formal Concept Analysis to any kind of lattice-structured set of properties (“generalized formal concept”), and especially the case of predicates.
Abstract: Formal Concept Analysis is an algebraic model based on a Galois connection that is used for symbolic knowledge exploration from an elementary form of the representation of data (“formal context”). The aim of this paper is to design the theoretical models required for the extension of Formal Concept Analysis to any kind of lattice-structured set of properties (“generalized formal concept”), and especially the case of predicates. Beyond their theoretical interest, the models aim at better solving real applied problems thanks to an improvement of the knowledge representation skills.

49 citations


Journal ArticleDOI
TL;DR: There is strong evidence that in signed language classifiers the authors have what, from the point of view of traditional (spoken-language based) linguistics, is a qualitatively new communication mode: formal, structured systems of visual representation that exist side-by-side with linguistic modalities, within the total signed communication system.
Abstract: It is argued that signed communication systems differ from spoken ones in having not one but two structured systems of representation. In addition to the linguistic mode (which is shared with spoken communication, and which appears to be fundamentally identical across spoken and signed modalities), signers also command distinctive, formal systems of schematic visual representation. These are the forms of signing known as classifier predicates. For the past two decades, signed classifier predicates have been modeled as linguistic. However, the basic formal units of such signing, the combination of these units, and their breakdown, all differ both from patterns seen in other signed forms that have long been recognized as linguistic, and from the classic patterns of language in general. Classifier predicates continue to be modeled as linguistic mostly on the basis of assumptions about alternatives, specifically about the form and acquisition of systems of visual-spatial representation. These assumptions are shown to be incorrect. Signed classifiers are shown to correspond in many respects not merely to visual representation, but to a particular strategy of depiction known as schematic visual representation. This is the mode of depiction that appears to be most natural for both children and adults to master, and that is commonly seen in drawing. There is thus strong evidence that in signed language classifiers we have what, from the point of view of traditional (spoken-language based) linguistics, is a qualitatively new communication mode: formal, structured systems of visual representation that exist side-by-side with linguistic modalities, within the total signed communication system.

47 citations


Book ChapterDOI
13 Aug 2000
TL;DR: In this paper, a generalization of FCA is proposed, in which sets of attributes are replaced by expressions of an almost arbitrary logic, and all FCA can be reconstructed on this basis.
Abstract: We propose a generalization of Formal Concept Analysis (FCA) in which sets of attributes are replaced by expressions of an almost arbitrary logic. We prove that all FCA can be reconstructed on this basis. We show that from any logic that is used in place of sets of attributes can be derived a contextualized logic that takes into account the formal context and that is isomorphic to the concept lattice. We then justify the generalization of FCA compared with existing extensions and in the perspective of its application to information systems.

46 citations


Journal ArticleDOI
TL;DR: The influence of Richard Bellman's work on their research is primarily felt in the attitude the authors take towards the research, which is that whenever the right mathematical tool for a problem that they are considering is not available, then some mathematical invention is required.

33 citations


Journal ArticleDOI
TL;DR: The main result is a generalization of the classical Splitting Lemma that leads to an algorithm for computing formal invariants and solutions in a more efficient way.
Abstract: In this paper we present a new method for the formal reduction of linear differential systems. We generalize classical results and concepts and obtain new characterizations of existing notions. Our main result is a generalization of the classical Splitting Lemma. This leads to an algorithm for computing formal invariants and solutions in a more efficient way.

Journal ArticleDOI
TL;DR: There has been a debate about whether or not Knut Wicksell had put forward a theory of capital and interest that is closed in the sense that the data, or independent variables, from which he started suffice to determine the unknowns, or dependent variables, especially the "natural" rates of wages, rents, and interest as discussed by the authors.
Abstract: Triggered by a stimulating paper by Bo Sandelin (1980), there has been a debate about Knut Wicksell's theory of capital and distribution that is known as "Wicksell's missing equation," To this debate have contributed, among others, Sandelin (1980, 1982), Takashi Negishi (1982a; 1982b; 1985, chap, 9), Larry Samuelson (1982), and Tom Kompas (1992, chap. 4), The issue under consideration is whether or not Knut Wicksell had put forward a theory of capital and interest that is closed in the sense that the data, or independent variables, from which he started suffice to determine the unknowns, or dependent variables, especially the "natural" rates of wages, rents, and interest. The mentioned authors claim that there is otie equation "missing" in Wicksell's theory and that his formal system of equations is underdetermined. The question then is how to close the system in a way that is faithful to Wicksell, The authors under consideration differ in terms of the closures they suggest.

Journal ArticleDOI
TL;DR: In this paper, the relation between formal topology and the theory of domains is discussed, and it is shown that some open problems in one of the two fields could already have a solution in the other, and that is why an intensification of contact should be rewarding.

Proceedings ArticleDOI
15 Mar 2000
TL;DR: This work applies the formal specification technique cTLA which is based on L. Lamport's Temporal Logic of Actions, TLA to complement the UML based design by additional formal models which refine UML diagrams to precise formal models.
Abstract: The Unified Modeling Language UML is well-suited for the design of real-time systems. In particular the design of dynamic system behaviors is supported by interaction diagrams and statecharts. Real-time aspects of behaviors can be described by time constraints. The semantics of the UML, however, is non-formal. In order to enable formal design verification, we therefore propose to complement the UML based design by additional formal models which refine UML diagrams to precise formal models. We apply the formal specification technique cTLA which is based on L. Lamport's Temporal Logic of Actions, TLA. In particular cTLA supports modular definitions of process types and the composition of systems from coupled process instances. Since process composition has superposition character each process system has all of the relevant properties of its constituting processes. Therefore mostly small subsystems are sufficient for the verification of system properties and it is not necessary to use complete and complex formal system models. We present this approach by means of an example and also exemplify the formal verification of its hard real-time properties.

Book
01 Jan 2000
TL;DR: It is pointed out that the construction and analysis of computer programs currently serve as objects of systems theory research, and why this is so is explained.
Abstract: In the first two chapters I repeatedly pointed out that the construction and analysis of computer programs currently serve as objects of systems theory research, and explained why this is so. Here I would like to sum up systematically once again the several reasons for this state of affairs.

Book
01 Jan 2000
TL;DR: The main results in this paper are the theorems that formulate the notion of compositionality and the completeness of the derivation system that supports this property in a component-based system.
Abstract: Motivated by our earlier work on the IWIM model and the Manifold language, in this paper, we attend to some of the basic issues in component-based software. We present a formal model for such systems, a formal-logic-based component interface description language that conveys the observable semantics of components, a formal system for deriving the semantics of a composite system out of the semantics of its constituent components, and the conditions under which this derivation system is sound and complete. Our main results in this paper are the theorems that formulate the notion of compositionality and the completeness of the derivation system that supports this property in a component-based system.

Proceedings ArticleDOI
04 Sep 2000
TL;DR: A new integration proposal is defined, based on first-order dynamic logic, that integrates both of the levels in the modeling notation architecture into a single conceptual framework and allows both static and dynamic aspects of either the model or the modeled system within a first- order formalism.
Abstract: Describes and classifies the different solutions that have been proposed to realize the integration of graphic modeling languages, which are known and accepted by software developers, with formal modeling languages having analysis and verification tools. Inspired by that classification, we define a new integration proposal, based on first-order dynamic logic. The principal benefits of the proposed formalization can be summarized as follows. (i) The different views of a system are integrated in a single formal model; this allows us to define compatibility rules between the separate views, on both a syntactic and a semantic level. (ii) Using formal manipulation, it is possible to deduce further knowledge from the specification. (iii) Faults in the specifications, expressed using a user-friendly notation, can be revealed using analysis and verification techniques based on the formal kernel model. The principal difference between this model and other object-oriented formal models is that it integrates both of the levels in the modeling notation architecture into a single conceptual framework. The integration of modeling entities and modeled entities into a single formalism allows us to express both static and dynamic aspects of either the model or the modeled system within a first-order formalism.

Journal ArticleDOI
TL;DR: To be suitable for verification of so called co-operating systems, a modified type of satisfaction relation for system properties (approximate satisfaction) is considered.
Abstract: Behaviour of systems is described by formal languages: the sets of all sequences of actions. Regarding abstraction, alphabetic language homomorphisms are used to compute abstract behaviours. To avoid loss of important information when moving to the abstract level, abstracting homomorphisms have to satisfy a certain property called simplicity on the concrete (i.e. not abstracted) behaviour. To be suitable for verification of so called co-operating systems, a modified type of satisfaction relation for system properties (approximate satisfaction) is considered. The well known state space explosion problem is tackled by a compositional method formalized by so called co-operation products of formal languages.

01 Jan 2000
TL;DR: Newman as mentioned in this paper established a close mapping between a formal system and a set of aspects of our world, which allowed complex inferences to be made about many different aspects of the world.
Abstract: Perhaps the most important legacy of Newton is in the use of formal systems. He established, for the first time, a close mapping between a formal system and a set of aspects of our world. This mapping allowed complex inferences to be made about many different aspects of the world. This went beyond merely exploring mathematics for its own sake and beyond using mathematics to merely express relations found in data. In fact, the mapping seemed to be so accurate in so many different ways, explaining so many diverse phenomena, making so many novel yet correct predictions about the world and bringing together so many different types of phenomena together into a single descriptive framework, that it was taken (with some justification) for the literal truth. Here the full potential of formal systems was revealed.

Proceedings ArticleDOI
16 Apr 2000
TL;DR: The purpose of this paper is to present the PLATUS simulation environment, providing the tools for formal system verification and code generation of real control systems, making the development of such systems a straightforward step from specification.
Abstract: The purpose of this paper is to present the PLATUS simulation environment. PLATUS allows the formal modeling of control systems using graph grammars. In PLATUS, control systems are viewed as a collection of three distinct abstractions, that allow the system designer to keep the specification of control elements free of simulation aspects, providing the tools for formal system verification and code generation of real control systems, making the development of such systems a straightforward step from specification.

Journal ArticleDOI
TL;DR: The tolerance of imprecision necessarily required in many real‐life software systems can be represented in the clear and structured mathematics of the fuzzy information granulation theory within the extended framework of formal methods.
Abstract: Formal methods are used to improve the quality of complex computer software by means of documenting system specifications in a precise and structured manner, the most popular specification language for formal methods is Z. However, based on classical set theory and classical logic, this mathematical language can only deal effectively with well-defined problems. This is a disadvantage that classical set operators and classical predicate logic can offer to formal methods. In this paper, the theory of fuzzy information granulation is discussed with an attempt to build toward flexible formal software specifications in which many aspects of human reasoning and natural language can be effectively addressed in mathematical terms. In other words, the tolerance of imprecision necessarily required in many real-life software systems can be represented in the clear and structured mathematics of the fuzzy information granulation theory within the extended framework of formal methods.

Journal ArticleDOI
TL;DR: Evaluability assessments were used initially as a means of learning how Philadelphia delinquency prevention programs define success, and Examination of programs' definitions of success over time has facilitated program development and policy making.
Abstract: In program evaluation, determining whether a program is reaching its goals is key to evaluating program success. Having an understanding of the stated goals of each organization within a system rather than simply knowing the formal system goals can aid in program development and inform policy making. Evaluability assessments were used initially as a means of learning how Philadelphia delinquency prevention programs define success. Subsequent surveys of programs were conducted to ascertain whether any changes in definitions of success had occurred. Examination of programs' definitions of success over time has facilitated program development and policy making.

01 May 2000
TL;DR: The main goal of this paper is to investigate the strong generative power of Lambek categorial grammars in the context of crossing dependencies, in view of the recent work of Tiede (1998).
Abstract: The relationship between strong and weak generative powers of formal systems is explored, in particular, from the point of view of squeezing more strong power out of a formal system without increasing its weak generative power. We examine a whole range of old and new results from this perspective. However, the main goal of this paper is to investigate the strong generative power of Lambek categorial grammars in the context of crossing dependencies, in view of the recent work of Tiede (1998). Introduction Strong generative power (SGP) relates to the set of structural descripti ons (such as derivation trees, dags, proof trees, etc.) assigned by a formal system to the strings that it specifies. Weak generative power (WGP) refers to the set of strings characterized by the formal system. SGP is clearly the primary object of interest from the linguistic point of view. WGP is often used to locate a formal system within one or another hierarchy of formal grammars 1. Clearly a study of the relationship between WGP and SGP is highly relevant, both formally and l inguistically. Although there has been interest in the study of this relationship, almost from the be ginning of the work in mathematical linguistics, the results are few, as this rela tionship is quite complex and not always easy to study mathematically (see Miller (1969) for a recent c omprehensive discussion of SGP). Our main goals in this paper are (1) to look at some old and recent results and try to put them in a general framework, a framework that can best be described by the slogan–How to squeeze more strong generative power out of a grammatical system?– and (2) to present a n w result concerning Lambek categorial grammars. Our general discussion of the relations hip f SGP and WGP will be in the context of context-free grammars, categorial grammars and lexicalized tree-adjoining grammars. 1. Context-Free Grammar (CFG) McCawley(1967) was the first person to point out that the use of context-sensitive rule s by linguists was really for checking structural descriptions (thus related to S GP) and not for generating strings (i.e., WGP), suggesting that this use of context-sensitive rule s possibly does not This work was partially supported by NSF Grant SBR8920230 1SGP is also relevant in the context of annotated corpora. The annotations reflect aspects of SGP and not of the rules of a grammar and therefore WGP. give more SGP than CFG’s. Peters and Ritchie (1969) showed that this was indee d the case. These results are closely related to the notion of recognizable sets of tree s (structural descriptions) as explained below. In a CFG,G, the derivation trees of G correspond to the possible structural descriptions assignable byG. It is easily shown that there are tree sets whose yield language is contextfre but the tree sets are not the tree sets of any CFG. That is, we are able to squeeze more s trong power out of CFG’s indirectly. Here is a simple example.

Book ChapterDOI
01 Jan 2000
TL;DR: Both the constructive and predicative approaches to mathematics arose during the period of what was felt to be a foundational crisis in the early part of this century as discussed by the authors. But the positive redevelopment of mathematics along constructive, resp. predicative grounds did not emerge as really viable alternatives to classical, set-theoretically based mathematics until the 1960s.
Abstract: Both the constructive and predicative approaches to mathematics arose during the period of what was felt to be a foundational crisis in the early part of this century. Each critiqued an essential logical aspect of classical mathematics, namely concerning the unrestricted use of the law of excluded middle on the one hand, and of apparently circular “impredicative” definitions on the other. But the positive redevelopment of mathematics along constructive, resp. predicative grounds did not emerge as really viable alternatives to classical, set-theoretically based mathematics until the 1960s. Now we have a massive amount of information, to which this lecture will constitute an introduction, about what can be done by what means, and about the theoretical interrelationships between various formal systems for constructive, predicative and classical analysis.

Proceedings ArticleDOI
01 Mar 2000
TL;DR: This paper shows how the Gurevich's Abstract State Machines (ASMs) have been successfully employed to describe the parallel semantics of skeletons of P3L completely abstracting from any implementation detail.
Abstract: Some formalizations have been developed in order to give a complete description of the parallel semantics of P3L, a programming language which ensures both task and data parallelism. However, the description level provided by the employed formalisms is either too much 'abstract' unable to grasp the operational behavior of the skeletons of the language or too much 'concrete' close to implementation details and therefore related to a particular machine model. This paper shows how the Gurevich's Abstract State Machines (ASMs) have been successfully employed to describe the parallel semantics of skeletons of P3L completely abstracting from any implementation detail. The proposed model describes both the operation of the compiler which leads to define a network of processes start° ing from a given program, and the computation of the running processes. The model has been also taken as a theoretical basis to prove some of the most interesting rewriting rules used by the compiler to get a better program performance and efficiency.

Journal ArticleDOI
TL;DR: In this article, the question whether confirmation transfer yields a logical system is answered in the negative, despite confirmation transfer having the standard properties of a consequence relation, on the grounds that validity of sequents in the system is not determined by the meanings of the connectives that occur in formulas.
Abstract: This article begins by exploring a lost topic in the philosophy of science:the properties of the relations evidence confirming h confirmsh' and, more generally, evidence confirming each ofh1, h2, ..., hm confirms at least one of h1’, h2’,ldots;, hn'.The Bayesian understanding of confirmation as positive evidential relevanceis employed throughout. The resulting formal system is, to say the least, oddlybehaved. Some aspects of this odd behaviour the system has in common withsome of the non-classical logics developed in the twentieth century. Oneaspect – its ``parasitism'' on classical logic – it does not, and it is this featurethat makes the system an interesting focus for discussion of questions in thephilosophy of logic. We gain some purchase on an answer to a recently prominentquestion, namely, what is a logical system? More exactly, we ask whether satisfaction of formal constraints is sufficient for a relation to be considered a (logical) consequence relation. The question whether confirmation transfer yields a logical system is answered in the negative, despite confirmation transfer having the standard properties of a consequence relation, on the grounds that validity of sequents in the system is not determined by the meanings of the connectives that occur in formulas. Developing the system in a different direction, we find it bears on the project of ``proof-theoretic semantics'': conferring meaning on connectives by means of introduction (and possibly elimination) rules is not an autonomous activity, rather it presupposes a prior, non-formal,notion of consequence. Some historical ramifications are alsoaddressed briefly.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that if the Internet is to serve as a major communication, entertainment, and information medium in the 21st Century, a system is needed that integrates the strengths of both informal and formal systems of control while respecting the social, intellectual, and political freedom of the Internet community.
Abstract: The rapid growth of the Internet during the past 10 years has resulted in many disagreements over who should have the power to make and enforce the rules of on-line content and conduct and what form, such rules and enforcement should take— informal or formal. The extremes at which each of these potentially complementary systems of social control are currently practiced have contributed to an atmosphere of inconsistency, contradiction, uncertainty, and excessive discretion amongst state agencies, Internet service providers, system operators, and Internet users. If the Internet is to serve as a major communication, entertainment, and information medium in the 21st Century, a system is needed that integrates the strengths of both informal and formal systems of control while respecting the social, intellectual, and political freedom of the Internet community.

Journal ArticleDOI
TL;DR: A formal logical system for sortal quantifiers, sortal identity and (second order) quantification over sortal concepts is formulated and the absolute consistency of the system is proved.
Abstract: A formal logical system for sortal quantifiers, sortal identity and (second order) quantification over sortal concepts is formulated. The absolute consistency of the system is proved. A completeness proof for the system is also constructed. This proof is relative to a concept of logical validity provided by a semantics, which assumes as its philosophical background an approach to sortals from a modern form of conceptualism.

Dissertation
01 Jan 2000
TL;DR: In this work particular emphasis is given to how representation models are constructed in Classical Mechanics and Nuclear Physics and what conceptual resources are used in their construction.
Abstract: Analyses of the nature and structure of scientific theories have predominantly focused on formalisation. The Received View of scientific theories considers theories as axiomatised sets of sentences. In Hilbert-style formalisation theories are considered formal axiomatic calculi to which interpretation is supplied by a set of correspondence rules. The Received View has long been abandoned. The Semantic View of scientific theories also considers theories as formal systems. In the Semantic conception, a theory is identified with the class of intended models of the formal language, if the theory were to be given such linguistic form. The proponents of the Semantic View, however, hold that this class of models can be directly defined without recourse to a formal language. Just like its predecessor, the Semantic View is also not free of untenable implications. The uniting feature of the arguments m this work is the topic of theoretical representation of phenomena. The Semantic View implies that theoretical representation conies about by the use of some model, which belongs to the class that constitutes the theory. However, this is not what we see when we scrutinise the features of actual representation models in physics. In this work particular emphasis is given to how representation models are constructed in Classical Mechanics and Nuclear Physics and what conceptual resources are used in their construction. The characteristics that these models demonstrate instruct us that to regard them as families of theoretical models, as the Semantic View purports, is to obscure how they are constructed, what is used for their construction, how they function and how they relate to the theory. For instance, representation models are devices that frequently postulate physical mechanisms for which the theory does not provide explanations. Thus it seems more appropriate to claim that these representation devices mediate between theory and experiment, and at the same time possess a partial independence from theory. Furthermore, when we focus our attention to the ways by which representation models are constructed we discern that they are the result of the processes of abstraction and concretisation. These processes are operative in theoretical representation and they demand our attention if we are to explicate how theories represent phenomena in their domains.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss inconsistencies between the statements in the text and those expressed in propositional logic and examine some of the conclusions reached by Patten in the context of dynamic systems theory.