scispace - formally typeset
Search or ask a question

Showing papers on "Domain (software engineering) published in 1991"


Journal ArticleDOI
TL;DR: The utility of compositional modeling is illustrated by outlining the organization of a large-scale, multi-grain,Multi-perspective model the authors have built for engineering thermodynamics, and showing how the model composition algorithm can be used to automatically select the appropriate knowledge to answer questions in a tutorial setting.

467 citations


Proceedings ArticleDOI
24 Aug 1991
TL;DR: This paper shall define a terminological and an assertional language, and consider the important inference problems such as subsumption, instantiation, and consistency, and proposes a scheme for integrating such concrete domains into concept languages.
Abstract: A drawback which concept languages based on KL-ONE have is that all the terminological knowledge has to be defined on an abstract logical level. In many applications, one would like to be able to refer to concrete domains and predicates on these domains when defining concepts. Examples for such concrete domains are the integers, the real numbers, or also non-arithmetic domains, and predicates could be equality, inequality, or more complex predicates. In the present paper we shall propose a scheme for integrating such concrete domains into concept languages rather than describing a particular extension by some specific concrete domain. We shall define a terminological and an assertional language, and consider the important inference problems such as subsumption, instantiation, and consistency. The formal semantics as well as the reasoning algorithms are given on the scheme level. In contrast to existing KL-ONE based systems, these algorithms will be not only sound but also complete. They generate subtasks which have to be solved by a special purpose reasoner of the concrete domain.

404 citations


Journal ArticleDOI
TL;DR: Computational experiments indicate that Cascade's learning mechanisms are jointly sufficient to reproduce the self-explanation effect, and a computer model is described, Cascade, that accounts for these findings.
Abstract: Several investigators have taken protocols of students learning sophisticated skills, such as physics problem solving and LISP coding, by studying examples and solving problems. These investigations uncovered the self-explanation effect: Students who explain examples to themselves learn better, make more accurate self-assessments of their understanding, and use analogies more economically while solving problems. We describe a computer model, Cascade, that accounts for these findings. Explaining an example causes Cascade to acquire both domain knowledge and derivational knowledge. Derivational knowledge is used analogically to control search during problem solving. Domain knowledge is acquired when the current domain knowledge is incomplete and causes an impasse. If the impasse can be resolved by applying an overly general rule, then a specialization of the rule becomes a new domain rule. Computational experiments indicate that Cascade's learning mechanisms are jointly sufficient to reproduce the self-expl...

331 citations


Journal ArticleDOI
TL;DR: In this article, the authors studied the problem of minimizing the functional complexity of the (1.1) problem on a smooth bounded domain, where 2N 2 < p < 2 < 2 = 2, 2E a + U { 0 }. N.
Abstract: In this paper we are concerned with the following problem: ~ u + 2u = u p ' in g?, u > 0 in 2 , (1.1) u ----0 on 0 Q where ~ Q R ~, N ~ 3, is a smooth bounded domain, and 2N 2 < p < 2 \" = ~ 2 , 2 E a + U { 0 } . N It is well known that problem (1.1) has at least one solution for every p E (2, 2*) and for every 2 E (--22, + co) and that this solution can be found by minimizing the functional e~(u) = f (IVul 2 + ,~u ~) dx D on the manifold

276 citations



Journal ArticleDOI
TL;DR: The purpose of this paper is to outline basic approaches and basic types of algorithms available to deal with this problem and to review their convergence analysis.
Abstract: A generalized fractional programming problem is specified as a nonlinear program where a nonlinear function defined as the maximum over several ratios of functions is to be minimized on a feasible domain of źn. The purpose of this paper is to outline basic approaches and basic types of algorithms available to deal with this problem and to review their convergence analysis. The conclusion includes results and comments on the numerical efficiency of these algorithms.

267 citations


Journal ArticleDOI
TL;DR: Assessing testability from program specifications and an experiment shows that it takes less time to build and test a program developed from a domain-testable specification than a similar program developing from a nondomain- testable specification are discussed.
Abstract: The concept of domain testability of software is defined by applying the concepts of observability and controllability to software. It is shown that a domain-testable program does not exhibit any input-output inconsistencies and supports small test sets in which test outputs are easily understood. Metrics that can be used to assess the level of effort required in order to modify a program so that it is domain-testable are discussed. Assessing testability from program specifications and an experiment which shows that it takes less time to build and test a program developed from a domain-testable specification than a similar program developed from a nondomain-testable specification are also discussed. >

255 citations


Journal ArticleDOI
Gilligan, Carol, Ward, Janie Victoria, Taylor, Jill McLEAN 

205 citations


Proceedings Article
14 Jul 1991
TL;DR: This paper examines how the Clarke tax could be used as an effective "preference revealer" in the domain of automated agents, reducing the need for explicit negotiation.
Abstract: When autonomous agents attempt to coordinate action, it is often necessary that they reach some kind of consensus. Reaching such a consensus has traditionally been dealt with in the Distributed Artificial Intelligence literature via the mechanism of negotiation. Another alternative is to have agents bypass negotiation by using a voting mechanism; each agent expresses its preferences, and a group choice mechanism is used to select the result. Some choice mechanisms are better than others, and ideally we would like one that. cannot be manipulated by an untruthful agent. One such non-manipulable choice mechanism is the Clarke tax [Clarke, 1971]. Though theoretically attractive, the Clarke tax presents a number of difficulties when one attempts to use it in a practical implementation. This paper examines how the Clarke tax could be used as an effective "preference revealer" in the domain of automated agents, reducing the need for explicit negotiation.

199 citations


Journal ArticleDOI
TL;DR: A denotational semantics for SCCS based on the domain of synchronization trees is given, and proved fully abstract with respect to bisimulation.
Abstract: Some basic topics in the theory of concurrency are studied from the point of view of denotational semantics, and particularly the ''domain theory in logical form'' developed by the author. A domain of synchronization trees is defined by means of a recursive domain equation involving the Plotkin powerdomain. The logical counterpart of this domain is described, and shown to be related to it by Stone duality. The relationship of this domain logic to the standard Hennessy-Milner logic for transition systems is studied; the domain logic can be seen as a rational reconstruction of Hennessy-Milner logic from the standpoint of a very general and systematic theory. Finally, a denotational semantics for SCCS based on the domain of synchronization trees is given, and proved fully abstract with respect to bisimulation.

188 citations


Proceedings Article
LiMin Fu1
14 Jul 1991
TL;DR: The KT algonhm is a novel algonthm for generating rules from an adapted net efficiently that is able to deal with both single-layer and multi-layer networks, and can learn both confirming and disconfirming rules.
Abstract: If the back propagation network can produce an inference structure with high and robust performance, then it is sensible to extract rules from it. The KT algonthm is a novel algonthm for generating rules from an adapted net efficiently. The algorithm is able to deal with both single-layer and multi-layer networks, and can learn both confirming and disconfirming rules. Empirically, the algorithm is demonstrated in the domain of wind shear detection by infrared sensors with success.

Patent
30 Apr 1991
TL;DR: In this paper, a system and method of logically and physically clustering data (tuples) in a database is presented, where data objects stored in the domains may be stored in a particular domain based upon a locality-of-reference algorithm in which a tuple of data is placed in a domain if and only if all objects referenced by the tuple are contained in the domain.
Abstract: A system and method of logically and physically clustering data (tuples) in a database. The database management system of the invention partitions (declusters) a set of relations into smaller so-called local relations and reclusters the local relations into constructs called domains. The domains are self-contained in that a domain contains the information for properly accessing and otherwise manipulating the data it contains. In other words, the data objects stored in the domains may be stored in a particular domain based upon a locality-of-reference algorithm in which a tuple of data is placed in a domain if and only if all objects referenced by the tuple are contained in the domain. On the other hand, the data objects stored in a domain may be clustered so that a tuple of data is placed in a domain based on the domain of the object referenced by a particular element of the tuple. By clustering the related object data in this manner, the database management system may more efficiently cache data to a user application program requesting data related to a particular data object. The system may also more efficiently lock and check-in and check-out data from the database so as to improve concurrency. Moreover, versioning may be more readily supported by copying tuples of a particular domain into a new domain which can then be updated as desired.

Journal ArticleDOI
TL;DR: A formal method to allow designers to explicitly make trade-off decisions is presented, and the details of the mathematical formulation are presented and discussed, along with two design examples.
Abstract: A formal method to allow designers to explicitly make trade-off decisions is presented. The methodology can be used when an engineer wishes to rate the design by the weakest aspect, or by cooperatively considering the overall performance, or a combination of thesestrategies. The design problem is formulated with preference rankings, similar to a utility theory or fuzzy sets approach. This approach separates the designtrade-off strategy from the performance expressions. The details of the mathematical formulation are presented and discussed, along with two design examples: one from the preliminary design domain, and one from the parameter design domain.

Journal ArticleDOI
TL;DR: A model of computer-supported negotiation is presented which can be used to address conflicts in systems analysis directly, and has been used to develop a system called Synoptic, which provides a set of tools to support the exploration of conflicts.

Journal ArticleDOI
TL;DR: A performance evaluation of 15 text-analysis systems was recently conducted to realistically assess the state of the art for detailed information extraction from unconstrained continuous text, supporting the claim that systems incorporating natural language‐processing techniques are more effective than systems based on stochastic techniques alone.
Abstract: A performance evaluation of 15 text-analysis systems was recently conducted to realistically assess the state of the art for detailed information extraction from unconstrained continuous text. Reports associated with terrorism were chosen as the target domain, and all systems were tested on a collection of previously unseen texts released by a government agency. Based on multiple strategies for computing each metric, the competing systems were evaluated for recall, precision, and overgeneration. The results support the claim that systems incorporating natural language-processing techniques are more effective than systems based on stochastic techniques alone. A wide range of language-processing strategies was employed by the top-scoring systems, indicating that many natural language-processing techniques provide a viable foundation for sophisticated text analysis. Further evaluation is needed to produce a more detailed assessment of the relative merits of specific technologies and establish true performance limits for automated information extraction.

Journal ArticleDOI
TL;DR: Memory measures suggest that the retention of training effects is due toMemory for the rule system rather than to memory for the specific details of the example problems, contrary to what would be expected if Ss were using direct analogies to solve the test problems.
Abstract: Ss were trained on the law of large numbers in a given domain through the use of example problems. They were then tested either on that domain or on another domain either immediately or after a 2-week delay. Strong domain independence was found when testing was immediate. This transfer of training was not due simply to Ss' ability to draw direct analogies between problems in the trained domain and in the untrained domain. After the 2-week delay, it was found that (a) there was no decline in performance in the trained domain and (b) although there was a significant decline in performance in the untrained domain, performance was still better than for control Ss. Memory measures suggest that the retention of training effects is due to memory for the rule system rather than to memory for the specific details of the example problems, contrary to what would be expected if Ss were using direct analogies to solve the test problems.

Journal ArticleDOI
TL;DR: This article developed a theory of questions and of question asking, motivated both by cognitive and computational considerations, and discuss the theory in the context of story understanding, and present a computer model of an active reader that learns about novel domains by reading newspaper stories.
Abstract: This article focuses on knowledge goals, that is, the goals of a reasoner to acquire or reorganize knowledge. Knowledge goals, often expressed as questions, arise when the reasoner's model of the domain is inadequate in some reasoning situation. This leads the reasoner to focus on the knowledge it needs, to formulate questions to acquire this knowledge, and to learn by pursuing its questions. I develop a theory of questions and of question asking, motivated both by cognitive and computational considerations, and I discuss the theory in the context of the task of story understanding. I present a computer model of an active reader that learns about novel domains by reading newspaper stories.

Proceedings ArticleDOI
01 Jun 1991
TL;DR: A novel architectural style specifically suited for this application domain is presented and a synopsis of a novel synthesis script typically oriented towards this architecture is described (architecture-driven synthesis).
Abstract: The goal of this paper is to extend the synthesis of real time digital signal processing (DSP) algorithms towards the domain of high throughput applications. A novel architectural style specifically suited for this application domain is presented. Furthermore, a synopsis of a novel synthesis script typically oriented towards this architecture is described (architecture-driven synthesis). The emphasis in the script is on the design of the data-paths which are dedicated to the application, and special attention is paid to the memory synthesis problem. In this paper only the data-path related tasks, namely data-path partitioning and data-path definition, are discussed. The new methodology is demonstrated using an image processing application.

Journal ArticleDOI
TL;DR: The BACK project was begun at the Technical University Berlin in January 1985 and supports complex representation of a domain terminology, description of domain objects using that terminology, and database access via a uniform interface language.
Abstract: The BACK project was begun at the Technical University Berlin in January 1985 as part of a larger project within the ESPRIT programme1. Our task in the project group was the specification, design and implementation of a knowledge representation system which we called BACK ("Berlin Advanced Computational Knowledge Representation System"). It is a based on a terminological logic (term description language) and supports complex representation of a domain terminology, description of domain objects using that terminology, and database access via a uniform interface language.


Journal ArticleDOI
01 Dec 1991
TL;DR: Examples of problems of semantic heterogeneity in databases due to “domain evolution”, as it occurs in both single- and multidatabase systems are described.
Abstract: We describe examples of problems of semantic heterogeneity in databases due to “domain evolution”, as it occurs in both single- and multidatabase systems. These problems occur when the semantics of values of a particular domain change over time in ways that are not amenable to applying simple mappings between “old” and “new” values. The paper also proposes facilities and strategies for solving such problems.

ReportDOI
24 Aug 1991
TL;DR: A qualitative method for understanding and representing phase space structures of complex systems and generates a complete, high level symbolic description of the phase space structure, through a combination of numerical, combinatorial, and geometric computations and spatial reasoning techniques.
Abstract: We develop a qualitative method for understanding and representing phase space structures of complex systems. To demonstrate this method, a program called MAPS has been constructed that understands qualitatively different regions of a phase space and represents and extracts geometric shape information about these regions, using deep domain knowledge of dynamical system theory. Given a dynamical system specified as a system of governing equations, MAPS applies a successive sequence of operations to incrementally extract the qualitative information and generates a complete, high level symbolic description of the phase space structure, through a combination of numerical, combinatorial, and geometric computations and spatial reasoning techniques. The high level description is sensible to human beings and manipulable by other programs. We are currently applying the method to a difficult engineering design domain in which controllers for complex systems are to be automatically synthesized to achieve desired properties, based on the knowledge of the phase space "shapes" of the systems.

Journal ArticleDOI
TL;DR: In this article, the authors explore the analytic nature and domain of macromarketing from a systems theoretic perspective and provide a rich conceptual framework in which to unify the traditional themes of macro-marketing and accelerate empirical research.
Abstract: This article explores the analytic nature and domain of macromarketing from a systems theoretic perspective. Macromarketing is developed as the study of the complex coordination and control processes underpinning growth, evolution, and design of exchange systems. This approach provides a rich conceptual framework in which to unify the traditional themes of macromarketing and, potentially, to accelerate empirical research.

ReportDOI
01 May 1991
TL;DR: The TRAINS project serves as an umbrella for research that involves pushing the state of the art in real-time planning, planning in uncertain worlds, plan monitoring and execution, natural language understanding techniques applicable to spoken language, and natural language dialog and discourse modelling.
Abstract: : The TRAINS project is a long-term research effort on building an intelligent planning assistant that is conversationally proficient in natural language The TRAINS project serves as an umbrella for research that involves pushing the state of the art in real-time planning, planning in uncertain worlds, plan monitoring and execution, natural language understanding techniques applicable to spoken language, and natural language dialog and discourse modelling Significant emphasis is being put on the knowledge representation issues that arise in supporting the tasks in the domain This report describes the general goals of the TRAINS project and the particular research directions that we are pursuing Planning, Natural language understanding, Dialog systems


Book ChapterDOI
Ramana B. Rao1
15 Jul 1991
TL;DR: It is claimed that an ostensibly broader view of reflection, which is called implementational reflection, can be applied to the design of other kinds of systems, accruing the same benefits that arise in the programming language case.
Abstract: The value of computational reflection has been explored in a number of programming language efforts. The major claim of this paper is that an ostensibly broader view of reflection, which we call implementational reflection, can be applied to the design of other kinds of systems, accruing the same benefits that arise in the programming language case. The domain of window systems in general, and the Silica window system in particular are used to illustrate how reflection can be applied more broadly. Silica is a CLOS-based window system that is a part of the Common Lisp Interface Manager, an emerging user interface programming standard for Common Lisp.

Book ChapterDOI
01 Mar 1991
TL;DR: The Forte theory revision system provides theory revision capabilities similar to those of the propositional systems, but works with domain theories stated in first-order logic.
Abstract: Recent learning systems have combined explanation-based and inductive learning techniques to revise propositional domain theories (e.g., EITHER, RTLS, KBANN). Inductive systems working in first order logic have also been developed (e.g., CIGOL, FOIL, FOCL). This paper presents a theory revision system, Forte, that merges these two developments. Forte provides theory revision capabilities similar to those of the propositional systems, but works with domain theories stated in first-order logic.

Book
01 Feb 1991
TL;DR: The authors describe an alternative approach, one that views simulation models as logical statements, and provide a standard notation for the construction of ecological simulation programs that can be readily understood by those who lack modeling, mathematical, or programming skills.
Abstract: When applied to problems in poorly structured domains such as government decision making, conventional methods for simulating - for example, a controversial problem like acid rain - prove to be flawed. In "Eco-Logic, "the authors describe an alternative approach, one that views simulation models as logical statements. Using the techniques of logic programming, they provide a standard notation for the construction of ecological simulation programs that can be readily understood by those who lack modeling, mathematical, or programming skills.The authors demonstrate this approach in the domain of ecological modeling by building a series of computer programs to assist ecologists to build simulation models. They show how the description of an ecological situation can be represented and then incrementally refined into a simulation model. This enables ecologists to describe problems initially in ecological terms, rather than in mathematical or programming terms. It also enables the final model to be formally related to the original ecological description. This permits a computer program to give ecologically meaningful explanations of the results of the model, and facilitates rapid remodeling if the underlying assumptions of the model are modified.

Proceedings Article
14 Jul 1991
TL;DR: This paper presents the following results: 1.
Abstract: Although blocks-world planning is well-known, its complexity has not previously been analyzed, and different planning researchers have expressed conflicting opinions about its difficulty. In this paper, we present the following results: 1. Finding optimal plans in a well-known formulation of the blocks-world planning domain is NP-hard, even if the goal state is completely specified. 2. Classical examples of deleted-condition interactions such as Sussman's anomaly and creative destruction are not difficult to handle in this domain, provided that the right planning algorithm is used. Instead, the NP-hardness of the problem results from difficulties in determining which of several different actions will best help to achieve multiple goals.

Journal ArticleDOI
TL;DR: A simple and efficient algorithm is described for automatic decomposition of an arbitrary finite element domain into a specified number of subdomains for finite element and substructuring analysis in a multi-processor computer environment.