scispace - formally typeset
Search or ask a question

Showing papers presented at "Conference on Tools With Artificial Intelligence in 1993"


Proceedings ArticleDOI
08 Nov 1993
TL;DR: Results are presented which suggested that genetic algorithms can be used to increase the robustness of feature selection algorithms without a significant decrease in compuational efficiency.
Abstract: Selecting a set of features which is optimal for a given task is a problem which plays an important role in wide variety of contexts including pattern recognition, adaptive control and machine learning. Experience with traditional feature selection algorithms in the domain of machine learning leads to an appreciation for their computational efficiency and a concern for their brittleness. The authors describe an alternative approach to feature selection which uses genetic algorithms as the primary search component. Results are presented which suggested that genetic algorithms can be used to increase the robustness of feature selection algorithms without a significant decrease in compuational efficiency.

178 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: A new class of constraint recording algorithms called Nogood Recording is proposed that may be used for solving both static and dynamic CSPs and offers an interesting compromise, polynomially bounded in space, between an ATMS-like approach and the usual static constraint satisfaction algorithms.
Abstract: Many AI synthesis problems such as planning, scheduling or design may be encoded in a constraint satisfaction problem (CSP). A CSP is typically defined as the problem of finding any consistent labeling for a fixed set of variables satisfying all given constraints between these variables. However, for many real tasks, the set of constraints to consider may evolve because of the environment or because of user interactions. The problem considered here is the solution maintenance problem in such a dynamic CSP (DCSP). The authors propose a new class of constraint recording algorithms called Nogood Recording that may be used for solving both static and dynamic CSPs. It offers an interesting compromise, polynomially bounded in space, between an ATMS-like approach and the usual static constraint satisfaction algorithms.

162 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors describe the first implementation of an AGM belief revision system based on classical first-order logic, and it efficiently computes expansions, contractions and revision satisfying the AGM postulates for rational belief change.
Abstract: Belief revision is increasingly being seen as central to a number of fundamental problems in artificial intelligence such as nonmonotonic reasoning, reasoning about action, truth maintenance and database update. The authors describe the first implementation of an AGM belief revision system. The system is based on classical first-order logic, and for any finitely representable belief state, it efficiently computes expansions, contractions and revision satisfying the AGM postulates for rational belief change. The system uses a finite base to represent a belief set, and interprets a partially specified entrenchment as representing a unique most conservative entrenchment-this is motivated by considerations of evidence and by the close connections between belief revision and nonmonotonic reasoning. The authors describe in detail the algorithms for belief change, and give some examples of the system's operation.

26 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: A model of a cognitive agent being currently developed at LIFIA is detailed and taking this model as reference, the authors examine alternatives for supporting cognitive agents in distributed and heterogeneous environments.
Abstract: The authors discuss the implementation and run-time support for multi-agent systems (MAS). They start presenting MAS in the context of open distributed processing (ODP). Next, a model of a cognitive agent being currently developed at LIFIA is detailed. Taking this model as reference, the authors examine alternatives for supporting cognitive agents in distributed and heterogeneous environments. Finally, a distributed processing tool developed by the authors is presented. This tool follows the active object model and it is shown that active object and agent are strongly related concepts.

26 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: A genetic approach for determining the priority order in the commitment of thermal units in power generation to meet the load demand and the reverse requirement at each time interval, such that the overall system generation cost is a minimum, while satisfying various operational constraints.
Abstract: The authors present a genetic approach for determining the priority order in the commitment of thermal units in power generation. The objective of the problem is to properly schedule the on/off states of all thermal units in a system to meet the load demand and the reverse requirement at each time interval, such that the overall system generation cost is a minimum, while satisfying various operational constraints. The authors examine the feasiblity of using genetic algorithms and report some simulation results in near-optimal commitment of thermal units in a power system.

25 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors present the techniques and architecture of a knowledge-based support system currently under development for risk analysis in software generation and reuse processes and describes the main components and their relationships.
Abstract: The authors present the techniques and architecture of a knowledge-based support system currently under development for risk analysis in software generation and reuse processes. They describe the main components and their relationships. Another tool that supports risk reduction is for consistency management. This tool analyzes the effects of changes and repairs.

17 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: This work argues that objects and frames can and should be related by implementation and uses a specific frame language called FrameTalk for illustrating how frames may be implemented.
Abstract: Object-oriented programming is a programming paradigm that emphasizes the role of objects as being the primary concern in the programming task. The notion of frames as introduced by Minsky (1975) emphasizes their role for the representation of knowledge. The two concepts are often confused because they operate with overlapping terminology. The basic premise of this work is that objects and frames can and should be related by implementation. For illustrating how frames may be implemented, the author uses a specific frame language called FrameTalk and the prototypical example of a default slot description. In FrameTalk, frames are implemented by classes, and slot descriptions are transformed into a set of slot accessor methods. The implementation makes use of the meta level concepts of the Common Lisp Object System.

16 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors wish to extend to natural language understanding (NLU) systems a paradigm now seen as essential for AI: the use of meta-knowledge and the capability a system may have to observe its own functioning.
Abstract: The authors wish to extend to natural language understanding (NLU) systems a paradigm now seen as essential for AI: the use of meta-knowledge and the capability a system may have to observe its own functioning. First, they recall why multi-expert systems seem the best architecture for dealing efficiently with most NL constraints. Then they give general information about reflective reasoning models and propose some extensions to currently admitted ideas in the domain of distributed AI. Lastly, an illustration of these ideas with CARAMEL (in French: Comprehension Automatique de Recits, Apprentissage et Modelisation des Echanges Langagiers), the system developed in the group and able to perform various tasks using NLU.

12 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: Object-based knowledge representation system TROPES, containing three important intertwined features: constraints, composite objects and tasks, uses constraints to propagate input and output data from one task to another.
Abstract: TROPES is an object-based knowledge representation system, containing three important intertwined features: constraints, composite objects and tasks. Constraints provide a declarative means to define and maintain relations between objects. The part-whole relation profits by constraints for sharing attribute values between a composite object and its components. The task model benefits from composite objects for task modeling and from classification and instantiation mechanisms for task execution; it also uses constraints to propagate input and output data from one task to another.

11 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: The author describes a new implementation approach by artifical neural networks of a linear transformation stage with a nonzero kernel and a vector quantization stage that reveals that a biologically-like growing lateral inhibition influence leads to a speed-up of the learning convergence of that model.
Abstract: One of the most popular encoding techniques for sensor data is transform coding. This encoding schema is composed of two stages: a linear transformation stage with a nonzero kernel and a vector quantization stage. For the first stage, the author describes a new implementation approach by artifical neural networks. The problem of determining the optimal transformation coefficients is solved by learning the coefficients by a lateral inhibited neural network. After a short introduction to the topic the author focuses on this model and a local stability analysis of the fixpoints for the serial dynamics is provided. The resulting parameter regime is used in a network simulation example using picture statistics. Additionally, the simulations reveal that a biologically-like growing lateral inhibition influence leads to a speed-up of the learning convergence of that model.

11 citations


Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors describe a TPS and the modifications of the Rete pattern matching algorithm required to incorporate interval-based temporal semantics and temporal knowledge maintenance routines.
Abstract: Although researchers stress the importance of relative time representations, most expert system shells with temporal reasoning target real-time domains and use absolute time representations. This ontological assumption limits these systems to domains in which temporal events relate to times of occurrence. A temporal production system (TPS) is a traditional production system augmented with temporal knowledge maintenance routines and a temporal knowledge representation. These additions provide the ability to model relationships between events and they allow, but do not require, events to be associated with times of occurrence. The authors describe a TPS and the modifications of the Rete pattern matching algorithm required to incorporate interval-based temporal semantics and temporal knowledge maintenance routines.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: HML is implemented as one of subsystems of the GLS (Global Learning Scheme) discovery system developed by us for knowledge discovery in databases and is used as a learning phase in multiple learning phases of GLS.
Abstract: Introduces an approach called HML (hierarchical model learning) for refining and managing knowledge discovered from databases. HML is implemented as one of subsystems of the GLS (Global Learning Scheme) discovery system developed by us for knowledge discovery in databases. It is used as a learning phase in multiple learning phases of GLS. By means of HML, concept clusters which are discovered from a database can be added to a knowledge base as classification knowledge with hierarchical models and can be easily refined/managed. HML is based on the model representation of multi-layer logic (MLL). A key feature of HML is the quantitative evaluation for selecting the best representation of the MLL formula by cooperatively using a criterion based on information theory and domain knowledge. Experience with a prototype of HML implemented by the knowledge-based system KAUS is discussed.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: A 3D vector version of the backpropagation algorithm is proposed for multilayered neural networks in which a vector product operation is performed, and whose weights, threshold values, input and output signals are all 3D real numbered vectors.
Abstract: A 3D vector version of the backpropagation algorithm is proposed for multilayered neural networks in which a vector product operation is performed, and whose weights, threshold values, input and output signals are all 3D real numbered vectors. This new algortihm can be used to learn patterns considered of 3D vectors in a natural way. A 3D example was used to successfully test the new formulation.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: A number of conventional and new features that can be used in concert with adaptive clustering schemes are explored, and experiences of the performance of these features are presented.
Abstract: Neural nets are considered as the underlying computing mechanism for a robust approach to the problem of handwritten character recognition. It is expected that recognition mechanisms will be developed through learning algorithms. A key factor to this problem is the set of primitive features which are used to form the raw input vectors representing the digitized image of a character. The authors have explored a number of conventional and new features that can be used in concert with adaptive clustering schemes. Experiences of the performance of these features are presented. A feature which the authors call shadow and which is presented here has produced particularly encouraging results.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors discuss both theoretical and practical problems of solving ICSPs, and present solution approaches taken in the new interval constraint satisfaction tool INC++.
Abstract: Numerical design and planning problems can often be formulated conveniently as a set of equations constraining the values of related variables, i.e., as a numerical or more generally as an interval constraint satisfaction problem (ICSP). However, due to theoretical and practical problems, no tools for solving ICSPs properly in the general case have thus far been designed and implemented. The authors discuss both theoretical and practical problems of solving ICSPs, and present solution approaches taken in the new interval constraint satisfaction tool INC++. The tool can be applied, for example, as the basis for next generation interval constraint spreadsheets.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The experimental results show that AUGURS can recognize a user-drawn symbol with better accuracy and plausibility than the other networks with the least amount of recognition time when the number of training examples is limited.
Abstract: A new neural network called AUGURS is designed to assist a user of a computer-aided design package in utilizing standard graphical symbols. AUGURS is similar to the Zipcode Net by Le Cun et al. (1989, 1990) in its encoding of transformation knowledge into its network structure, but is much more compact and efficient. The experiments compare AUGURS with two versions of the Zipcode Net and a traditional layered feedforward network with an unconstrained structure. The experimental results show that AUGURS can recognize a user-drawn symbol with better accuracy and plausibility than the other networks with the least amount of recognition time when the number of training examples is limited.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: A classification problem tool called DECISIONBOX has been developed with the aim of providing experts with an integrated knowledge representation capability, which shows that suitability of representation depends on a given situation.
Abstract: Describes an approach to integrating various knowledge representations for classification problems. Knowledge representation forms have been analyzed: the analysis shows that suitability of representation depends on a given situation. Therefore, a multiple representation form capability and a form conversion capability are necessary to support developing knowledge bases for wide areas of application. A classification problem tool called DECISIONBOX has been developed with the aim of providing experts with an integrated knowledge representation capability. A knowledge base can be represented in a tabular form, a rule form or a tree form, and form conversion can be done at all times. With this integrated representation, an expert is able to build a knowledge base using the most appropriate form.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: By observing various properties inherent in Bayesian networks, one can successfully develop a hill-climbing strategy which does not require an energy function and experimental results indicate that finding the most probable explanation can be accomplished fairly easily.
Abstract: Integer linear programming (ILP) has long been an important tool for operations research akin to the AI search heuristics for NP-hard problems. However, there has been relatively little incentive to use it in AI, even though it also deals with optimization. The problem stems from the misperception that because the general ILP problem is difficult to solve, then it will be difficult for all cases. It is known that AI search at first glance also seems this way until one begins to apply it to a specific domain. Clearly, there are many gains to be had from studying the problem with a different perspective like ILP. The authors look at probabilistic reasoning with Bayesian networks. For some time now, they have been stalled by its computational complexities. Algorithms have been designed for small classes of networks, but have been mainly inextensible to the general case. In particular, the authors consider belief revision in Bayesian networks which is the search for the most probable explanation for some given evidence. They present a new approach for computing belief revision from the ILP point of view. By observing various properties inherent in Bayesian networks, one can successfully develop a hill-climbing strategy which does not require an energy function. This approach can handle the entire class of Bayesian networks. Furthermore, experimental results indicate that finding the most probable explanation can be accomplished fairly easily.

Proceedings ArticleDOI
B.T. Low1
08 Nov 1993
TL;DR: The author sketches a new architecture for representing knowledge and performing commonsense reasoning that is an acyclical directed graph formalism with a neural network computation model and a Prolog-style unification mechanism called Neural-Logic Belief Network.
Abstract: The author sketches a new architecture for representing knowledge and performing commonsense reasoning. It is an acyclical directed graph formalism with a neural network computation model and a Prolog-style unification mechanism called Neural-Logic Belief Network. In this representation, a concept is either believed, its negation is believed, unknown, or in the state of contradiction. Each proposition also has a degree-of-belief value to represent its reliability and/or certainty. Every directed link carries a tuple of real numbers to model a three-valued logic and other relations such as the commonsense IF-THEN rules. Due to the nature of network computation, it has an extreme level of tolerance to contradictory input knowledge.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors implemented the traffic simulation environment ISYS to develop and test strategies and concepts for driver models and innovative driver assistant systems and demonstrates the interactions among different agents simulated with ISYS.
Abstract: Raod users in the traffic world make up a system of autonomous acting agents. Understanding and describing the phenomena of the traffic world raises questions in the field of distributed artificial intelligence. The authors implemented the traffic simulation environment ISYS to develop and test strategies and concepts for driver models and innovative driver assistant systems. The simulation environment provides a framework to model/construct autonomous agents individually and with models of different granularity. It integrates knowledge-based techniques with numerical methods for simulation of vehicle dynamics. An application example demonstrates the interactions among different agents simulated with ISYS.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors present an approach, based on formal methods, to the modification of reusable software components, from a two-tiered hierarchy of reusableSoftware components that are analogous to the query specification.
Abstract: Using formal specifications to represent software components facilitates the determination of reusability because they more precisely characterize the functionality of the software, and the well-defined syntax makes processing amenable to automation. The authors present an approach, based on formal methods, to the modification of reusable software components. From a two-tiered hierarchy of reusable software components, the candidate components that are analogous to the query specification are retrieved from the hierarchy. A retrieved component is compared to the query specification to determine what changes needed to be applied to the corresponding program component in order to make it satisfy the query specification.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: Preliminary results have indicated that such a knowledge-based tool for operating system performance tuning is viable; increases the productivity of system maintenance personnel and reduces the cost of training; and provides users of an operating system with prompt solutions to their system performance problems.
Abstract: The authors present the design and implementation of a knowledge-based tool for performance tuning of the UNIX operating system. The tool, called BMS, provides intelligent support and maintenance for identifying bottlenecks in UNIX performance and recommending solutions to the problems. Currently, it handles problems in UNIX resource managers, such as memory utilization, disk utilization, CPU scheduling and I/O devices. BMS has been implemented in the EXSYS environment and tested on UNIX V.3. Preliminary results have indicated that such a knowledge-based tool for operating system performance tuning is viable; increases the productivity of system maintenance personnel and reduces the cost of training; and provides users of an operating system with prompt solutions to their system performance problems.

Proceedings ArticleDOI
01 Apr 1993
TL;DR: A modification of Kohonen's (1982) self-organizing feature maps offers solutions for a wide range of graph mapping problems by replacing the Euclidian metric of the target space by a new metric.
Abstract: A modification of Kohonen's (1982) self-organizing feature maps offers solutions for a wide range of graph mapping problems. This is reached by replacing the Euclidian metric of the target space by a new metric. A sample implementation for mapping parallel programs onto processor networks demonstrates the properties of this method.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors have implemented a system to make abductive reasoning to clarify hidden information and resolve the problem of noun phrase reference.
Abstract: The authors developed a language QUIXOTE as a tool to deal with various information in natural language processing. QUIXOTE is a hybrid language of a deductive object-oriented database and constraint logic programming language. The new mechanism of QUIXOTE is a combination of an object-orientation concept such as object identity and the concept of a module that classifies a large knowledge base. In addition, its logical inference system is extended to be able to make restricted abduction. The authors first apply QUIXOTE to the sorted feature structure of constraint-based grammar formalisms. Next, it is shown that QUIXOTE can contribute to the description of situation-based semantics. The authors have implemented a system to make abductive reasoning to clarify hidden information. Also, they resolve the problem of noun phrase reference.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: Basic aspects of knowledge representation and reasoning in SILO, a system integrating logic and objects that gives pre-eminence to objects, are presented.
Abstract: Most of the systems that integrate logic and objects (frames or classes) give pre-eminence to logic. This, however, results in a number of disadvantages. Basic aspects of knowledge representation and reasoning in SILO, a system integrating logic and objects that gives pre-eminence to objects, are presented. A SILO object comprises declarative facets, from frames, as well as methods and message passing, from classes. A kind of an (extended) first-order many-sorted logic is used to express object-internal knowledge. Message passing, alongside inheritance, plays a significant role in the reasoning process.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors describe the main features of the Plog programming language, a logical language that supports such a paradigm and defines the combined paradigm as the combination of the main properties of both paradigms.
Abstract: Object-oriented programming and logic programming are two of the most used programming paradigms in artificial intelligence. The authors describe a proposal to combine these two paradigms into a common logical framework. The combined framework encompasses the main features of both paradigms, making it a suitable tool for developing AI applications. First, the combined paradigm is defined as the combination of the main properties of both paradigms. Then, the authors describe the main features of the Plog programming language, a logical language that supports such a paradigm.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The neural network is found to yield better predictions than an optimum ARMA (autoregressive moving average) model and the problem of selecting the number of input and hidden nodes for modeling nodes is studied by the predictive minimum description length (MDL) principle.
Abstract: An optimization tool for neural network architecture selection is presented. The main aim of the optimization tool is to reduce the size and complexity of the network and use the least number of weights and nodes for modeling and predictions on nonlinear time series. The problem of selecting the number of input and hidden nodes for modeling nodes is studied by the predictive minimum description length (MDL) principle. The authors discuss comparatively the performance of neural networks and conventional methods in predicting nonlinear time series. The neural network is found to yield better predictions than an optimum ARMA (autoregressive moving average) model.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: The authors describe and prototype a tool that enables programmers to describe a general specification for a function, in a language independent of detailed design constructs, and the computer can identify, using case-based reasoning, an existing code sample or collection of code samples that matches the specification.
Abstract: For a software component to be reusable, it must have two characteristics: it must be designed for reuse, and it must be available for reuse. Object-oriented development enables the reuse of designs and code in future projects. However, the extent to which object-oriented development currently lends itself to such reuse is frequently overstaged. The authors seek to improve the specification and retrieval mechanisms for reusable components in object-oriented languages. They describe and prototype a tool that enables programmers to describe a general specification for a function, in a language independent of detailed design constructs. The computer can then identify, using case-based reasoning, an existing code sample or collection of code samples that matches the specification.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: BEXA is a covering algorithm whose specialization model generalizes those of CN2 and members of Michalski's AQ family (1986) such as AQ15, and it is shown that BEXA, using a beam search, rarely generates a better quality width of one.
Abstract: BEXA is a covering algorithm whose specialization model generalizes those of CN2 and members of Michalski's AQ family (1986) such as AQ15. All of these algorithms employ a time-consuming beam search to construct concept descriptions. It is shown that BEXA, using a beam search, rarely generates a better quality width of one. Three pruning strategies are also evaluated for BEXA, namely CN2's significance test, a novel stop-growth test, and Quinlan's post-pruning scheme for production rules (J. Quinlan, 1986). The stop-growth test produced better results than CN2's significance test for a beamwidth of one, while both schemes produced the best results for the same number of test databases when performing a beam search. Post-pruning produced better results than a stop-growth or significance test for a beam search, but the stop-growth test yielded similar results than post-pruning for a beamwidth of one.

Proceedings ArticleDOI
08 Nov 1993
TL;DR: A classification tool, based on the SOC (self-organizing classifier) neural model, is presented as an alternative solution to the problem of world modeling, aimed at navigation planning of an autonomous mobile robot.
Abstract: A classification tool, based on the SOC (self-organizing classifier) neural model, is presented as an alternative solution to the problem of world modeling, aimed at navigation planning of an autonomous mobile robot. Starting from rough sensorial data, the knowledge about the explored environment of a mobile robot can be incrementally organized by means of self-organizing maps and a set of heuristic rules, avoiding the computational overhead due to classical geometric approaches to world modeling. The classification strategy realized, called SON (self-organizing navigation), allows to map neural information into symbols: the authors called such emergent symbols 'navigation situations'. The prototype has been successfully tested both with simulated and real data.