scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Pattern Recognition and Artificial Intelligence in 1989"


Journal ArticleDOI
TL;DR: The computational realisation of the advocated approach to combining evidence is demonstrated for the application of edge labelling and details of how both the measurement generation process and the world-model may be expressed probabilistically and what the advantages are to be gained from using different forms of support-function are included.
Abstract: We present a specification of the problem of combining evidence that arises in the approach to consistent object-labelling known as probabilistic relaxation. This specification differs from others in several important respects. Firstly, we ensure internal consistency by distinguishing between directly and indirectly interacting objects. Secondly, we avoid certain problems of interpretation and meaning by regarding the iterative updating of probabilities as a filtering process on the measurements for objects. Finally, we overcome the problem of the exponential complexity of the resulting evidence combining formula by deriving practical support functions of at most polynomial complexity. The computational realisation of the advocated approach to combining evidence is demonstrated for the application of edge labelling. This application includes details of how both the measurement generation process and the world-model may be expressed probabilistically and what the advantages are to be gained from using different forms of support-function.

154 citations


Journal ArticleDOI
TL;DR: By combining the split and the merge operations for shape matching it is unnecessary to apply any type of edit operation to a model shape, which makes the distance between the input shape and the model shape more meaningful and stable, and improves recognition results.
Abstract: Due to noise and distortion, segmentation uncertainty is a key problem in structural pattern analysis. In this paper we propose the use of the split operation for shape recognition by attributed string matching. After illustrating the disadvantage of attributed string matching using the merge operation, the split operation is proposed. Under the guidance of the model shape, an input shape can be reapproximated, using the split operation, into a new attributed string representation. By combining the split and the merge operations for shape matching it is unnecessary to apply any type of edit operation to a model shape. This makes the distance between the input shape and the model shape more meaningful and stable, and improves recognition results. An algorithm for attributed string matching by split-and-merge is proposed. To eliminate the effect of the numbers of primitives in the model shape on the shape distance, shape recognition based on a similarity measure is also proposed. Good experimental results prove the feasibility of the proposed approach for general shape recognition.

36 citations


Journal ArticleDOI
TL;DR: A hybrid technique is proposed, involving the use of structural prototypes from pattern recognition combined with production systems from artificial intelligence to enable the computer interface to observe, detect, identify and anticipate conceptual structures.
Abstract: The objects of pattern recognition have in the past been primarily physical in nature, describable in terms of some physical units of measurement. The patterns of interest in this paper are conceptual or logical. They are to be representative of structures known to be operative in the human mind. A hybrid technique is proposed, involving the use of structural prototypes from pattern recognition combined with production systems from artificial intelligence. The purpose is to enable the computer interface to observe, detect, identify and anticipate conceptual structures. As a result, the interface system should have a more comprehensive “perspective” on what the user is doing and likely to do next. It should therefore be better prepared to provide him/her with adaptive assistance.

8 citations


Journal ArticleDOI
TL;DR: The paper discusses methods of representing spatial knowledge, with particular focus on the broad categories known as analogical and propositional representations, and several solutions for the problem are presented.
Abstract: The use of spatial knowledge is necessary in a variety of artificial intelligence and expert systems applications. The need is not only in tasks with spatial goals such as image interpretation and robot motion, but also in tasks not involving spatial goals, e.g. diagnosis and language understanding. The paper discusses methods of representing spatial knowledge, with particular focus on the broad categories known as analogical and propositional representations. The problem of neurological localization is considered in some detail as an example of intelligent problem-solving that requires the use of spatial knowledge. Several solutions for the problem are presented: the first uses an analogical representation only, the second uses a propositional representation and the third uses an integrated representation. Conclusions about the different representations for building intelligent systems are drawn.

7 citations


Journal ArticleDOI
TL;DR: This work deals with the question of the number N of solutions of SAT and shows that for a given SAT-instance, it should be possible to find an estimate of N with a margin of confidence in polynomial time.
Abstract: In propositional logic (zero order) a system of logical rules may be put under the form of a conjunction of disjunction, i.e. a “satisfiability” or SAT-problem. SAT is central to NP-complete problems. Any result obtained on SAT would have consequences for a lot of problems important in artificial intelligence. We deal with the question of the number N of solutions of SAT. Firstly, any system of SAT clauses may be transformed in a system of independent clauses by an exponential process; N may be computed exactly. Secondly, by a statistical approach, results are obtained showing that for a given SAT-instance, it should be possible to find an estimate of N with a margin of confidence in polynomial time. Thirdly, we demonstrate the usefulness of these ideas on large knowledge bases.

7 citations


Journal ArticleDOI
TL;DR: The methods and ways of pattern recognition are used in the subsystems DIALOGUE, ANALYTIC and HOMEOSTAT and the recognition algorithms which work on the information to be retained in the data base and knowledge base are described.
Abstract: A definition of expert systems is given, its pragmatic demands are cited and its structure is described. The methods and ways of pattern recognition are used in the subsystems DIALOGUE, ANALYTIC and HOMEOSTAT. The recognition algorithms which work on the information to be retained in the data base and knowledge base are described. The problems of recognition appearing under the construction of expert systems are noted.

6 citations


Journal ArticleDOI
TL;DR: Experimental results show that with the modification and multiprocessor implementations, the SPTA was considerably speeded up.
Abstract: A modification is proposed to the Safe Point Thinning Algorithm (SPTA) to speed it up. The modified algorithm was implemented on a single processor. It was then implemented on the Homogeneous Multiprocessor Proper under two techniques: data decomposition and function decomposition. Experimental results show that with our modification and multiprocessor implementations, the SPTA was considerably speeded up.

6 citations


Journal ArticleDOI
TL;DR: Two general principles of diagnostic modeling are discussed, based upon observations from expert protocols: model variables assume values relative to normal levels (the normality principle) and these values are propagated through the model to account for the production of single outputs (the single output principle).
Abstract: Diagnostic models of complex mechanisms are designed specifically to support diagnostic reasoning. Diagnostic reasoning is that aspect of troubleshooting which determines a set of component faults that can account for observed abnormalities in system performance. A system’s performance is evaluated by comparing observations of the system’s behavior with its functional specifications. As such, diagnostic models must incorporate elements of both function as well as behavior. We define two necessary properties of diagnostic models reflecting this outlook: architectural fidelity and functional adequacy. We then discuss two general principles of diagnostic modeling, based upon observations from expert protocols: model variables assume values relative to normal levels (the normality principle) and these values are propagated through the model to account for the production of single outputs (the single output principle). These two principles allow significant simplification in the definition of diagnostic models of complex mechanisms. We illustrate our approach to diagnostic modeling and reasoning with examples from xerography.

4 citations


Journal ArticleDOI
TL;DR: Hexagonal pyramid automata are considered and are shown to accept some languages generated by hexagonal array grammars and languages accepted by one-way 2-D cellular automata.
Abstract: Systolic pyramid automata accepting square arrays are defined. Homogeneous and semi-homogeneous pyramid automata are shown to have equal power though regular pyramid automata are more powerful. Languages accepted by these automata are compared with languages generated by array grammars and languages accepted by one-way 2-D cellular automata. Hexagonal pyramid automata are also considered and are shown to accept some languages generated by hexagonal array grammars.

3 citations


Journal ArticleDOI
TL;DR: In this paper a new method for testing algorithms and programs of multivariate statistical procedures—the so-called “exact samples method” is introduced and the programs of simple linear regression analysis from four most popular standard packages are tested and compared with the help of the new method.
Abstract: Sometimes programs, for multivariate statistical procedures are included into expert systems. The requirements of accuracy, exactness and reliability for such programs are very high. In this paper a new method for testing algorithms and programs of multivariate statistical procedures—the so-called “exact samples method” is introduced. The programs of simple linear regression analysis from four most popular standard packages are tested and compared with the help of the new method.

3 citations


Journal ArticleDOI
TL;DR: The proposed framework and methods can be considered as a hybrid methodology in which both structural and decision-theoretic pattern recognition are integrated and can result in rendering tractable the possibly hard original inductive learning problem associated with the given task.
Abstract: A new framework is introduced which allows the formulation of difficult structural classification tasks in terms of decision-theoretic-based pattern recognition. It is based on extending the classical formulation of generalized linear discriminant functions so as to permit each given object to have a different vector representation in each class. The proposed extension properly accounts for the corresponding extension of the classical learning techniques of linear discriminant functions in a way such that the convergence of the extended techniques can still be proved. The proposed framework can be considered as a hybrid methodology in which both structural and decision-theoretic pattern recognition are integrated. Furthermore, it can be considered as a means to achieve convenient tradeoffs between the inductive and deductive ways of knowledge acquisition, which can result in rendering tractable the possibly hard original inductive learning problem associated with the given task. The proposed framework and methods are illustrated through their use in two difficult structural classification tasks, showing both the appropriateness and the capability of these methods to obtain useful results.

Journal ArticleDOI
TL;DR: An approach to ESIA development on the base of pyramidal-recursive structures as a way for image representation and for establishing a correspondence between elements of the two hierarchies is proposed.
Abstract: Peculiarities of expert systems for image analysis (ESIA) as well as their distinction from traditional expert systems are considered. Two interconnected components of the ESIA which simulate left and right cerebral hemisphere mechanisms of human visual perception are pointed out. The left formal-logical mechanism is simulated by a “conceptual hierarchy”, reflecting a problem area specialists’ knowledge. The right spatial-pattern mechanism is simulated by a “visual hierarchy” and an image processing expertise. An approach to ESIA development on the base of pyramidal-recursive structures as a way for image representation and for establishing a correspondence between elements of the two hierarchies is proposed.

Journal ArticleDOI
TL;DR: This work considers the class of logical decision rules and its applications for the solution of various problems of multivariate statistical analysis: discriminant and regression analysis, and cluster analysis.
Abstract: We consider the class of logical decision rules and its applications for the solution of various problems of multivariate statistical analysis: discriminant and regression analysis, and cluster analysis. Some useful properties of the statistical analysis methods using the class under consideration are shown. Particular attention is paid to the possibility of presenting statistical results (unlike all other methods) in a language close to a natural language of logical statements.

Journal ArticleDOI
TL;DR: It is shown that semantic information is required to parse grammatical agreement in Hindi.
Abstract: Hindi exhibits an obligatory grammatical agreement pattern (subject-verb, object-verb, or neutral agreement), and any adequate parser must reject or mark strings that violate grammatical agreement. We develop a combination of strategies for parsing grammatical agreement in Hindi and implement them in an Augmented Transition Network (ATN) parser. It is shown that semantic information is required to parse grammatical agreement in Hindi.

Journal ArticleDOI
TL;DR: a It is natural that the efficiency condition requires that T 0 , pr 0 should be defined effectively as well as well, and this condition is satisfied by defining T 0, pr 0 effectively.
Abstract: a It is natural that the efficiency condition requires that T 0 , pr 0 should be defined effectively as well.