scispace - formally typeset
Search or ask a question

Showing papers on "Generalization published in 1988"


Journal Article
TL;DR: The relationship between 'learning' in adaptive layered networks and the fitting of data with high dimensional surfaces is discussed, leading naturally to a picture of 'generalization in terms of interpolation between known data points and suggests a rational approach to the theory of such networks.
Abstract: : The relationship between 'learning' in adaptive layered networks and the fitting of data with high dimensional surfaces is discussed. This leads naturally to a picture of 'generalization in terms of interpolation between known data points and suggests a rational approach to the theory of such networks. A class of adaptive networks is identified which makes the interpolation scheme explicit. This class has the property that learning is equivalent to the solution of a set of linear equations. These networks thus represent nonlinear relationships while having a guaranteed learning rule. Great Britain.

3,538 citations


Journal ArticleDOI
TL;DR: This paper will derive a generalization of backpropagation to recurrent systems (which input their own output), such as hybrids of perceptron-style networks and Grossberg/Hopfield networks, and does not require the storage of intermediate iterations to deal with continuous recurrence.

960 citations


Journal ArticleDOI
TL;DR: In this paper, the quantum mechanical dynamics of a particle coupled to a heat bath is treated by functional integral methods and a generalization of the Feynman-Vernon influence functional is derived.

880 citations


Journal ArticleDOI
Jacob Cohen1
TL;DR: Set correlation is a realization of the general multi variate linear model, can be viewed as a multivariate generalization of multiple correlation analysis, and may be employed in the analysis of m... as mentioned in this paper.
Abstract: Set correlation is a realization of the general multi variate linear model, can be viewed as a multivariate generalization of multiple correlation analysis, and may be employed in the analysis of m...

684 citations


Proceedings Article
01 Jan 1988
TL;DR: In this paper, the authors propose a means of using the knowledge in a network to determine the functionality or relevance of individual units, both for the purpose of understanding the network's behavior and improving its performance.
Abstract: This paper proposes a means of using the knowledge in a network to determine the functionality or relevance of individual units, both for the purpose of understanding the network's behavior and improving its performance. The basic idea is to iteratively train the network to a certain performance criterion, compute a measure of relevance that identifies which input or hidden units are most critical to performance, and automatically trim the least relevant units. This skeletonization technique can be used to simplify networks by eliminating units that convey redundant information; to improve learning performance by first learning with spare hidden units and then trimming the unnecessary ones away, thereby constraining generalization; and to understand the behavior of networks in terms of minimal "rules."

482 citations


Journal ArticleDOI
TL;DR: It is argued that in computer-assisted generalization, the spatial modelling process can be simulated only by strategies based on understanding and not by a mere sequence of operational processing steps.
Abstract: This paper reviews the prospects of computer-assisted generalization of spatial data. Generalization as a general human activity is first considered in a broad context and map generalization is defined as a special variant of spatial modelling. It is then argued that in computer-assisted generalization, the spatial modelling process can be simulated only by strategies based on understanding and not by a mere sequence of operational processing steps. A conceptual framework for knowledge-based generalization is then presented which can be broken down into five steps: structure recognition, process recognition, process modelling, process execution and display. With reference to the goals of map generalization we identified tasks of statistical and cartographic generalization. The use of these types of tasks is discussed in relation to the concepts of digital landscape models (DLM) and digital cartographic models (DCM). A literature review is then presented in the context of this conceptual framework...

328 citations


Journal ArticleDOI
TL;DR: This thesis presents a logical formalism for representing and reasoning with probabilistic knowledge which offers combined, interacting, but still clearly separated, plausible inductive inference and sound deductive inference.
Abstract: This thesis presents a logical formalism for representing and reasoning with probabilistic knowledge. The formalism differs from previous efforts in this area in a number of ways. Most previous work has investigated ways of assigning probabilities to the sentences of a logical language. Such an assignment fails to capture an important class of probabilistic assertions, empirical generalizations. Such generalizations are particularly important for AI, since they can be accumulated through experience with the world. Thus, they offer the possibility of reasoning in very general domains, domains where no experts are available to gather subjective probabilities from. A logic is developed which can represent these empirical generalizations. Reasoning can be performed through a proof theory which is shown to be sound and complete. Furthermore, the logic can represent and reason with a very general set of assertions, including many non-numeric assertions. This also is important for AI as numbers are usually not available. The logic makes it clear that there is an essential difference between empirical, or statistical, probabilities and probabilities assigned to sentences, e.g., subjective probabilities. The second part of the formalism is an inductive mechanism for assigning degrees of belief to sentences based on the empirical generalizations expressed in the logic. these degrees of belief have a strong advantage over subjective probabilities: they are founded on objective statistical knowledge about the world. Furthermore, the mechanism of assigning degrees of belief gives a natural answer to the question "Where do the probabilities come from:" they come from our experience with the world. The two parts of the formalism offer combined, interacting, but still clearly separated, plausible inductive inference and sound deductive inference.

310 citations


Journal ArticleDOI
TL;DR: In this paper, a con-tinuously defined two-parameter generalization of the Tukey lambda family of distributions, which holds promise of a variety of additional applications, is variously studied.
Abstract: The Tukey lambda family of distributions together with its extensions have played an important role in statistical practice. In this paper a con¬tinuously defined two-parameter generalization of this family, which holds promise of a variety of additional applications, is variously studied. The coefficients of skewness and kurtosis and the density shapes of its members are examined and the family is related to the classical Pearsonian system of distributions.

226 citations



Journal ArticleDOI
01 Mar 1988
TL;DR: In this paper, a generalization of commuting maps, called compatible maps, are characterized in terms of coincidence points and common fixed point theorems for compatible maps and commuting maps on compact metric spaces are obtained.
Abstract: Compatible maps-a generalization of commuting maps-are characterized in terms of coincidence points, and common fixed point theorems for compatible maps and commuting maps on compact metric spaces are obtained.

208 citations


Journal ArticleDOI
05 Jun 1988
TL;DR: In this paper, a generalization of the epipolar-plane image-analysis mapping technique is presented, which enables varying view direction, including varying over time, and provides three-dimensional connectivity information for building coherent spatial descriptions of observed objects.
Abstract: The previous implementations of the authors' epipolar-plane image-analysis mapping technique demonstrated the feasibility and benefits of the approach, but were carried out for restricted camera geometries. The question of more general geometries made the technique's utility for autonomous navigation uncertain. The authors have developed a generalization of the analysis that: (1) enables varying view direction, including varying over time; (2) provides three-dimensional connectivity information for building coherent spatial descriptions of observed objects; and (3) operates sequentially, allowing initiation and refinement of scene feature estimates while the sensor is in motion. To implement this generalization it was necessary to develop an explicit description of the evolution of images over time. They achieved this by building a process that creates a set of two-dimensional manifolds defined at the zeros of a three-dimensional spatiotemporal Laplacian. These manifolds represent explicitly both the spatial and temporal structure of the temporally evolving imagery and are termed spatiotemporal surfaces. >

Proceedings ArticleDOI
24 Jul 1988
TL;DR: The authors apply neural networks to a generalization problem of predicting the ratings of corporate bonds, where conventional mathematical modeling techniques have yielded poor results and it is difficult to build rule-based artificial-intelligence systems.
Abstract: The authors apply neural networks to a generalization problem of predicting the ratings of corporate bonds, where conventional mathematical modeling techniques have yielded poor results and it is difficult to build rule-based artificial-intelligence systems The results indicate that neural nets are a useful approach to generalization problems in such nonconservative domains, performing much better than mathematical modeling techniques like regression >

Journal ArticleDOI
TL;DR: The standard version of the inviscid two-invariant cap model is considered and a viscoplastic rate-dependent generalization is proposed and a new algorithm is proposed based on the notion of closest-point project to fit the model to experimental data systematically.
Abstract: In this paper, the standard version of the inviscid twoinvariant cap model is considered and a viscoplastic ratedependent generalization is proposed For the inviscid case, a new algorithm is propo

Proceedings Article
01 Jan 1988
TL;DR: A generalization of the Winner-Take-All Network is presented and rigorously analyzed: the K-Winners- take- all Network, which identifies the K largest of a set of N real numbers.
Abstract: We present and rigorously analyze a generalization of the Winner-Take-All Network: the K-Winners-Take-All Network. This network identifies the K largest of a set of N real numbers. The network model used is the continuous Hopfield model.

Journal ArticleDOI
Stefano Olla1
TL;DR: A large deviation principle for Gibbs random fields on Zd is proven and a corresponding large deviations proof of the Gibbs variational formula is given and a generalization of the Lanford theory of large deviations is obtained.
Abstract: A large deviation principle for Gibbs random fields on Zd is proven and a corresponding large deviations proof of the Gibbs variational formula is given. A generalization of the Lanford theory of large deviations is also obtained.

Journal ArticleDOI
TL;DR: In this paper, the concept of M-valued sets was introduced to solve the problem of the quotient w.r.t. a similarity relation, which is closely related to Poincare's paradox and central problems in cluster analysis.

Journal ArticleDOI
TL;DR: It is argued that explanation-based generalisation as recently proposed in the machine learning literature is essentially equivalent to partial evaluation, a well-known technique in the functional and logic programming literature.

Journal ArticleDOI
TL;DR: In this paper, a generalization of correspondence analysis to multivariate categorical data is proposed, where all two-way contingency tables of a set of categorical variables are simultaneously fitted by weighted least-squares.
Abstract: SUMMARY A generalization of correspondence analysis to multivariate categorical data is proposed, where all two-way contingency tables of a set of categorical variables are simultaneously fitted by weighted least-squares. An alternating least-squares algorithm is developed to perform the fitting. This technique has a number of advantages over the usual generalization known as multiple correspondence analysis. It is also an analogue of least-squares factor analysis for categorical data.

Journal ArticleDOI
TL;DR: This paper describes a method for causal attribution that can produce the analyses of examples that the generalization methods require, in the domain of simple procedures in human-computer interaction, and argues that none of the current analysis-based generalized methods fully captures Wertheimer s notion of understanding.

Journal ArticleDOI
TL;DR: This paper illustrates how the application of integer programming to logic can reveal parallels between logic and mathematics and lead to new algorithms for inference in knowledge-based systems.
Abstract: This paper illustrates how the application of integer programming to logic can reveal parallels between logic and mathematics and lead to new algorithms for inference in knowledge-based systems. If logical clauses (stating that at least one of a set of literals is true) are written as inequalities, then the resolvent of two clauses corresponds to a certain cutting plane in integer programming. By properly enlarging the class of cutting planes to cover clauses that state that at least a specified number of literals are true, we obtain a generalization of resolution that involves both cancellation-type and circulant-type sums. We show its completeness by proving that it generates all prime implications, generalizing an early result by Quine. This leads to a cutting-plane algorithm as well as a generalized resolution algorithm for checking whether a set of propositions, perhaps representing a knowledge base, logically implies a given proposition. The paper is intended to be readable by persons with either an operations research or an artificial intelligence background.

Proceedings ArticleDOI
01 Feb 1988
TL;DR: The authors show how the process of object class definition by generalization can be incorporated into object-oriented systems by identifying types of semantic relationships that may hold between a generalization class's subclasses and their attributes.
Abstract: The authors show how the process of object class definition by generalization can be incorporated into object-oriented systems. Traditional message handling, which is mainly based on downward property inheritance, is revised and extended to upward property inheritance, so that a maximum of reusability of code and data can be achieved. Different types of semantic relationships that may hold between a generalization class's subclasses and their attributes are identified. The different semantic relationships can then be utilized to produce different default treatments of messages and upward property propagation.

Journal ArticleDOI
TL;DR: A modified finite-volume method, which is a direct generalization of the standard finite-difference method to arbitrary polygonal grids, is shown to be the most accurate.

Book
01 Jan 1988
TL;DR: The Third Edition has been meticulously updated and continues the successful pedagogical approach of the two previous editions, guiding students through the fundamental elements of formal deductive logic, classification and definition, fallacies, basic argument analysis, inductive generalization, statistical reasoning, and explanation.
Abstract: The Third Edition has been meticulously updated and continues the successful pedagogical approach of the two previous editions, guiding students through the fundamental elements of formal deductive logic, classification and definition, fallacies, basic argument analysis, inductive generalization, statistical reasoning, and explanation.

Journal ArticleDOI
TL;DR: A syntax-directed generalization of Owicki–Gries's Hoare logic for a parallel while language is presented, based on Hoare asserted programs of the form {Γ, A } p { B, Δ} where Γ, Δ are sets of first-order formulas.

Journal ArticleDOI
TL;DR: In this paper, a method for expanding arbitrary powers of the characteristic polynomial of a matrix is developed, expressed in terms of matrix functions generalizing those of the permanent and determinant.

Journal ArticleDOI
TL;DR: In this paper, a generalization of the uniform mean centred directional sampling in the standardized n-dimensional Gaussian space is given, where two modifications of different nature are involved: shifting the origin to a point different from the mean and defining the sampling distribution in such a way that the exact probability on a given half-space is obtained by a single simulation.


Book ChapterDOI
Serge Haddad1
01 Jun 1988
TL;DR: This paper presents the generalization to the coloured nets of the most efficient reductions defined by Berthelot for Petri nets, and defines extensions of the implicit place transformation and the pre and post agglomeration of transitions.
Abstract: This paper presents the generalization to the coloured nets of the most efficient reductions defined by Berthelot for Petri nets. First, a generalization methodology is given that is independent from the reduction one wants to generalize. Then based on that methodology, we define extensions of the implicit place transformation and the pre and post agglomeration of transitions. For each reduction we prove that the reduced net has exactly the same properties as the original net. Finally we completely reduce an improved model of the data base management with multiple copies, thus showing its correctness.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the validity generalization critique of James, Demaree, and Mulaik and concluded that their analysis is not relevant to the real-world use of validity generalisation in organizations and has overlooked the bulk of the evidence against the situational specificity hypothesis.
Abstract: In this article we analyzed the James, Demaree, and Mulaik (1986) critique of validity generalization. We demonstrated that the James et al. article (a) is not relevant to the real-world use of validity generalization in organizations, (b) has overlooked the bulk of the evidence against the situational specificity hypothesis and, therefore, the substantive conclusion that the situational specificity hypothesis is "alive and well" cannot be supported, and (c) has confused the processes of hypothesis testing and parameter estimation in validity generalization and has made incorrect statements about the assumptions underlying both. In addition, (d) James et al.'s critique of the 75% rule is a statistical power argument and, as such, does not add to earlier statistical power studies; (e) the procedures for use of confidence intervals that they advocate are erroneous; (f) there is no double correction of artifacts in validity generalization, as they contend; (g) the bias in the correlation (r) and the sampling error formula for r that they discuss is well-known, trivial in magnitude, and has no empirical significance; and (h) the use of the Fisher's z transformation of r in validity generalization studies and other meta-analyses (which they advocate) creates an unnecessary inflationary bias in estimates of true validities and provides no benefits. In light of these facts, we conclude that the James et al. substantive conclusions and methodological recommendations are seriously flawed. This article is an analysis of the James, Demaree, and Muliak (1986) critique of validity generalization methods and conclusions, a long and detailed article. In the interests of brevity, we will focus only on those portions of James et al. that we judge to be most in need of critical evaluation. Their remaining arguments are left to the reader to evaluate in light of the analysis presented here.

Proceedings ArticleDOI
15 Jun 1988
TL;DR: In this paper, a generalization of Kharitonov's four polynomial concept to the case of linearly dependent coefficient perturbations and more general zero location regions is presented.
Abstract: From a systems-theoretic point of view, Kharitonov's seminal theorem on stability of interval polynomials suffers from two fundamental limitations: First, the theorem only applies to polynomials with independent coefficient perturbations. Note that uncertainty in the physical parameters of a linear system typically results in dependent perturbations in the coefficients of the characteristic polynomial. Secondly, Kharitonov's Theorem only applies to zeros in the left half plane?more general zero location regions are not accommodated. In view of this motivation, the main result of this paper is a generalization of Kharitonov's four polynomial concept to the case of linearly dependent coefficient perturbations and more general zero location regions.