scispace - formally typeset
Journal ArticleDOI

Learning Matrices and Their Applications

TLDR
A survey of the learning circuits which became known as learning matrices and some of their possible technological applications is given.
Abstract
The paper gives a survey of the learning circuits which became known as learning matrices and some of their possible technological applications. The first section describes the principle of learning matrices. So-called conditioned connections between the characteristics of an object and the meaning of an object are formed in the learning phase. During the operation of connecting the characteristics of an object with its meaning (EB operation of the knowing phase) upon presenting the object characteristics, the associated most similar meaning is realized in the form of a signal by maximum likelihood decoding. Conversely, in operation from the meaning of an object to its characteristics (BE operation) the associated object characteristics are obtained as signals by parallel reading upon application of an object meaning. According to the characteristic signals processed (binary or analog signals) discrimination must be made between binary and nonbinary learning matrices. In the case of the binary learning matrix the conditioned connections are a statistical measure for the frequency of the coordination of object characteristics and object meaning, in the case of the nonbinary learning matrix they are a measure for an analog value proportional to a characteristic. Both types of matrices allow for the characteristic sets applied during EB operation to be unsystematically disturbed within limits. Moreover, the nonbinary learning matrix is invariant to systematic deviations between presented and learned characteristic sets (invariance to affine transformation, translation and rotated skewness).

read more

Citations
More filters
Journal ArticleDOI

30 years of adaptive neural networks: perceptron, Madaline, and backpropagation

TL;DR: The history, origination, operating characteristics, and basic theory of several supervised neural-network training algorithms (including the perceptron rule, the least-mean-square algorithm, three Madaline rules, and the backpropagation technique) are described.
Journal ArticleDOI

Locally Weighted Learning

TL;DR: The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning fit parameters, and applications of locally weighted learning.
Proceedings Article

End-to-end memory networks

TL;DR: This paper proposed an end-to-end memory network with a recurrent attention model over a possibly large external memory, which can be seen as an extension of RNNsearch to the case where multiple computational steps (hops) are performed per output symbol.
Posted Content

End-To-End Memory Networks

TL;DR: A neural network with a recurrent attention model over a possibly large external memory that is trained end-to-end, and hence requires significantly less supervision during training, making it more generally applicable in realistic settings.
Book ChapterDOI

Stochastic Computing Systems

TL;DR: The invention of the stored-program digital computer during the second world war made it possible to replace the lower-level mental processes of man by electronic data-processing in machines, but the authors lack the "steam engine" or "digital computer" which will provide the necessary technology for learning and pattern recognition by machines.
References
More filters
Journal ArticleDOI

Nichtdigitale lernmatrizen als perzeptoren

TL;DR: The properties of learning matrices capable of learning and processing accordingly patterns of non-digital nature are investigated and it is shown that the recognition of patterns by means of such non- digital learningMatrices offers extraordinary possibilities for the formation of invariants.
Journal ArticleDOI

STELLA: A scheme for a learning machine

TL;DR: A scheme for a learning machine, which is being constructed in the form of a mechanical tortoise which takes its name from its laboratory origin, in which the machine explores the possibilities of its future actions with a view to modifying its performance.
Journal ArticleDOI

A consistency technique for pattern association

TL;DR: A new technique based on the consistency of partial classifications is compared with other techniques for pattern association in the light of computer simulations and other considerations.