scispace - formally typeset
Search or ask a question

Showing papers on "Artificial neural network published in 1972"


Book
01 Jan 1972
TL;DR: Using metaphors to describe the brain, distributed interactions are depicted that underlie intelligence in the human, animal, or machine (robot) and basic ideas about neural networks are covered, both artificial and biological.
Abstract: From the Publisher: Shows how highly distributed cooperative computation deepens our understanding of the human mind/brain and catalyzes the development of computing machinery. Using metaphors to describe the brain, distributed interactions are depicted that underlie intelligence in the human, animal, or machine (robot). Covers basic ideas about neural networks, both artificial and biological. Models are provided to show how the brain works, and the schemas which mediate our perception, knowledge, and action.

171 citations


Journal ArticleDOI
TL;DR: A neural network model is introduced which approximates the behavior of an integrator and the response of the ``integrator'' to sustained inputs is derived and shown to approximate the desired proportionality of input magnitude and rate of change of output.
Abstract: A neural network model is introduced which approximates the behavior of an integrator. It is developed from cells that are physiologically realistic. Requirements for insensitivity of network function to cell threshold distribution are derived, and network performance as the forward-path element of a neuromuscular control system is analyzed. The response of the ``integrator'' to sustained inputs is derived and shown to approximate the desired proportionality of input magnitude and rate of change of output. Finally, deviations from physiological realism are discussed and the visual accommodation system is described as an example of a biological servomechanism in which a neural integrator seems probable.

35 citations


Journal ArticleDOI
Luigi Accardi1, A Aiello1
TL;DR: The geometric concept of “quadrant-degeneration” is studied and it is shown to be independent of the algebraic concept of rank-degension, which is employed to solve some global problems of synthesis without the use of the theory of linear inequalities.
Abstract: Some properties of the global behaviour of a model of neural network are considered. The geometric concept of “quadrant-degeneration” is studied and it is shown to be independent of the algebraic concept of rank-degeneration. The results obtained are employed to solve some global problems of synthesis (i.e., independent of the initial state of the network) without the use of the theory of linear inequalities.

5 citations


Journal ArticleDOI
TL;DR: Physical principles underlying some recent studies on information processing in nonlinear neurons and neural networks are outlined in simple mathematical terms.
Abstract: Physical principles underlying some recent studies on information processing in nonlinear neurons and neural networks are outlined in simple mathematical terms.

3 citations


Journal ArticleDOI
Makoto Oonuki1
TL;DR: The time characteristics of a linear network in the brain are obtained by the method of the “time partition function,” which is analogous to a grand partition function or a distribution function in statistical mechanics.
Abstract: The time characteristics of a linear network in the brain are obtained by the method of the “time partition function,” which is analogous to a grand partition function or a distribution function in statistical mechanics. The analogy between the average density in a many-particle system and the reciprocal of the frequency in a network is shown. By this method, the frequency distribution functions are obtained with respect to a network composed of two layers, the network used in information retrieval and the network generating a brain wave.

2 citations


Journal ArticleDOI
TL;DR: The Perceptron is the most commonly used model among adaptive neural nets, but is shown to be inadequate to account for perception.
Abstract: Adaptive neural nets are collections of threshold logic units (TLU) connected by variable weights. They are often compared to biological systems, serving as models for memory and perception, and are frequently used for Optical Character Recognition (OCR). The Perceptron is the most commonly used model among these nets. Taking it as an example we will show it to be inadequate to account for perception. The following notation is used: “ φ,φ ....“ denotes individual characters φ,φ 2....; denotes the class of all characters φ, φ2.

2 citations