# A logical calculus of the ideas immanent in nervous activity

01 Jan 1990-Bulletin of Mathematical Biology (Kluwer Academic Publishers)-Vol. 52, Iss: 4, pp 99-115

TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.

About: This article is published in Bulletin of Mathematical Biology.The article was published on 1990-01-01. It has received 14937 citations till now. The article focuses on the topics: Propositional calculus & Existential quantification.

##### Citations

More filters

•

Aston University

^{1}TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.

Abstract: From the Publisher:
This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modelling probability density functions and the properties and merits of the multi-layer perceptron and radial basis function network models. Also covered are various forms of error functions, principal algorithms for error function minimalization, learning and generalization in neural networks, and Bayesian techniques and their applications. Designed as a text, with over 100 exercises, this fully up-to-date work will benefit anyone involved in the fields of neural computation and pattern recognition.

19,056 citations

### Cites background from "A logical calculus of the ideas imm..."

...1 can generate any Boolean function, provided the number M of hidden units is sufficiently large (McCulloch and Pitts, 1943)....

[...]

••

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

14,635 citations

••

TL;DR: The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue.

Abstract: Publisher Summary This chapter provides an account of different neural network architectures for pattern recognition. A neural network consists of several simple processing elements called neurons. Each neuron is connected to some other neurons and possibly to the input nodes. Neural networks provide a simple computing paradigm to perform complex recognition tasks in real time. The chapter categorizes neural networks into three types: single-layer networks, multilayer feedforward networks, and feedback networks. It discusses the gradient descent and the relaxation method as the two underlying mathematical themes for deriving learning algorithms. A lot of research activity is centered on learning algorithms because of their fundamental importance in neural networks. The chapter discusses two important directions of research to improve learning algorithms: the dynamic node generation, which is used by the cascade correlation algorithm; and designing learning algorithms where the choice of parameters is not an issue. It closes with the discussion of performance and implementation issues.

13,033 citations

••

TL;DR: This article will be concerned primarily with the second and third questions, which are still subject to a vast amount of speculation, and where the few relevant facts currently supplied by neurophysiology have not yet been integrated into an acceptable theory.

Abstract: The first of these questions is in the province of sensory physiology, and is the only one for which appreciable understanding has been achieved. This article will be concerned primarily with the second and third questions, which are still subject to a vast amount of speculation, and where the few relevant facts currently supplied by neurophysiology have not yet been integrated into an acceptable theory. With regard to the second question, two alternative positions have been maintained. The first suggests that storage of sensory information is in the form of coded representations or images, with some sort of one-to-one mapping between the sensory stimulus

8,434 citations

•

01 Jan 1988

TL;DR: The second and third questions are still subject to a vast amount of speculation, and where the few relevant facts currently supplied by neurophysiology have not yet been integrated into an acceptable theory as mentioned in this paper.

Abstract: The first of these questions is in the province of sensory physiology, and is the only one for which appreciable understanding has been achieved. This article will be concerned primarily with the second and third questions, which are still subject to a vast amount of speculation, and where the few relevant facts currently supplied by neurophysiology have not yet been integrated into an acceptable theory. With regard to the second question, two alternative positions have been maintained. The first suggests that storage of sensory information is in the form of coded representations or images, with some sort of one-to-one mapping between the sensory stimulus

8,134 citations

##### References

More filters

•

01 Jun 1937

TL;DR: Carnap's entire theory of language structure appeared in The Logical Syntax of Language (1934) as mentioned in this paper, which led to his famous "Principle of tolerance" by which everyone is free to mix and match the rules of his logic in any way he wishes.

Abstract: Rudolf Carnap's entire theory of Language structure "came to me," he reports, "like a vision during a sleepless night in January 1931, when I was ill." This theory appeared in The Logical Syntax of Language (1934). Carnap argued that many philosophical controversies really depend upon whether a particular language form should be used. This leads him to his famous "Principle of tolerance" by which everyone is free to mix and match the rules of his language and therefore his logic in any way he wishes. In this way, philosophical issues become reduced to a discussion of syntactical properties, plus reasons of practical convenience for preferring one form of language to another. In a tour de force of precise reasoning, Carnap also indicated how two model languages could be constructed. This is one of three books which Open Court is making available in paperback reprint in its Open Court Classics series. The other two are Carnap's The Logical Structure of the World and Schlick's General Theory of Knowledge.

894 citations