scispace - formally typeset
Open AccessJournal ArticleDOI

Number detectors spontaneously emerge in a deep neural network designed for visual object recognition

TLDR
It is shown that network units tuned to abstract numerosity, and therefore reminiscent of real number neurons, spontaneously emerge in a biologically inspired deep neural network that was merely trained on visual object recognition.
Abstract
Humans and animals have a "number sense," an innate capability to intuitively assess the number of visual items in a set, its numerosity. This capability implies that mechanisms to extract numerosity indwell the brain's visual system, which is primarily concerned with visual object recognition. Here, we show that network units tuned to abstract numerosity, and therefore reminiscent of real number neurons, spontaneously emerge in a biologically inspired deep neural network that was merely trained on visual object recognition. These numerosity-tuned units underlay the network's number discrimination performance that showed all the characteristics of human and animal number discriminations as predicted by the Weber-Fechner law. These findings explain the spontaneous emergence of the number sense based on mechanisms inherent to the visual system.

read more

Citations
More filters
Journal ArticleDOI

Number: the Language of Science

J. B. S. Haldane
- 04 Jan 1941 - 
TL;DR: Number: the Language of Science by Tobias Dantzig shows why science is inevitably and progressively doomed to use mathematics as its language.
Journal ArticleDOI

Temporal and spatial enumeration processes in the primate parietal cortex

TL;DR: The authors found that temporal and spatial enumeration processes engaged different populations of neurons in the intraparietal sulcus of behaving monkeys, and that another neuronal population represented the cardinality of a set irrespective of whether it had been cued in a spatial layout or across time.
Journal ArticleDOI

The Adaptive Value of Numerical Competence

TL;DR: The internal number representations determine how animals perceive stimulus magnitude, which, in turn, constrains an animal's spontaneous decisions and is placed in a framework to provide for a more quantitative analysis of the adaptive value and selection pressures of numerical competence.
Journal ArticleDOI

Theory of Mind May Have Spontaneously Emerged in Large Language Models

Michal Kosinski
- 04 Feb 2023 - 
TL;DR: Theory of mind (ToM), or the ability to impute unobservable mental states to others, is central to human social interactions, communication, empathy, self-consciousness, and morality as mentioned in this paper .
Journal ArticleDOI

Structure learning and the posterior parietal cortex

TL;DR: It is proposed that structure learning is grounded in the actions that primates take when they reach for objects or fixate them with their eyes, and a model of how this might occur in neural circuits is sketched.
References
More filters
Proceedings Article

ImageNet Classification with Deep Convolutional Neural Networks

TL;DR: The state-of-the-art performance of CNNs was achieved by Deep Convolutional Neural Networks (DCNNs) as discussed by the authors, which consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax.
Proceedings Article

Very Deep Convolutional Networks for Large-Scale Image Recognition

TL;DR: In this paper, the authors investigated the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting and showed that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 layers.
Journal Article

Dropout: a simple way to prevent neural networks from overfitting

TL;DR: It is shown that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets.
Journal ArticleDOI

ImageNet classification with deep convolutional neural networks

TL;DR: A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.
Related Papers (5)