scispace - formally typeset
Search or ask a question
Author

Kunihiko Fukushima

Bio: Kunihiko Fukushima is an academic researcher from Osaka University. The author has contributed to research in topics: Neocognitron & Artificial neural network. The author has an hindex of 17, co-authored 63 publications receiving 2136 citations. Previous affiliations of Kunihiko Fukushima include University of Electro-Communications & Tokyo University of Technology.


Papers
More filters
Book ChapterDOI
01 Jan 1982
TL;DR: A neural network model, called a “neocognitron”, is proposed for a mechanism of visual pattern recognition that has characteristics similar to those of visual systems of vertebrates.
Abstract: A neural network model, called a “neocognitron”, is proposed for a mechanism of visual pattern recognition. It is demonstrated by computer simulation that the neocognitron has characteristics similar to those of visual systems of vertebrates.

969 citations

Journal ArticleDOI
01 Sep 1983
TL;DR: In this article, a large-scale network with a learning-with-a-teacher (L2Teacher) process is used for reinforcement of the modifiable synapses in the new large-size model, instead of the learning-without-a teacher process applied to a previous model.
Abstract: A recognition with a large-scale network is simulated on a PDP-11/34 minicomputer and is shown to have a great capability for visual pattern recognition. The model consists of nine layers of cells. The authors demonstrate that the model can be trained to recognize handwritten Arabic numerals even with considerable deformations in shape. A learning-with-a-teacher process is used for the reinforcement of the modifiable synapses in the new large-scale model, instead of the learning-without-a-teacher process applied to a previous model. The authors focus on the mechanism for pattern recognition rather than that for self-organization.

720 citations

Journal ArticleDOI
TL;DR: The process of feature extraction by an S-cell is analyzed mathematically in this paper, and the role of the C-cells in deformation-invariant pattern recognition is discussed.

142 citations

Journal ArticleDOI
TL;DR: It is shown that various new functions can be realized by, for example, introducing top-down connections to the neocognitron: mechanism of selective attention, recognition and completion of partly occluded patterns, restoring Occluded contours, and so on.

100 citations

Book
01 Jul 1988
TL;DR: A recognition with a large-scale network is simulated on a PDP-11/34 minicomputer and is shown to have a great capability for visual pattern recognition and can be trained to recognize handwritten Arabic numerals even with considerable deformations in shape.
Abstract: A recognition with a large-scale network is simulated on a PDP-11/34 minicomputer and is shown to have a great capability for visual pattern recognition. The model consists of nine layers of cells. The authors demonstrate that the model can be trained to recognize handwritten Arabic numerals even with considerable deformations in shape. A learning-with-a-teacher process is used for the reinforcement of the modifiable synapses in the new large-scale model, instead of the learning-without-a-teacher process applied to a previous model. The authors focus on the mechanism for pattern recognition rather than that for self-organization.

92 citations


Cited by
More filters
Book
01 Jan 1995
TL;DR: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition, and is designed as a text, with over 100 exercises, to benefit anyone involved in the fields of neural computation and pattern recognition.
Abstract: From the Publisher: This is the first comprehensive treatment of feed-forward neural networks from the perspective of statistical pattern recognition. After introducing the basic concepts, the book examines techniques for modelling probability density functions and the properties and merits of the multi-layer perceptron and radial basis function network models. Also covered are various forms of error functions, principal algorithms for error function minimalization, learning and generalization in neural networks, and Bayesian techniques and their applications. Designed as a text, with over 100 exercises, this fully up-to-date work will benefit anyone involved in the fields of neural computation and pattern recognition.

19,056 citations

Journal ArticleDOI
TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.

14,635 citations

Proceedings ArticleDOI
18 Jun 2018
TL;DR: In this article, the non-local operation computes the response at a position as a weighted sum of the features at all positions, which can be used to capture long-range dependencies.
Abstract: Both convolutional and recurrent operations are building blocks that process one local neighborhood at a time. In this paper, we present non-local operations as a generic family of building blocks for capturing long-range dependencies. Inspired by the classical non-local means method [4] in computer vision, our non-local operation computes the response at a position as a weighted sum of the features at all positions. This building block can be plugged into many computer vision architectures. On the task of video classification, even without any bells and whistles, our nonlocal models can compete or outperform current competition winners on both Kinetics and Charades datasets. In static image recognition, our non-local models improve object detection/segmentation and pose estimation on the COCO suite of tasks. Code will be made available.

8,059 citations

Journal ArticleDOI
TL;DR: The objective of this review paper is to summarize and compare some of the well-known methods used in various stages of a pattern recognition system and identify research topics and applications which are at the forefront of this exciting and challenging field.
Abstract: The primary goal of pattern recognition is supervised or unsupervised classification. Among the various frameworks in which pattern recognition has been traditionally formulated, the statistical approach has been most intensively studied and used in practice. More recently, neural network techniques and methods imported from statistical learning theory have been receiving increasing attention. The design of a recognition system requires careful attention to the following issues: definition of pattern classes, sensing environment, pattern representation, feature extraction and selection, cluster analysis, classifier design and learning, selection of training and test samples, and performance evaluation. In spite of almost 50 years of research and development in this field, the general problem of recognizing complex patterns with arbitrary orientation, location, and scale remains unsolved. New and emerging applications, such as data mining, web searching, retrieval of multimedia data, face recognition, and cursive handwriting recognition, require robust and efficient pattern recognition techniques. The objective of this review paper is to summarize and compare some of the well-known methods used in various stages of a pattern recognition system and identify research topics and applications which are at the forefront of this exciting and challenging field.

6,527 citations

Journal ArticleDOI
TL;DR: A broad survey of the recent advances in convolutional neural networks can be found in this article, where the authors discuss the improvements of CNN on different aspects, namely, layer design, activation function, loss function, regularization, optimization and fast computation.

3,125 citations