scispace - formally typeset
Search or ask a question
Author

Nobuyuki Matsui

Bio: Nobuyuki Matsui is an academic researcher from University of Hyogo. The author has contributed to research in topics: Artificial neural network & Cellular automaton. The author has an hindex of 23, co-authored 195 publications receiving 1980 citations. Previous affiliations of Nobuyuki Matsui include Hyogo University & Artificial Intelligence Center.


Papers
More filters
Journal Article
TL;DR: This paper shows by experiments that the quaternion-version of the Back Propagation algorithm achieves correct geometrical transformations in three-dimensional space, as well as in color space for an image compression problem, whereas real-valued BP algorithms fail.
Abstract: Quaternion neural networks are models in which computations of the neurons are based on quaternions, the four-dimensional equivalents of imaginary numbers. This paper shows by experiments that the quaternion-version of the Back Propagation (BP) algorithm achieves correct geometrical transformations in three-dimensional space, as well as in color space for an image compression problem, whereas real-valued BP algorithms fail. The quaternion neural network also performs superior in terms of convergence speed to a real-valued neural network with respect to the 3-bit parity check problem, as simulations show.

186 citations

Book ChapterDOI
03 Sep 2003
TL;DR: This paper shows by experiments that the quaternion-version of the Back Propagation (BP) algorithm achieves correct geometrical transformations in color space for an image compression problem, whereas real-valued BP algorithms fail.
Abstract: Quaternion neural networks are models of which computations in the neurons is based on quaternions, the four-dimensional equivalents of imaginary numbers. This paper shows by experiments that the quaternion-version of the Back Propagation (BP) algorithm achieves correct geometrical transformations in color space for an image compression problem, whereas real-valued BP algorithms fail.

154 citations

Journal ArticleDOI
TL;DR: Simulations have shown that the Qubit model solves learning problems with significantly improved efficiency as compared to the classical model, and it is suggested that the improved performance is due to the use of superposition of neural states and theUse of probability interpretation in the observation of the output states of the model.
Abstract: Neural networks have attracted much interest in the last two decades for their potential to realistically describe brain functions, but so far they have failed to provide models that can be simulated in a reasonable time on computers; rather they have been limited to toy models. Quantum computing is a possible candidate for improving the computational efficiency of neural networks. In this framework of quantum computing, the Qubit neuron model, proposed by Matsui and Nishimura, has shown a high efficiency in solving problems such as data compression. Simulations have shown that the Qubit model solves learning problems with significantly improved efficiency as compared to the classical model. In this paper, we confirm our previous results in further detail and investigate what contributes to the efficiency of our model through 4-bit and 6-bit parity check problems, which are known as basic benchmark tests. Our simulations suggest that the improved performance is due to the use of superposition of neural states and the use of probability interpretation in the observation of the output states of the model.

118 citations

Journal ArticleDOI
TL;DR: Associative memory networks based on quaternionic Hopfield neural network are investigated and it is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated stored patterns, and each of these states has its basin in the quaternion networks.
Abstract: Associative memory networks based on quaternionic Hopfield neural network are investigated in this paper. These networks are composed of quaternionic neurons, and input, output, threshold, and connection weights are represented in quaternions, which is a class of hypercomplex number systems. The energy function of the network and the Hebbian rule for embedding patterns are introduced. The stable states and their basins are explored for the networks with three neurons and four neurons. It is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated stored patterns, and each of these states has its basin in the quaternionic networks.

112 citations

Journal ArticleDOI
TL;DR: In this paper, a qubit-like neural network is constructed for a 3-bit quantum circuit, which is the minimum quantum logical gate describing all basic logical operations, and in this model, how to determine circuit parameters by learning.
Abstract: Investigations into quantum computations have begun from the pioneering theoretical studies of Feynman, Deutsch, and others, and detailed studies have been done since the discovery of a quantum algorithm which can solve the problem of factorizing a large integer in polynomial time by Shor in 1994. Recently, the existence of nonalgorithmic quantum computations in microtubules inside a neural circuit has been debated, resulting in the proposal of the concept of the quantum neural computation theory, although detailed studies have not as yet been made. In this paper, in order to construct a new framework for describing the cohesiveness of the distribution and synthesis inherent in a neural network, a neural state is described quantum dynamically and a qubitlike neural network corresponding to the quantum circuit of quantum computations is studied. Specifically, a qubitlike neural network is constructed for a 3-bit quantum circuit, which is the minimum quantum logical gate describing all basic logical operations, and in this model we investigate how to determine circuit parameters by learning. © 2000 Scripta Technica, Electron Comm Jpn Pt 3, 83(10): 67–73, 2000

67 citations


Cited by
More filters
Book
26 Jul 2012
TL;DR: The foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory, 'contextuality' and 'quantum entanglement', are introduced, which allow cognitive phenomena to be modeled in non-reductionist ways.
Abstract: Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, 'contextuality', is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, 'quantum entanglement', allows cognitive phenomena to be modeled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light. Introducing the basic principles in an easy-to-follow way, this book does not assume a physics background or a quantum brain and comes complete with a tutorial and fully worked-out applications in important areas of cognition and decision.

745 citations

Journal ArticleDOI
TL;DR: A broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches are summarized, which provides interesting research challenges for future research to cope-up with the present information processing era.

398 citations

Journal ArticleDOI
TL;DR: An enhanced and generalized PNN (EPNN) is presented using local decision circles (LDCs) to overcome the aforementioned shortcoming of PNN and improve its robustness to noise in the data.
Abstract: In recent years the Probabilistic Neural Network (PPN) has been used in a large number of applications due to its simplicity and efficiency. PNN assigns the test data to the class with maximum likelihood compared with other classes. Likelihood of the test data to each training data is computed in the pattern layer through a kernel density estimation using a simple Bayesian rule. The kernel is usually a standard probability distribution function such as a Gaussian function. A spread parameter is used as a global parameter which determines the width of the kernel. The Bayesian rule in the pattern layer estimates the conditional probability of each class given an input vector without considering any probable local densities or heterogeneity in the training data. In this paper, an enhanced and generalized PNN (EPNN) is presented using local decision circles (LDCs) to overcome the aforementioned shortcoming and improve its robustness to noise in the data. Local decision circles enable EPNN to incorporate local information and non-homogeneity existing in the training population. The circle has a radius which limits the contribution of the local decision. In the conventional PNN the spread parameter can be optimized for maximum classification accuracy. In the proposed EPNN two parameters, the spread parameter and the radius of local decision circles, are optimized to maximize the performance of the model. Accuracy and robustness of EPNN are compared with PNN using three different benchmark classification problems, iris data, diabetic data, and breast cancer data, and five different ratios of training data to testing data: 90:10, 80:20, 70:30, 60:40, and 50:50. EPNN provided the most accurate results consistently for all ratios. Robustness of PNN and EPNN is investigated using different values of signal to noise ratio (SNR). Accuracy of EPNN is consistently higher than accuracy of PNN at different levels of SNR and for all ratios of training data to testing data.

314 citations

Journal ArticleDOI
TL;DR: This work compares the temporal dynamics of percept alternations observed during auditory streaming with those observed for visual plaids and the susceptibilities of both modalities to volitional control to indicate that auditory and visual alternations share common principles of perceptual bistability.

292 citations

01 Jan 2007

281 citations