Journal ArticleDOI
Learning representations by back-propagating errors
Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.Abstract:
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.read more
Citations
More filters
Journal ArticleDOI
Computer-Aided Diagnosis Systems for Lung Cancer: Challenges and Methodologies
Ayman El-Baz,Garth M. Beache,Georgy Gimel'farb,Kenji Suzuki,Kazunori Okada,Ahmed Elnakib,Ahmed Soliman,Behnoush Abdollahi +7 more
TL;DR: The paper addresses several challenges that researchers face in each implementation step and outlines the strengths and drawbacks of the existing approaches for lung cancer CAD systems.
Journal ArticleDOI
Continuous-variable quantum neural networks
Nathan Killoran,Thomas R. Bromley,Juan Miguel Arrazola,Maria Schuld,Nicolás Quesada,Seth Lloyd +5 more
TL;DR: In this paper, the authors demonstrate that neural networks and quantum computers can be executed with the same physical platform, based on photonics, which provides a natural extension of classical machine learning algorithms into the quantum realm.
Journal ArticleDOI
Motor Learning with Unstable Neural Representations
TL;DR: This work proposes that motor cortex is a redundant neural network, i.e., any single behavior can be realized by multiple configurations of synaptic strengths and hypothesizes that synaptic modifications underlying learning contain a random component, which causes wandering among synaptic configurations with equivalent behaviors but different neural representations.
Journal ArticleDOI
A new neural network approach including first guess for retrieval of atmospheric water vapor, cloud liquid water path, surface temperature, and emissivities over land from satellite microwave observations
TL;DR: In this article, a neural network approach is developed that uses a first-guess approach to retrieve the surface skin temperature, the integrated water vapor content, the cloud liquid water path and the microwave surface emissivities between 19 and 85 GHz over land from SSM/I observations.
Journal ArticleDOI
Online and offline handwritten Chinese character recognition: A comprehensive study and new benchmark
TL;DR: In this article, a new adaptation layer is proposed to reduce the mismatch between training and test data on a particular source layer, and the adaptation process can be efficiently and effectively implemented in an unsupervised manner.