scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Journal ArticleDOI

Designing neural networks through neuroevolution

TL;DR: This Review looks at several key aspects of modern neuroevolution, including large-scale computing, the benefits of novelty and diversity, the power of indirect encoding, and the field’s contributions to meta-learning and architecture search.
Journal ArticleDOI

Performance evaluation of a sequential minimal radial basis function (RBF) neural network learning algorithm

TL;DR: The M-RAN algorithm is shown to realize networks with far fewer hidden neurons with better or same approximation/classification accuracy and the time taken for learning (training) is also considerably shorter as M- RAN does not require repeated presentation of the training data.
Journal ArticleDOI

Complex-Valued Convolutional Neural Network and Its Application in Polarimetric SAR Image Classification

TL;DR: The proposed CV-CNN is comparable to that of existing state-of-the-art methods in terms of overall classification accuracy and experiments show that the classification error can be further reduced if employingCV-CNN instead of conventional real-valued CNN with the same degrees of freedom.
Journal ArticleDOI

Deep Transfer Learning for Image-Based Structural Damage Recognition

TL;DR: This article implements the state‐of‐the‐art deep learning technologies for a civil engineering application, namely recognition of structural damage from images with four naïve baseline recognition tasks: component type identification, spalling condition check, damage level evaluation, and damage type determination.
Journal ArticleDOI

Beyond Sharing Weights for Deep Domain Adaptation

TL;DR: This work introduces a two-stream architecture, where one operates in the source domain and the other in the target domain, and demonstrates that this both yields higher accuracy than state-of-the-art methods on several object recognition and detection tasks and consistently outperforms networks with shared weights in both supervised and unsupervised settings.
References
More filters
Related Papers (5)