scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Proceedings ArticleDOI

Recurrent Human Pose Estimation

TL;DR: The result is a simple architecture that achieves performance on par with the state of the art, but without the complexity of a graphical model stage (or layers).
Proceedings ArticleDOI

Toward deep learning software repositories

TL;DR: This work motivate deep learning for software language modeling, highlighting fundamental differences between state-of-the-practice software language models and connectionist models, and proposes avenues for future work, where deep learning can be brought to bear to support model-based testing, improve software lexicons, and conceptualize software artifacts.
Book ChapterDOI

Siamese Neural Networks: An Overview.

TL;DR: The siamese neural network architecture is described, and its main applications in a number of computational fields since its appearance in 1994 are outlined, including the programming languages, software packages, tutorials, and guides that can be practically used by readers to implement this powerful machine learning model.
Journal ArticleDOI

A comparative analysis of training methods for artificial neural network rainfall-runoff models

TL;DR: It has been found that the RGA trained ANN model significantly outperformed the ANN model trained using BPA, and was also able to overcome certain limitations of the ANN rainfall-runoff model trained with BPA reported by many researchers in the past.
Journal ArticleDOI

Scalable High-Performance Image Registration Framework by Unsupervised Deep Feature Representations Learning

TL;DR: A learning-based image registration framework is proposed that uses deep learning to discover compact and highly discriminative features upon observed imaging data that scales well to new image modalities or new image applications with little to no human intervention.
References
More filters
Related Papers (5)