scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Journal ArticleDOI

Convolutional Recurrent Deep Learning Model for Sentence Classification

TL;DR: This paper uses an unsupervised neural language model to train initial word embeddings that are further tuned by the authors' deep learning network, then, the pre-trained parameters of the network are used to initialize the model and a joint CNN and RNN framework is described to overcome the problem of loss of detailed, local information.
Book ChapterDOI

Connectionist Temporal Modeling for Weakly Supervised Action Labeling

TL;DR: In this paper, the Extended Connectionist Temporal Classification (ECTC) framework is proposed to evaluate all possible alignments via dynamic programming and explicitly enforce their consistency with frame-to-frame visual similarities.
Journal ArticleDOI

Explore a deep learning multi-output neural network for regional multi-step-ahead air quality forecasts

TL;DR: Results demonstrated that the proposed DM-LSTM model incorporated with three deep learning algorithms could significantly improve the spatio-temporal stability and accuracy of regional multi-step-ahead air quality forecasts.
Journal ArticleDOI

Training feedforward neural networks using multi-verse optimizer for binary classification problems

TL;DR: The comparative study demonstrates that MVO is very competitive and outperforms other training algorithms in the majority of datasets in terms of improved local optima avoidance and convergence speed.
Posted Content

Optimization Methods for Large-Scale Machine Learning

TL;DR: A major theme of this study is that large-scale machine learning represents a distinctive setting in which the stochastic gradient method has traditionally played a central role while conventional gradient-based nonlinear optimization techniques typically falter, leading to a discussion about the next generation of optimization methods for large- scale machine learning.
References
More filters
Related Papers (5)