scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Posted Content

Neural Additive Models: Interpretable Machine Learning with Neural Nets

TL;DR: Neural Additive Models (NAMs) are proposed which combine some of the expressivity of DNNs with the inherent intelligibility of generalized additive models and are more accurate than widely used intelligible models such as logistic regression and shallow decision trees.
Journal ArticleDOI

A Neurocomputational Model of the N400 and the P600 in Language Processing

TL;DR: This neurocomputational model is the first to successfully simulate the N400 and P600 amplitude in language comprehension, and simulations with this model provide a proof of concept of the single‐stream RI account of semantically induced patterns of N400and P600 modulations.
Posted Content

Deep Activity Recognition Models with Triaxial Accelerometers

TL;DR: This paper shows that deep activity recognition models provide better recognition accuracy of human activities, and avoid the expensive design of handcrafted features in existing systems, and utilize the massive unlabeled acceleration samples for unsupervised feature extraction.
Journal ArticleDOI

Multistep speed prediction on traffic networks: A deep learning approach considering spatio-temporal dependencies

TL;DR: A novel deep learning framework named attention graph convolutional sequence-to-sequence model (AGC-Seq2Seq) is proposed to capture the complex non-stationary temporal dynamics and spatial correlations in multistep traffic-condition prediction and further capture the temporal heterogeneity of traffic pattern.
Journal ArticleDOI

A Survey on Deep Learning for Data-Driven Soft Sensors

TL;DR: The necessity and significance of deep learning for soft sensor applications are demonstrated first by analyzing the merits ofDeep learning and the trends of industrial processes, and mainstream deep learning models, tricks, and frameworks/toolkits are summarized and discussed to help designers propel the developing progress of soft sensors.
References
More filters
Related Papers (5)