scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Journal ArticleDOI

Artificial intelligence in COVID-19 drug repurposing.

TL;DR: This Review provides a strong rationale for using AI-based assistive tools for drug repurposing medications for human disease, including during the COVID-19 pandemic.
Journal ArticleDOI

Deep learning-based remaining useful life estimation of bearings using multi-scale feature extraction

TL;DR: A novel intelligent remaining useful life (RUL) prediction method based on deep learning is proposed, and high accuracy on the RUL prediction is achieved, and the proposed method is promising for industrial applications.
Journal ArticleDOI

An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems

TL;DR: It is shown that the global minimum of this nonparametric estimator for Renyi's entropy is the same as the actual entropy, and the performance of the error-entropy-minimization criterion is compared with mean-square-error- Minimization in the short-term prediction of a chaotic time series and in nonlinear system identification.
Proceedings Article

Phone Recognition with the Mean-Covariance Restricted Boltzmann Machine

TL;DR: This work uses the mean-covariance restricted Boltzmann machine (mcRBM) to learn features of speech data that serve as input into a standard DBN, and achieves a phone error rate superior to all published results on speaker-independent TIMIT to date.
Journal ArticleDOI

Flood prediction using machine learning models: Literature review

TL;DR: In this paper, the state-of-the-art machine learning models for both long-term and short-term floods are evaluated and compared using a qualitative analysis of robustness, accuracy, effectiveness and speed.
References
More filters
Related Papers (5)