scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Proceedings ArticleDOI

Chinese Poetry Generation with Recurrent Neural Networks

TL;DR: A model for Chinese poem generation based on recurrent neural networks which is ideally suited to capturing poetic content and form is proposed which outperforms competitive Chinese poetry generation systems using both automatic and manual evaluation methods.
Proceedings ArticleDOI

Concurrent learning for convergence in adaptive control without persistency of excitation

TL;DR: It is shown that for an adaptive controller that uses recorded and instantaneous data concurrently for adaptation, a verifiable condition on linear independence of the recorded data is sufficient to guarantee exponential tracking error and parameter error convergence.
Journal ArticleDOI

Gene expression inference with deep learning

TL;DR: A deep learning method (abbreviated as D-GEX) is presented to infer the expression of target genes from theexpression of landmark genes, and shows that deep learning achieves lower error than LR in 99.97% of the target genes.
Book ChapterDOI

To recognize shapes, first learn to generate images.

TL;DR: This chapter describes several of the proposed algorithms and shows how they can be combined to produce hybrid methods that work efficiently in networks with many layers and millions of adaptive connections.
Proceedings ArticleDOI

Non-local Color Image Denoising with Convolutional Neural Networks

TL;DR: In this article, the authors proposed a non-local image denoising network based on variational methods that exploit the inherent nonlocal self-similarity property of natural images and showed that the proposed network achieved state-of-the-art performance on the Berkeley segmentation dataset.
References
More filters
Related Papers (5)