scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Proceedings Article

Tensorizing neural networks

TL;DR: In this paper, the authors converted the dense weight matrices of the fully-connected layers to the Tensor Train format such that the number of parameters is reduced by a huge factor and at the same time the expressive power of the layer is preserved.
Proceedings Article

Learning Representations and Generative Models for 3D Point Clouds.

TL;DR: In this article, a deep AutoEncoder (AE) network with state-of-the-art reconstruction quality and generalization ability is proposed. But the model is not suitable for 3D point clouds.
Posted Content

Gaussian Process Kernels for Pattern Discovery and Extrapolation

TL;DR: In this paper, simple closed-form kernels are derived by modelling a spectral density with a Gaussian mixture, which can be used with Gaussian processes to discover patterns and enable extrapolation, and demonstrate the proposed kernels by discovering patterns and performing long range extrapolation on synthetic examples, as well as atmospheric CO2 trends and airline passenger data.
Proceedings ArticleDOI

Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks

TL;DR: In this paper, a comparative study is made on the computational requirements of particle swarm optimization and backpropagation for neural networks, and the results show that the feed-forward neural network weights converge faster with the PSO than with the BP algorithm.
Journal ArticleDOI

State of charge estimation for Li-ion batteries using neural network modeling and unscented Kalman filter-based error cancellation

TL;DR: In this paper, an artificial neural network-based battery model is developed to estimate the battery's state of charge (SOC) based on the measured current and voltage, which is validated using LiFePO4 battery data collected from the Federal Driving Schedule and dynamical stress testing.
References
More filters
Related Papers (5)