scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Journal ArticleDOI

Machine learning at the energy and intensity frontiers of particle physics.

TL;DR: The application and development of machine-learning methods used in experiments at the frontiers of particle physics (such as the Large Hadron Collider) are reviewed, including recent advances based on deep learning.
Journal ArticleDOI

A Survey of Deep Learning Methods for Cyber Security

TL;DR: This survey paper describes a literature review of deep learning methods for cyber security applications, including deep autoencoders, restricted Boltzmann machines, recurrent neural networks, generative adversarial networks, and several others.
Journal ArticleDOI

Correction of intensity variations in MR images for computer-aided tissue classification

TL;DR: A new approach to the correction of intra-slice intensity variations is presented and results demonstrate that the correction process enhances the performance of backpropagation neural network classifiers designed for the segmentation of the images.
Proceedings Article

Adaptive dropout for training deep neural networks

TL;DR: A method is described called 'standout' in which a binary belief network is overlaid on a neural network and is used to regularize of its hidden units by selectively setting activities to zero, which achieves lower classification error rates than other feature learning methods, including standard dropout, denoising auto-encoders, and restricted Boltzmann machines.
Journal ArticleDOI

Learning approach to optical tomography

TL;DR: A method for imaging 3D phase objects in a tomographic configuration implemented by training an artificial neural network to reproduce the complex amplitude of the experimentally measured scattered light is described.
References
More filters
Related Papers (5)