scispace - formally typeset
Journal ArticleDOI

Learning representations by back-propagating errors

Reads0
Chats0
TLDR
Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract
We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

read more

Citations
More filters
Journal ArticleDOI

Deep high dynamic range imaging of dynamic scenes

TL;DR: A convolutional neural network is used as the learning model and three different system architectures are compared to model the HDR merge process to demonstrate the performance of the system by producing high-quality HDR images from a set of three LDR images.
Journal ArticleDOI

Accelerated discovery of stable lead-free hybrid organic-inorganic perovskites via machine learning

TL;DR: A target-driven method to predict undiscovered hybrid organic-inorganic perovskites (HOIPs) for photovoltaics based on bandgap, which can achieve high accuracy in a flash and be applicable to a broad class of functional material design.
Journal ArticleDOI

Deep semantic segmentation of natural and medical images: a review

TL;DR: This review categorizes the leading deep learning-based medical and non-medical image segmentation solutions into six main groups of deep architectural, data synthesis- based, loss function-based, sequenced models, weakly supervised, and multi-task methods.
Proceedings ArticleDOI

Understanding how image quality affects deep neural networks

TL;DR: An evaluation of 4 state-of-the-art deep neural network models for image classification under quality distortions shows that the existing networks are susceptible to these quality distortions, particularly to blur and noise.
Journal ArticleDOI

Metaheuristic design of feedforward neural networks

TL;DR: A broad spectrum of FNN optimization methodologies including conventional and metaheuristic approaches are summarized, which provides interesting research challenges for future research to cope-up with the present information processing era.
References
More filters
Related Papers (5)