scispace - formally typeset
Open AccessJournal ArticleDOI

Recent advances in convolutional neural networks

TLDR
A broad survey of the recent advances in convolutional neural networks can be found in this article, where the authors discuss the improvements of CNN on different aspects, namely, layer design, activation function, loss function, regularization, optimization and fast computation.
About
This article is published in Pattern Recognition.The article was published on 2018-05-01 and is currently open access. It has received 3125 citations till now. The article focuses on the topics: Deep learning & Convolutional neural network.

read more

Citations
More filters
Journal ArticleDOI

Improving the performance of lightweight CNNs for binary classification using quadratic mutual information regularization

TL;DR: The experimental evaluation indicates that hinge loss is the optimal choice for binary classification problems, considering lightweight deep models, and a novel regularization method motivated by the Quadratic Mutual Information is proposed in order to improve the generalization ability of the utilized models.
Journal ArticleDOI

Determination of an infill well placement using a data-driven multi-modal convolutional neural network

TL;DR: The proposed CNN is applied to a channelized oil reservoir, and its performance is compared to that of a feedforward neural network, and both yield comparable predictability for a quad-modal case.
Journal ArticleDOI

DCNR: deep cube CNN with random forest for hyperspectral image classification

TL;DR: DCNR is put forward, which is composed of a cube neighbor HSI pixels strategy, a deep CNN and a random forest classifier, which can achieve classification accuracy of 96.78%, 96.08% and 94.85% on KSC, IP and SA datasets respectively with 20% samples as training set, and significantly outperforming random forest and cube CNN models.
Journal ArticleDOI

Multisignal VGG19 Network with Transposed Convolution for Rotating Machinery Fault Diagnosis Based on Deep Transfer Learning

TL;DR: A novel deep learning framework that combines transfer learning and transposed convolution is proposed for high-precision and high-efficiency machine fault diagnosis and has faster training speed, fewer training samples per time, and higher accuracy.
Journal ArticleDOI

DOE-based structured-light method for accurate 3D sensing

TL;DR: A compact and accurate three-dimensional (3D) sensing system that employs a diffraction optical element as a projection device and most of the grid-points can be robustly detected and that complex surfaces such as human faces and bodies can be precisely reconstructed.
References
More filters
Proceedings ArticleDOI

Deep Residual Learning for Image Recognition

TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article

Very Deep Convolutional Networks for Large-Scale Image Recognition

TL;DR: In this paper, the authors investigated the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting and showed that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 layers.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Related Papers (5)