scispace - formally typeset
Open AccessJournal ArticleDOI

Recent advances in convolutional neural networks

TLDR
A broad survey of the recent advances in convolutional neural networks can be found in this article, where the authors discuss the improvements of CNN on different aspects, namely, layer design, activation function, loss function, regularization, optimization and fast computation.
About
This article is published in Pattern Recognition.The article was published on 2018-05-01 and is currently open access. It has received 3125 citations till now. The article focuses on the topics: Deep learning & Convolutional neural network.

read more

Citations
More filters
Journal ArticleDOI

A new method for intelligent fault diagnosis of machines based on unsupervised domain adaptation

TL;DR: This paper introduces the domain adaptation strategy into deep neural networks to propose a deep domain adaptation architecture, which realizes to learn knowledge from the labeled source domain to facilitate the target classification.
Journal ArticleDOI

Multi-view Convolutional Neural Network for lung nodule false positive reduction

TL;DR: The work shows that the proposed multi-view 2D network is a simple, yet effective algorithm for the false positive reduction problem and can detect nodules that are isolated, linked to a vessel or attached to the lung wall.
Journal ArticleDOI

Trustworthy and Intelligent COVID-19 Diagnostic IoMT Through XR and Deep-Learning-Based Clinic Data Access

TL;DR: A new framework in the COVID-19 diagnostic integration is suggested, which outperforms the existing perception techniques with significantly higher accuracy performance, and new research about the integration of XR and deep learning for IoMT implementation is opened.
Journal ArticleDOI

Unsupervised pre-trained filter learning approach for efficient convolution neural network

TL;DR: A comprehensive survey of the relationship between ConvNet with different pre-trained learning methodologies and its optimization effects and the experimental results on the benchmark dataset highlight the merit of efficient pre- trained learning algorithms for optimized ConvNet.
Journal ArticleDOI

Look-behind fully convolutional neural network for computer-aided endoscopy

TL;DR: The proposed architecture, named Look-Behind FCN (LB-FCN), is capable of extracting multi-scale image features by using blocks of parallel convolutional layers with different filter sizes, and has a smaller number of free parameters than conventional Convolutional Neural Network (CNN) architectures, which makes it suitable for training with smaller datasets.
References
More filters
Proceedings ArticleDOI

Deep Residual Learning for Image Recognition

TL;DR: In this article, the authors proposed a residual learning framework to ease the training of networks that are substantially deeper than those used previously, which won the 1st place on the ILSVRC 2015 classification task.
Proceedings Article

Adam: A Method for Stochastic Optimization

TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article

Very Deep Convolutional Networks for Large-Scale Image Recognition

TL;DR: In this paper, the authors investigated the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting and showed that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 layers.
Journal ArticleDOI

Gradient-based learning applied to document recognition

TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Related Papers (5)