scispace - formally typeset
Open AccessJournal ArticleDOI

A Comparative Analysis of Transfer Learning Architecture Performance on Convolutional Neural Network Models with Diverse Datasets

Aji Primajaya, +1 more
- 03 May 2023 - 
- Vol. 12, Iss: 1, pp 1-11
TLDR
In this paper , the authors conducted three experiments on different datasets to train models with various transfer learning architectures and concluded that the DenseNet-121 architecture is the best transfer learning architecture model for various datasets.
Abstract
Deep learning is a branch of machine learning with many highly successful applications. One application of deep learning is image classification using the Convolutional Neural Network (CNN) algorithm. Large image data is required to classify images with CNN to obtain satisfactory training results. However, this can be overcome with transfer learning architectural models, even with small image data. With transfer learning, the success rate of a model is likely to be higher. Since there are many transfer learning architecture models, it is necessary to compare each model's performance results to find the best-performing architecture. In this study, we conducted three experiments on different datasets to train models with various transfer learning architectures. We then performed a comprehensive comparative analysis for each experiment. The result is that the DenseNet-121 architecture is the best transfer learning architecture model for various datasets.

read more

Content maybe subject to copyright    Report

References
More filters
Proceedings Article

Very Deep Convolutional Networks for Large-Scale Image Recognition

TL;DR: This work investigates the effect of the convolutional network depth on its accuracy in the large-scale image recognition setting using an architecture with very small convolution filters, which shows that a significant improvement on the prior-art configurations can be achieved by pushing the depth to 16-19 weight layers.
Proceedings ArticleDOI

Densely Connected Convolutional Networks

TL;DR: DenseNet as mentioned in this paper proposes to connect each layer to every other layer in a feed-forward fashion, which can alleviate the vanishing gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters.
Posted Content

MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications

TL;DR: This work introduces two simple global hyper-parameters that efficiently trade off between latency and accuracy and demonstrates the effectiveness of MobileNets across a wide range of applications and use cases including object detection, finegrain classification, face attributes and large scale geo-localization.
Proceedings ArticleDOI

Understanding of a convolutional neural network

TL;DR: All the elements and important issues related to CNN, and how these elements work, are explained and defined and the parameters that effect CNN efficiency are state.
Posted Content

Learning Transferable Architectures for Scalable Image Recognition

TL;DR: This paper proposes to search for an architectural building block on a small dataset and then transfer the block to a larger dataset and introduces a new regularization technique called ScheduledDropPath that significantly improves generalization in the NASNet models.
Related Papers (5)