Transfer Learning Based Network Performance Comparison of the Pre-Trained Deep Neural Networks using MATLAB

Authors

  • Senthil Kumar Jayapalan
  • Syahid Anuar

DOI:

https://doi.org/10.11113/oiji2022.10nSpecial%20Issue%201.179

Keywords:

Deep Learning, Transfer Learning, Convolutional Neural Network, mage Classification, Computer Vision

Abstract

Deep learning has grown tremendously in recent years, having a substantial impact on practically every discipline. The performance of the neural network will improve as the depth of the network is increased, but this progress will come at the cost of time and processing resources. Transfer learning allows us to transfer the knowledge of a model that has been formerly trained for a particular job to a new model that is attempting to solve a related but not identical problem. Specific layers of a pretrained model must be retrained while the others must remain unmodified in order to adapt it to a new task effectively. When faced with a challenge selecting which layers should be enabled for training and which should be frozen, this adaptation is commonly made employing fine-tuning procedures. Furthermore, similar to traditional deep neural network training, there is a typical issue with setting hyper-parameter values. All of these concerns have a substantial effect on training capabilities as well as classification performance. In this study, we examined the performance of five pre-trained networks such as SqueezeNet, GoogleNet, ShuffleNet, Darknet-53 and Inception-V3 with different Epochs, Learning Rates and Mini Batch Sizes in order to evaluate and compare the network’s performance using confusion matrix. Based on the findings, Inception-V3 has achieved the highest accuracy of 96.98%, as well as other evaluation metrics including precision of 92.63%, sensitivity of 92.46%, specificity of 98.12%, and f1-score of 92.49 %, respectively.

Downloads

Published

2022-05-20

How to Cite

Jayapalan, S. K., & Anuar, S. (2022). Transfer Learning Based Network Performance Comparison of the Pre-Trained Deep Neural Networks using MATLAB. Open International Journal of Informatics, 10(Special Issue 1), 27–40. https://doi.org/10.11113/oiji2022.10nSpecial Issue 1.179