A Comparison Review of Optimizers and Activation Functions For Convolutional Neural Networks

dog breed classification convolutional neural networks (cnn) artificial neural network (ann) transfer learning optimization functions activation functions learning rate

Authors

  • Ahmad Awad School of computing Asia Pacific University of Technology and Innovation (APU) Kuala Lumpur, Malaysia
  • Tharun Muthukumaran Umadevi School of computing Asia Pacific University of Technology and Innovation (APU) Kuala Lumpur, Malaysia
  • Joey Ng Ceng Yi School of computing Asia Pacific University of Technology and Innovation (APU) Kuala Lumpur, Malaysia
  • Toh Jian En School of computing Asia Pacific University of Technology and Innovation (APU) Kuala Lumpur, Malaysia
  • Koo Susan School of computing Asia Pacific University of Technology and Innovation (APU) Kuala Lumpur, Malaysia
  • Zailan Arabee Abdul Salam
    zailan@apu.edu.my
    School of computing Asia Pacific University of Technology and Innovation (APU) Kuala Lumpur, Malaysia
Vol. 7 No. 1 (2023)
Original Research
January 15, 2026

Downloads

Convolutional Neural Networks (CNN) are widely used in today’s world for research on image classification and image identification. In this research, exploration is made into one type of CNN using transfer learning by implementing the DenseNet-161 model into classifying 133 different types of dog breeds in a total of 8,351 images split between training, validation, and testing. Only 7,515 images will be used for training and validation at a ratio of 89:11 respectively. This research aims to identify the accuracies and performances of Rectified Linear Unit (ReLU), Leaky ReLU, and Exponential Linear Unit (ELU) activation functions along with Adaptive Moment Estimation (Adam), Adaptive Gradient Algorithm (Adagrad), and Stochastic Gradient Descent (SGD) optimization functions with learning rates (lr) of 0.001, 0.01, and 0.1.