Comparative Analysis of Neural Network Architectures: CNNs, RNNs, and Transformers in Real-World Applications
Downloads
This study presents a systematic and experimentally validated comparison of three major deep learning architectures: Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and Transformer models. Standard benchmark datasets were utilized to evaluate their performance under controlled conditions. The models were assessed using multiple evaluation criteria, including classification accuracy, precision, recall, F1-score, computational cost, and execution time. The findings indicate that Transformer-based architectures achieve superior predictive performance, particularly in sequence-based tasks, whereas CNN models provide a more efficient balance between computational overhead and accuracy. The study emphasizes reproducibility by clearly defining dataset partitions, model configurations, and training parameters. Statistical testing confirms the significance of observed performance differences. These results provide practical guidance for selecting appropriate deep learning models across various real-world applications.
Downloads
Copyright (c) 2026 Journal of Applied Technology and Innovation

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.





