Hybrid Optimization Based Hyperparameter Tuning for Enhanced Image Classification Using CNN
DOI:
https://doi.org/10.62643/Keywords:
CIFAR-10, Convolutional Neural Network, Genetic Algorithm, Jellyfish Search Optimization, hyperparameter tuning, image classification, transfer learning, MobileNetV2, metaheuristic optimization, deep learning.Abstract
The hyperparameter optimization of deep convolutional neural networks is one of the most computationally intensive problems in applied machine learning, especially in the context of multi-class image classification of complex multi-class image datasets. The paper presents a hybrid optimization framework, which combines Genetic Algorithm (GA) and Jellyfish Search Optimization (JSO) with a residual Convolutional Neural Network (CNN) to automatically and optimally optimize hyperparameters to the CIFAR-10 benchmark. There are four experimental environments that are tested: a handcrafted Custom CNN baseline, a fine-tuned MobileNetV2 transfer learning model, a GA-optimized residual CNN, and a JSO-optimized residual CNN. The JSO-CNN achieves the highest test accuracy of 91.26% and a macro F1-score of 91.23%, surpassing the Custom CNN (90.48%), the GA-CNN (89.58%), and MobileNetV2 (82.98%). These results suggest that bio-inspired metaheuristic algorithms, particularly, the Jellyfish Search Optimizer and its ocean current drift, Levy flight passive walk, and active swarm behaviour operators can be used to a great effect in comparison to manual and genetic search strategies. The proposed pipeline is complemented by a web application based on FastAPI, predicting images in real-time, which is a step towards production-ready deployment of optimized image classifiers.
Downloads
Published
Issue
Section
License

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.













