Eye Disease Classification Using EfficientNet-B0 Based on Transfer Learning
DOI:
https://doi.org/10.30871/jaic.v9i4.9743Keywords:
CNN, Classification, EfficientNet-B0, Eye Disease, Fundus ImageAbstract
This study focuses on developing and evaluating a deep learning approach employing EfficientNet-B0 based on transfer learning to classify retinal fundus images into four categories: Cataract, Diabetic Retinopathy, Glaucoma, and Normal. The model was trained using a retinal image dataset and demonstrated stable training performance, indicated by a consistent decrease in both training and validation loss without signs of overfitting. The training accuracy reached 92%, while the validation accuracy ranged between 94–95%. Model performance evaluation using a confusion matrix and classification report showed excellent classification results, particularly for the Diabetic Retinopathy class, with an F1-Score of 0.98. The Cataract and Normal classes also achieved high performance, with F1-Scores of 0.94 and 0.92, respectively. However, classification accuracy slightly declined for the Glaucoma class, which experienced some misclassification with the Normal class. Overall, the model achieved a classification accuracy of 94% on the test dataset, indicating good generalization capability. These findings suggest that the model holds strong potential for implementation in automated medical image-based diagnostic support systems. Nonetheless, performance improvement in classes with relatively higher misclassification rates is still required to ensure model reliability in clinical practice.
Downloads
References
[1] I. McCormick, I. Mactaggart, A. Bastawrous, M. J. Burton, and J. Ramke, “Effective refractive error coverage: an eye health indicator to measure progress towards universal health coverage,” Jan. 01, 2020, Blackwell Publishing Ltd. doi: 10.1111/opo.12662.
[2] L. Fauzi, L. Anggorowati, and C. Heriana, “Skrining Kelainan Refraksi Mata Pada Siswa Sekolah Dasar Menurut Tanda Dan Gejala,” J Health Educ, vol. 1, no. 1, Aug. 2016, doi: 10.15294.
[3] R. R. A. Bourne et al., “Causes of blindness and vision impairment in 2020 and trends over 30 years, and prevalence of avoidable blindness in relation to VISION 2020: The Right to Sight: An analysis for the Global Burden of Disease Study,” Lancet Glob Health, vol. 9, no. 2, pp. e144–e160, Feb. 2021, doi: 10.1016/S2214-109X(20)30489-7.
[4] R. Indraswari, W. Herulambang, and R. Rokhana, “Deteksi Penyakit Mata Pada Citra Fundus Menggunakan Convolutional Neural Network (CNN) Ocular Disease Detection on Fundus Images Using Convolutional Neural Network (CNN),” TECHNO.COM, vol. 21, no. 2, pp. 378–389, May 2022, doi: 10.62411.
[5] C. A. Putri and S. Rakasiwi, “Diagnosis Dini Penyakit Mata: Klasifikasi Citra Fundus Retina dengan Convolutional Neural Network VGG-16,” vol. 9, no. 1, pp. 208–216, Apr. 2025, doi: 10.29408/edumatic.v9i1.29571.
[6] C. R. Mulyasari, A. I. Hadiana, and A. Komarudin, “Deteksi Penyakit Diabetes, Katarak Dan Glaukoma Pada Citra Fundus Retina Mata Manusia Menggunakan Cnn Dengan Arsitektur Alexnet,” JUMANJI Jurnal Masyarakat Informatika Unjani, vol. 8, no. 1, pp. 53–68, Apr. 2024, doi: https://doi.org/10.26874/jumanji.v8i1.341.
[7] M. N. I. Muhlashin and A. Stefanie, “Klasifikasi Penyakit Mata Berdasarkan Citra Fundus Menggunakan Yolo V8,” Jurnal Mahasiswa Teknik Informatika, vol. 7, no. 2, p. 1363, Apr. 2023, doi: https://doi.org/10.36040/jati.v7i2.6927.
[8] F. A. Shidik, K. Musthofa, A. P. Kartiningtyas, and T. Agustin, “Seminar Nasional Amikom Surakarta (Semnasa) 2024 Analisis Citra Medis Untuk Identifikasi Penyakit Mata Dengan Teknologi Convolutional Neural Networks,” Prosiding Seminar Nasional AMIKOM Surakarta, vol. 2, Nov. 2024.
[9] K. A. Shastry, S. Shastry, A. Shastry, and S. M. Bhat, “A multi-stage efficientnet based framework for Alzheimer’s and Parkinson’s diseases prediction on magnetic resonance imaging,” Multimed Tools Appl, Jan. 2025, doi: 10.1007/s11042-025-20597-5.
[10] V. Ravi, V. Acharya, and M. Alazab, “A multichannel EfficientNet deep learning-based stacking ensemble approach for lung disease detection using chest X-ray images,” Cluster Comput, vol. 26, no. 2, pp. 1181–1203, Apr. 2023, doi: 10.1007/s10586-022-03664-6.
[11] M. Dubey, J. Tembhurne, and R. Makhijani, “Improving coronary heart disease prediction with real-life dataset: a stacked generalization framework with maximum clinical attributes and SMOTE balancing for imbalanced data,” Multimed Tools Appl, Nov. 2024, doi: 10.1007/s11042-024-19429-9.
[12] N. A. Fajrina, H. Z. Pradana, S. I. Purnama, and S. Romadhona, “Application of the EfficientNet-B0 Architecture in the Classification of Acute Lymphoblastic Leukemia,” Jurnal Riset Rekayasa Elektro, vol. 6, no. 1, pp. 59–68, Jun. 2024, doi: 10.30595/JRRE.
[13] P. Gupta, J. Nirmal, and N. Mehendale, “Custom CNN architectures for skin disease classification: binary and multi-class performance,” Multimed Tools Appl, Dec. 2024, doi: 10.1007/s11042-024-20503-5.
[14] L. Arora et al., “Ensemble deep learning and EfficientNet for accurate diagnosis of diabetic retinopathy,” Sci Rep, vol. 14, no. 1, Dec. 2024, doi: 10.1038/s41598-024-81132-4.
[15] A. D. Azzumzumi, M. Hanafi, and W. M. P. Dhuhita, “Klasifikasi Penyakit Paru-Paru Berdasarkan Peningkatan Kualitas Kontras dan EfficientNet Menggunakan Gambar X-Ray,” TEKNIKA, vol. 13, no. 2, pp. 293–300, Jul. 2024, doi: 10.34148/teknika.v13i2.881.
[16] A. Irsyad, Islamiyah, and F. Amal, “Klasifikasi COVID 19 dengan Metode EfficientNet Berdasarkan CT Scan Paru-Paru,” Fountain of Informatics Journal, vol. 8, no. 2, 2024, doi: 10.21111/fij.v8i2.10497
[17] N. Gupta, H. Garg, and R. Agarwal, “A robust framework for glaucoma detection using CLAHE and EfficientNet,” Visual Computer, vol. 38, no. 7, pp. 2315–2328, Jul. 2022, doi: 10.1007/s00371-021-02114-5
[18] H. Garg, N. Gupta, R. Agrawal, S. Shivani, and B. Sharma, “A real time cloud-based framework for glaucoma screening using EfficientNet,” Multimed Tools Appl, vol. 81, no. 24, pp. 34737–34758, Oct. 2022, doi: 10.1007/s11042-021-11559-8
[19] M. Tan and Q. V. Le, “EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks,” May 2019, [Online]. Available: http://arxiv.org/abs/1905.11946
[20] L. Alzubaidi et al., “Review of deep learning: concepts, CNN architectures, challenges, applications, future directions,” J Big Data, vol. 8, no. 1, Dec. 2021, doi: 10.1186/s40537-021-00444-8.
[21] K. Weiss, T. M. Khoshgoftaar, and D. Wang, “A survey of transfer learning,” J Big Data, vol. 3, no. 1, Dec. 2016, doi: 10.1186/s40537-016-0043-6.
[22] A. Ghosh, B. Soni, and U. Baruah, “A Fine-Tuned EfficientNet B1 Based Deep Transfer Learning Framework for Multiple Types of Brain Disorder Classification,” Iranian Journal of Science and Technology - Transactions of Electrical Engineering, vol. 48, no. 3, pp. 1279–1299, Sep. 2024, doi: 10.1007/s40998-024-00726-w.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 I Dewa Ayu Pradnya Pratiwi Tentriajaya, I Gusti Ngurah Lanang Wijayakusuma

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) ) that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).








