Detection of Malaria Diseases with Residual Attention Network




Deep Learning, Residual Attention Network, Support Vector Machine, Malaria Disease


To describe a model using classic machine learning techniques for creating machine learning systems, a person who specializes in this technique needs to extract feature vectors. This period also breaks into expert time. Also, these methods could not process raw data without preprocessing and expert assistance. Deep learning has made great progress in solving problems at this point, and machine learning research has continued for many years. Unlike traditional machine learning and image processing techniques, deep networks enable learning processes using raw data. In this study, a deep learning approach for the classification and diagnosis of malaria is developed. For this purpose, Residual Attention Network (RAN) a deep learning Convolutional Neural Network (CNN) technique was used with previously classified datasets. The goal is to design computer-aided software for classifying blood cell images (blood samples) as “parasitized” or “uninfected”. In the program, a decision support system was implemented by a deep learning approach. As a result, the RAN model achieved the best ability to produce better results in processing and classification images compared to other algorithm types. RAN model’s training simulation results showed a 95.79% classification accuracy rate. Using the Support Vector Machine (SVM) obtained only 83.30% classification accuracy rate. Besides, it is evaluated that for the classification of blood cell images and diagnosis of malaria using deep learning methods can be used successfully. In addition, deep learning methods have the advantage of automatically learning features from input data and require minimal input by specialists in automated malaria diagnosis.


Download data is not yet available.




How to Cite

M. M. QANBAR and S. Tasdemir, “Detection of Malaria Diseases with Residual Attention Network”, Int J Intell Syst Appl Eng, vol. 7, no. 4, pp. 238–244, Dec. 2019.



Research Article