Classification of Vitiligo using Transfer Learning with New Activation Function Retan
Keywords:
Activation Function, CNN, Deep Learning, Transfer Learning, VitiligoAbstract
In neural networks, activation function is mostly utilised to derive non-linear variations. Based on the input that is provided to a neuron, the activation function determines that neuron's output. Numerous activation functions, including sigmoid, relu, tanh, and others, are utilised in neural networks. To improve the model's accuracy in the proposed study, a novel activation function called retan is employed. When retan was used instead of another activation function, model accuracy rose by about 4 to 5 percent. With the aid of transfer leaning, the proposed activation function was utilised to categorise the vitiligo image. The model with retan activation function offers 92.59 percent validation accuracy and 90.80 percent training accuracy.
Downloads
References
Nawal Soliman ALKolifi ALEnezi, A Method Of Skin Disease Detection Using Image Processing And Machine Learning, Procedia Computer Science, Volume 163,2019, Pages 85-92, ISSN 1877-0509,
Guifang Lin, Wei Shen, Research on convolutional neural network based on improved Relu piecewise activation function, Procedia Computer Science, Volume 131, 2018, Pages 977-984, ISSN 1877-0509.
Moshe Leshno, Vladimir Ya. Lin, Allan Pinkus, Shimon Schocken, Multilayer feedforward networks with a nonpolynomial activation function can approximate any function, Neural Networks, Volume 6, Issue 6, 1993, Pages 861-867, ISSN 0893-6080, https://doi.org/10.1016/S0893-6080(05)80131-5.
Johannes Schmidt-Hieber. "Nonparametric regression using deep neural networks with ReLU activation function." Ann. Statist. 48 (4) 1875 - 1897, August 2020.
Yuen, B., Hoang, M.T., Dong, X. et al. Universal activation function for machine learning. Sci Rep 11, 18757 (2021).
Siddharth Sharma, Simone Sharma, Anidhya Athaiya, “ACTIVATION FUNCTIONS IN NEURAL NETWORKS” International Journal of Engineering Applied Sciences and Technology, 2020 Vol. 4, Issue 12, ISSN No. 2455-2143, Pages 310-316
B. Ding, H. Qian, and J. Zhou, "Activation functions and their characteristics in deep neural networks," 2018 Chinese Control And Decision Conference (CCDC), 2018, pp. 1836-1841.
Yingying Wang, Yibin Li, Yong Song, and Xuewen Rong, "The Influence of the Activation Function in a Convolution Neural Network Model of Facial Expression Recognition” Appl. Sci. 2020, 10, 1897.
Chigozie E. Nwankpa, Winifred l.Jonah, Anthony Gachagan, Stephen Marshall, “Activation Functions: Comparison of Trends in Practice and Research for Deep Learning". 124 - 133. 2nd International Conference on Computational Sciences and Technology, Jamshoro, Pakistan
M. Guevara, V. Cruz, O. Vergara, M. Nandayapa, H. Ochoa, H. Sossa, “Study of the Effect of Combining Activation Functions ina Convolutional Neural Network” IEEE LATIN AMERICA TRANSACTIONS, VOL. 19, NO. 5, MAY 2021
Koutsoukas, A., Monaghan, K.J., Li, X. et al. Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data. J Cheminform 9, 42 (2017).
Yao Ying, Ningbo Zhang, Ping He, Silong Peng, "Improving Convolutional Neural Networks with Competitive Activation Function", Security and Communication Networks, vol. 2021, Article ID 1933490, 9 pages, 2021.
Downloads
Published
How to Cite
Issue
Section
License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
All papers should be submitted electronically. All submitted manuscripts must be original work that is not under submission at another journal or under consideration for publication in another form, such as a monograph or chapter of a book. Authors of submitted papers are obligated not to submit their paper for publication elsewhere until an editorial decision is rendered on their submission. Further, authors of accepted papers are prohibited from publishing the results in other publications that appear before the paper is published in the Journal unless they receive approval for doing so from the Editor-In-Chief.
IJISAE open access articles are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. This license lets the audience to give appropriate credit, provide a link to the license, and indicate if changes were made and if they remix, transform, or build upon the material, they must distribute contributions under the same license as the original.