ADVANCING DERMATOLOGIC ONCOLOGY USING PARAMETER-REFINED DEEP LEARNING-DRIVEN STRATEGY FOR ENHANCED PRECISION MEDICINE
Rana Alabdan,
Hanan Abdullah Mengash,
Mashael M. Asiri,
Faheed A. F. Alrslani,
Abdullah Mohamed and
Yazeed Alzahrani
Additional contact information
Rana Alabdan: Department of Information Systems, College of Computer and Information Science, Majmaah University, Al Majma’ah 11952, Saudi Arabia
Hanan Abdullah Mengash: Department of Information Systems, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh 11671, Saudi Arabia
Mashael M. Asiri: Department of Computer Science, Applied College at Mahayil, King Khalid University, Abha 62521, Saudi Arabia
Faheed A. F. Alrslani: Department of Information Technology, Faculty of Computing and Information Technology, Northern Border University, Rafha 76413, Saudi Arabia
Abdullah Mohamed: Research Center, Future University in Egypt, New Cairo 11845, Egypt
Yazeed Alzahrani: Department of Computer Engineering, College of Engineering in Wadi Addawasir, Prince Sattam bin Abdulaziz University, Al-Kharj 16278, Saudi Arabia
FRACTALS (fractals), 2025, vol. 33, issue 02, 1-16
Abstract:
Dermatologic oncology’s precision medicine revolutionizes skin cancer detection by integrating advanced technologies and personalized patient data. Dermatologic oncology concentrates on detecting skin cancer, utilizing modern techniques and technologies to detect and classify several kinds of cutaneous malignancies. Leveraging medical knowledge or advanced imaging approaches like dermoscopy and reflectance confocal microscopy; dermatologists effectively investigate skin cancer for subtle signs of malignancy. Furthermore, computer-aided diagnostic (CAD) systems, controlled by machine learning (ML) methods, are gradually deployed to boost diagnostic accuracy by investigating massive datasets of dermatoscopic images. It is a multi-disciplinary method that allows early recognition of skin lesions and enables precise prognostication and particular treatment approach, finally enhancing patient outcomes in dermatologic oncology. This paper presents the Fractals Snake Optimization with Deep Learning for Accurate Classification of Skin Cancer in Dermoscopy Images (SODL-ACSCDI) approach. The purpose of the SODL-ACSCDI approach is to identify and categorize the existence of skin cancer on Dermoscopic images. The SODL-ACSCDI technique applies a contrast enhancement process as the initial step. Next, the SODL-ACSCDI technique involves the SE-ResNet+FPN model for deriving intrinsic and complex feature patterns from dermoscopic images. Additionally, the SO technique can help boost the hyperparameter selection of the SE-ResNet+FPN approach. Furthermore, skin cancer classification uses the convolutional autoencoder (CAE) approach. The experimentation results of the SODL-ACSCDI technique could be examined using a dermoscopic image dataset. A wide-ranging result of the SODL-ACSCDI technique indicated a superior performance of 99.61% compared to recent models concerning various metrics.
Keywords: Precision Medicine; Dermotologyic Oncology; Skin Cancer; Dermoscopic Images; Fractals Snake Optimization; Deep Learning; Feature Pyramid Network (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
http://www.worldscientific.com/doi/abs/10.1142/S0218348X25400092
Access to full text is restricted to subscribers
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wsi:fracta:v:33:y:2025:i:02:n:s0218348x25400092
Ordering information: This journal article can be ordered from
DOI: 10.1142/S0218348X25400092
Access Statistics for this article
FRACTALS (fractals) is currently edited by Tara Taylor
More articles in FRACTALS (fractals) from World Scientific Publishing Co. Pte. Ltd.
Bibliographic data for series maintained by Tai Tone Lim ().