Abstract: Skin cancer is a growing public health concern; while some types of skin cancer are deadly, such as Melanoma, early detection is crucial for effective treatment and improving patient survival rates [1,2,3,4]. In fact, Malignant melanoma accounts for only 2.3% of all skin cancers yet is responsible for more than 75% of skin cancer-related deaths. However, if it is detected at an early stage, it is highly curable; the 10-year survival rate is between 90% and 97% when the tumour thickness is less than 1 mm. Also, the treatment for an early detected cancerous mole is as simple as excision of the lesion, which can prevent metastasis and spread of cancer to other organs. In this research study, we introduce an approach for skin cancer classification using a state-of-the-art deep learning architecture that has demonstrated exceptional performance in diverse image analysis tasks. We have used two publicly available benchmark data sets for training and validating our results: HAM10000 and ISIC2018 datasets. These datasets consist of dermoscopic images captured using Dermatoscopes and carefully annotated by expert dermatologists. Preprocessing techniques, such as normalization and augmentation, are applied to enhance the robustness and generalization of the model. The proposed approach demonstrated the efficacy of extracting relevant features for accurate classification by leveraging Deep Object Detection models to identify the location of the Lesion, then using the Segment Anything Model (SAM) and MedSAM for extracting the border of the lesions, then finally using various pre-trained states-of-the-art Deep Convolution Networks for Classification. Comprehensive experiments and evaluations are performed in this research; the results demonstrate the effectiveness of using Zero-Shot Segmentation methods over traditional deep learning architectures in skin cancer classification.

Keywords: Skin Cancer, Computer vision, Cancer classification, image processing, Vision Transformer, image classification, Cancer Cell Segmentation.


PDF | DOI: 10.17148/IJARCCE.2024.13835

Open chat
Chat with IJARCCE