
Academic Editor: Christos Bouras
Received: 3 October 2025
Revised: 28 October 2025
Accepted: 29 October 2025
Published: 31 October 2025
Citation: Ali, M.D.; Iqbal, M.A.; Lee,
S.; Duan, X.; Kim, S.K. Explainable AI
Based Multi Class Skin Cancer
Detection Enhanced by Meta Learning
with Generative DDPM Data
Augmentation. Appl. Sci. 2025,15,
11689. https://doi.org/10.3390/
app152111689
Copyright: © 2025 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license
(https://creativecommons.org/
licenses/by/4.0/).
Article
Explainable AI Based Multi Class Skin Cancer Detection
Enhanced by Meta Learning with Generative DDPM
Data Augmentation
Muhammad Danish Ali 1, Muhammad Ali Iqbal 2, Sejong Lee 3, Xiaoyun Duan 4and Soo Kyun Kim 2,*
1Department of Electronic Engineering, Jeju National University, Jeju 63243, Republic of Korea;
2Department of Computer Engineering, Jeju National University, Jeju 63243, Republic of Korea;
3School of Computer Science and Engineering Yeungnam University, 280 Daehak-ro,
4
School of Software, Anyang Normal University, Anyang 455002, China; [email protected]Abstract
Despite the widespread success of convolutional deep learning frameworks in computer
vision, significant limitations persist in medical image analysis. These include low image
quality caused by noise and artifacts, limited data availability compromising robustness
on unseen data, class imbalance leading to biased predictions, and insufficient feature
representation, as conventional CNNs often fail to capture subtle patterns and complex
dependencies. To address these challenges, we propose DAME (Diffusion-Augmented
Meta-Learning Ensemble), a unified architecture that integrates hybrid modeling with
generative learning using the Denoising Diffusion Probabilistic Model (DDPM). The DDPM
component improves resolution, augments scarce data, and mitigates class imbalance. A
hybrid backbone combining CNN, Vision Transformer (ViT), and CBAM captures both local
dependencies and long-range spatial relationships, while CBAM further enhances feature
representation by adaptively emphasizing informative regions. Predictions from multiple
hybrids are aggregated, and a logistic regression meta classifier learns from these outputs
to produce robust decisions. The framework is evaluated on the HAM10000 dataset, a
benchmark for multi-class skin cancer classification. Explainable AI is incorporated through
Grad CAM, providing visual insights into the decision-making process. This synergy
mitigates CNN limitations and demonstrates superior generalizability, achieving 98.6%
accuracy, 0.986 precision, 0.986 recall, and a 0.986 F1-score, significantly outperforming
existing approaches. Overall, the proposed framework enables accurate, interpretable, and
reliable medical image diagnosis through the joint optimization of contextual modeling,
feature discrimination, and data generation.
Keywords: skin cancer; convolutional neural networks (CNN); deep learning; meta learning;
Convolutional Block Attention Module (CBAM); Data Augmentation with Diffusion
Models (DDPMs)
1. Introduction
Skin cancer is one of the most common and aggressive cancers worldwide, leading to
significant health deterioration or even loss of life. In the United States alone, it is estimated
that over 9500 individuals are diagnosed with skin cancer every day, while more than
Appl. Sci. 2025,15, 11689 https://doi.org/10.3390/app152111689