
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 16, No. 4, 2025
transfer to the cloud [9]. Optimizing resource allocation in
such distributed systems is a central challenge. Dynamic re-
source management—including bandwidth, computing power,
and storage—is crucial, especially when handling critical real-
time data streams, such as those generated by biometric
sensors and cameras in cardiac monitoring systems [10]. To
address these challenges, integrating Edge-Fog Computing
systems with artificial intelligence (AI) approaches and digital
twins paves the way for intelligent and scalable healthcare
systems that can adapt to the dynamic needs of patients and
infrastructure [11] [12]. In this context, our work proposes
an innovative approach to optimizing resource allocation in
Edge-Fog Computing environments, specifically designed to
enhance the prediction of cardiac events. It combines advanced
AI models, including a hybrid CNN-LSTM model for cardiac
event prediction and a Deep Q-Network (DQN) for dynamic
resource allocation. This system aims to establish a real-time
health monitoring framework capable of predicting patients’
cardiac health status, determining the optimal location for task
processing—whether at the edge or fog—and delivering rapid
responses in critical situations. Moreover, integrating digital
twins into this architecture enables comprehensive system
supervision, providing a platform for real-time monitoring
and predictive analysis [13]. These digital twins not only
simulate system behavior under varying conditions [fdgth]
but also continuously optimize resource allocation decisions
[14]. Preliminary results indicate that this approach effectively
handles workload variations, improves system performance,
and supports rapid response to critical situations. The main
contributions of our research study are as follows:
1) AI-Driven Heart Attack Risk Prediction at the
Edge: Development and implementation of a CNN-
BiLSTM deep learning model for heart attack predic-
tion, enabling real-time monitoring and accurate risk
assessment directly on edge devices.
2) Dynamic Resource Allocation Optimisation: Im-
plementation of a reinforcement learning Deep Q-
Network (DQN) model to optimise resource manage-
ment. This model dynamically determines whether
data, including video streams in critical situations,
should be processed locally on edge devices or of-
floaded to the fog layer in resource-intensive scenar-
ios.
3) Integrating Digital Twin Technology: Use of digital
twins for cardiac monitoring in healthcare to refine
the accuracy of heart attack predictions, optimise
resource allocation and improve system performance
through real-time monitoring, notification in critical
situations and continuous optimisation based on repli-
cated data.
The rest of this paper is organized as follows:Section 2
discusses related work.In Section 3, we presents the proposed
framework.Furthermore, Section 4 is the results and discus-
sion. The conclusion and the paper’s potential future directions
are presented in Section 5. I wish you the best of success.
mds
March 20, 2025
II. RELATED WORK
Many authors have carried out studies relevant to our
research. Key studies are summarized below:
Talaat et al. [15] introduced EPRAM, a methodology for
optimizing resource allocation and prediction in Fog Com-
puting(FC) for healthcare. By combining Deep Reinforcement
Learning(DRL) and Probabilistic Neural Networks(PNN),
EPRAM improves real-time data processing for heart disease
prediction, aiming to reduce latency and enhance service
quality while maximizing Average Resource Utilization and
Load Balancing through three modules: Data Preprocessing,
Resource Allocation, and Effective Prediction. In a related
study,Aazam et al. [16] studied machine learning-based task
offloading in Edge Computing to enhance smart healthcare
applications. They deployed ML models, including k-nearest
neighbors (kNN), naive Bayes (NB), and support vector
classification (SVC), on edge nodes to optimize processing
efficiency for healthcare tasks, including COVID-19 scenar-
ios. While real healthcare data demonstrated the approach’s
benefits, specific quantitative performance metrics were not
provided. The work of Amzil et al. [17] proposed a Ma-
chine Learning-based Medical Data Segmentation (ML-MDS)
method, achieving 92% classification accuracy and reducing
latency by 56% compared to existing methods. This focus on
efficiency aligns well with Ullah et al. [18], who combined
an analytical model with fuzzy-based reinforcement learning
to minimize energy usage and latency in healthcare IoT,
although they did not provide specific accuracy metrics. Khan
et al.[19] proposed a dynamic resource allocation and load
balancing algorithm for Internet of Healthcare Things (IoHT)
applications to achieve maximum resource utilization among
various fog nodes. Experimental results show significant im-
provements, including a 45% decrease in delay, a 37% reduc-
tion in energy consumption, and a 25% decrease in network
bandwidth consumption compared to existing methods. The
approaches of Hanumantharaju et al. [20] and Scrugli
et al. [21] highlight the diversity in predictive modeling.
Hanumantharaju et al. focused on heart disease prediction
using Random Forest and Naive Bayes, and Scrugli et al.
achieved over 97% accuracy in detecting arrhythmia disorders
using a convolutional neural network (CNN).Rajapaksha et
al.[22] focused on generating and utilizing synthetic data
by harnessing the potential of synthetic data vaults. they
developed a predictive model using an LSTM structure to
identify the likelihood of cardiac arrest based on data extracted
from BHTs. The model achieved an accuracy of 96%. Tang
et al. [23] developed a model for diagnosing cardiovascular
diseases and diabetes using the Smart Healthcare-Crow Search
Optimization (SH-CSO) algorithm . The efficacy of the SH-
CSO algorithm was validated using medical records. the SH-
CSO model achieved a maximum accuracy of 97.26% for
diabetes diagnosis and 96.16% for heart disease diagnosis.
Dritsas et al.[24] evaluated and compared the performance
of five Deep Learning (DL) models : Multi-Layer Percep-
tron (MLP), Convolutional Neural Network (CNN), Recurrent
Neural Network (RNN), Long Short-Term Memory (LSTM),
Gated Recurrent Unit (GRU), and a Hybrid model, to a heart
attack prediction dataset. The results showed that the Hybrid
model achieved the highest performance across all metrics,
with an accuracy of 91%, precision of 89%, recall of 90%,
F1-score of 89%. Hossain et al. [25] proposed a hybrid model
www.ijacsa.thesai.org 2 |Page