International Journal of Intelligent Systems and Applications (IJISA)

IJISA Vol. 17, No. 3, Jun. 2025

Cover page and Table of Contents: PDF (size: 236KB)

Table Of Contents

REGULAR PAPERS

Robust, Interactive, and Intelligent Captcha Model Based on Image Processing

By Remzi Gurfidan Oguzhan KiLiM Tuncay YigiT

DOI: https://doi.org/10.5815/ijisa.2025.03.01, Pub. Date: 8 Jun. 2025

Ensuring online security against automated attacks remains a critical challenge, as traditional CAPTCHA mechanisms often struggle to balance robustness and usability. This study proposes a novel intelligent and interactive CAPTCHA system that integrates advanced image processing techniques with a convolutional neural network (CNN)-based evaluation model to enhance security and user engagement. The proposed CAPTCHA dynamically generates images with randomized object placement, adaptive noise layers, and geometric transformations, making them resistant to AI-based solvers. Unlike conventional CAPTCHAs, this approach requires users to interact with images by selecting and marking specific objects, creating a human-in-the-loop validation process. For evaluation, a CNN-based classifier processes user selections and determines their validity. A lightweight embedded software module tracks user interactions in real-time, monitoring selection accuracy and response patterns to improve decision-making. The system was tested on 6,000 images across five categories (airplanes, cars, cats, motorcycles, and fish), with an 80% training and 20% testing split. Experimental results demonstrate a classification accuracy of 99.58%, validation accuracy of 96.15%, and a loss value of 0.2078. The CAPTCHA evaluation time was measured at 47–53 milliseconds for initial validation and 17–23 milliseconds for subsequent evaluations. These results confirm that the proposed CAPTCHA model effectively differentiates human users from bots while maintaining usability, demonstrating superior resilience against automated solvers compared to traditional approaches.

[...] Read more.
Solving Traveling Salesman Problem Through Genetic Algorithm with Clustering

By Md. Azizur Rahman Kazi Mohammad Nazib Md. Rafsan Islam Lasker Ershad Ali

DOI: https://doi.org/10.5815/ijisa.2025.03.02, Pub. Date: 8 Jun. 2025

The Traveling Salesman Problem (TSP) is a well-known NP-hard combinatorial optimization problem, commonly studied in computer science and operations research. Due to its complexity and broad applicability, various algorithms have been designed and developed from the viewpoint of intelligent search. In this paper, we propose a two-stage method based on the clustering concept integrated with an intelligent search technique. In the first stage, a set of clustering techniques - fuzzy c-means (FCM), k-means (KM), and k-mediods (KMD) - are employed independently to generate feasible routes for the TSP. These routes are then optimized in the second stage using an improved Genetic Algorithm (IGA). Actually, we enhance the traditional Genetic Algorithm (GA) through an advanced selection strategy, a new position-based heuristic crossover, and a supervised mutation mechanism (FIB). This IGA is implemented to the feasible routes generated in the clustering stage to search the optimized route. The overall solution approach results in three distinct pathways: FCM+IGA, KM+IGA, and KMD+IGA. Simulation results with 47 benchmark TSP datasets demonstrate that the proposed FCM+IGA performs better than both KM+IGA and KMD+IGA. Moreover, FCM+IGA outperforms other clustering-based algorithms and several state-of-the-art methods in terms of solution quality.

[...] Read more.
A Soil Nutrient Assessment for Crop Recommendation Using Ensemble Learning and Remote Sensing

By Sudianto Sudianto Eko Fajar Cahyadi

DOI: https://doi.org/10.5815/ijisa.2025.03.03, Pub. Date: 8 Jun. 2025

Understanding the nutrient content of soils, such as nitrogen (N), phosphorus (P), potassium (K), pH, temperature, and moisture is key to dealing with soil variation and climate uncertainty. Effective soil nutrient management can increase plant resilience to climate change as well as improve water use. In addition, soil nutrients affect the selection of suitable plant types, considering that each plant has different nutritional needs. However, the lack of integration of soil nutrient analysis in agricultural practices leads to the inefficient use of inputs, impacting crop yields and environmental sustainability. This study aims to propose a soil nutrient assessment scheme that can recommend plant types using ensemble learning and remote sensing. Remote sensing proposals support performance broadly, while ensemble learning is helpful for precision agriculture. The results of this scheme show that the nutrient assessment with remote sensing provides an opportunity to evaluate soil conditions and select suitable plants based on the extraction of N, P, K, pH, TCI, and NDTI values. Then, Ensemble Learning algorithms such as Random Forest work more dominantly compared to XGBoost, AdaBoost, and Gradient Boosting, with an accuracy level of 0.977 and a precision of 0.980 in 0.895 seconds.

[...] Read more.
Energy and Deadline Aware Scheduling in Multi Cloud Environment Using Water Wave Optimization Algorithm

By Santhosh Kumar Medishetti Rameshwaraiah Kurupati Rakesh Kumar Donthi Ganesh Reddy Karri

DOI: https://doi.org/10.5815/ijisa.2025.03.04, Pub. Date: 8 Jun. 2025

Scheduling is an NP-hard problem, and heuristic algorithms are unable to find approximate solutions within a feasible time frame. Efficient task scheduling in Cloud Computing (CC) remains a critical challenge due to the need to balance energy consumption and deadline adherence. Existing scheduling approaches often suffer from high energy consumption and inefficient resource utilization, failing to meet stringent deadline constraints, especially under dynamic workload variations. To address these limitations, this study proposes an Energy-Deadline Aware Task Scheduling using the Water Wave Optimization (EDATSWWO) algorithm. Inspired by the propagation and interaction of water waves, EDATSWWO optimally allocates tasks to available resources by dynamically balancing energy efficiency and deadline adherence. The algorithm evaluates tasks based on their energy requirements and deadlines, assigning them to virtual machines (VMs) in the multi-cloud environment to minimize overall energy consumption while ensuring timely execution. Google Cloud workloads were used as the benchmark dataset to simulate real-world scenarios and validate the algorithm's performance. Simulation results demonstrate that EDATSWWO significantly outperforms existing scheduling algorithms in terms of energy efficiency and deadline compliance. The algorithm achieved an average reduction of energy consumption by 21.4%, improved task deadline adherence by 18.6%, and optimized resource utilization under varying workloads. This study highlights the potential of EDATSWWO to enhance the sustainability and efficiency of multi-cloud systems. Its robust design and adaptability to dynamic workloads make it a viable solution for modern cloud computing environments, where energy consumption and task deadlines are critical factors.

[...] Read more.
Data-driven Insights for Informed Decision-Making: Applying LSTM Networks for Robust Electricity Forecasting in Libya

By Asma Agaal Mansour Essgaer Hend M. Farkash Zulaiha Ali Othman

DOI: https://doi.org/10.5815/ijisa.2025.03.05, Pub. Date: 8 Jun. 2025

Accurate electricity forecasting is vital for grid stability and effective energy management, particularly in regions like Benghazi, Libya, which face frequent load shedding, generation deficits, and aging infrastructure. This study introduces a data-driven framework to forecast electricity load, generation, and deficits for 2025 using historical data from two distinct years: 2019 (an instability year) and 2023 (a stability year). Various time series models were employed, including Autoregressive Integrated Moving Average (ARIMA), seasonal ARIMA, dynamic regression ARIMA, extreme gradient boosting, simple exponential smoothing, and Long Short-Term Memory (LSTM) neural networks. Data preprocessing steps—such as missing value imputation, outlier smoothing, and logarithmic transformation—are applied to enhance data quality. Model performance was evaluated using metrics such as mean squared error, root mean squared error, mean absolute error, and mean absolute percentage error. LSTM outperformed other models, achieving the lowest mentioned metric values for forecasting load, generation, and deficits, demonstrating its ability to handle non-stationarity, seasonality, and extreme events. The study’s key contribution is the development of an optimized LSTM framework tailored to North Benghazi’s electricity patterns, incorporating a rich dataset and exogenous factors like temperature and humidity. These findings offer actionable insights for energy policymakers and grid operators, enabling proactive resource allocation, demand-side management, and enhanced grid resilience. The research highlights the potential of advanced machine learning techniques to address energy-forecasting challenges in resource-constrained regions, paving the way for a more reliable and sustainable electricity system.

[...] Read more.
Intelligent Application for Predicting Diabetes Spread Risk in the World Based on Machine Learning

By Dmytro Uhryn Victoria Vysotska Daryna Zadorozhna Mariia Spodaryk Kateryna Hazdiuk Zhengbing Hu

DOI: https://doi.org/10.5815/ijisa.2025.03.06, Pub. Date: 8 Jun. 2025

This paper presents the development and implementation of an intelligent system for predicting the risk of diabetes spread using machine learning techniques. The core of the system relies on the analysis of the Pima Indians Diabetes dataset through k-nearest neighbours (k-NN), Random Forest, Logistic Regression, Decision Trees and XGBoost algorithms. After pre-processing the data, including normalization and handling missing values, the k-NN model achieved an accuracy of 77.2%, precision of 80.0%, recall of 85.0%, F1-score of 83.0% and ROC of 81.9%. The Random Forest model achieved an accuracy of 81.0%, precision of 87.0%, recall of 91.0%, F1-score of 89.0% and ROC of 90.0%. The Logistic Regression model achieved an accuracy of 60.0%, precision of 93.0%, recall of 61.0%, F1-score of 74.0% and ROC of 69.0%. The Decision Trees model achieved an accuracy of 79.0%, precision of 87.0%, recall of 89.0%, F1-score of 88.0% and ROC of 83.0%. In comparison, the XGBoost model outperformed with an accuracy of 83.0%, precision of 85.0%, recall of 96.0%, F1-score of 90.0% and ROC of 91.0%, indicating strong prediction capabilities. The proposed system integrates both hardware (continuous glucose monitors) and software (AI-based classifiers) components, ensuring real-time blood glucose level tracking and early-stage diabetes risk prediction. The novelty lies in the proposed architecture of a distributed intelligent monitoring system and the use of ensemble learning for risk assessment. The results demonstrate the system's potential for proactive healthcare delivery and patient-centred diabetes management.

[...] Read more.