International Journal of Information Engineering and Electronic Business (IJIEEB)

IJIEEB Vol. 18, No. 1, Feb. 2026

Cover page and Table of Contents: PDF (size: 954KB)

Table Of Contents

REGULAR PAPERS

Multi-Criteria Decision-Making (MCDM) Approach for Software Architecture Selection in Cloud Computing Using Evidential Reasoning and Bayesian Inference Techniques

By Jide Ebenezer Taiwo Akinsola Akinwale Olusolabomi Akinkunmi Ifeoluwa Michael Olaniyi John Edet Efiong Emmanuel Ajayi Olajubu Ganiyu Adesola Aderonmu

DOI: https://doi.org/10.5815/ijieeb.2026.01.01, Pub. Date: 8 Feb. 2026

Choosing the optimal software architecture for cloud-based systems is a critical and complex Multi-Criteria Decision Making (MCDM) problem, characterized by multiple, often conflicting, and interdependent criteria such as performance, cost, scalability, deployment speed, security, and maintainability. This research addresses this challenge by proposing and applying an integrated MCDM methodology that leverages Evidential Reasoning (ER) and Bayesian Inference (BI). The study's primary objective is to provide a robust and transparent framework for evaluating six common architecture styles: Monolithic, Microservices, Layered, Serverless, Event-Driven, and Service-Oriented Architecture (SOA). The methods employed involved a multi-stage process. First, criteria weights were derived using the Analytic Hierarchy Process (AHP) through expert pairwise comparisons. The techniques for handling uncertainty and dependencies were central. ER was utilized to aggregate subjective and objective assessments, representing them as belief distributions to explicitly account for imprecision and ignorance. Concurrently, BI was applied to model probabilistic interdependencies between criteria (Security influencing Performance, Performance influencing Scalability and Cost) within a Bayesian Network. The Intelligent Decision System (IDS) tool facilitated the operationalization of both ER aggregation and Bayesian inference. The results of the AHP weighting revealed the priorities: Performance (0.3930), Security (0.2355), Scalability (0.1420), Maintainability (0.1160), Deployment Speed (0.0568), and Cost (0.0568). The overall evaluation, integrating these weighted criteria with ER and BI, identified Monolithic architecture as the most suitable option, achieving a utility score of 0.81. This ranking was followed by Event-Driven (0.69), SOA (0.68), Serverless (0.68), Microservices (0.65), and Layered (0.47). A comprehensive sensitivity analysis was conducted to assess the robustness of this decision. Crucially, the analysis demonstrated that while the Monolithic architecture was initially optimal, significant shifts in criteria weights could alter the ranking. Specifically, when the weight of Security was substantially increased (to ~0.32) and Performance decreased (to ~0.25), the Serverless architecture emerged as the new top-ranked alternative (83% utility score), surpassing Monolithic (78%). This finding underscores the critical influence of strategic priorities on architecture selection. Future studies may also focus on developing data-driven, adaptive, and domain-specific decision frameworks to enhance the robustness, transparency, and real-world applicability of MCDM approaches for cloud-based software architecture selection.

[...] Read more.
The Analysis of livestream as an Intervening Variable in Gen Z

By Indra A. Irawan Iha H. Hatta Yani Dewanti Setiarini

DOI: https://doi.org/10.5815/ijieeb.2026.01.02, Pub. Date: 8 Feb. 2026

This research discusses consumer behavior toward unplanned purchasing, or what is popularly referred to as impulsive buying, in an online setting. The variables considered are Fear of Missing Out (FoMO), Price Perception, and Hedonism, and their effect on purchasing habits, with live streaming as a mediating variable for Generation Z TikTok Shop users. FoMO is a psychological state of anxiety regarding missing out on experiences that are appreciated by others. Price Perception refers to the act of deciphering price options based on obtained information in order to develop a sense of a product's pricing. Hedonism, on the other hand, refers to the tendency to seek pleasure and enjoyment. Generation Z, with its technological and social media savviness, often exhibits spontaneous buying tendencies in the case of online shopping, particularly on highly used apps like TikTok Shop. This research is quantitative in nature with survey techniques on 243 respondents of Generation Z who are active on TikTok Shop in Jakarta, Bogor, Depok, Tangerang, and Bekasi—locations that significantly contribute to the population and economic growth of Indonesia. The findings show that there is a positive correlation between FoMO and Price Perception variables and online impulsive buying, while Hedonism did not have any effect. In addition, Generation Z's engagement with live streaming shows a substantial positive effect on all three independent variables associated with impulsive consumer behavior. The results offer essential insight for e-commerce professionals who aim to create more effective marketing strategies by utilizing drivers that can strengthen and stimulate impulsive buying behavior among Generation Z consumers.

[...] Read more.
Evaluation of Coalition Fault Tolerance

By Viktor Mashkov

DOI: https://doi.org/10.5815/ijieeb.2026.01.03, Pub. Date: 8 Feb. 2026

The paper deals with alliances and coalitions that can be formed by entities. In the paper, we consider unselfish (not self-interested) entities that do their best to achieve their common goal(s) without expecting any compensations or payoffs. The number of alliance members is assumed to be limited and fixed. To solve specific tasks, alliance members form coalitions. Generally, many coalitions can be formed by alliance members, and the problem arises to select the best of them. It can be done on the basis of some criteria, one of which could be coalition fault tolerance. Despite the great volume of conducted researches, only a few metrics have been proposed that can be used to quantify the coalition fault tolerance. The paper proposes new metrics for measuring coalition fault tolerance. A simple example explains how to compute coalition fault tolerance by using the proposed metrics. Bi-objective optimization problem, in which one of the objectives is coalition fault tolerance, was solved in the paper. Compromise programming was used to solve the optimization problem. 

[...] Read more.
Mathematical Model of Strategic Diagnostics of Enterprise in the Circular Economy Adaptation Process

By Oleksandr Trukhan Zarina Poberezhna Maksym Zaliskyi Yanina Goncharenko

DOI: https://doi.org/10.5815/ijieeb.2026.01.04, Pub. Date: 8 Feb. 2026

The article considers the development of strategic diagnostics of an enterprise in the condition of adaptation to a circular economy. In the context of global economic and environmental changes, strategic diagnostics is an important tool for assessing the effectiveness of the implemented changes and adapting to new challenges. The authors emphasize the importance of implementing the principles of a circular economy, which involve minimizing waste and reusing resources. It is obvious that in the process of adapting an enterprise to the conditions of a circular economy, strategic diagnostics plays an important role, which allows assessing the enterprise readiness to transition to a new economic model, determining its competitive advantages and developing effective development strategies. A set of strategic diagnostics tools with their adaptation to the conditions of a circular economy is systematized. It is substantiated that the proposed tools allow enterprises to effectively adapt to the conditions of a circular economy, reduce environmental impact, increase competitiveness, and find new opportunities for sustainable economic development. An algorithm for using strategic diagnostics of the enterprise in the context of adaptation to circular economy is developed. The presented algorithm allows enterprises to effectively adapt to the conditions of the circular economy, increase competitiveness, and ensure long-term business sustainability.

[...] Read more.
The Persuasiveness of Digital Transformation in the Global Competitive Economies: The Gains, The Pains and The Balancing Strategy

By Omojokun Gabriel Aju Mokgohloa Kgabo

DOI: https://doi.org/10.5815/ijieeb.2026.01.05, Pub. Date: 8 Feb. 2026

Digital transformation has emerged as a critical driver of economic growth and competitiveness in the global economy. By integrating advanced technologies such as artificial intelligence, blockchain, cloud computing, and the Internet of Things (IoT), organizations across industries are reshaping their operational models, enhancing productivity, and promoting innovation. This study examines the multifaceted impact of digital transformation on global economy competitiveness, focusing on how businesses, governments, and societies are adapting to the rapidly evolving digital landscape. It describes the evolution of digital transformation through the literature review, while highlighting the key digital transformation drivers, gains, challenges, and the relationship between digital transformation and global economy with United States of America, United Kingdom, China and South Africa as case studies using a systematic literature review approach. The study underscores the necessity for digital transformation implementation strategy to sustain competitive advantage in an interconnected world. The findings contribute to a deeper understanding of how digital transformation not only disrupts traditional economic paradigms but also creates opportunities for sustainable growth and innovation in the 21st century by presenting an all-inclusive comprehensive digital transformation implementation strategy for the global competitive economy. 

[...] Read more.
Potential Study of Parallel Dipoles Line Technology as Tiltmeter Sensor for Geotechnical Applications

By Indra Hartarto Tambunan Andi Ray Hutauruk Philippians Manurung Amsal Sinambela Febrian Cornellius Sidabutar

DOI: https://doi.org/10.5815/ijieeb.2026.01.06, Pub. Date: 8 Feb. 2026

Tiltmeters with high accuracy and sensitivity are indispensable for various geotechnical applications, including soil deformation monitoring, structural inclination analysis, and seismic activity assessment. This study proposes a novel tiltmeter system utilizing Parallel Dipole Line (PDL) technology, where a diamagnetic graphite cylinder is levitated within a camelback potential field generated by parallel magnetic dipoles. Variations in the vertical position of the graphite cylinder correspond to tilt angles, which are captured by a high-resolution imaging system and processed using a Jetson Nano microcomputer for real-time analysis. Experimental results show that shorter graphite lengths can increase the measurement range. One of the test results is that 6 mm graphite can measure inclination in the range of -1.00000° to +0.99999°. In contrast, longer graphite, such as 12 mm, only reaches a range of -0.60000° to +0.60434°. In addition, the increase in graphite length and the reduction in magnet dimensions significantly help reduce oscillations during measurement, which ultimately improves system stability. The optimized PDL-based tiltmeter is capable of detecting inclination with a high resolution of up to 10⁻⁵ degrees, with critical damping used to eliminate oscillatory interference. These findings confirm that the PDL tiltmeter system offers much better precision, stability, and durability than conventional methods, making it a potential innovative tool for high-resolution geotechnical and structural monitoring.

[...] Read more.
Exploring and Implementing Container Scheduling Methods: A Comparative Review and Practical Approach

By Kanika Sharma Parul Khurana

DOI: https://doi.org/10.5815/ijieeb.2026.01.07, Pub. Date: 8 Feb. 2026

Container-based virtualization has become prominent as lightweight virtualization due to its scalability, resource utilization, and portability, especially in microservices. Container scheduler plays an essential role in Container services to optimize performance to reduce the overall cost by managing load balancing. Although Containers serve a lot of benefits, resource allocation is one of the major concerns associated with Container technology. This paper systematically reviews the distinct scheduling techniques used for Containers in a cloud environment, where existing scheduling techniques and their shortcomings have been discussed in detail. In the review process, the various issues and challenges in the Container technology have been identified, and the same has been discussed in this paper. Based on crucial elements including different performance metrics like CPU utilization, memory utilization, load balancing, and many others, it gives an in-depth comparison of the existing scheduling techniques like Ant Colony Optimization(ACO), Particle Swarm Optimization(PSO), Bee Colony Optimization(BCO), Chicken Swarm Optimization(CSO) and Genetic Algorithm(GA) outlining their benefits and drawbacks. This study also proposes a hybrid framework for secure and efficient Container scheduling. This framework can be implemented in the future to provide better results compared to existing approaches.

[...] Read more.
Profit Forecasting for Daily Pharmaceutical Sales Using Traditional, Shallow, and Deep Neural Networks: A Case Study from Sabha City, Libya

By Mansour Essgaer Asma Agaal Amna Abbas Rabia Al Mamlook

DOI: https://doi.org/10.5815/ijieeb.2026.01.08, Pub. Date: 8 Feb. 2026

Abstract: Accurate profit forecasting is critical for small-scale pharmacies, particularly in resource-constrained environments where financial decisions must be both timely and data-informed. This study investigates the predictive performance of sixteen regression models for daily profit forecasting using transactional data collected from a single local pharmacy in Sabha, Libya, over a 14-month period. An exploratory data analysis revealed strong right-skewed distributions in sales, cost, and profit, as well as pronounced temporal patterns, including seasonal peaks during spring and early summer and weekly profit clustering around weekends. After outlier treatment using the interquartile range method. A total of sixteen regression models were developed and evaluated, encompassing linear models (Linear, Ridge, Lasso, ElasticNet), tree-based models (Decision Tree, Random Forest, Extra Trees, Gradient Boosting, AdaBoost), proximity-based models (K-Nearest Neighbors), kernel-based models (Support Vector Regression), and neural architectures (Multi-Layer Perceptron, Convolutional Neural Network, Long Short-Term Memory, Gated Recurrent Unit). The models were assessed using Mean Absolute Error, Mean Squared Error, Root Mean Squared Error, and the R-squared score. The results consistently showed that tree-based ensemble models—particularly Extra Trees and LightGBM—achieved the highest accuracy, with R² values of 0.978 and 0.975 respectively, significantly outperforming neural and linear models. Learning curves and residual plots further confirmed the superior generalization and robustness of these models. We acknowledge that the dataset size (424 records) and the deterministic relationship between sales, costs, and profit influence these metrics. The study highlights the importance of model selection tailored to domain-specific data characteristics and suggests that well-tuned ensemble methods may offer reliable, interpretable, and scalable solutions for profit forecasting in simialr low-resource retail environments. However, broad claims of usefulness for all low-resource settings should be tempered by the limited scope of this dataset. Future work should consider longer-term data and external economic indicators to further improve model reliability, and focus on operational deployment strategies, investigating how these models can be integrated into daily pharmacy workflows despite real-time data constraints.

[...] Read more.
Insan AI: Integrated Artificial Intelligence Learning Platform

By Winanti Yoga Prihastomo Yulius Denny Prabowo Achmad Sidik Penny Hendriyati

DOI: https://doi.org/10.5815/ijieeb.2026.01.09, Pub. Date: 8 Feb. 2026

The integration of artificial intelligence (AI) in education presents significant challenges, such as gaps in educators' digital adaptability, ethical considerations, and inconsistent infrastructure. This study details the development and validation of the Insan AI Platform, an integrated learning solution designed to address these obstacles through adaptive AI tools for both teachers and students. The platform was developed using a user-centered prototyping methodology, drawing on comprehensive literature analysis, Focus Group Discussions with 26 educational stakeholders, and expert interviews. Key features include an AI Content Generator, a Virtual Tutor, and a Learning Analytics dashboard, all intended to facilitate personalized learning experiences and enhance teaching efficiency. User Acceptance Testing with 20 teachers demonstrated the platform's functional robustness, with perfect pass rates on all core features and a high usability score (SUS: 84.0). The platform architecture integrates multiple AI application programming interfaces (APIs) while maintaining responsive performance under varied network conditions. These findings indicate that the Insan AI Platform effectively meets user requirements and provides a strong foundation for broader educational implementation. Future development will focus on incorporating multilingual support and advanced learning analytics capabilities. According to a questionnaire completed by 26 users, there was a score increase of 22.42 after using the Insan AI platform. This indicates that the application has successfully met user requirements

[...] Read more.
Information Engineering for Data-Driven Analysis of h-Index Formation Across Academic Career Stages Using Large-Scale Bibliometric Parameters, Statistical and Clustering Methods

By Yurii Ushenko Victoria Vysotska Serhii Vladov Zhengbing Hu Lyubomyr Chyrun

DOI: https://doi.org/10.5815/ijieeb.2026.01.10, Pub. Date: 8 Feb. 2026

In the context of globalisation of the scientific space and the growing role of scientometric indicators, the Hirsch index (h-index) remains one of the key tools for assessing scientific performance. At the same time, the influence of individual factors on the h-index varies significantly across the stages of a scientist's academic career, necessitating their comparative analysis. The purpose of this work is to conduct a comparative study of the Hirsch index and the factors that influence its formation, considering both novice and experienced scientits anaccounting for The study employed descriptive statistics, visual analysis, time-series smoothing (Kendall's method, Pollard's method, exponential and median smoothing), correlation analysis (Pearson's coefficients), and the k-means clustering method. The study was conducted on two large datasets representing novice and experienced scientists. It was found that the average h-index of experienced scientists is 37.78, approximately 2.6 times that of beginner scientists (14.59). Correlation analysis revealed a weak or negative relationship between the h-index and self-citation, with the strongest correlation observed between the h-index and co-authorship (r = 0.68–0.80). Medium identified 6 clusters, including one that unites scientific leaders with extremely high H-index values. The study's results confirm that, in the early stages of a scientific career, geographical and institutional factors play a significant role. In contrast, for experienced scientists, the Hirsch index becomes more predictable and is determined by the quality of scientific publications, the level of citation, and practical cooperation within scientific teams.

[...] Read more.