ISSN: 2310-9025 (Print)
ISSN: 2310-9033 (Online)
DOI: https://doi.org/10.5815/ijmsc
Website: https://www.mecs-press.org/ijmsc
Published By: MECS Press
Frequency: 4 issues per year
Number(s) Available: 43
IJMSC is committed to bridge the theory and practice of mathematical sciences and computing. IJMSC publishes original, peer-reviewed, and high quality articles in the areas of mathematical sciences and computing. IJMSC is a well-indexed scholarly journal and is indispensable reading and references for people working at the cutting edge of mathematical sciences and computing applications.
IJMSC has been abstracted or indexed by several world class databases:Google Scholar, Microsoft Academic Search, CrossRef, CNKI, Baidu Wenku, JournalTOCs, etc..
IJMSC Vol. 11, No. 2, Jun. 2025
REGULAR PAPERS
Various authors from around the world have extended the fuzzy concept to study the uncertainty condition and define its degree of certainty in various real-life experiments. At the same time, many authors have discussed the shortcomings of the definition of fuzzy sets that currently exist. However, no author has properly highlighted the problem of not following the two main classical set theories logically. To address this issue, an imprecise set definition is introduced as an extended definition of fuzzy sets, where the new concept applies two parameters, namely the functions of membership and reference, instead of one, and is helpful in defining the uncertainty problem in a more convenient manner than the existing one. In our previous work, we have studied imprecise subgroup using this new concept addressed by Baruah. In this paper, using the concept of complement of imprecise subgroup, we have introduced anti imprecise subgroup and some properties of anti-imprecise subgroup with examples. Imprecise subgroup is an extended version of fuzzy group theory developed using the definition of imprecise set defined by Baruah. In addition, we expected an application developed from an anti-imprecise subgroup that can be used to resolve various networking problems.
[...] Read more.Mathematical modeling plays a crucial role in epidemiology by helping us understand how an epidemic unfolds under different conditions. Respiratory infectious diseases have emerged in our history, the virus has significantly impacted all aspects of life. In the absence of a definitive treatment, vaccination and Non-Pharmaceutical Interventions (NPIs) such as social distancing, handwashing, wearing face masks, quarantine, isolation, and contact tracing have been essential in controlling its spread. This study develops a deterministic mathematical model to explore the dynamics of respiratory infectious diseases under key mitigation measures, including vaccination, face mask usage, quarantine, and isolation. The system of Ordinary Differential Equations (ODEs) is solved using Wolfram Mathematica, while the Next Generation Matrix (NGM) method is employed to determine the basic reproduction number. Stability analysis is conducted using the Jacobian matrix, and numerical simulations are carried out in Python using Jupyter Notebook. The analysis indicates that the model has a disease-free equilibrium (DFE), which is locally asymptotically stable when the basic reproduction number is less than one. This suggests that respiratory infectious diseases can be effectively controlled if vaccination and NPIs are implemented together. Sensitivity analysis highlights that the most critical factors for eradicating respiratory infectious diseases are the vaccine coverage rate (the proportion of susceptible individuals vaccinated) and vaccine efficacy.
[...] Read more.This research focuses on developing an automated framework for evaluating distress on flexible and rigid pavement surfaces through deep learning and algorithms, enhancing infrastructure monitoring by efficiently identifying, assessing, and measuring road distresses. The methodology begins with identifying road stretches from ground-level images, followed by capturing photos of distresses and applying algorithms to measure their dimensions accurately. A YOLOv5 model is developed to evaluate the length and width of identified distresses, with an exploration of the relationship between camera position and measurement accuracy. Physical measurements using tape are employed for validation, ensuring that the automated results align with real-world dimensions. Results indicate that the average errors of 26.1% for length and 26.9% for width for flexible pavement and the average percentage error in length is about 29% and average percentage error in width is about 1% for rigid pavement. This highlights the importance of precise measurements for effective road rehabilitation. The integration of computer vision in road maintenance, validated through physical measurements, promises significant improvements in the accuracy, efficiency, and resilience of road networks.
[...] Read more.Emotions significantly influence human behaviour, decision-making, and communication, making their accurate recognition essential for various applications. This study introduces a novel approach for emotion extraction from electrocardiogram (ECG) and galvanic skin response (GSR) signals using Bidirectional Long Short-Term Memory (BiLSTM) networks. Unlike conventional emotion recognition methods that rely on facial expressions or self-reports, our model utilizes physiological signals to capture emotional states with high precision. ECG provides insights into cardiac activity, while GSR reflects changes in skin conductance, both serving as reliable indicators of emotional responses. By leveraging advanced signal processing techniques and deep learning algorithms, the model effectively identifies intricate patterns within these biosignals, enabling accurate emotion classification. Experimental validation demonstrates the model’s effectiveness in distinguishing between different emotional states, surpassing traditional methods. This research contributes to affective computing and human-computer interaction (HCI) by enhancing the capability of intelligent systems to recognize and respond to human emotions, paving the way for applications in mental health monitoring, driver assistance systems, and adaptive user interfaces.
[...] Read more.People around the world use fresh water daily for drinking, sanitation, and washing. At the same time, they discharge wastewater into canals, which can be harmful to both human health and the ecosystem of surface water sources. A significant amount of water is consumed for washing purposes. However, it is possible to disinfect and purify this large volume of wastewater for reuse. The process of treating used wastewater is known as refinement. This study aims to develop a two-stage stochastic recourse model that refines wastewater before it is released into the environment. The goal is to ensure that the refined wastewater does not harm the ecosystem. The treated water can then be repurposed for various secondary uses. The proposed model will account for uncertainties related to the availability of water from the supplying authority. To evaluate the effectiveness of this model, we will compare the costs of the water supply system both with and without refinement. The advantages of the proposed model will be assessed through calculations of the expected value of perfect information (EVPI), the value of the stochastic solution (VSS), the recourse solution (RS), the wait-and-see solution (WS), and the expected solution based on first-stage decisions (EEV). Additionally, a risk-averse (RA) optimization model will be used to analyze the sensitivity of system costs.
[...] Read more.With the reform of Chinese economic system, the development of enterprises is facing many risks and challenges. In order to understand the state of operation of enterprises, it is necessary to apply relevant methods to evaluate the enterprise performance. Taking Industrial and Commercial Bank of China as an example, this paper selects its financial data from 2018 to 2021. Firstly, DuPont analysis is applied to decompose the return on equity into the product of profit margin on sales, total assets turnover ratio and equity multiplier. Then analyzes the effect of the changes of these three factors on the return on equity respectively by using the Chain substitution method. The results show that the effect of profit margin on sales on return on equity decreases year by year and tends to be positive from negative. The effect of total assets turnover ratio on return on equity changes from positive to negative and then to positive, while the effect of equity multiplier is opposite. These results provide a direction for the adjustment of the return on equity of Industrial and Commercial Bank of China. Finally, according to the results, some suggestions are put forward for the development of Industrial and Commercial Bank of China.
[...] Read more.In the software development industry, ensuring software quality holds immense significance due to its direct influence on user satisfaction, system reliability, and overall end-users. Traditionally, the development process involved identifying and rectifying defects after the implementation phase, which could be time-consuming and costly. Determining software development methodologies, with a specific emphasis on Test-Driven Development, aims to evaluate its effectiveness in improving software quality. The study employs a mixed-methods approach, combining quantitative surveys and qualitative interviews to comprehensively investigate the impact of Test-Driven Development on various facets of software quality. The survey findings unveil that Test-Driven Development offers substantial benefits in terms of early defect detection, leading to reduced costs and effort in rectifying issues during the development process. Moreover, Test-Driven Development encourages improved code design and maintainability, fostering the creation of modular and loosely coupled code structures. These results underscore the pivotal role of Test-Driven Development in elevating code quality and maintainability. Comparative analysis with traditional development methodologies highlights Test-Driven Development's effectiveness in enhancing software quality, as rated highly by respondents. Furthermore, it clarifies Test-Driven Development's positive impact on user satisfaction, overall product quality, and code maintainability. Challenges related to Test-Driven Development adoption are identified, such as the initial time investment in writing tests and difficulties adapting to changing requirements. Strategies to mitigate these challenges are proposed, contributing to the practical application of Test-Driven Development. Offers valuable insights into the efficacy of Test-Driven Development in enhancing software quality. It not only highlights the benefits of Test-Driven Development but also provides a framework for addressing challenges and optimizing its utilization. This knowledge is invaluable for software development teams, project managers, and quality assurance professionals, facilitating informed decisions regarding adopting and implementing Test-Driven Development as a quality assurance technique in software development.
[...] Read more.The process of making decisions on software architecture is the greatest significance for the achievement of a software system's success. Software architecture establishes the framework of the system, specifies its characteristics, and has significant and major effects across the whole life cycle of the system. The complicated characteristics of the software development context and the significance of the problem have caused the research community to build various methodologies focused on supporting software architects to improve their decision-making abilities. With these efforts, the implementation of such systematic methodologies looks to be somewhat constrained in practical application. Moreover, the decision-makers must overcome unexpected difficulties due to the varying software development processes that propose distinct approaches for architecture design. The understanding of these design approaches helps to develop the architectural design framework. In the area of software architecture, a significant change has occurred wherein the focus has shifted from primarily identifying the result of the architecting process, which was primarily expressed through the representation of components and connectors, to the documentation of architectural design decisions and the underlying reasoning behind them. This shift finally concludes in the creation of an architectural design framework. So, a correct decision- making approach is needed to design the software architecture. The present study analyzes the design decisions and proposes a new design decision model for the software architecture. This study introduces a new approach to the decision-making model, wherein software architecture design is viewed based on specific decisions.
[...] Read more.Since the inception of Blockchain, the computer database has been evolving into innovative technologies. Recent technologies emerge, the use of Blockchain is also flourishing. All the technologies from Blockchain use a mutual algorithm to operate. The consensus algorithm is the process that assures mutual agreements and stores information in the decentralized database of the network. Blockchain’s biggest drawback is the exposure to scalability. However, using the correct consensus for the relevant work can ensure efficiency in data storage, transaction finality, and data integrity. In this paper, a comparison study has been made among the following consensus algorithms: Proof of Work (PoW), Proof of Stake (PoS), Proof of Authority (PoA), and Proof of Vote (PoV). This study aims to provide readers with elementary knowledge about blockchain, more specifically its consensus protocols. It covers their origins, how they operate, and their strengths and weaknesses. We have made a significant study of these consensus protocols and uncovered some of their advantages and disadvantages in relation to characteristics details such as security, energy efficiency, scalability, and IoT (Internet of Things) compatibility. This information will assist future researchers to understand the characteristics of our selected consensus algorithms.
[...] Read more.Cloud computing is a widely acceptable computing environment, and its services are also widely available. But the consumption of energy is one of the major issues of cloud computing as a green computing. Because many electronic resources like processing devices, storage devices in both client and server site and network computing devices like switches, routers are the main elements of energy consumption in cloud and during computation power are also required to cool the IT load in cloud computing. So due to the high consumption, cloud resources define the high energy cost during the service activities of cloud computing and contribute more carbon emissions to the atmosphere. These two issues inspired the cloud companies to develop such renewable cloud sustainability regulations to control the energy cost and the rate of CO2 emission. The main purpose of this paper is to develop a green computing environment through saving the energy of cloud resources using the specific approach of identifying the requirement of computing resources during the computation of cloud services. Only required computing resources remain ON (working state), and the rest become OFF (sleep/hibernate state) to reduce the energy uses in the cloud data centers. This approach will be more efficient than other available approaches based on cloud service scheduling or migration and virtualization of services in the cloud network. It reduces the cloud data center's energy usages by applying a power management scheme (ON/OFF) on computing resources. The proposed approach helps to convert the cloud computing in green computing through identifying an appropriate number of cloud computing resources like processing nodes, servers, disks and switches/routers during any service computation on cloud to handle the energy-saving or environmental impact.
[...] Read more.Fog computing is extending cloud computing by transferring computation on the edge of networks such as mobile collaborative devices or fixed nodes with built-in data storage, computing, and communication devices. Fog gives focal points of enhanced proficiency, better security, organize data transfer capacity sparing and versatility. With a specific end goal to give imperative subtle elements of Fog registering, we propose attributes of this region and separate from cloud computing research. Cloud computing is developing innovation which gives figuring assets to a specific assignment on pay per utilize. Cloud computing gives benefit three unique models and the cloud gives shoddy; midway oversaw assets for dependable registering for performing required errands. This paper gives correlation and attributes both Fog and cloud computing differs by outline, arrangement, administrations and devices for associations and clients. This comparison shows that Fog provides more flexible infrastructure and better service of data processing by consuming low network bandwidth instead of shifting whole data to the cloud.
[...] Read more.Predicting human emotion from speech is now important research topic. One’s mental state can be understood by emotion. The proposed research work is emotion recognition from human speech. Proposed system plays significant role in recognizing emotion while someone is talking. It has a great use for smart home environment. One can understand the emotion of other who is in home or may be in other place. University, service center or hospital can get a valuable decision support system with this emotion prediction system. Features like-MFCC (Mel-Frequency Cepstral Coefficients) and LPC are extracted from audio sample signal. Audios are collected by recording speeches. A test also applied by combining self-collected dataset and popular Ravdees dataset. Self-collected dataset is named as ABEG. MFCC and LPC features are used in this study to train and test for predicting emotion. This study is made on angry, happy and neutral emotion classes. Different machine learning algorithms are applied here and result is compared with each other. Logistic regression performs well as compared to other ML algorithm.
[...] Read more.Security in digital communication is becoming more important as the number of systems is connected to the internet day by day. It is necessary to protect secret message during transmission over insecure channels of the internet. Thus, data security becomes an important research issue. Steganography is a technique that embeds secret information into a carrier such as images, audio files, text files, and video files so that it cannot be observed. In this paper, based on spatial domain, a new image steganography method is proposed to ensure the privacy of the digital data during transmission over the internet. In this method, least significant bit substitution is proposed where the information embedded in the random bit position of a random pixel location of the cover image using Pseudo Random Number Generator (PRNG). The proposed method used a 3-3-2 approach to hide a byte in a pixel of a 24 bit color image. The method uses Pseudo Random Number Generator (PRNG) in two different stages of embedding process. The first one is used to select random pixels and the second PRNG is used select random bit position into the R, G and B values of a pixel to embed one byte of information. Due to this randomization, the security of the system is expected to increase and the method achieves a very high maximum hiding capacity which signifies the importance of the proposed method.
[...] Read more.To measure the difference of two fuzzy sets / intuitionistic sets, we can use the distance measure and dissimilarity measure between fuzzy sets. Characterization of distance/dissimilarity measure between fuzzy sets/intuitionistic fuzzy set is important as it has application in different areas: pattern recognition, image segmentation, and decision making. Picture fuzzy set (PFS) is a generalization of fuzzy set and intuitionistic set, so that it have many application. In this paper, we introduce concepts: difference between PFS-sets, distance measure and dissimilarity measure between picture fuzzy sets, and also provide the formulas for determining these values. We also present an application of dissimilarity measures in multi-attribute decision making.
[...] Read more.Fast development in universal computing and the growth in radio/wireless and mobile strategies have led to the extended use of application space for Radio Frequency (RFID), wireless sensors, Internet of things (IoT). There are numerous applications that are safe and privacy sensitive. The increase of the new equipments has permitted intellectual methods of linking physical strategies and the computing worlds through numerous network interfaces. Consequently, it is compulsory to take note of the essential risks subsequent from these communications. In Wireless systems, RFID and sensor linkages are extremely organized in soldierly, profitable and locomotive submissions. With the extensive use of the wireless and mobile devices, safety has therefore become a major concern. As a consequence, need for extremely protected encryption and decryption primitives in such devices is very important than before.
[...] Read more.With the reform of Chinese economic system, the development of enterprises is facing many risks and challenges. In order to understand the state of operation of enterprises, it is necessary to apply relevant methods to evaluate the enterprise performance. Taking Industrial and Commercial Bank of China as an example, this paper selects its financial data from 2018 to 2021. Firstly, DuPont analysis is applied to decompose the return on equity into the product of profit margin on sales, total assets turnover ratio and equity multiplier. Then analyzes the effect of the changes of these three factors on the return on equity respectively by using the Chain substitution method. The results show that the effect of profit margin on sales on return on equity decreases year by year and tends to be positive from negative. The effect of total assets turnover ratio on return on equity changes from positive to negative and then to positive, while the effect of equity multiplier is opposite. These results provide a direction for the adjustment of the return on equity of Industrial and Commercial Bank of China. Finally, according to the results, some suggestions are put forward for the development of Industrial and Commercial Bank of China.
[...] Read more.Many different methods are applied and used in an attempt to solve higher order nonlinear boundary value problems (BVPs). Galerkin weighted residual method (GWRM) are widely used to solve BVPs. The main aim of this paper is to find the approximate solutions of fifth, seventh and ninth order nonlinear boundary value problems using GWRM. A trial function namely, Bezier Polynomials is assumed which is made to satisfy the given essential boundary conditions. Investigate the effectiveness of the current method; some numerical examples were considered. The results are depicted both graphically and numerically. The numerical solutions are in good agreement with the exact result and get a higher accuracy in the solutions. The present method is quit efficient and yields better results when compared with the existing methods. All problems are performed using the software MATLAB R2017a.
[...] Read more.Quantum computing is a computational framework based on the Quantum Mechanism, which has gotten a lot of attention in the past few decades. In comparison to traditional computers, it has achieved amazing performance on several specialized tasks. Quantum computing is the study of quantum computers that use quantum mechanics phenomena such as entanglement, superposition, annealing, and tunneling to solve problems that humans cannot solve in their lifetime. This article offers a brief outline of what is happening in the field of quantum computing, as well as the current state of the art. It also summarizes the features of quantum computing in terms of major elements such as qubit computation, quantum parallelism, and reverse computing. The study investigates the cause of a quantum computer's great computing capabilities by utilizing quantum entangled states. It also emphasizes that quantum computer research requires a combination of the most sophisticated sciences, such as computer technology, micro-physics, and advanced mathematics.
[...] Read more.Among other factors affecting face recognition and verification, the aging of individuals is a particularly challenging one. Unlike other factors such as pose, expression, and illumination, aging is uncontrollable, personalized, and takes place throughout human life. Thus, while the effects of factors such as head pose, illumination, and facial expression on face recognition can be minimized by using images from controlled environments, the effect of aging cannot be so controlled. This work exploits the personalized nature of aging to reduce the effect of aging on face recognition so that an individual can be correctly recognized across his/her different age-separated face images. To achieve this, an individualized face pairing method was developed in this work to pair faces against entire sets of faces grouped by individuals then, similarity score vectors are obtained for both matching and non-matching image-individual pairs, and the vectors are then used for age-invariant face recognition. This model has the advantage of being able to capture all possible face matchings (intra-class and inter-class) within a face dataset without having to compute all possible image-to-image pairs. This reduces the computational demand of the model without compromising the impact of the ageing factor on the identity of the human face. The developed model was evaluated on the publicly available FG-NET dataset, two subsets of the CACD dataset, and a locally obtained FAGE dataset using leave-one-person (LOPO) cross-validation. The model achieved recognition accuracies of 97.01%, 99.89%, 99.92%, and 99.53% respectively. The developed model can be used to improve face recognition models by making them robust to age-variations in individuals in the dataset.
[...] Read more.Currently, every company is concerned about the retention of their staff. They are nevertheless unable to recognize the genuine reasons for their job resignations due to various circumstances. Each business has its approach to treating employees and ensuring their pleasure. As a result, many employees abruptly terminate their employment for no apparent reason. Machine learning (ML) approaches have grown in popularity among researchers in recent decades. It is capable of proposing answers to a wide range of issues. Then, using machine learning, you may generate predictions about staff attrition. In this research, distinct methods are compared to identify which workers are most likely to leave their organization. It uses two approaches to divide the dataset into train and test data: the 70 percent train, the 30 percent test split, and the K-Fold approaches. Cat Boost, LightGBM Boost, and XGBoost are three methods employed for accuracy comparison. These three approaches are accurately generated by using Gradient Boosting Algorithms.
[...] Read more.An earlier research project that dealt with converting ASCII codes into 2D Cartesian coordinates and then applying translation and rotation transformations to construct an encryption system, is improved by this study. Here, we present a variation of the Cantor Pairing Function to convert ASCII values into distinctive 2D Coordinates. Then, we apply some novel methods to jumble the ciphertext generated as a result of the transformations. We suggest numerous improvements to the earlier research via simple tweaks in the existing code and by introducing a novel key generation protocol that generates an infinite integral key space with no decryption failures. The only way to break this protocol with no prior information would be brute force attack. With the help of elementary combinatorics and probability topics, we prove that this encryption protocol is seemingly infeasible to overcome by an unwelcome adversary.
[...] Read more.A fundamental principle and assumption of cosmology says that the universe is homogeneous and isotropic when viewed on a large scale. According to the cosmological principle, space might be flat, or have a negative or positive curvature in cosmological model. Positively curved universe denotes the closed universe and negatively curved universe denotes the open universe. Our universe type is flat because it expands in every direction neither curving positively nor negatively. We have observed that the progression of the universe is based on radiation and matter domination. In this paper we also have observed that future possible upper limit age of the universe is 9.4203×〖10〗^10 years which varies with density.
[...] Read more.Forecasting is estimating the magnitude of uncertain future events and provides different results with different supposition. In order to identify the core data pattern of jute bale requirements for yarn production, we examined 10 years' worth of data from Jute Yarn/Twin that were shipped by their member mills Limited. Exponential smoothing and Holt’s methods are commonly used to forecast this output because it provides an adequate result. Selecting the right smoothing constant value is essential for reducing predicting errors. In this work, we created a method for choosing the smoothing constant's ideal value to reduce study errors measured by the mean square error (MSE), mean absolute deviation (MAD), and mean square percent error (MAPE). At the contrary, we discuss research finding result and future possibility so that Jute Mills Limited and similar companies may execute forecasting smoothly and develop the expertise level of the procurement system to stay competitive in the worldwide market.
[...] Read more.Information Extraction is an essential task in Natural Language Processing. It is the process of extracting useful information from unstructured text. Information extraction helps in most of the NLP applications like sentiment analysis, named entity recognition, medical data extraction, features extraction from research articles, feature extraction from agriculture, etc. Most of the applications in information extraction are performed by machine learning models. Many research work shave been carried out on machine learning based information extraction from various domain texts in English such as Bio medical, Share market, Weather, Business, Social media, Agriculture, Engineering, and Tourism. However domain specific information extraction for a particular regional language is still a challenge. There are different types of classification algorithms. However, for a selected domain to select the appropriate classification algorithm is very difficult. In this paper three famous classification algorithms are selected to do information extraction by classifying the Gynecological domain data in Tamil Language. The main objective or this research work is to analyze the machine learning methods which is suitable for Tamil domain specific text documents. There are 1635 documents being involved in classification task to extract the features by these selected three algorithms. By evaluating the classification task of each model it has been found that the Naive Bayes classification model provides highest accuracy value (84%) for the gynecological domain data. The F1-Score, Error rate and Execution time also evaluated for the selected machine learning models. The evaluation of performance has proved that the Naïve Bayes classification model gives optimal results. It has been concluded that the Naïve Bayes classification model is the best model to classify the gynaecological domain text in Tamil language
[...] Read more.There exist numerous numerical methods for solving the initial value problems of ordinary differential equations. The accuracy level and computational time are not the same for all of these methods. In this article, the Modified Euler method has been discussed for solving and finding the accurate solution of Ordinary Differential Equations using different step sizes. Approximate Results obtained by different step sizes are shown using the result analysis table. Some problems are solved by the proposed method then approximated results are shown graphically compare to the exact solution for a better understanding of the accuracy level of this method. Errors are estimated for each step and are represented graphically using Matlab Programming Language and MS Excel, which reveals that so much small step size gives better accuracy with less computational error. It is observed that this method is suitable for obtaining the accurate solution of ODEs when the taken step sizes are too much small.
[...] Read more.