IJCNIS Vol. 18, No. 1, Feb. 2026
Cover page and Table of Contents: PDF (size: 156KB)
REGULAR PAPERS
Malware detection is a significant factor in establishing effective cybersecurity in the face of constantly increasing cyber threats. This research article aims to investigate the field of machine learning (ML) techniques for malware detection. More specifically, the paper focuses on the Customized K-Nearest Neighbors (C-KNN) classifier and the Firefly Algorithm (FA). The work aims to assess the effectiveness of C-KNN and C-KNN with FA (C-KNN/FA) in malware identification using the MalMem-2022 dataset. The novelty of the proposed method lies in the synergistic integration of the C-KNN algorithm with the FA for metaheuristic optimization. The use of FA to select the most relevant features enables the C-KNN to train on a small and high-quality feature set. Therefore, the performance of malware detection will be improved. We compare the performance of both methods to understand the influence of KNN parameter adjustment and feature selection on malware classification. The C-KNN and C-KNN/FA have produced remarkable results in malware identification, reaching an accuracy of 99.98%. This accomplishment is quite encouraging. With regard to multiclass and binary classification methods, C-KNN and C-KNN/FA both perform better than their alternatives.
[...] Read more.It is essential and unavoidable to detect Malware on the Internet, as a wide range of online IT services are available. Portable Executable files are the most frequently targeted platform by Malware. Malware must be promptly identified and alerted in a real-world environment by establishing a deployable learning system. The researchers applied machine learning to a Malware dataset, observing the model's performance metrics at a high computational cost, but were unable to deploy the model in a real-world environment. A deployable machine learning model using RF, attaining an accuracy of 97.16%, precision of 95.21%, and F1 score of 95.24% is achieved in the proposed research work, which is particularly adept at accurately identifying Malware. We have developed a novel classification model that employs the Support Vector Machine (SVM) to classify preprocessed data, detecting malware and normal instances. Furthermore, the SHAP technique identifies significant features, including SizeOfStackReserve, DllCharacteristics, and MajorImageVersion. The use of SHAP values facilitates an understanding of the characteristics of each feature in the model's prediction. Employing the SHAP algorithm using the trained SVM model to reduce the features, attained an accuracy of 97.16%.
[...] Read more.Wireless Sensor Networks (WSNs) play a crucial role in various domains, such as environmental monitoring, health, and military applications. These applications necessitate the establishment of secure and efficient communication. This network encounters a major issue since routing attacks along with data tampering are highly prevalent in such networks due to their decentralized architecture and limited resources for computation that make the networks susceptible to a wide range of security threats. The existing techniques-WSN-Block, CEMT, TSRP, ORASWSN, POA-DL, AI-WSN, and EOSR are extensively used for routing but experience inefficiencies in optimizing paths that increases energy consumption drastically and also allows packet loss. In addition, the existing blockchain models for WSN security are not scalable; have high overheads of computation. To address these limitations, we propose the Secure Blockchain-Based Routing with Narwhal Optimization for WSNs (OpNa-SGCDN). Our approach employs Optimized Narwhal-Based Metaheuristic for guaranteed shortest-path communication and minimum energy consumption in optimum routing. Moreover, we provide Scalable Permissionless Blockchain Consensus Model (SP-BlockCM) features enhanced to yield a decentralized solution that is tamper-proof but with improved scalability. The attack detection function is designed by making use of a Stacked Bi-Tier Convolutional Deep Network (SBT-CDN), which is optimized by the Snow Geese Evolutionary Algorithm (SGEA). Experimental results demonstrate that our method improves energy efficiency with 94.7% as well as achieves higher detection accuracy for 96.3% and packet loss for 95.5%, both security and performance, and thus it is better than the available methods. The framework given here is thus obviously comprehensive as well as scalable for secure, energy-efficient WSN communication.
[...] Read more.The global shift from 5G to 6G wireless communication networks presents immense challenges in managing resources for ultra-dense, heterogeneous, and latency-sensitive 6G applications such as holographic communications, autonomous systems, and the Internet of Everything (IoE). Traditional resource allocation methods struggle to meet the dynamic and complex demands of 6G, leading to inefficiencies, higher latency, and fairness issues. To address these challenges, we propose a novel framework called Proof-of-Resource enabled 6G Resource Management Using Quaternion-Attentive Cascaded Capsule Networks (Caps-PoR). Our approach integrates Quaternion-Attentive Cascaded Deep Capsule Networks (Q-AtCapsN) to improve the accuracy of predicting resource demands by capturing real-time multi-dimensional dependencies. Additionally, we optimize resource allocation dynamically through an Enhanced Collaborative Learning Algorithm (ECoLA), which supports decentralized decision-making across multiple nodes, significantly reducing latency. The Proof-of-Resource mechanism ensures transparency, fairness, and trust, preventing resource misallocation while ensuring equal access. Performance evaluations show that Caps-PoR outperforms traditional methods in 6G multi-access edge computing (MEC) scenarios, achieving over 98% resource utilization efficiency, a latency reduction exceeding 96%, and a user satisfaction rate of more than 97%. This demonstrates how Caps-PoR effectively enhances efficiency, security, and scalability in next-generation 6G networks, reshaping the future of resource management in decentralized systems.
[...] Read more.Mobile Adhoc Networks are not without challenges such as limitations with scaling, and problems with excessive power consumptions which is mostly attributed to battery consumption. The movement of nodes causes an increasing use of energy, delay, and the shortened lifetime of nodes. A solution to these problems is a resource-efficient City Block-K Means Algorithm (CB-K Means) clustering method that prioritizes resource optimization and the lifespan of the network. These are localization, selection of Cluster Head (CH) and mobile node clustering. Mobile Node (MN) localization is performed using a Crossover and Mutation-based Jaya Optimization Algorithm (CM-JOA), followed by clustering through CB-K Means, where the CHs are selected with a linear scaling-based Satin Bowerbird Optimization Algorithm (LS-SBOA). The experimental results indicate that there is an improvement by 3.48% in the PDR (Packet Delivery Ratio), 7.3 percent packet loss, reduced clustering time of 3412ms and enhanced throughput of 889 bps.
[...] Read more.Precision agriculture relies on wireless sensor networks (WSNs) to support informed decision-making, thereby enhancing crop yields and resource management. A critical challenge in such networks is minimizing the energy consumption of sensor nodes while ensuring reliable data transmission. Sensor nodes are grouped using an optimal multi-objective clustering approach, which also chooses appropriate cluster heads (CH) for effective communication. By combining the exploration power of the Osprey Optimization Algorithm with the exploitation power of the Parrot Optimizer, a hybrid optimization approach improves CH selection. A hybrid deep learning framework, combining a convolutional autoencoder with a dual-key transformer network, is designed to monitor energy utilization and detect constraints affecting consumption. Training and testing performance of this framework is further improved using a metaheuristic based on the cooperative feeding and locomotion behavior of gooseneck barnacles. Experimental evaluation demonstrates superior performance, achieving 99.2% accuracy, 68 kbps throughput, 98% packet delivery ratio, and a network lifetime of 85 ms. With an average delay of 0.23 seconds, energy consumption is decreased to 39 J, demonstrating the effectiveness of the suggested strategy for dependable and sustainable precision agriculture applications.
[...] Read more.In this research, we propose an integrated routing protocol termed Bacterial Foraging inspired Mamdani Fuzzy Inference based AODV (BF-MFI-AODV) for Vehicular Ad-hoc Networks (VANETs), which combines Mamdani Fuzzy Inference System (MFIS) and Bacterial Foraging Optimization (BFO) techniques. The protocol aims to address the challenges of dynamic and unpredictable network conditions in VANETs by leveraging fuzzy logic and bio-inspired optimization principles. BF-MFI-AODV enhances route discovery, maintenance, and optimization mechanisms, resulting in improved adaptability, reliability, and efficiency of communication. Through extensive simulations and real-world experiments, the performance of BF-MFI-AODV is evaluated in terms of packet delivery ratio, end-to-end delay, routing overhead, and network lifetime. Our results demonstrate the effectiveness of BF-MFI-AODV in enhancing the overall performance of VANETs compared to existing routing protocols. The proposed protocol shows promise in providing robust and efficient communication solutions for dynamic vehicular environments, thus contributing to the advancement of intelligent transportation systems.
[...] Read more.The Internet of Things and cloud computing are expanding at a very rapid rate, which has posed a great challenge in maintaining data security and integrity, particularly during forensic investigation. The conventional logging mechanisms are prone to manipulation, unreliable, and difficult to verify the digital evidence. In response to these problems, a blockchain-based system is suggested to facilitate the security and reliability of forensic data stored on cloud-based environments, which is related to IoT devices. The decentralized storage is paired with the smart contract technology to form an immutable version of the cloud communications to make sure that the evidence is unaltered to guarantee its verifiability. It further has a safe off-chain storage system enabling swift records and recalls of massive forensic records. The enormous amount of experimentation has demonstrated that the system minimizes the verification times to about 28 to 39 milliseconds. It is quicker than the methods that are currently in place and has high data integrity. The framework enhances transaction throughput as well as provides scalable solution to preserve forensic evidence. It has offered a feasible and reliable platform to enhance the security, visibility and reliability of forensic data within intricate IoT and cloud environments. These characteristics aid law enforcement groups and forensic investigators in having effective and credible investigations.
[...] Read more.Advanced intrusion detection systems are required due to the quick uptake of cloud computing and the growing complexity of cyber threats, especially Denial of Service and Distributed Denial of Service attacks. Deep learning architectures are becoming more popular because traditional IDS techniques frequently falter in dynamic, large-scale settings. Using datasets including CICIDS2017, NSL-KDD, and UNSW-NB15, this paper assesses the effectiveness of well-known DL architectures for intrusion detection, including Convolutional Neural Network, Recurrent Neural Networks, Long Short-Term Memory, and others. Key performance indicators such as accuracy, precision, and false positive rates are examined to compare the efficacy of these models. The findings show that some designs, like ResNet and Self-Organizing Map, perform well in structured environments but poorly on complicated datasets like KDDTest-21. Another important data gap highlighting the need for more research in this area is that most models do not automatically adjust to unexpected threats. This work aids in the creation of intelligent, scalable systems for changing network environments by evaluating the efficacy of DL-based IDS solutions.
[...] Read more.We propose a meta-learning-enhanced BiLSTM autoencoder architecture for robust one-bit error correction coding, designed to dynamically adapt to diverse channel conditions without requiring explicit retraining. The proposed method fuses a channel-aware meta-discriminator into an adversarial training framework, allowing the system to generalize across Rician, Rayleigh, and AWGN channels by adapting its decision boundaries based on temporal signal statistics. The meta-discriminator, realized as a lightweight Transformer-encoder with cross-attention, computes channel-specific embeddings from the received signal, which modulate the adversarial loss and guide the reconstruction process. Furthermore, the BiLSTM encoder-decoder utilizes bidirectional layers with residual connections to capture long-range dependencies, while a learnable one-bit quantizer with adaptive thresholds ensures efficient signal representation. The training objective combines reconstruction loss, adversarial loss, and a meta-regularization term, which stabilizes updates and refines adaptation. The meta-discriminator performs real-time parameter adjustments using a single gradient step during inference to make the system resilient to unseen channel impairments. The experiments demonstrate significant improvements in BER and MSE across various fading channels and data sizes. The Rician channel exhibits the lowest values of BER and MSE of 0.032 and 0.031, respectively, when considering a data size of 2500 symbols. The proposed work shows its dual capability to learn error-correcting codes through BiLSTMs, apart from exploiting meta-learning for channel adaptation.
[...] Read more.