IJCNIS Vol. 13, No. 4, Aug. 2021
Cover page and Table of Contents: PDF (size: 168KB)
The article describes the concept of a unified information space and an algorithm of its formation using a special information and computer system. The process of incoming object search in a unified information space is considered, which makes it possible to uniquely identify it by corresponding features. One of the main tasks of a unified information space is that each information object in it is uniquely identified. For this, the identification method was used, which is based on a step-by-step analysis of object characteristics. The method of parallel information object search in unified information spaces is proposed, when information object search will be conducted independently in all unified information spaces in parallel. Experimental studies of the method of parallel information object search in unified information spaces were conducted, on the basis of which the analysis of efficiency and incoming objects search time in unified information spaces was carried out. There was experimentally approved that the more parameters that describe the information object, the less the time of object identification depends on the length of the interval. Also, there was experimentally approved that the efficiency of the searching of the incoming objects in unified information spaces tends to a directly proportional relationship with a decrease in the length of the interval and an increase in the number of parameters, and vice versa.[...] Read more.
Randomness is an imperative component in every cryptographic algorithm to guarantee secret keys are unpredictable and secured against all forms of attacks. Speck generated sequence is non-random, a clear sign that it falls beyond the acceptable success rate when tested in statistical analysis. Thus, this study resolves the non-randomness by integrating a novel key derivation function that uses elementary operators designed for lightweight application. This design aims not to compromise performance when implemented on software and hardware. As a result, the modified Speck successfully passed the NIST SP 800 - 22 and Dieharder v3.31.0 Statistical Test Analysis as no p-value is flagged as failed during testing. Hence, making modified Speck cryptographically secured. Nevertheless, a 1.06% decrease in the figure of merit of the modified Speck still makes it worthier in a resource-constrained Internet of Things application as contrasted to Speck because it is proven to be beyond cryptographically secured.[...] Read more.
Recently, the use of Internet is increased for digital communication to share a lot of sensitive information between computers and mobile devices. For secure communication, data or information must be protected from adversaries. There are many methods of safeties like encryption, firewalls and access control. Intrusion detection system is mainly used to detect internal attacks in organization. Machine leaning techniques are mostly used to implement intrusion detection system. Ensemble method of machine learning gives high accuracy in which moderately accurate classifiers are combined. Ensemble classifier also provides less false positive rates.
In this paper, a novel ensemble classifier using rule combination method has proposed for intrusion detection system. Ensemble classifier is designed using three rule learners as base classifiers. The benefits and feasibility of the proposed ensemble classifier have demonstrated by means of KDD’98 datasets. The main novelty of the proposed approach is based on three rule learner combination using rule of combination method of ensemble and feature selector. These three base classifiers are separately trained and combined using average probabilities rule combination. Base classifier’s accuracies have compared with the proposed ensemble classifier. Best First search algorithm has used to select relevant features from training dataset. This algorithm also helped to reduce dimension of training and testing dataset which benefits in reduction of training time. Several comparative experiments are conducted for evaluating performances of classifiers in term of accuracy and false positive rates. Experimental results show that the proposed ensemble classifier provide significant improvement of accuracy compared to individual classifiers with less positive rates.
This paper proposes a new approach to estimating the contour of the coverage zone for a cellular communications base station that takes into account meaningful reflecting objects located out of the considering zone. Based on this approach, the procedure for modeling and designing the cellular system coverage area. Unlike known methods, the developed procedure considers the influence of electromagnetic wave reflection from external details of the relief, in particular essential reflecting objects located outside the considered cell. The effect of the external objects on the formation of the coverage area resulting contour is considered analytically, numerically and experimentally. The proposed solution leads to more accurate designing of the coverage area for each cell. This creates the opportunity for further development of designing techniques to more effective engineering solutions at developing and applying cellular communication systems in real situations and at various scenarios.[...] Read more.
Newer mobile applications are increasingly being defined using Internet Protocol, resulting in increased use of Internet Protocol and subsequent upsurge of smartphones. However, many communication service provider core networks continue to use classical routing protocols and single controller-based networks if deployed. Controller-based networks built on the foundation of software-defined networks include centralization and separation of control plane and data plane, which can address the challenges experienced with the classical routing protocols. When single controllers are used, they tend to get overloaded with traffic. The ability to use multi-controller-based network architecture to improve quality of service in the mobile IP core network is still an open issue. This paper presents a performance evaluation of multi-controller-based network architecture, running OpenFlow and Open Shortest Path First protocol. The long-term evolution simulated network architecture is created using well-known network simulator Objective Modular Network Testbed running OpenFlow and simuLTE add-on. We test and analyze data traffic for Packet data ratio and Jitter and their associated effects on a multi-controller-based network running OpenFlow versus OSPF on a mobile core network. The experiment created two topologies; multi controller-based and Open Shortest path first network. Video and ping traffic is tested by the generation of traffic from User Equipment to the network-based server in the data center and back, and traffic metrics recorded on an inbuilt integrated development environment. The simulation setup consisted of an OpenFlow controller, HyperFlow algorithm, OpenFlow switches, and Open Shortest Path First routers. The multi-controller-based network improved Jitter by 10 ms. The Open Shortest Path first showed packet data ratio values of 89% gain while the controller-based network registered a value of 86%. A standard deviation test revealed 0.7%, which shows that the difference is not significant when testing for Packet data ratio. We provided insight into the performance of multi-controller-based architecture and Open Shortest Path First protocol in the communication service provider's core network.[...] Read more.
When a computer gets involved in a crime, it is the mission of the digital forensic experts to extract the left binary artifacts on that device. Among those artifacts, there may be some volume shadow copy files left on the Windows operating system. Those files are snapshots of the volume recorded by the system in case of a needed restore to a specific past date. Before this study, we did not know if the valuable forensic information hold within those snapshot files can be exploited to locate suspicious timestamps in an NTFS formatted partition. This study provides the reader with an inter-snapshot time analysis for detecting file system timestamp manipulation. In other words, we will leverage the presence of the time information within multiples volume shadow copies to detect any suspicious tampering of the file system timestamps. A detection algorithm of the suspicious timestamps is contributed. Its main role is to assist the digital investigator to spot the manipulation if it has occurred. In addition, a virtual environment has been set up to validate the use of the proposed algorithm for the detection.[...] Read more.