IJCNIS Vol. 13, No. 3, Jun. 2021
Cover page and Table of Contents: PDF (size: 295KB)
Increasing the implication of IoT data puts a focus on extracting the knowledge from sensors’ raw data. The management of sensors’ data is inefficient with current solutions, as studies have generally focused on either providing cloud-based IoT solutions or inefficient predefined rules. Cloud-based IoT solutions have problems with latency, availability, security and privacy, and power consumption. Therefore, Providing IoT gateways with relevant intelligence is essential for gaining knowledge from raw data to make the decision of whether to actuate or offload tasks to the cloud. This work proposes a model that provides an IoT gateway with the intelligence needed to extract the knowledge from sensors’ data in order to make the decision locally without needing to send all raw data to the cloud over the Internet. This speeds up decisions and actions for real-time data and overcomes the limitations of cloud-based IoT solutions. When the gateway is unable to process a task locally, the data and task are offloaded to the cloud.[...] Read more.
Software-Defined Networking is a new network architecture that separates control and data planes. It has central network control and programmability facilities, so it improves manageability, scaling, and performance. However, it may suffer from creating a single point of failure against the controller, which represents the network control plane. So, defending the controller against attacks such as a distributed denial of service attack is a valuable and urgent issue. The advances of this paper are to implement an accurate and significant method to detect this attack with high accuracy using machine learning-based algorithms exploiting new advanced features obtained from traffic flow information and statistics. The developed model is trained with kernel radial basis function. The technique uses advanced features such as unknown destination addresses, packets inter-arrival time, transport layer protocol header, and type of service header. To the best knowledge of the authors, the proposed approach of the paper had not been used before. The proposed work begins with generating both normal and attack traffic flow packets through the network. When packets reach the controller, it extracts their headers and performs necessary flow calculations to get the needed features. The features are used to create a dataset that is used as an input to linear support vector machine classifier. The classifier is used to train the model with kernel radial basis function. Methods such as Naive Bayes, K-Nearest Neighbor, Decision Tree, and Random Forest are also utilized and compared with the SVM model to improve the detection operation. Hence, suspicious senders are blocked and their information is stored. The experimental results prove that the proposed technique detects the attack with high accuracy and low false alarm, compared to other related techniques.[...] Read more.
Random numbers have many uses, but finding true randomness is incredibly difficult. Therefore, quantum mechanics is used, using the essentially unpredictable behavior of a photon, to generate truly random numbers that form the basis of many modern cryptographic protocols. It is essential to trust cryptographic random number generators to generate only true random numbers. This is why certification methods are needed which will check both the performance of our device and the quality of the random bits generated. Self-testing as well as device independent quantum random number generation methods are analyzed in the paper. The advantages and disadvantages of both methods are identified. The model of a novel semi self-testing certification method for quantum random number generators is offered in the paper. This method combines different types of certification approaches and is rather secure and efficient. The method is very important for computer science, because it combines the best features from self-testing and device independent methods. It can be used, when the random numbers’ entropy depends on the device and when it does not. In the related researches, these approaches are offered to be used separately, depending on the random number generator. The offered novel certification technology can be properly used, when the device is compromised or spoiled. The technology can successfully detect unintended irregularities, operational problems, abnormalities and problems in the randomization process. The offered mythology assists to eliminate problems related to physical devices. The offered system has the higher certification randomness security and is faster than self-testing approaches. The method is rather efficient because it implements the different certification approaches in the parallel threads. The offered techniques make the offered research must more efficient than the other existing approaches. The corresponding programming simulation is implemented by means of the simulation techniques.[...] Read more.
With the increase in the number of e-services, there is a sharp increase in online financial transactions these days. These services require a strong authentication scheme to validate the users of these services and allow access to the resources for strong security. Since two-factor authentication ensures the required security strength, various organizations employ biometric-based or Smart Card or Cryptographic Token-based methods to ensure the safety of user accounts. But most of these methods require a verifier table for validating users at a server. This poses a security threat of stolen-verifier attack. To address this issue, there is a strong need for authentication schemes for e-services that do not require a verifier table at the server. Therefore, this paper proposes the design of an authentication scheme for e-services which should be resistant to various attacks including a stolen verifier attack. The paper will also discuss: 1) The proposed scheme analyzed for security provided against the known authentication attacks 2) The concept implementation of the proposed scheme.[...] Read more.
Modern children are active Internet users. However, in the context of information abundance, they have little knowledge of which information is useful and which is harmful. To make the Internet a safe place for children, various methods are used at the international and national levels, as well as by experts, and the ways to protect children from harmful information are sought. The article proposes an approach using a multi-criteria decision-making process to prevent children from encountering harmful content on the Internet and to make the Internet more secure environment for children. The article highlights the age characteristics of children as criteria. Harmless information, Training information, Entertainment information, News, and Harmful information are considered as alternatives. Here, a decision is made by comparing the alternatives according to the given criteria. According to the trials, harmful information is rated in the last position.
There is no child protection issue on the Internet using the AHP method. This research is important to protect children from harmful information in the virtual space. In the protection of minors Internet users is a reliable approach for educational institutions, parents and other subjects related to child safety.
Given recent events, training during quarantine can only take place remotely. To ensure quality training, communication must be seamless. To do this, the network must function smoothly. The solution to this problem is functionally stable networks that allow uninterrupted transmission of information due to redundancy. An important issue is the definition of redundancy. To solve this problem, the article considers the method of synthesis of the structure of the distance learning system. The method of synthesis of network structure used for providing distance learning by the criterion of maximum functional stability on the basis of the introduction of correcting communication lines is offered. With this method, you can develop tools for self-recovery of distributed software, taking into account the characteristics of disparate computer resources through the use of redundancy. This will allow you to develop functionally stable software systems, significantly reduce the recovery time of such systems after or in the event of possible failures. To increase the efficiency of the developed method, the mathematical model of the hyper network based on two hypergraphs was improved, which allows taking into account different requirements for the quality of the network.[...] Read more.