IJCNIS Vol. 7, No. 10, Sep. 2015
Cover page and Table of Contents: PDF (size: 578KB)
Attacks on web servers are becoming increasingly prevalent; the resulting social and economic impact of successful attacks is also exacerbated by our dependency on web-based applications. There are many existing attack detection and prevention schemes, which must be carefully configured to ensure their efficacy. In this paper, we present a study challenges that arise in training network payload anomaly detection schemes that utilize collected network traffic for tuning and configuration. The advantage of anomaly-based intrusion detection is in its potential for detecting zero day attacks. These types of schemes, however, require extensive training to properly model the normal characteristics of the system being protected. Usually, training is done through the use of real data collected by monitoring the activity of the system. In practice, network operators or administrators may run into cases where they have limited availability of such data. This issue can arise due to the system being newly deployed (or heavily modified) or due to the content or behavior that leads to normal characterization having been changed. We show that artificially generated packet payloads can be used to effectively augment the training and tuning. We evaluate the method using real network traffic collected at a server site; We illustrate the problem at first (use of highly variable and unsuitable training data resulting in high false positives of 3.6∼10%), then show improvements using the augmented training method (false positives as low as 0.2%). We also measure the impact on network performance, and present a lookup based optimization that can be used to improve latency and throughput.[...] Read more.
Developing mathematical models for reliable approximation of epidemic spread on a network is a challenging task, which becomes even more difficult when a wireless network is considered, because there are a number of inherent physical properties and processes which are apparently invisible. The aim of this paper is to explore the impact of several abstract features including trust, selfishness and collaborative behavior on the course of a network epidemic, especially when considered in the context of a wireless network. A five-component differential epidemic model has been proposed in this work. The model also includes a latency period, with a possibility of switching epidemic behavior. Bilinear incidence has been considered for the epidemic contacts. An analysis of the long term behavior of the system reveals the possibility of an endemic equilibrium point, in addition to an infection-free equilibrium. The paper characterizes the endemic equilibrium in terms of its existence conditions. The system is also seen to have an epidemic threshold which marks a well-defined boundary between the two long-term epidemic states. An expression for this threshold is derived and stability conditions for the equilibrium points are also established in terms of this threshold. Numerical simulations have further been used to show the behavior of the system using four different experimental set-ups. The paper concludes with some interesting results which can help in establishing an interface between epidemic spread and collaborative behavior in wireless networks.[...] Read more.
The Universal Composability (UC) framework provides provable security guaranties for harsh application environment, where we want to construct protocols which keep security guarantees even when they are concurrently composed with arbitrary number of arbitrary (even hostile) protocols. This is a very strong guarantee. The UC-framework inherently supports the modular design, which allows secure composition of arbitrary number of UC-secure components with an arbitrary protocol. In contrast, traditional analysis and design is a stand alone analysis where security of a single instance is considered, i.e. an instance which is not in potential interaction with any concurrent instances. Furthermore, a typical traditional analysis is informal, i.e. without a formal proof. In spite of these facts, beyond the task of key-exchange this technology have not really took the attention of the community of applied cryptography. From practitioner's point of view the UC-world may seem more or less an academic interest of theoretical cryptographers.
Accordingly we take a pragmatic approach, where we concentrate on meaningful compromises between the assumed adversarial strength, ideality wishes and realization complexity while keeping provable security guarantees within the UC-framework. We believe that even modest but provable goals (especially, if tunable to application scenarios) are interesting if a wider penetration of the UC-technology is desired into the daily-practice of protocol applications.
To enhance the throughput and Quality of Service (QoS) of indoor users, Femto cells became the best solution. The placement of a femtocell is always a challenging task due to its interference constraints with other cells. If the base stations has limited number of Femto-cells, then there is need to focus much on interference constraints. In the dense countries like India, there is a need to install more number of Femto-cells to get the proper throughput. But, the limiting factor here is frequency and interference management. This paper explained the interference management issues and hand over issues. This paper proposed method of optimal placement of Femto cell to increase the QoS in dense environment where macro-cell holds many number of Femto cells. This solution made an assumption that the interference effect was considerably strong when compared to Noise. Hence, we have not considered Noise parameter in the analysis. It was shown that the optimal placement has better throughput when compared to blind placement. Two cases were considered as blind placements and their throughput was analyzed with respect to optimal placement. The proposed method was tested in both single and multi-room buildings. Finally, it was observed that the gain of 50% was increased with respect to large buildings with many rooms.[...] Read more.
MANET Mobile Ad hoc Network are evolved through various characteristics such as shared media, this property make a routing protocols vulnerable. AODV is a reactive routing where each intermediate node cooperates in the process of route discovery. In this case, the node that behaves as malicious exploit the malfunction of specified service. The black hole attack uses the sequence number that is used to select the freshest route and attract all exchanged data packets to destroy them. Many researchers have dealt with this attack and many solutions have been proposed. These solutions target the network layer only. In this paper, we present our approach to counter black hole attack. This approach is entitled CrossAODV and it is based on verification and validation process. The key point of our approach is the use of the inter layer interaction between networks layer and medium access within the distributed coordination function (DCF) to efficiently detect and isolate malicious nodes. During the route discovery, the verification process uses the RTS / CTS frame that contains information about the requested path. The validation process consists of comparing the routing information with the result of verification phase. Our Approach have been implemented, simulated and compared to two related studies using the well know NS2 Simulator. The obtained results show the efficacy our proposal in term of packet delivery with a neglected additional delay.[...] Read more.
This paper presents an encryption technique based on independent random number generation for every individual message sent based upon a pass key which depends upon a secured telephonic conversation and the starting time of the conversation. . A multiplier technique is then applied on the plain text in order to generate the cipher text. The world runs on ciphers today and the generation of secure keys for producing a cipher asks for more simplicity yet requires much more effective cryptosystems which could generate a cipher with the most minimal complexity. Vedic Mathematics in itself offers a wide variety of techniques for encrypting a text which even involves concepts of elliptical curves, Vedic multiplier and so on. The Vedic Multiplier system is used for encoding and decoding and here we have used it to encrypt plain texts and generate a certain kind of cipher based on some random sequence of character equivalents and partial products. The objective of this paper will always resound for the development of a unique system which will ensure secrecy and authenticity for the private communication between two entities. The proposed idea can be implemented for inter-office message communication.[...] Read more.
This paper presents a study of the improvement in efficiency of the Rabin-Karp pattern-matching algorithm based Deep Packet Inspection. NVIDIA GPU is programmed with the NVIDIA's general purpose parallel computing architecture, CUDA, that leverages the parallel compute engine in NVIDIA GPUs to solve many complex computational problems in a more efficient way than on a CPU. The proposed CUDA based implementation on a multicore GPU outperforms the Intel quadcore processor and runs upto 14 times faster by executing the algorithm in parallel to search for the pattern from the text. The speedup may not sound exorbitant but nonetheless is significant, keeping in view that the experiments have been conducted on real data and not synthetic data and, optimal performance even with the huge increase in traffic was the main expectation, not just an improvement in speed with few test cases.[...] Read more.