IJCNIS Vol. 7, No. 5, Apr. 2015
Cover page and Table of Contents: PDF (size: 131KB)
This paper presents the architecture of embedded real-time web server. Unlike existing web servers, in our approach, requests are processed not in the “first in first out” order but according to their deadlines and the expected server load. For this purpose the Least Laxity First scheduling method is used. First, requests with imposed hard real-time constraints are served. Then requests enclosed by soft deadlines are processed. Finally, request without time requirements are served in the order they arrived. We also present real-time extensions to the Hypertext Transfer Protocol. We propose headers that enable defining hard and soft deadlines, as well as responses containing time information, that are being sent to the client application. The experimental results showed that in case of real-time applications our server misses significantly fewer requests, due to time out, then existing solutions. The presented server may be very useful for implementing real-time services supported by embedded systems, e.g. in future real-time “Internet of things” applications.[...] Read more.
Today’s Grids include resources (referred as Grid-site s) from different domains including dedicated production resources, resources from university labs, and even P2P en?vironments. Grid high level services, like schedulers, resource managers, etc. need to know the reliability of the available Grid-sites to select the most suitable from them. Modeling reliability of a Grid-site for successful execution of a job requires prediction of Grid-site availability for the given duration of job execution as well as possibility of successful execution of the job. Predicting Grid-site availability is complex due to different availability patterns, resource sharing policies implemented by resource owners, nature of domain the resource belongs to (e.g. P2P etc.), and its maintenance etc. To give a solution, we model reliability of Grid-site in terms of prediction of its availability and possibility of job success. Our availability predictions incorporate past patterns of the Grid-site availability using pattern recognition methods. To estimate possibility of job success, we consider historical traces of job execution. The experiments conducted on a trace of real Grid demonstrate the effectiveness of our approach for ranking Grid-sites based on their reliability for executing jobs successfully.[...] Read more.
Digital Forensics can be defined as a field of study involving the usage of technical and proved procedures for collecting, preserving, validating, analyzing, interpreting and presenting the Evidences extracted from the digital sources for presenting those in the court of law. Different process models have been proposed by the researchers for cyber crimes’ investigation process, each having its own suitability to environments where they are applicable and other pros and cons. The paper includes the tailoring of existing process models to the particular domain of higher education institutes. With the growing access of computing resources and internet to the students, employees and overall citizens, it is the need of time that organizations should establish and maintain their cyber forensics analysis policy along with whole process to be followed in case of any cyber crime scene reporting.[...] Read more.
This paper puts forward a novel image encryption scheme based on ordinary differential equation system. Firstly, a hyper-chaotic differential equation system is used to generate two hyper-chaotic orbit sequences. Introducing the idea of hybrid orbit, two orbits are mixed to generate a hybrid hyper-chaotic sequence which is used to be the initial chaotic key stream. Secondly, the final encryption key stream is generated through two rounds of diffusion operation which is related to the initial chaotic key stream and plain-image. Therefore, the algorithm’s key stream not only depends on the cipher keys but also depends on plain-image. Last but not least, the security and performance analysis have been performed, including key space analysis, histogram analysis, correlation analysis, information entropy analysis, peak signal-to-noise ratio analysis, key sensitivity analysis, differential analysis etc. All the experimental results show that the proposed image encryption scheme is secure and suitable for practical image and video encryption.[...] Read more.
Over last 3 decades, many cryptography algorithms based on chaos have been proposed that are very fast in computation. Chaos is used for secured communication in two ways as analog secured communication and digital chaotic ciphers. This paper is mainly focused at digital chaotic cryptosystem. In symmetric cryptosystems, same key is used for both encryption and decryption purpose. In 1998, Baptista gave the most used symmetric cryptosystem based on Ergodic property of logistic map. Later on, many refinements were done in Baptista’s algorithm. By going through later proposed refinements in this cryptosystem, some flaws are observed. Proposed scheme has a two-step logistic map that is a feedback mechanism using an extra variable to come over these flaws. At last, there is comparison between proposed scheme and other version of Baptista type cryptosystem, which shows that the proposed scheme is better than previous ones and it is resist against behavior analysis attack and partial key recovery attack.[...] Read more.
Cloud Computing refers to network-based service provided by a large number of computers, sharing computing and storage resources. Combined with on-demand provisioning mechanisms and relied on a pay-per-use business model.
The Cloud Computing offers the possibilities to scale rapidly, to store data remotely and to share services in a dynamic environment. However, these benefits can be seen as weaknesses for assuring trust, and providing confidence to the users of service. In this case, some traditional mechanisms to guarantee reliable services are no longer suitable or dynamic enough, and new models need to be developed to fit this paradigm.
This study describes the assessment of the trust in the context of Cloud Computing, proposes a new trust model adapted to Cloud environments, and shows some experiments in regards of the proposed solution.
In this paper we present a new packet scheduling method based on parallel usage of multiple WRR schedulers, rate limiters and output bandwidth calculation for modern NGN networks. The main idea of the presented method is to provide queueing fairness within queues. The method provides the same results in output bandwidth allocation as the compared algorithm, while within one queue flows with different packet size and arrival rates gets the same output bandwidth. With this method we are able to achieve the overall result of bandwidth assignment as algorithms like WRR, WFQ, WRRPQ and LLQ by only changing the mathematical model used to calculate the bandwidth assignment. We call this method Weighted Round Robin and Rate Limiter based Fair Queuing (WRRRLbFQ). We prove the model outcome with simulation results using NS2 simulator and compare the behavior with the WRR scheduler.[...] Read more.
In day-to-day communications we may need to establish temporary (ad hoc) connections anytime, anywhere. Data transfer through this ad hoc wireless network is required when it is hard to establish the large infrastructure. In MANETs there are many challenges in terms of deploying security especially when the confidentiality of the data is compromised. If the data is highly confidential, then providing security especially in the malicious environment is really a challenging task. Many researchers have however proposed solutions for internal as well as external attacks. But unfortunately everyone has some tradeoffs. Some methods are designed only for specific attacks. Some provide solutions for many attacks but depend on the factors like delay, high resource utilization etc. In this paper, we have in sighted into various security providing techniques that have cumulated from many years. We have attempted to present the current approaches for developing secured systems. These methods have used simple techniques to enhance the security and to reduce the complexity. There are many surveys done before on the security issues and methods. However to our information no one has surveyed the current emerging secured methods which may be more effective than the mostly used ones.[...] Read more.