IJCNIS Vol. 9, No. 5, May. 2017
Cover page and Table of Contents: PDF (size: 256KB)
The privacy of biometric data needs to be protected. Cancellable biometrics is proposed as an effective mechanism of protecting biometric data. In this paper a novel scheme of constructing cancellable fingerprint minutiae template is proposed. Specifically, each real minutia point from an original template is mapped to a neighbouring fake minutia in a user-specific randomly generated synthetic template using the k-nearest neighbour method. The recognition template is constructed by collecting the neighbouring fake minutiae of the real minutiae. This scheme has two advantages: (1) An attacker needs to capture both the original template and the synthetic template in order to construct the recognition template; (2) A compromised recognition template can be cancelled easily by replacing the synthetic template. Single-neighboured experiments of self-matching, nonself-matching, and imposter matching are carried out on three databases: DB1B from FVC00, DB1B from FVC02, and DB1 from FVC04. Double-neighboured tests are also conducted for DB1B from FVC02. The results show that the constructed recognition templates can perform more accurately than the original templates and it is feasible to construct cancellable fingerprint templates with the proposed approach.[...] Read more.
Sharing of data is always a crucial job in the world of data technology, where data are perceived as useful resources for generating significant information for taking notable decisions or for doing interpretation. On the basis of impact factor associated with the information, sharing and storage of data demands security. Sharing of large extent of textual sensitive data contained in a file needs a way of hidings their direct decipherment. Due to the size related restriction associated with steganography we always promote cryptography for sharing large amount of sensitive data. In this paper we have proposed an encryption technique with time complexity of O(1) that exploits the dynamic node fusion property of graph theory. We are applying the rule of cosmos which reveals that the entire cosmos relies on the conception of data hiding. Each time billions of particles merges to form a new structure and again splits, gets distributed and again follows the same process of hiding and disclosing the hidden truth in a cyclic manner. Following this logic we have proposed a dynamic layered encryption technique that will be completely able to resist the illicit actions of intruders with low computational efforts as well as it reduces the network load on packet transmission. In future with the successive increase in the processing power and requirements we can easily intensify the capacity of the proposed technique.[...] Read more.
Nowadays, software systems play remarkable roles in human life and software has become an indispensable aspect of modern society. Hence, regarding the high significance of software, establishing and maintaining software reliability is considered to be an essential issue so that error occurrence, failure and disaster can be prevented. Thus, the magnitude of errors in a program should be detected and identified and software reliability should be measured and investigated so as to prevent the spread of error. In line with this purpose, different methods have been proposed in the literature on software reliability; however, the majority of the proposed methods are inefficient and undesirable due to their high overhead, vulnerability, excessive redundancy and high data replication. The method introduced in this paper identifies vulnerable data of the program and uses class diagram and the proposed formula. Also, by applying minimum redundancy and duplication on 70% of the critical data of the program, the proposed method protects the program data. The evaluation of the operation of the propose method on program indicated that it can improve reliability, reduce efficiency overhead, redundancy and complexity.[...] Read more.
In previous years, wireless sensor networks (WSNs) have fascinated lot of consideration from the scientific and technical society. The distributed characteristics and dynamic topology of sensor networks initiates very peculiar necessities in routing schemes that supposed to be met. The key feature of efficient routing protocol is energy expenditure and extension in lifetime of network. In past few years, various routing algorithms have been presented for WSNs. In this work, we focus on cluster based routing algorithms and propose a new algorithm for routing in WSNs. We perform the analysis of our new cluster based algorithms with existing algorithm on the basis of performance metrics. Simulation results shows that proposed algorithm outperform the other existing algorithms of his category.[...] Read more.
Secret Image Sharing using Verifiable method has become an important field in cryptography in today’s world. Security is of main concern and verifiability has become a demand of this era in order to avoid cheating prevention and a new scheme of secret image sharing scheme for identification of the presence of cheater is has been analyzed and described. A method for ensuring integrity of secret image prior to its recovery is proposed. An secret image and verification image are used to create shares by ARGB to CMYK conversions which are sent via cover image for transmission. The shares created are meaningful therefore this method is able of identifying whether cheater exists or not in order to preserve the integrity of the image.[...] Read more.
In 2009 Craig Gentry proved that Fully Homomorphic Encryption can be applied and realized in principle. Homomorphism allowed us to perform arbitrary computations and calculations on encrypted data. With RSA being the first cryptosystem to hold homomorphic properties, there came other additive and multiplicative cryptosystems. However, fully Homomorphic encryption proved to be the ultimate cryptographic solution to ensure security of data on cloud. It enables processing and computing arbitrary functions over the encrypted data thereby reducing the probability of accessing the plain text.[...] Read more.
Cloud computing is a new generation of computing environment which delivers the applications as a service to users over the internet. The users can select any service from a list provided by service providers depending on their demands or needs. The nature of this new computing environment leads to tasks scheduling and load balancing problems which become a booming research area. In this paper, we have proposed Scheduling Cost Approach (SCA) that calculates the cost of CPU, RAM, bandwidth, storage available. In this approach, the tasks will be distributed among the VMs based on the priority given by user. The priority depends on the user budget satisfaction. The proposed SCA will try to improve the load balance by selecting suitable VM for each task. The results of SCA are compared with the results of FCFS and SJF algorithms which proves that, the proposed SCA approach significantly reduces the cost of CPU, RAM, bandwidth, storage compared to FCFS and SJF algorithms.[...] Read more.