IJITCS Vol. 8, No. 5, May. 2016
Cover page and Table of Contents: PDF (size: 191KB)
A parallel approach for solving a large-scale Traveling Salesman Problem (TSP) is presented. The problem is solved in four stages by using the following sequence of procedures: decomposing the input set of points into two or more clusters, solving the TSP for each of these clusters to generate partial solutions, merging the partial solutions to create a complete initial solution M0, and finally optimizing this solution. Lin-Kernighan-Helsgaun (LKH) algorithm is used to generate the partial solutions. The main goal of this research is to achieve speedup and good quality solutions by using parallel calculations. A clustering algorithm produces a set of small TSP problems that can be executed in parallel to generate partial solutions. Such solutions are merged to form a solution, M0, by applying the "Ring" method. A few optimization algorithms were proposed to improve the quality of M0 to generate a final solution Mf. The loss of quality of the solution by using the developed approach is negligible when compared to the existing best-known solutions but there is a significant improvement in the runtime with the developed approach. The minimum number of processors that are required to achieve the maximum speedup is equal to the number of clusters that are created.[...] Read more.
We describe the programming language FOBS-X (Extensible FOBS). FOBS-X is interpreted, and is intended as a universal scripting language. One of the more interesting features of FOBS-X is its ability to be extended, allowing it to be adopted to new scripting environments. FOBS-x is structured as a core language that is parsed by the interpreter, and an extended language that is translated to the core by macro expansion. The syntax of the language can easily be modified by writing new macros. The library for FOBS-X is reconfigurable, allowing the semantics of the language to be modified, and adapted to facilitate the interaction with interfaces to new scripting environments. This paper focuses on the tools used for the semantic extension of the language. A tool called FEDELE has been developed, allowing the user to add library modules to the FOBS-X library. In this way the semantics of the language can be enhanced, and the language can be adapted.[...] Read more.
Cloud computing is considered as one of the most exciting technology because of its flexibility and scalability. The main problem that occurs in cloud is security. To overcome the problems or issues of security, a new technique called fog-computing is evolved. As there are security issues in fog even after getting the encrypted data from cloud, we implemented the process of encryption using AES algorithm to check how it works for the fog. So far, to our analysis AES algorithm is the most secured process of encryption for security. Three datasets of different types are considered and applied the analysed encryption technique over those datasets. On validation, entire data over datasets is being accurately encrypted and decrypted back as well. We took android mobile as an edge device and deployed the encryption over datasets into it. Further, performance of encryption is evaluated over selected datasets for accuracy if the entire data is correctly encrypted and decrypted along with the time, User load, Response time, Memory Utilization over file size. Further best and worst cases among the datasets are analysed thereby evaluating the suitability of AES in fog.[...] Read more.
Money laundering is a criminal activity to disguise black money as white money. It is a process by which illegal funds and assets are converted into legitimate funds and assets. Money Laundering occurs in three stages: Placement, Layering, and Integration. It leads to various criminal activities like Political corruption, smuggling, financial frauds, etc. In India there is no successful Anti Money laundering techniques which are available. The Reserve Bank of India (RBI), has issued guidelines to identify the suspicious transactions and send it to Financial Intelligence Unit (FIU). FIU verifies if the transaction is actually suspicious or not. This process is time consuming and not suitable to identify the illegal transactions that occurs in the system. To overcome this problem we propose an efficient Anti Money Laundering technique which can able to identify the traversal path of the Laundered money using Hash based Association approach and successful in identifying agent and integrator in the layering stage of Money Laundering by Graph Theoretic Approach.[...] Read more.
In recent years, continuous progress in wireless communication has opened a new research field in computer networks. Now a day's wireless ad-hoc networking is an emerging research technology that needs attention of the industry people and the academicians. A vehicular ad-hoc network uses vehicles as mobile nodes to create mobility in a network. Simulation is the reproduction of the method of real-world practices. The computer simulation runs on a single or a network of computers to model and reproduce the behaviour of a system. This is based upon the conceptual model to simulate the system. In this research paper, we will discuss the coupling simulator VanetMobiSim and NS2 for vehicular ad-hoc networks. This output will be useful in implementing efficient tools on the realistic highway scenario, especially for four-wheeler traffic types in VANETs.[...] Read more.
This study examines the individuals' participation intentions and behaviour on Social Networking Sites. For this purpose, the Decomposed Theory of Planned Behaviour is utilized. Data collected from a survey of 1100 participants and distilled to 657 usable sets has been analysed to assess the predictive power of Decomposed Theory of Planned Behaviour' model via structural equation modelling. The results show that attitude and subjective norm have significant effect on the participation intention of adopters. Further, the results show that participation intention has significant effect on participation behaviour. However, the study findings also show that perceived behavioural control has no significant effect on participation intention or behaviour of adopters. The model adopted in this study explains 47% of the variance in "Participation Intentions" and 36% of the variance in "Participation Behaviour". Participation of behavioural intention in the model' explanatory power was the highest amongst the constructs (able to explain 14.6% of usage behaviour). While, "attitude" explain around 9% of SNSs usage behaviour.[...] Read more.
ICT is driving all areas of the economy and is likely to dictate the future for all genders. The narrow definition of ICT has greatly impacted on the female gender choosing ICT as a career of choice. There are few women in the ICT careers. The study sought to determine the nature of ICT career gender exclusion, status and trend of ICT job opportunities, source of ICT gender career exclusion and the contribution of the narrow definition to the exclusion. A mixed method of survey and desktop method was employed in this study. A structured questionnaire was used in this study in order to identify the factors that influence ICT career choice amongst Kenyan lady students. A purposive sample of Information Technology and Computer Science undergraduate university students (77 females, 56 males; age range 17 to 35 years) and 10 postgraduate students in Information Technology from two public universities participated in the study. The paper discusses the emerging unfilled ICT jobs. The study established that the narrow definition negatively influences ICT as a career of Choice among girls. Broadening ICT definition to include ICT related careers that have more social rather than technical aspect accordingly is likely to influence more women to join the field.[...] Read more.
Systems carry sensitive data where users are involved. There is need for security concern for the modern software applications. We can term them as 'untrusted clients'. Internet usage has rapidly grown over the years and, more users are opening their information system to their clientele, it is essential to understand users' data that need protecting and to control system access as well and the rights of users of the system. Because of today's increasingly nomadic lifestyle, where they allow users to connect to information systems from anywhere with all the devices in the market, the users need to carry part of the information system out of the secure infrastructure. Insecurity in user interfaces is caused by user ignoring functionalities in the system where some are not only a threat but can harm the system e.g. leaving network services active even though the user does not need them, or when a user is having little or no information of the available security measures. This research paper aims critically address through a review of existing literature, the importance of balance or trade-off between usability and the security of the system. Systematic review method involved a physical exploration of some conference proceedings and journals to conduct the literature review. Research questions relating to usability and security were asked and the criteria for usability and security evaluations were identified. This systematic literature review is valuable in closing the gap between usability and security in software development process, where usability and security engineering needs to be considered for a better quality end-user software.[...] Read more.
Biological sequence comparison is one of the most important and basic problems in computational biology. Due to its high demands for computational power and memory, it is a very challenging task. The well-known algorithm proposed by Smith-Waterman obtains the best local alignments at the expense of very high computing power and huge memory requirements. This paper introduces a new efficient algorithm to locate the longest common subsequences (LCS) in two different DNA sequences. It is based on the convolution between the two DNA sequences: The major sequence is represented in the linked-list X while the minor one is represented in circular linked-list Y. An array of linked lists is established where each linked list is corresponding to an element of the linked-list X and a new node is added to it for each match between the two sequences. If two or more matches in different locations in string Y share the same location in string X, the corresponding nodes will construct a unique linked-list. Accordingly, by the end of processing, we obtain a group of linked-lists containing nodes that reflect all possible matches between the two sequences X and Y. The proposed algorithm has been implemented and tested using C# language. The benchmark test shows very good speedups and indicated that impressive improvements has been achieved.[...] Read more.
The research activities in mobile computing and wireless networks strongly indicates that mobile computers and their wireless communication links will be an integral part of the current internet works. Communication over wireless links is characterised by limited bandwidth, high latency, high bit-error rates and temporary disconnections. Most often, networks with wireless links and mobile hosts incur significant losses due to handoff. It is also most important that in wireless environment TCP (transmission control protocol) causes degraded end-to-end performance. In this paper, we tried to give solutions to these problems. We adopt a new method to improve the end-to-end reliable transport performance in mobile wireless environment by incorporating changes to network-layer at the base station and mobile host and preserve end-to-end semantics of TCP. The methodology employs NDG (normalised delay gradient) loss-predictor function to determine congestion losses from that of transmission loss. The sender window can adjust its size depending on the loss information. If the loss is due to congestion, congestion control algorithm is invoked to decrease the flow rate. If the loss is due to handoff (transmission loss), immediate-recovery algorithm is invoked to recover the losses caused by the sender TCP. Stochastic equations are used to analyse, i) arrival rate handoff calls, ii) blocking probability of handoff request, iii) distinguish packet loss due to congestion or handoff, iv) dynamics of sender window, v) queue length at the ingress point of the BS router , vi) throughput, and vii) the losses due to congestion and handoff blocking. Our results provide better way understanding the problem of calls drop due to handoff, and providing most accurate solutions for mobile wireless system.[...] Read more.