IJITCS Vol. 8, No. 12, Dec. 2016
Cover page and Table of Contents: PDF (size: 185KB)
The Continuous Integration idea lays beneath almost all automation aspects of the software development process. On top of that idea are built other practices extending it: Continuous Delivery and Continuous Deployment, automating even more aspects of software development. The purpose of this paper is to describe those practices, including debug process automation, to emphasize the importance of automating not only unit tests, and to provide an example of complex automation of the web application development process.[...] Read more.
The field of usability, user experience (UX) design and human-computer interaction (HCI) arose in the realm of desktop computers and applications. The current experience in computing has radically evolved into ubiquitous computing over the preceding years. Interactions these days take place on different devices: mobile phones, e-readers and smart TVs, amid numerous smart devices. The use of one service across multiple devices is, at present, common with different form factors. Academic researchers are still trying to figure out the best design techniques for new devices and experiences. The Internet of Things (IoT) is growing, with an ever wider range of daily objects acquiring connectivity, sensing ability and increased computing power. Designing for IoT raises a lot of challenges; the obvious difference being the much wider variety of device form factors. IoT is still a technically driven field, thus the usability of many of IoT products is, in some way, of the level anticipated of mature consumer products. This study focuses on proposing a usability evaluation criterion for the generic IoT architecture and essential technological components.[...] Read more.
In this paper, the human emotions are analyzed from EEG Signal (Electroencephalogram) with different kinds of situation. Emotions are very important in different activity and decision making. Various feature extraction techniques like discrete wavelet transform, Higher Order crossings, Independent component analysis is used to extract the particular features. Those features are used to classify the emotions with different groups like arousal and valence level using different classification techniques like Neural networks, Support vector machine etc.. Based on these emotion groups analyze the performance and accuracy for the classification are determined.[...] Read more.
Web application is being challenged to develop methods and techniques for large data processing at optimum response time. There are technical challenges in dealing with the increasing demand to handle vast traffic on these websites. As number of users' increases, several problems are faced by web servers like bottleneck, delayed response time, load balancing and density of services. The whole traffic cannot reside on a single server and thus there is a fundamental requirement of allocating this huge traffic on multiple load balanced servers. Distributing requests among servers in the web server clusters is the most important means to address such challenge, especially under intense workloads. In this paper, we propose a new request distribution algorithm for load balancing among web server clusters. The Dynamic Load Balancing among web servers take place based on user's request and dynamically estimating server workload using multiple parameters like processing and memory requirement, expected execution time and various time intervals. Our simulation results show that, the proposed method dynamically and efficiently balance the load to scale up the services, calculate average response time, average waiting time and server's throughput on different web servers. At the end of the paper, we presented an experimentation of running proposed system which proves the proposed algorithm is efficient in terms of speed of processing, response time, server utilization and cost efficiency.[...] Read more.
In current scenario a huge amount of data is introduced over the web, because data introduced by the various sources, that data contains heterogeneity in na-ture. Data extraction is one of the major tasks in data mining. In various techniques for data extraction have been proposed from the past, which provides functionali-ty to extract data like Collaborative Adaptive Data Shar-ing (CADS), pay-as-you-go etc. The drawbacks associat-ed with these techniques is that, it is not able to provide global solution for the user. Through these techniques to get accurate search result user need to know all the de-tails whatever he want to search. In this paper we have proposed a new searching technique "Enhanced Collaborative Adaptive Data Sharing Platform (ECADS)" in which predefined queries are provided to the user to search data. In this technique some key words are provided to user related with the domain, for efficient data extraction task. These keywords are useful to user to write proper queries to search data in efficient way. In this way it provides an accurate, time efficient and a global search technique to search data. A comparison analysis for the existing and proposed technique is pre-sented in result and analysis section. That shows, pro-posed technique provide better than the existing tech-nique.[...] Read more.
Past few decades have witnessed an information big bang in the form of World Wide Web leading to gigantic repository of heterogeneous data. A humble journey that started with the network connection between few computers at ARPANET project has reached to a level wherein almost all the computers and other communication devices of the world have joined together to form a huge global information network that makes available most of the information related to every possible heterogeneous domain. Not only the managing and indexing of this repository is a big concern but to provide a quick answer to the user's query is also of critical importance. Amazingly, rather miraculously, the task is being done quite efficiently by the current web search engines. This miracle has been possible due to a series of mathematical and technological innovations continuously being carried out in the area of search techniques. This paper takes an overview of search engine evolution from primitive to the present.[...] Read more.
Two main revolutions in data management have occurred recently, namely Big Data analytics and NoSQL databases. Even though they have evolved with different purposes, their independent developments complement each other and their convergence would benefit businesses tremendously in making real-time decisions using volumes of complex data sets that could be both structured and unstructured. While on one hand many software solutions have emerged in supporting Big Data analytics, on the other, many NoSQL database packages have arrived in the market. However, they lack an independent benchmarking and comparative evaluation. The aim of this paper is to provide an understanding of their contexts and an in-depth study to compare the features of four main NoSQL data models that have evolved. The performance comparison of traditional SQL with NoSQL databases for Big Data analytics shows that NoSQL database poses to be a better option for business situations that require simplicity, adaptability, high performance analytics and distributed scalability of large data. This paper concludes that the NoSQL movement should be leveraged for Big Data analytics and would coexist with relational (SQL) databases.[...] Read more.
Nowadays the trends of web is to become a collection of services that interoperate through the Internet. The first step towards this inter-operation is finding services that meet requester requirements; which is called a service discovery. Service discovery matches functional and non-functional properties of the requester with the provider. In this paper, an enhanced matching algorithm of Web Service Security Policy (WS-SP) is proposed to perform requirement-capability matchmaking of a consumer and a provider. Web service security policy specifies the security requirements or capabilities of a web service participant (a provider or a consumer). Security requirement or a capability of a participant is one of the non-functional properties of a web service. The security addressed through this paper is the integrity and the confidentiality of web service SOA message transmitted between participants. The enhanced matching algorithm states simple policy and complex policy cases of the web service security as a non-functional attribute. A generalization matching algorithm is introduced to get the best-matched web service provider from a list of available providers for serving the consumer.[...] Read more.
Blur detection of the partially blurred image is challenging because in this case blur varies spatially. In this paper, we propose a blurred-image detection framework for automaticallQy detecting blurred and non-blurred regions of the image. We propose a new feature vector that consists of the information of an image patch as well as blur kernel. That is why it is called kernel-specific feature vector. The information extracted about an image patch is based on blurred pixel behavior on local power spectrum slope, gradient histogram span, and maximum saturation methods. To make the features vector useful for real applications, kernels consisting of motion-blur kernels, defocus-blur kernels, and their combinations are used. Gaussian filters are used for filtering process of extracted features and kernels. Construction of kernel-specific feature vector is followed by the proposed Naïve Bayes Classifier based on Nearest Neighbor classification method (NBNN). The proposed algorithm outperforms the up-to-date blur detection method. Because blur detection is an initial step for the de-blurring process of partially blurred images, our results also demonstrate the effectiveness of the proposed method in deblurring process.[...] Read more.
Wireless Body Area Network has attracted significant research interest in various applications due to its self-automaton and advanced sensor technology. The most severe issue in WBAN is to sustain its Quality of Service (QoS) under the dynamic changing environment like healthcare, and patient monitoring system. Another critical issue in WBAN is heterogeneous packet handling in such resource-constrained network. In this paper, a new classifier having hybrid Binary Decision Tree and Support Vector Machine classifier is proposed to tackle these important challenges. The proposed hybrid classifier decomposes the N-class classification problem into N-1 sub-problems, each separating a pair of sub-classes. This protocol dynamically updates the priority of packet and node, adjusts data rate, packet transmission order and time, and resource distribution for the nodes based on node priority. The proposed protocol is implemented and simulated using NS-2 network simulator. The result generated for proposed approach shows that new protocol can outperform in a dynamic environment, and yields better performance by leveraging advantages of both the Binary Decision Tree in terms of efficient computation and Support Vector Machine for high classification accuracy. This hybrid classifier significantly reduces loss ratio and delay and increase packet delivery ratio and throughput.[...] Read more.