IJIEEB Vol. 8, No. 5, Sep. 2016
Cover page and Table of Contents: PDF (size: 181KB)
To cope up with the pace of digitalization all over the world, like developed countries, developing countries are also offering services to its citizens through various online portals, web applications and web sites. Unfortunately, due to the lack of consideration on vulnerability issues during the development phase, many of those web based services are suffering from serious security threats. For these developing countries, vulnerability statistics are required to have insight about the current security status of the provided web services. That statistical data can assist the stakeholders to take appropriate actions against cyberattacks. In this work, we conduct a survey to observe the responses of web based services against four most commonly found web attacks called Man in the Middle, SQL Injection, Cross Site Scripting and Denial of Service. We carry out the survey for 30 websites (applications) of Bangladesh as the country has been focusing on digitalization of government services for the last few years and has already been offering various online services to its citizens. Among the 30 websites of several categories, result shows that approximately 77% sites are vulnerable to Man in the Middle attack whereas 3% are vulnerable to SQL Injection and Cross Site Scripting.[...] Read more.
The conventional voting scheme employs paper-based ballot to verify votes. This voting scheme is insecure due to the attributed shortcomings including ballot stuffing, ballot snatching and voter’s impersonation. In this paper, we present the design and development of secure e-voting to ensure a free, fair and credible election where the preference of electorate counts. This proposed system solves the authentication, integrity and confidentiality security issues of e-voting in kiosk and poll site evoting scenarios using unimodal fingerprint biometrics and Advanced Encryption Standard based Wavelet based Crypto-watermarking Approach. The developed system solves: The possibility of blundering voter’s authentication, integrity and confidentiality of vote stored in the server. The results after qualitative evaluation of the system with anti-watermarking detectors revealed that the developed secure e-voting system could serve as a platform for the delivery of credible e-election in developing countries with significant digital divides.[...] Read more.
In this digital world, research interest is being increased on e-governance, existing research isn’t adequately address key issues regarding the development, implementation and integration of the e-governance projects. The delivering of e-services to the citizens at their door is the primary function of government. The Maharashtra state is the pioneer in citizen centric e-governance initiatives like Common Service Centre (Maha e-Seva, SETU), Public Distribution System (PDS), Land Record (Bhoomi Abhilekh) etc. The present research is efforts to identify and establish linkages between the factors responsible for creating a suitable environment for effective implementation of e-governance services in government offices. The objective of the present research paper is to establish the background for the development of a conceptual framework of e-governance initiatives. Based on the existing literature review the paper provides an analysis of existing empirical findings and conceptual perspectives related to e-governance initiatives in Satara District, Maharashtra, India.[...] Read more.
In this correspondence we present the application of iterative shrinkage (IS) operator to the DOA estimation task. In particular we focus our attention to Stage wise Orthogonal Matching Pursuit (StOMP) algorithm. We compare StOMP against MUSIC, which is state of the art in DOA estimation. StOMP belongs to compressive sensing regime where as MUSIC is parametric technique based upon sub-space processing.
To best of our knowledge IS operators have not been analyzed for DOA estimation. The comparison is performed using extensive numerical simulations.
In recent time, with the rapid development of web 2.0 the number of online user-generated review of product is increases very rapidly. It is very difficult for user to read all reviews and handle all websites to make a valuable decision at feature level. The feature level opinion mining has become very infeasible when people write same feature with contrary words or phrases. To produce a relevant feature based summary of domain synonyms words and phrase, need to be group into same feature group. In this work, we focus on feature based opinion mining and proposed a dynamic system for generate feature based summary of specific feature with specific polarity of opinion according to customer demand on periodic base and changed the summary after a span of period according to customer demand. First a method for feature (frequent and infrequent) extraction using the probabilistic approach at word-level. Second identify the corresponding opinion word and make feature-opinion pair. Third we designed an algorithm for final polarity detection of opinion. Finally, assigning the each feature-opinion pair into the respective feature based cluster (positive, negative or neutral) to generate the summary of specific feature with specific opinion on periodic base which are helpful for user. The experiment results show that our approach can achieves 96%accuracy in feature extraction and 92% accuracy in final polarity detection of feature-opinion pair in feature based summary generation task.[...] Read more.
Software quality is an important topic of software development and it is always challenging to deliver high-quality software. The major challenges, to complete the software, are time and cost without losing the software quality. Software quality has a significant impact on software performance. The acceptability, success, and failure of software are depending on its level of quality and number of defects. Software defects are one of the fundamental factors that can determine the time of software delivery. In addition, defects or errors need to be eliminated before software delivery. Software companies spend a lot to reduce code defects. The aim is to detect defects early with cheaper methods. This paper proposes a code quality scanner to decrease the code defects. The proposed solution is a combination of code scanner and code review. Moreover, the paper presents results using quantitative analysis to show the effectiveness of the proposed solution. The results are found encouraging.[...] Read more.
This work is focused on the stationary behavior of a document processing system. This problem can be handled using workflow models; knowing that the techniques used in workflow modeling heavily rely on constrained Petri nets. When using a document processing system, one wishes to know how the system behaves when a new document enters in order to give precise support to the manager’s decision. This requires a good analysis of the system’s performances. But according to many authors, stochastic models, specifically waiting lines should be used instead of Petri nets at a strategic level in order to lead such analysis. The need to study a new model comes from the fact that we wish to provide tools for a decision maker to lead accurate performance analysis in a document processing system. In this paper, amodel for document management systems in an organization is studied. The model has a static and a dynamic component. The static one is a graph which represents transitions between processing units. The dynamic component is composed of a Markov processes and a network of queues which model the set of waiting-lines at each processing unit. Key performance indicators are defined and studied point-wise and on the average. Formulas are given for some example models.[...] Read more.
As the World Wide Web carries on to grow up rapidly in size and popularity, web traffic and network bottlenecks are more important issues in the networked world. The continued enhancement in demand for items on the World Wide Web causes severe overloading in many sites, network congestion, delay in perceived latency and network bottleneck. Many users have no patience in waiting more than a few seconds for downloading a web page, that’s why Web traffic reduction system is very necessary in today World Wide Web for accessing the websites efficiently with the facility of existing networks. Web caching is an effective method to improve the performance of the World Wide Web but in today’s World Wide Web caching method alone is not enough because of World Wide Web has grown quickly from a simple information-sharing mechanism to a rich collection of dynamic objects and multimedia data. The web prefetching is used to improve the performance of the proxy server. Prefetching predict web object that is expected to be requested in the near future and store them in advance, thus the response time of the user request is reduced. To improve the performance of the proxy server, this paper proposed a new framework which combines the caching system and prefetching technique and also optimize the prefetching with the help of probability. In this paper, we use the dataset for the experiment which is collected from ircache.net proxy server and give the result with the comparison of other technique of prefetching.[...] Read more.