IJMECS Vol. 6, No. 8, Aug. 2014
Cover page and Table of Contents: PDF (size: 636KB)
Nowadays hundreds of millions of people of almost all levels of education and attitudes from different country communicate with each other for different purposes and perform their jobs on internet or other communication medium using various languages. Not all people know all language; therefore it is very difficult to communicate or works on various languages. In this situation the computer scientist introduce various inter language translation program (Machine translation). UNL is such kind of inter language translation program. One of the major problem of UNL is identified a name from a sentence, which is relatively simple in English language, because such entities start with a capital letter. In Bangla we do not have concept of small or capital letters. Thus we find difficulties in understanding whether a word is a proper noun or not. Here we have proposed analysis rules to identify proper noun from a sentence and established post converter which translate the name entity from Bangla to UNL. The goal is to make possible Bangla sentence conversion to UNL and vice versa. UNL system prove that the theoretical analysis of our proposed system able to identify proper noun from Bangla sentence and produce relative Universal word for UNL.[...] Read more.
The current educational processes need various tools and technological supports in order to attain the required level of knowledge. The learning processes have been simplified with the help of different resources including internet resources. The usage of internet resources usage depends on the learner’s requirements in the field of study. This research had identified the significant impacts of the specialization of the learners on the internet resource usage. Also, the paper identified some specializations that have major influences in using such internet resources in learning processes. The study had been conducted in Omani undergraduate student’s environment with respect to selected specializations. The specialization impacts on the frequency of using internet resources, places of searching and purpose of using internet resources in the learning processes were analyzed using conditional probabilities and impacts had been identified with the help of decision tree diagram. The results showed that the students studying Information Technology specialization had greater impact in using internet resources in their learning processes compared to others at undergraduate levels.[...] Read more.
Bat Algorithm is a recently-developed method in the field of computational intelligence. In this paper is presented an improved version of a Bat Meta-heuristic Algorithm, (IBACH), for solving integer programming problems. The proposed algorithm uses chaotic behaviour to generate a candidate solution in behaviors similar to acoustic monophony. Numerical results show that the IBACH is able to obtain the optimal results in comparison to traditional methods (branch and bound), particle swarm optimization algorithm (PSO), standard Bat algorithm and other harmony search algorithms. However, the benefits of this proposed algorithm is in its ability to obtain the optimal solution within less computation, which save time in comparison with the branch and bound algorithm (exact solution method).[...] Read more.
Significant research into the logarithmic analysis of complex networks yields solution to help minimize virus spread and propagation over networks. This task of virus propagation is been a recurring subject, and design of complex models will yield modeling solutions used in a number of events not limited to and include propagation, dataflow, network immunization, resource management, service distribution, adoption of viral marketing etc. Stochastic models are successfully used to predict the virus propagation processes and its effects on networks. The study employs SI-models for independent cascade and the dynamic models with Enron dataset (of e-mail addresses) and presents comparative result using varied machine models. Study samples 25,000 emails of Enron dataset with Entropy and Information Gain computed to address issues of blocking targeting and extent of virus spread on graphs. Study addressed the problem of the expected spread immunization and the expected epidemic spread minimization; but not the epidemic threshold (for space constraint).[...] Read more.
In recent technology the popularity and demand of image processing is increasing due to its immense number of application in various fields. Most of these are related to biometric science like face recognitions, fingerprint recognition, iris scan, and speech recognition. Among them face detection is a very powerful tool for video surveillance, human computer interface, face recognition, and image database management. There are a different number of works on this subject. Face recognition is a rapidly evolving technology, which has been widely used in forensics such as criminal identification, secured access, and prison security. In this paper we had gone through different survey and technical papers of this field and list out the different techniques like Linear discriminant analysis, Viola and Jones classification and adaboost learning curvature analysis and discuss about their advantages and disadvantages also describe some of the detection and recognition algorithms, mention some application domain along with different challenges in this field. . We had proposed a classification of detection techniques and discuss all the recognition methods also.[...] Read more.
Reuse refers to a common principle of using existing resources repeatedly, that is pervasively applicable everywhere. In software engineering reuse refers to the development of software systems using already available artifacts or assets partially or completely, with or without modifications. Software reuse not only promises significant improvements in productivity and quality but also provides for the development of more reliable, cost effective, dependable and less buggy (considering that prior use and testing have removed errors) software with reduced time and effort. In this paper we present an efficient and reliable automation model for reusability evaluation of procedure based object oriented software for predicting the reusability levels of the components as low, medium or high. The presented model follows a reusability metric framework that targets the requisite reusability attributes including maintainability (using the Maintainability Index) for functional analysis of the components. Further Multilayer perceptron (using back propagation) based neural network is applied for the establishment of significant relationships among these attributes for reusability prediction. The proposed approach provides support for reusability evaluation at functional level rather than at structural level. The automation support for this approach is provided in the form of a tool named JRA2M2 (Java based Reusability Assessment Automation Model using Multilayer Perceptron (MLP)), implemented in Java. The performance of JRA2M2 is recorded using parameters like accuracy, classification error, precision and recall. The results generated using JRA2M2 indicate that the proposed automation tool can be effectively used as a reliable and efficient solution for automated evaluation of reusability.[...] Read more.
The WWW is increasing at very fast rate and data or information present over web is changes very frequently. As the web is very dynamic, it becomes very difficult to get related and fresh information. In this paper we design and develop a program for web crawler which uses multiple HTTP for crawling the web. Here we use multiple threads for implementation of multiple HTTP connection. The whole downloading process can be reduced with the help of multiple threads. This paper deals with a system which is based on web crawler using .net technology. The proposed approach is implemented in VB.NET with multithread to crawl the web pages in parallel and crawled data is stored in central database (Sql Server). The duplicacy of record is checked through stored procedure which is pre complied & checks the result very fast. The proposed architecture is very fast and allows many crawlers to crawl the data in parallel.[...] Read more.