IJISA Vol. 8, No. 6, Jun. 2016
Cover page and Table of Contents: PDF (size: 213KB)
Gestures are natural means of communication between humans, and therefore their application would benefit to many fields where usage of typical input devices, such as keyboards or joysticks is cumbersome or unpractical (e.g., in noisy environment). Recently, together with emergence of new cameras that allow obtaining not only colour images of observed scene, but also offer the software developer rich information on the number of seen humans and, what is most interesting, 3D positions of their body parts, practical applications using body gestures have become more popular. Such information is presented in a form of skeletal data. In this paper, an approach to gesture recognition based on skeletal data using nearest neighbour classifier with dynamic time warping is presented. Since similar approaches are widely used in the literature, a few practical improvements that led to better recognition results are proposed. The approach is extensively evaluated on three publicly available gesture datasets and compared with state-of-the-art classifiers. For some gesture datasets, the proposed approach outperformed its competitors in terms of recognition rate and time of recognition.[...] Read more.
Estimation and Approximation plays an important role in planning for future. People especially the business leaders, who understand the significance of estimation, practice it very often. The act of estimation or approximation involves analyzing historical data pertaining to domain, current trends and expectations of people connected to it. Exercising estimation is not only complicated due to technological change in the world around, but also due to complexity of the problems. Traditional numerical based techniques for solution of ill-defined non-linear real world problems are not sufficient. Hence, there is a need of some robust methodologies which can deal with dynamic environment, imprecise facts and uncertainty in the available data to achieve practical applicability at low cost. Soft computing seeks to solve class of problems not suited for traditional algorithmic approaches.
To address the common problems in business of inexactness, some models are put forward for servicing, support and monitoring by approximating and estimating important outcomes. This work illustrates some very general yet widespread problems which are of interest to common people. The suggested approaches can overcome the fuzziness in traditional methods by predicting some future events and getting better control on business. This includes study of various neuro-fuzzy architectures and their possible applications in various areas, where decision-making using classical methods fail.
Surveillance systems are useful in the identification of patients that contract infections during their hospitalization period. Despite still being at infancy, electronic information control surveillance systems for Hospital Acquired Infections (HAIs) are improving and becoming more commonplace as the acceptance levels rise. There are crucial gaps in existing knowledge concerning the best ways for implementing electronic surveillance systems especially in the context of the Intensive Care Unit (ICU). To bridge this gap, the aim of this paper was to provide a comprehensive review of various electronic surveillance approaches and to highlight the requisite data components and offer guidelines. This review revealed denominator, numerator, and discrete data requirements and guidelines for the surveillance of four main ICU HAIs, including Central Line–Associated Bloodstream Infection (CLABSI), Urinary Tract Infection (UTI), Surgical Site Infections (SSIs) and Ventilator-Associated Conditions/Events (VACs/VAEs).[...] Read more.
Author attribution is the problem of assigning an author to an unknown text. We propose a new approach to solve such a problem using an extended version of the probabilistic context free grammar language model, supplied by more informative lexical and syntactic features. In addition to the probabilities of the production rules in the generated model, we add probabilities to terminals, non-terminals, and punctuation marks. Also, the new model is augmented with a scoring function which assigns a score for each production rule. Since the new model contains different features, optimum weights, found using a genetic algorithm, are added to the model to govern how each feature participates in the classification. The advantage of using many features is to successfully capture the different writing styles of authors. Also, using a scoring function identifies the most discriminative rules. Using optimum weights supports capturing different authors’ styles, which increases the classifier’s performance. The new model is tested over nine authors, 20 Arabic documents per author, where the training and testing are done using the leave-one-out method. The initial error rate of the system is 20.6%. Using the optimum weights for features reduces the error rate to 12.8%.[...] Read more.
This paper elucidates a new approach for aligning multiple sequences using DNA operations. A new distance metric using DNA hybridization melting temperature that gives approximate solutions for the multiple sequence alignment (MSA) problem is proposed. This paper provides proof for the proposed distance metric using the distance function properties. With this distance metric, a distance matrix is constructed that generates a guide tree for the alignment. Providing an accurate solution in less computational time is considered to be a challenging task for the MSA problem. Developing an algorithm for the MSA problem is essentially a trade-off between finding an accurate solution and that can be completed in less computational time. In order to reduce the time complexity, the Bio-inspired technique called the DNA computing is applied in calculating the distance between the sequences. The main application of this multiple sequence alignment (MSA) is to identify the sub-sequences for the functional study of the whole genome sequences. The detailed theoretical study of this approach is explained in this paper.[...] Read more.
The purpose of this paper is to survey stochastic differential equations and Euler-Maruyama method for approximating the solution to these equations in financial problems. It is not possible to get explicit solution and analytically answer for many of stochastic differential equations, but in the case of linear stochastic differential equations it may be possible to get an explicit answer. We can approximate the solution with standard numerical methods, such as Euler-Maruyama method, Milstein method and Runge-Kutta method. We will use Euler-Maruyama method for simulation of stochastic differential equations for financial problems, such as asset pricing model, square-root asset pricing model, payoff for a European call option and estimating value of European call option and Asian option to buy the asset at the future time. We will discuss how to find the approximated solutions to stochastic differential equations for financial problems with examples.[...] Read more.
In this paper, a simple and optimal form of fractional-order feedback approach assigned for the control and synchronization of a class of fractional-order chaotic systems is proposed. The proposed control law can be viewed as a distributed network of linear regulators wherein each node is modeled by a PI controller with moderate gains. The multiobjective genetic algorithm with chaotic mutation, adopted in this work, can be visualized as a combination of structural and parametric genes of a controller orchestrated in a hierarchical fashion. Then, it is applied to select an optimal knowledge base, which characterizes the developed controller, and satisfies various design specifications. The proposed design and optimization of the developed controller represents a simple powerful approach to provide a reasonable tradeoff between computational overhead, storage space, numerical accuracy and stability criterion in control and synchronization of a class of fractional-order chaotic systems. Simulation results show the satisfactory performance of the proposed approach.[...] Read more.
Since its inception rough set theory has proved itself to be one of the most important models to capture impreciseness in data. However, it was based upon the notion of equivalence relations, which are relatively rare as far as applicability is concerned. So, the basic rough set model has been extended in many directions. One of these extensions is the covering based rough set notion, where a cover is an extension of the concept of partition; a notion which is equivalent to equivalence relation. From the granular computing point of view, all these rough sets are unigranular in character; i.e. they consider only a singular granular structure on the universe. So, there arose the necessity to define multigranular rough sets and as a consequence two types of multigranular rough sets, called the optimistic multigranular rough sets and pessimistic rough sets have been introduced. Four types of covering based optimistic multigranular rough sets have been introduced and their properties are studied. The notion of equality of sets, which is too stringent for real life applications, was extended by Novotny and Pawlak to define rough equalities. This notion was further extended by Tripathy to define three more types of approximate equalities. The covering based optimistic versions of two of these four approximate equalities have been studied by Nagaraju et al recently. In this article, we study the other two cases and provide a comparative analysis.[...] Read more.