IJISA Vol. 9, No. 1, Jan. 2017
Cover page and Table of Contents: PDF (size: 916KB)
The new approach to structural identification of nonlinear dynamic systems under uncertainty is pro-posed. It is based on the analysis of virtual frameworks (VF), reflecting a state of a nonlinear part system. Con-struction VF is based on obtaining special an informa-tional set describing a steady state of a nonlinear dynamic system. Introduction VF demands an estimation of structural identifiability of a system. This concept is associated with nonlinearity of system and properties VF. The method of an estimation of structural identifiability is proposed. The appearance of the insignificant virtual frameworks, not satisfying to the condition of structural identifiability, is considered. Algorithms for an estimation of a nonlinearity class on the basis of the analysis of sector sets are proposed. Methods and procedures of the estimation of framework single-valued and multiple-valued nonlinearities are proposed. The method of the structurally-frequency analysis is proposed and applied to validate the obtained solutions. VF is proposed for identification of an order and a spectrum of eigenvalues of a linear dynamic system. The possibility of application VF for the problem solving of identification static systems is shown.[...] Read more.
The purpose of this paper is to investigate the factors affecting electronic sharing (E-sharing) behaviour, with a particular focus on location-aware technology. Based on an extensive literature review, a structural model consisting of seven factors was proposed to model the E-sharing behaviour of location-based knowledge (LBK). The main constructs were: reward expectancy (WE), reputation expectancy (RE), perceived benefits (PB), perceived trust (PT), attitudes towards LBK, attitudes toward knowledge sharing incentives (KSI) and intention to share knowledge (ISK). The model was examined by empirical data gathered from four hundred and ninety (n=490) respondents. Results herein indicate that attitude toward KSI can be determined by the RE and WE, while attitude toward LBK E-sharing can be predicted by the PB and PT. The two attitude constructs (KSI and LBK) can determine the behavioural ISK. All of the proposed relationships within the model were statistically significant.[...] Read more.
In this research work, an improved active contour method called Bat-Active Contour Method (BA-ACM) using bat algorithm has been developed. The bat algorithm is incorporated in order to escape local minima entrapped into by the classical active contour method, stabilize contour (snake) movement and accurately, reach boundary concavity. Then, the developed Bat-Active Contour Method was applied to a dataset of medical images of the human heart, bone of knee and vertebra which were obtained from Auckland MRI Research Group (Cardiac Atlas Website), University of Auckland. Set of similarity metrics, including Jaccard index and Dice similarity measures were adopted to evaluate the performance of the developed algorithm. Jaccard index values of 0.9310, 0.9234 and 0.8947 and Dice similarity values of 0.8341, 0.8616 and 0.9138 were obtained from the human heart, vertebra and bone of knee images respectively. The results obtained show high similarity measures between BA-ACM algorithm and expert segmented images. Moreso, traditional ACM produced Jaccard index values 0.5873, 0.5601, 0.6009 and Dice similarity values of 0.5974, 0.6079, 0.6102 in the human heart, vertebra and bone of knee images respectively. The results obtained for traditional ACM show low similarity measures between it and expertly segmented images. It is evident from the results obtained that the developed algorithm performed better compared to the traditional ACM.[...] Read more.
The main objective of image fusion is to obtain an enhanced image with more relevant information by integrating complimentary information from two source images. In this paper, a novel image fusion algorithm based on discrete wavelet transform (DWT) and cross bilateral filter (CBF) is proposed. In the proposed framework, source images are decomposed into low and high frequency subbands using DWT. The low frequency subbands of the transformed images are combined using pixel averaging method. Meanwhile, the high frequency subbands of the transformed images are fused with weighted average fusion rule where, the weights are computed using CBF on both the images. Finally, to reconstruct the fused image inverse DWT is performed over the fused coefficients. The proposed method has been extensively tested on several pairs of multi-focus and multisensor images. To compare the results of proposed method with different existing methods, a variety of image fusion quality metrics are employed for the qualitative measurement. The analysis of comparison results demonstrates that the proposed method exhibits better results than many other fusion methods, qualitatively as well as quantitatively.[...] Read more.
Nature is there since millenniums. Natural elements have withstood harsh complexities since years and have proved their efficiency in tackling them. This aspect has inspired many researchers to design algorithms based on phenomena in the natural world since the last couple of decades. Such algorithms are known as natural computing algorithms or nature inspired algorithms. These algorithms have established their ability to solve a large number of real-world complex problems by providing optimal solutions within the reasonable time duration. This paper presents an investigation by assessing the performance of some of the well-known natural computing algorithms with their variations. These algorithms include Genetic Algorithms, Ant Colony Optimization, River Formation Dynamics, Firefly Algorithm and Cuckoo Search. The Traveling Salesman Problem is used here as a test bed problem for performance evaluation of these algorithms. It is a kind of combinatorial optimization problem and known as one the most famous NP-Hard problems. It is simple and easy to understand, but at the same time, very difficult to find the optimal solution in a reasonable time – particularly with the increase in a number of cities. The source code for the above natural computing algorithms is developed in MATLAB R2015b and applied on several TSP instances given in TSPLIB library. Results obtained are analyzed based on various criteria such as tour length, required iterations, convergence time and quality of solutions. Conclusions derived from this analysis help to establish the superiority of Firefly Algorithms over the other algorithms in comparative terms.[...] Read more.
In this article, a new approach is presented to survey the validity of the nonlinear and nonsmooth inequalities on a compact domain using optimization. Here, an optimization problem corresponding with the considered inequality is proposed and by solving of which, the validity of the inequality will be determined. The optimization problem, in smooth and nonsmooth forms, is solved by a linearization approach. The efficiency of presented approach is illustrated in some examples.[...] Read more.
A fuzzy clustering algorithm for multidimensional data is proposed in this article. The data is described by vectors whose components are linguistic variables defined in an ordinal scale. The obtained results confirm the efficiency of the proposed approach.[...] Read more.
The Huge amount of Big Data is constantly arriving with the rapid development of business organizations and they are interested in extracting knowledgeable information from collected data. Frequent item mining of Big Data helps with business decision and to provide high quality service. The result of traditional frequent item set mining algorithm on Big Data is not an effective way which leads to high computation time. An Apache Hadoop MapReduce is the most popular data intensive distributed computing framework for large scale data applications such as data mining. In this paper, the author identifies the factors affecting on the performance of frequent item mining algorithm based on Hadoop MapReduce technology and proposed an approach for optimizing the performance of large scale frequent item set mining. The Experiments result shows the potential of the proposed approach. Performance is significantly optimized for large scale data mining in MapReduce technique. The author believes that it has a valuable contribution in the high performance computing of Big Data.[...] Read more.