IJISA Vol. 10, No. 11, Nov. 2018
Cover page and Table of Contents: PDF (size: 189KB)
Lyapunov exponents (LE) identification prob-lem of dynamic systems with periodic coefficients is con-sidered under uncertainty. LE identification is based on the analysis of framework special class describing dy-namics of their change. Upper bound for the smallest LE and mobility limit for the large LE are obtained and the indicator set of the system is determined. The graphics criteria based on the analysis of framework special class features are proposed for an adequacy estimation of obtained LE estimations. The histogram method is applied to check for obtained estimation set. We show that the dynamic system can have the LE set.[...] Read more.
An efficient classification algorithm used recently in many big data applications is the Random forest classifier algorithm. Large complex data include patient record, medicine details, and staff data etc., comprises the medical big data. Such massive data is not easy to be classified and handled in an efficient manner. Because of less accuracy and there is a chance of data deletion and also data missing using traditional methods such as Linear Classifier K-Nearest Neighbor, Random Clustering K-Nearest Neighbor. Hence we adapt the Random Forest Classification using K-means clustering algorithm to overcome the complexity and accuracy issue. In this paper, at first the medical big data is partitioned into various clusters by utilizing k- means algorithm based upon some dimension. Then each cluster is classified by utilizing random forest classifier algorithm then it generating decision tree and it is classified based upon the specified criteria. When compared to the existing systems, the experimental results indicate that the proposed algorithm increases the data accuracy.[...] Read more.
To date, the requirements for the quality of paper products are increasing. At the same time, the most common trend in recent years is improving the resource and energy conservation of all technological processes. From the point of view of specialists in the field of paper industry in technological process of production on a paper machine, the greatest attention must be paid to the drying of a paper web. This process is the most expensive and decisive for a large number of quality parameters of finished products. In order to satisfy these requirements, it is necessary to implement a system of optimal control for this technological process.
The first and one of the most important parts of the development of such system is the formation of a criterion for the optimal control and calculation of the optimal mode of operation of the first stage of drying - the heating of a paper web. For this purpose, the problem of calculating the optimal temperature graph of heating the paper web in the drying section of a paper machine is considered. Proposed quality control criterion ensures the maintenance of the parameters of finished products within the limits defined by the standard. Established limitations on the dynamics of temperature change on each drying cylinder and the final values.
The calculation of the optimal temperature schedule is made by taking into account the characteristics of the material, the changes in the parameters of heat and mass transfer, which are functional dependences on the temperature of the paper web. The formulas for calculating the temperature of the paper at the exit from each drying cylinder and the free movement sections are based on the data on partial pressure on the surface of the paper web and in the environment.
Results of the work are presented in the form of a step-by-step algorithm. Implementation of the developed algorithm ensures uniform heating of the paper web and reaches the optimum temperature for effective removal of moisture.
In today’s computing era, the world is dealing with big data which has enormously expanded in terms of 7Vs (volume, velocity, veracity, variability, value, variety, visualization). The conventional data structures like arrays, linked list, trees, graphs etc. are not able to effectively handle these big data. Therefore new and dynamic tools and techniques which can handle these big data effectively and efficiently are the need of the hour. This paper aims to provide an enhancement to the recently proposed “dynamic” data structure “r-Train” for handling big data. With the emergence of the “Internet of Things (IoT)” technology, real-time handling of requests and services are pivotal. Therefore it becomes necessary to promptly fetch the required data as and when required from the enormous piles of big data that are generally located at different sites. Therefore an effective searching and retrieval mechanism must be provided that can handle these challenging issues. The primary aim of this proposed refinement is to provide an effective means of insertion, deletion and searching techniques to efficiently handle the big data.[...] Read more.
UML State Diagram is used to represent the behavior of the System Under Test (SUT) when an event occurs. The state of the system is determined by the event that occurs randomly. The system state changes when the transition relationship between the States is satisfied. Test cases are generated from State Chart Diagram to test the behavior of the system. When multiple decision nodes are present in the same path, path explosion occurs. A method is proposed to generate Basis Path (BP) test cases with node coverage using Genetic Algorithm (GA) to overcome this problem. Experiments are conducted upon various Android applications and the efficiency of the algorithm is evaluated through the code coverage and the mutation analysis. Using this approach, BP test cases, Robotium test scripts are generated for 10 Android applications and observed an average of 70% reduction in the test case number concerning all path test cases. The resulted average code coverage is 74%, and Defect Removal Efficiency (DRE) is 95%. The experimental results show that the proposed method is effective when compared to other methods.[...] Read more.
In this paper, a new robust and imperceptible digital image watermarking scheme that can overcome the limitation of traditional wavelet-based image watermarking schemes is proposed using hybrid transforms viz. Lifting wavelet transform (LWT), discrete cosine transform (DCT) and singular value decomposition (SVD). The scheme uses canny edge detector to select blocks with higher edge pixels. Two reference sub-images, which are used as the point of reference for watermark embedding and extraction, have been formed from selected blocks based on the number of edges. To achieve a better trade-off between imperceptibility and robustness, multiple scaling factors (MSF) have been employed to modulate different ranges of singular value coefficients during watermark embedding process. Particle swarm optimization (PSO) algorithm has been adopted to obtain optimized MSF. The performance of the proposed scheme has been assessed under different conditions and the experimental results, which are obtained from computer simulation, verifies that the proposed scheme achieves enhanced robustness against various attacks performed. Moreover, the performance of the proposed scheme is compared with the other existing schemes and the results of comparison confirm that our proposed scheme outperforms previous existing schemes in terms of robustness and imperceptibility.[...] Read more.
In this paper a new Quantum Tunneling Particle Swarm Optimization (QTPSO) algorithm is proposed and applied to the training of feedforward Artificial Neural Networks (ANNs). In the classical Particle Swarm Optimization (PSO) algorithm the value of the cost function at the location of the personal best solution found by each particle cannot increase. This can significantly reduce the explorative ability of the entire swarm. In this paper a new PSO algorithm in which the personal best solution of each particle is allowed to tunnel through hills in the cost function analogous to the Tunneling effect in Quantum Physics is proposed. In quantum tunneling a particle which has insufficient energy to cross a potential barrier can still cross the barrier with a small probability that exponentially decreases with the barrier length. The introduction of the quantum tunneling effect allows particles in the PSO algorithm to escape from local minima thereby increasing the explorative ability of the PSO algorithm and preventing premature convergence to local minima. The proposed algorithm significantly outperforms three state-of-the-art PSO variants on a majority of benchmark neural network training problems.[...] Read more.
Basic rough set model introduced by Pawlak in 1982 has been extended in many directions to enhance their modeling power. One such attempt is the notion of rough sets on fuzzy approximation spaces by De et al in 1999. This basic model uses equivalence relation for its definition, which decompose the universal set into disjoint equivalence classes. These equivalence classes are called granules of knowledge. From the granular computing point of view the basic rough set model is unigranular in character. So, in order to handle more than one granular structure simultaneously, two types of multigranular rough sets, called the optimistic and pessimistic multigranular rough sets were introduced by Qian et al in 2006 and 2010 respectively. In this paper, we introduce two types of multigranular rough sets on fuzzy approximation spaces (optimistic and pessimistic), study several of their properties and illustrate how this notion can be used for prediction of rainfall. The introduced notions are explained through several examples.[...] Read more.