IJITCS Vol. 5, No. 9, Aug. 2013
Cover page and Table of Contents: PDF (size: 198KB)
This paper presents the new algorithm of the recursive least-squares (RLS) Wiener fixed-point smoother and filter based on the randomly delayed observed values by one sampling time in linear discrete-time wide-sense stationary stochastic systems. The observed value y(k) consists of the observed value y¯(k-1) with the probability p(k) and of y¯(k) with the probability 1-p(k). It is assumed that the delayed measurements are characterized by Bernoulli random variables. The observation y¯(k) is given as the sum of the signal z(k)=Hx(k) and the white observation noise v(k). The RLS Wiener estimators use the following information: (a) the system matrix for the state vector x(k); (b) the observation matrix H (c) the variance of the state vector x(k); (d) the delayed probability p(k); (e) the variance of white observation noise v(k); (f) the input noise variance of the state equation for the augmented vector v¯(k) related with the observation noise.[...] Read more.
Uplink planification in a WCDMA network consists of estimating the maximum capacity that a cell can support, by using the quality of service equation designed by (E_b)/(N_0). We are interested in this work on two different scenarios: an isolated cell and multiple cells. This capacity is adversely affected by interferences due to own mobile stations and to others belonging to neighboring cells. In order to enhance capacity and minimize the blocking probability of new requests in the cell, we have proposed a Freeing Resources algorithm which consists of releasing some mobile stations in the handover area with the overloaded cell. This algorithm is based on freeing 1, 2 and 3 mobile stations of 12.2 kbps and 1 mobile station of 64 kbps.[...] Read more.
This paper exploits the feature extraction capabilities of the discrete cosine transform (DCT) together with an illumination normalization approach in the logarithm domain that increase its robustness to variations in facial geometry and illumination. Secondly in the same domain the entropy measures are applied on the DCT coefficients so that maximum entropy preserving pixels can be extracted as the feature vector. Thus the informative features of a face can be extracted in a low dimensional space. Finally, the kernel entropy component analysis (KECA) with an extension of arc cosine kernels is applied on the extracted DCT coefficients that contribute most to the entropy estimate to obtain only those real kernel ECA eigenvectors that are associated with eigenvalues having high positive entropy contribution. The resulting system was successfully tested on real image sequences and is robust to significant partial occlusion and illumination changes, validated with the experiments on the FERET, AR, FRAV2D and ORL face databases. Experimental comparison is demonstrated to prove the superiority of the proposed approach in respect to recognition accuracy. Using specificity and sensitivity we find that the best is achieved when Renyi entropy is applied on the DCT coefficients. Extensive experimental comparison is demonstrated to prove the superiority of the proposed approach in respect to recognition accuracy. Moreover, the proposed approach is very simple, computationally fast and can be implemented in any real-time face recognition system.[...] Read more.
Software engineering starts to be less linked to the development, but at the same time it tends to rely on using the component-based software. The community interested in software engineering has proposed what is called software reuse and offered some ways for component-based software development. The basic difficulty encountered when designing component-based systems is the process of searching for and selecting the appropriate set of the current software components. Selecting a component is considered a hard task in the Component Based Software Engineering (referred to as CBSE), particularly with the augmentation of the number of the component development. Hence, it is hard to select component for CBSE. Different ways and approaches were suggested to solve the problem related to software component selection. Validation of the proposed solution in this paper through collecting a sample of people who answer an electronic survey that composed of 15 questions. An electronic survey target distributed to specialists in software engineering through social sites such as twitter and Facebook also by email them. The result of the validation of the proposed solution proved using a new improvement CBR system to do select the suitable component.[...] Read more.
NLRP10 is one of the members of NOD-like receptors (NLRs) family that is least characterized. It is a protein that takes part in pathogen sensing and responsible for the subsequent signaling propagation leading to immunologic response. In this study, computational tools such as algorithm, web server and database were used to investigate the domain of NLRP10 protein. The findings of this research may provide computational insights into the structure and functions of NLRP10, which in turn may foster better understanding of the role of NLRP10 in the immunologic defense.[...] Read more.
This study presents an attempt to develop a reliable computerized algorithm, which could classify images into predetermined classes. For this purpose, the histogram of the normalized distance between each two points of the image (algorithm I) and the histogram of normalized distances between three points and the normalized angle of the image edge points (algorithm II) are analyzed. The probabilistic neural network (PNN) is implemented to do shape classification. Our proposed approach is tested on ten classes of MPEG-7 image database. It has been shown that feature extraction based on the distance histogram (algorithm I and algorithm II) is efficient due to its potential to preserve interclass and intra-class variation. In addition, these algorithms ensur invariance to geometric transformations (e.g. translation, rotation and scaling). The best classification accuracy is achieved by eight classes with the total accuracy of 90% and 92.5% for algorithm I and algorithm II, respectively. The reported experiment reveal that the proposed classification algorithm could be useful in the study of MPEG-7 shapes.[...] Read more.
Wireless sensor networks have recently emerged as important computing platform. These sensors are power-limited and have limited computing resources. Therefore the sensor energy has to be managed wisely in order to maximize the lifetime of the network. Simply speaking, LEACH requires the knowledge of energy for every node in the network topology used. In LEACHs threshold which selects the cluster head is fixed so this protocol does not consider network topology environments. We proposed IELP algorithm, which selects cluster heads using different thresholds. New cluster head selection probability consists of the initial energy and the number of neighbor nodes. On rotation basis, a head-set member receives data from the neighboring nodes and transmits the aggregated results to the distant base station. For a given number of data collecting sensor nodes, the number of control and management nodes can be systematically adjusted to reduce the energy consumption, which increases the network life.
The simulation results show that the performance of IELP has an improvement of 39% over LEACH and 20% over SEP in the area of 100m*100m for m=0.1, α =2 where advanced nodes (m) and the additional energy factor between advanced and normal nodes (α).
This paper is approaching a new technique of creating Minimal Spanning Trees based on degree constraints of a simple symmetric and connected graph G. Here we recommend a new algorithm based on the average degree sequence factor of the nodes in the graph. The time complexity of the problem is less than O(N log|E|) compared to the other existing time complexity algorithms is O(|E| log|E|)+C of Kruskal, which is optimum. The goal is to design an algorithm that is simple, graceful, resourceful, easy to understand, and applicable in various fields starting from constraint based network design, mobile computing to other field of science and engineering.[...] Read more.
The AODV protocol is based on the minimum delay path as its route selection criteria, regardless of the paths load. This issue leads to unbalanced load dissemination in the network and the energy of the nodes on the shortest path deplete earlier than others. We proposed an improved AODV protocol with limited TTL (Time to Live) of RREP packet in which the route reply (RREP) packet of AODV is modified to limite TTL information of nodes. Experiments have been carried out using network simulator software (NS2). Simulation results show that our proposed routing protocol outperforms regular AODV in terms of packet delivery rate, good put, throughput, and jitter.[...] Read more.
This paper presents a constrained finite-state model to represent the morphotactic rule of Manipuri adjective word forms. There is no adjective word category in Manipuri language. By rule this category is derived from verb roots with the help of some selected affixes applicable only to verb roots. The affixes meant for the purpose and the different rules for adjective word category formation are identified. Rules are composed for describing the simple agglutinative morphology of this category. These rules are combined to describe the more complex morphotactic structures. Finite-state machine is used to describe the concatenation rules and corresponding non-deterministic and deterministic automaton are developed for ease of computerization. A root lexicon of verb category words is used along with an affix dictionary in a database. The system is capable to analyze and recognize a certain word as adjective by observing the morpheme concatenation rule defined with the help of finite-state networks.[...] Read more.
Software Effort estimation involves the estimation of effort required to develop software. Cost overrun, schedule overrun occur in the software development due to the wrong estimate made during the initial stage of software development. Proper estimation is very essential for successful completion of software development. Lot of estimation techniques available to estimate the effort in which neural network based estimation technique play a prominent role. Back propagation Network is the most widely used architecture. ELMAN neural network a recurrent type network can be used on par with Back propagation Network. For a good predictor system the difference between estimated effort and actual effort should be as low as possible. Data from historic project of NASA is used for training and testing. The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network.[...] Read more.
Adaptation software component is a crucial problem in component-based software engineering (CBSE). Components that assembled or reused sometimes cannot perfectly fit one another because of the incompatibility issues between them. The focus today is on finding adaptation technique, to solve the mismatch between component interfaces and to guarantee that the software components are able to interact in the right way. This paper will focus on detecting mismatch, which considers as an important step through adaptation process. We propose a solution to detect mismatch, by suggesting improvement in Symbolic Transition Systems that used in representing component interface, and synchronous vector algorithm to deal with parameters data type mismatch.[...] Read more.