IJIGSP Vol. 5, No. 12, Oct. 2013
Cover page and Table of Contents: PDF (size: 138KB)
This paper presents an efficient and simple adaptive method for high density salt-and-pepper noise removal. A noise detector is utilized to check whether the selected pixel is noisy or noise free. Noise pixels will then be subjected to the second stage of the filtering action, while the noise free pixels are left unaltered. Since not every pixel is filtered, undue distortion can be avoided. The noise free pixels are only considered in the filter operation for finding the value of the processed pixel. The window size is selected as 3 X 3 in the first step. If all pixels within the window are considered to be noise, then change the selected window size to 5 X 5. If all the pixels within the selected 5 x 5 window are considered to be noise, then the processing pixel is replaced by the previous resultant pixel. This technique requires one nonnoise original image as training image. The key point of the filter operation is based on the solution of the equations system X=A-1B in the nonnoise original image. An algorithm to extract the data from the nonnoisy image and form it in the linear equation system is presented. Comparison of the given filter with other existing filters is provided in this paper. The results demonstrate that the proposed technique can obtain better performances than other existing denoising techniques. The proposed method works well for high-density salt & pepper noise even up to a noise density of 97%.[...] Read more.
Second generation bandelet transform is a new method based on capturing the complex geometric content in image; we use this transform to study medical and satellite image compressed using the bandelet transform coupled by SPIHT coder. The goal of this paper is to examine the capacity of this transform proposed to offer an optimal representation for image geometric. We are interested in compressed medical image, In order to develop the compressed algorithm we compared our results with those obtained by the bandelet transform application in satellite image field. We concluded that the results obtained are very satisfactory for medical image domain.[...] Read more.
When the length of the filter and consequently the number of filter coefficients increase, the design of the filter becomes complex and therefore the popular NLMS algorithm has been replaced with MMax NLMS algorithm. But its performance in terms of convergence characteristics reduces to an extent though the filter design becomes very easy i.e., convergence occurs at a later stage taking too much computational time for the processing of the signal. In this paper, a proposal for improving the convergence characteristics is made without compromising the performance of the design and affecting the tap-selection process of the MMax NLMS algorithm. With the introduction of the concept of variable step-size for the filter coefficients, loss in the performance due to MMax NLMS algorithm can be effectively lowered and the convergence is better achieved in the filter deign.[...] Read more.
Software dealing with large-scale signal processing takes long time even on modern hardware. Cross-correlation applications are mostly algorithms rather than data-intensive (that is, they are more CPU-bound than I/O-bound). Parallel implementation of the cross-correlation execution over the local network, or in some cases over a Wide Area Network (WAN), helps reducing the processing time. The aim of this paper is to discuss the possibility of distributing the cross-correlation computational process over the available PCs in the local network. Moreover, the algorithm portion that is sent to a remote PC, within the LAN, will be redistributed over the available CPU cores on that computer yielding to maximum utilization of all available cores in the local area network. The load balancing problem will be addressed as well.[...] Read more.
In this paper, two novel first order current-mode all-pass filters are proposed using a resistor and a grounded capacitor along with a multi-output dual-X second-generation current conveyor (MO-DXCCII). There is no element matching restriction. Both the circuits exhibit low input and high output impedance, which is a desirable feature for current-mode circuits. The proposed circuits are simulated using SPICE simulation program to confirm the theory.[...] Read more.
A novel and different approach for detecting texture orientation by computer was presented in this research work. Many complex real time problem example detection of size and shape of cancer cell, classification of brain image signal, classification of broken bone structure, detection and classification of remote sensing images, identification of foreign particle in universe, detection of material failure in construction design, detection and classification of textures in particularly fabrications etc where edge detection and both vertical and horizontal line detection are essential. Thus researches need to develop different algorithm for this above complex problem. It is seen from literature that conventional algorithm DCT, FFT are all highly computational load and hence impossible task to implemented in hardware. These difficulties were solved in this particular research work by applying DWT and radon transform. It was seen from the simulation result that with very high computational load the entire algorithm takes very less CPU time and proved its robustness.[...] Read more.
Scaling behavior is an indicator of the lack of characteristic time scale, and the existence of long-range correlations related to physiological constancy preservation. To investigate the fluctuations of the sleep electroencephalogram (EEG) over various time scales during different sleep stages detrended fluctuation analysis (DFA) is studied. The sleep EEG signals for analysis were obtained from the Sleep-EDF Database available online at the PhysioBank. The DFA computations were performed in different sleep stages. The scaling behavior of these time series was investigated with detrended fluctuation analysis (window size: 50 to 500). The results show that the mean values of scaling exponents were lower in subjects during stage 4 and standard deviation of scaling exponents of stage 4 was larger than that of the other stages. In contrast, the mean value of scaling exponents of stage 2 was larger, while a small variation of scaling exponent is observed at this stage. Therefore, DFA has a more stable behavior in stage 2, whereas the random variability and unpredictable behavior of DFA can be observed in the stage 4. In conclusion, scaling exponent indices are efficacious in quantifying EEG signals in different sleep stages.[...] Read more.
Image fusion is one of the recent trends in image registration which is an essential field of image processing. The basic principle of this paper is to fuse multi-focus images using simple statistical standard deviation. Firstly, the simple standard deviation for the kk window inside each of the multi-focus images was computed. The contribution in this paper came from the idea that the focused part inside an image had high details rather than the unfocused part. Hence, the dispersion between pixels inside the focused part is higher than the dispersion inside the unfocused part. Secondly, a simple comparison between the standard deviation for each kk window in the multi-focus images could be computed. The highest standard deviation between all the computed standard deviations for the multi-focus images could be treated as the optimal that is to be placed in the fused image. The experimental visual results show that the proposed method produces very satisfactory results in spite of its simplicity.[...] Read more.
An Iterative method of individual nameplate detection using color images acquired from a high position is proposed for guidance of nighttime vehicles and other similar purposes. Segmentation is a very critical and difficult stage to accomplish in computer aided detection systems. Fundamentally the method contains iterative automatic thresholding and selecting the best threshold value which is applied to the original or enhanced dark night images. The main focus of the iteration based threshold to distinguish the image of the background and foreground. This method was tested on an actual outdoor vehicle images and results obtained from automatic thresholding of the experimental images are showing the validity of the method.[...] Read more.
The subject of the research in this scientific paper is the text on the Web pages, with special emphasis on the interpretation of the text fonts chosen by the Web designer, along with its typographic features, on computers of various users. In addition, users can have different operating systems, different browsers, and different preferences in terms of their computers settings. An overall direction of the choice of fonts and their characteristics when designing Web pages, as well as some advice and opinions on the same topic are presented here. After that, several problems which arise from the interpretation of the text on the Web pages of the users are analyzed, for which a few solutions for the problems, as well as recommendations on which solution when to be chosen are also given in this text. The problem of having no fonts, chosen by the designer, on the user's computer is studied as well. Then, the possibility of the users to change the default font, given by the designer, on their computers, and the possibility to change the typographic features of the default font is also analyzed. Finally, the problem with incompatibility with different operating systems and Web browsers in visualizing the fonts is also considered.[...] Read more.