Work place: Department of Electronics and Communication, University of Allahabad, Allahabad, 211002, India
Research Interests: Randomized Algorithms, Analysis of Algorithms, Data Structures and Algorithms, Medical Image Computing, Image Manipulation, Computer systems and computational processes, Medical Informatics
Richa Srivastava received the M.Sc. degree in computer science from the University of Allahabad, Allahabad, India, in 2005. Since 2010, she has been involved in the development of image fusion algorithms for medical, remote sensing and multifocus images. Presently she is conducting research from University of Allahabad, on image fusion using new generation wavelet transforms including complex wavelet transform, curvelet transform, contourlet transform and shearlet transform.
DOI: https://doi.org/10.5815/ijigsp.2016.10.08, Pub. Date: 8 Oct. 2016
Image fusion is a popular application of image processing which performs merging of two or more images into one. The merged image is of improved visual quality and carries more information content. The present work introduces a new image fusion method in complex wavelet domain. The proposed fusion rule is based on a level dependent threshold, where absolute difference of a wavelet coefficient from the threshold value is taken as fusion criteria. This absolute difference represents variation in the image intensity that resembles the salient features of image. Hence, for fusion, the coefficients that are far from threshold value are being selected. The motivation behind using dual tree complex wavelet transform is due to failure of real valued wavelet transform in many aspects. Good directional selectivity, availability of phase information and approximate shift invariant nature of dual tree complex wavelet transform make it suitable for image fusion and help to produce a high quality fused image. To prove the strength of the proposed method, it has been compared with several spatial, pyramidal, wavelet and new generation wavelet based fusion methods. The experimental results show that the proposed method outperforms all the other state-of-the-art methods visually as well as in terms of standard deviation, mutual information, edge strength, fusion factor, sharpness and average gradient.[...] Read more.
Subscribe to receive issue release notifications and newsletters from MECS Press journals