Object tracking via a Novel Parametric Decisions based RGB-Thermal Fusion

Full Text (PDF, 1482KB), PP.1-18

Views: 0 Downloads: 0


Satbir Singh 1,* Arun Khosla 1 Rajiv Kapoor 2

1. Dr B R Ambedkar National Institute of Technology/ Centre for Artificial Intelligence, Jalandhar, 144088, India

2. Delhi Technological University, Delhi, 110042, India

* Corresponding author.

DOI: https://doi.org/10.5815/ijigsp.2023.04.01

Received: 8 Sep. 2022 / Revised: 2 Nov. 2022 / Accepted: 25 Jan. 2023 / Published: 8 Aug. 2023

Index Terms

Particle filter, Object tracking, Decision level fusion, visible-thermal amalgamation.


The thermo- visual fusion based tracking has been deployed for overcoming the shortcomings of alone vision-based object tracking. The assistance from both domains should be wisely merged so that it should result in a useful practice for object tracking. Several techniques had been developed recently to implement a brilliant fusion, but this undeveloped field still inhibits many unsolved challenges. The proposed method aims at increasing the effectiveness of tracking by bi-modal fusion with the introduction of a new set of rules based upon the parameters generated from the decision of individual modality trackers. This practice helps to achieve output by only a single run of the fusion process in every frame. The method also proposes to use minimal information from individual trackers in normal conditions and incorporates the use of supplementary information from imageries merely in case of diverse working conditions. This procedure, in turn, lessens the computations and hence reduces time to process. The experiments performed on well-known publically available datasets show the advantages of the proposed method over the individual visual domain tracking and other existing states of the art fusion techniques.

Cite This Paper

Satbir Singh, Arun Khosla, Rajiv Kapoor, "Object tracking via a Novel Parametric Decisions based RGB-Thermal Fusion", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.15, No.4, pp. 1-18, 2023. DOI:10.5815/ijigsp.2023.04.01


[1]Ali, U., Ali, M.: ‘Optimized Visual and Thermal Image Fusion for Efficient Face Recognition’ in ‘9th Int. Conf. on Inf. Fusion’, (2006), pp. 1–6.

[2]G. Bebis, A. Gyaourova, S.S. and I.P.: ‘Face Recognition by Fusing Thermal Infrared and Visible Imagery’ Image Vis. Comput., 2006, 24, (7), pp. 727–742. 

[3]Heo, J., Kong, S.G., Abidi, B.R., Abidi, M.A.: ‘Fusion of visual and thermal signatures with eyeglass removal for robust face recognition’, in ‘IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops’, USA, (2004).

[4]Kong, S.G., Heo, J., Boughorbel, F., et al.: ‘Multiscale fusion of visible and thermal IR images for illumination- invariant face recognition’Int. J. Comput. Vis., 2007, 71, (2), pp. 215–233. 

[5]Wilhelm, T., Böhme, H.J., Gross, H.M.: ‘A multi-modal system for tracking and analyzing faces on a mobile robot’Rob. Auton. Syst., 2004, 48, (1), pp. 31–40. 

[6]Cielniak, G., Duckett, T.: ‘Active People Recognition using Thermal and Grey Images on a Mobile Security Robot’ IEEE/RSJ Int. Conf. Intell. Robot. Syst. IROS, (2005), pp. 3610–3615. 

[7]Cielniak, G., Duckett, T., Lilienthal, A.J.: ‘Improved data association and occlusion handling for vision-based people tracking by mobile robots’, in ‘2007 IEEE/RSJ International Conference on Intelligent Robots and Systems’ (2007), pp. 3436–3441

[8]Palmerini, G.B., Università, S.: ‘Combining Thermal and Visual Imaging in Spacecraft Proximity Operations’2014, 2014, (December), pp. 10–12. 

[9]Tong, Y., Liu, L., Zhao, M., Chen, J., Li, H.: ‘Adaptive fusion algorithm of heterogeneous sensor networks under different illumination conditions’Signal Processing, 2016, 126, pp. 149–158. 

[10]Zhou, Z., Wang, B., Li, S., Dong, M.: ‘Perceptual fusion of infrared and visible images through a hybrid multi-scale decomposition with Gaussian and bilateral filters’Inf. Fusion, 2016, 30, pp. 15–26. 

[11]Ma, J., Ma, Y., Li, C.: ‘Infrared and visible image fusion methods and applications: A survey’Inf. Fusion, 2018, 45, (December 2017), pp. 153–178. 

[12]Walia, G.S., Kapoor, R.: ‘Recent advances on multicue object tracking: a survey’Artif. Intell. Rev., 2016, 46, (1), pp. 15821–15847.  

[13]Li, C., Wu, X., Zhao, N., Cao, X., Tang, J.: ‘Fusing two-stream convolutional neural networks for RGB-T object tracking’Neurocomputing, 2017. 

[14]Nummiaro, K., Koller-Meier, E., Van Gool, L.: ‘An adaptive color-based particle filter’Image Vis. Comput., 2003, 21, (1), pp. 99–110. 

[15]Talha, M., Stolkin, R.: ‘Particle filter tracking of camouflaged targets by adaptive fusion of thermal and visible spectra camera data’IEEE Sens. J., 2014, 14, (1), pp. 159–166. 

[16]Conaire, C.Ó., O’Connor, N.E., Smeaton, A.: ‘Thermo-visual feature fusion for object tracking using multiple spatiogram trackers’Mach. Vis. Appl., 2008, 19, (5–6), pp. 483–494. 

[17]Xiao, G., Yun, X., Wu, J.: ‘A new tracking approach for visible and infrared sequences based on tracking-before-fusion’Int. J. Dyn. Control, 2016, 4, (1), pp. 40–51. 

[18]Xiao, J., Stolkin, R., Oussalah, M., Leonardis, A.: ‘Continuously Adaptive Data Fusion and Model Relearning for Particle Filter Tracking With Multiple Features’IEEE Sens. J., 2016, 16, (8), pp. 2639–2649. 

[19]Stolkin, R., Rees, D., Talha, M., Florescu, I.: ‘Bayesian fusion of thermal and visible spectra camera data for region based tracking with rapid background adaptation’IEEE Int. Conf. Multisens. Fusion Integr. Intell. Syst., 2012, pp. 192–199. 

[20]Fendri, E., Boukhriss, R.R., Hammami, M.: ‘Fusion of thermal infrared and visible spectra for robust moving object detection’Pattern Anal. Appl., 2017, 20, (4), pp. 907–926. 

[21]Singh, S. Khosla, A., and Kapoor, R. ‘Visual-Thermal Fusion Based Object Tracking via a Granular Computing Backed Particle Filtering’, IETE Journal of Research, 2022, pp. 1-16.

[22]Research, S.R., Chan, A.L.: ‘Enhanced target tracking through infrared-visible image fusion’14th Int. Conf. Inf. Fusion, 2011, pp. 1–8. 

[23]Singh, S. Khosla, A., and Kapoor, R., "Object Tracking with a Novel Visual-Thermal Sensor Fusion Method in Template Matching", International Journal of Image, Graphics and Signal Processing, 2019, 11(7), pp. 39-47. 

[24]Wu, Y., Blasch, E., Chen, G., Bai, L., Ling, H.: ‘Multiple source data fusion via sparse representation for robust visual tracking’, in ‘14th International Conference on Information Fusion’ (2011), pp. 1–8

[25]Li, C., Cheng, H., Hu, S., Liu, X., Tang, J., Lin, L.: ‘Learning Collaborative Sparse Representation for Grayscale-Thermal Tracking’IEEE Trans. Image Process., 2016, 25, (12), pp. 5743–5756. 

[26]Li, C., Hu, S., Gao, S., Tang, J.: ‘Real-time grayscale-thermal tracking via laplacian sparse representation’, in ‘Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)’ (2016), pp. 54–65

[27]Liu, H.P., Sun, F.C.: ‘Fusion tracking in color and infrared images using joint sparse representation’Sci. China-Information Sci., 2012, 55, (3), pp. 590–599. 

[28]Yong, H., Meng, D., Zuo, W., Zhang, L.: ‘Robust Online Matrix Factorization for Dynamic Background Subtraction’IEEE Trans. Pattern Anal. Mach. Intell., 2017.