Application of Sparse Coded SIFT Features for Classification of Plant Images

Full Text (PDF, 1007KB), PP.50-59

Views: 0 Downloads: 0


Suchit Purohit 1,* Savita R. Gandhi 1

1. Department of Computer Science Gujarat University Ahmedabad, India

* Corresponding author.


Received: 26 May 2017 / Revised: 4 Jun. 2017 / Accepted: 15 Jun. 2017 / Published: 8 Oct. 2017

Index Terms

SIFT, Sparse Coding, Plant Species, Content based retrievel, Spatial Pyramid matching, HSV color space, Texture fetaures extraction


Automated system for plant species recognition is need of today since manual taxonomy is cumbersome, tedious, time consuming, expensive and suffers from perceptual biasness as well as taxonomic impediment. Availability of digitized databases with high resolution plant images annotated with metadata like date and time, lat long information has increased the interest in development of automated systems for plant taxonomy. Most of the approaches work only on a particular organ of the plant like leaf, bark or flowers and utilize only contextual information stored in the image which is time dependent whereas other metadata associated should also be considered. Motivated from the need of automation of plant species recognition and availability of digital databases of plants, we propose an image based identification of species of plant when the image may belong to different plant parts such as leaf, stem or flower, fruit , scanned leaf, branch and the entire plant. Besides using image content, our system also uses metadata associated with images like latitude, longitude and date of capturing to ease the identification process and obtain more accurate results. For a given image of plant and associated metadata, the system recognizes the species of the given plant image and produces an output that contains the Family, Genus, and Species name. Different methods for recognition of the species are used according to the part of the plant to which the image belongs to. For flower category, fusion of shape, color and texture features are used. For other categories like stem, fruit, leaf and leafscan, sparsely coded SIFT features pooled with Spatial pyramid matching approach is used. The proposed framework is implemented and tested on ImageClef data with 50 different classes of species. Maximum accuracy of 98% is attained in leaf scan sub-category whereas minimum accuracy is achieved in fruit sub-category which is 67.3 %.

Cite This Paper

Suchit Purohit, Savita R. Gandhi," Application of Sparse Coded SIFT Features for Classification of Plant Images", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.9, No.10, pp. 50-59, 2017. DOI: 10.5815/ijigsp.2017.10.06


[1]James S. Cope, David P. A. Corney, Jonathan Y. Clark, Paolo Remagnino, and Paul Wilkin. Plant speciesidentification using digital morphometrics: A review. Expert Syst. Appl., 39(8):7562{7573, 2012.

[2]Kumar, N., Belhumeur, P.N., Biswas, A., Jacobs, D.W., Kress, W.J., Lopez, I.C., Soares, J.V.B.: Leafsnap: A computer vision system for automatic plant speciesidentification. European Conference on Computer Vision. pp. 502{516 (2012)

[3]Backes, A.R., Casanova, D., Bruno, O.M. Plant leaf identification based on volumetric fractal dimension. International Journal of Pattern Recognition and Artificial Intelligence 23(6), 1145{1160 (2009)

[4]Y. Nam, E. Hwang, and D. Kim. Clover: A mobile content-based leaf image retrieval system. In Digital Libraries: Implementing Strategies and Sharing Experiences, Lecture Notes in Computer Science, pages139{148. 2005.

[5]D. Barthelemy. The pl@ntnet project: A computational plant identification and collaborative information system. Technical report, XIII World Forestry Congress,2009.

[6]G. Cerutti, V. Antoine, L. Tougne, J. Mille, L. Valet, D. Coquin, and A.Vacavant.Reves participation -tree speciesclassification using random forests and botanical features. In Conference and Labs of the Evaluation Forum, 2012.


[8]J.-X. Du, X.-F. Wang and G.-J. Zhang, “Leaf shape based plant species recognition,” Applied Mathematics and Computation, vol. 185, 2007.

[9]A. H. Kulkarni, H. M. Rai, K. A. Jahagirdar and P. S. Upparamani, (2013). A Leaf Recognition Technique for Plant Classification Using RBPNN and Zernike Moments, International Journal of Advanced Research in Computer and Communication Engineering, Vol. 2, Issue 1, pp. 984-988.

[10]Krishna  Singh,  Indra  Gupta,  Sangeeta  Gupta,  SVM-BDT  PNN  and Fourier Moment Technique for classification of Leaf, Internatio Guru, D. S., Y. H. Sharath, and S. Manjunath. Texture features and KNN in classification of flower images.IJCA, Special Issue on RTIPPR (1) (2010): 21-29, 2010.nal Journal of Signal Processing, Image Processing and Pattern Recognition Vol. 3, No. 4, December, 2010. 

[11]Gopal, S. P. Reddyn and V. Gayatri, (2012). Classification of Selected Medicinal Plants Leaf Using Image Processing, IEEE International Conference on Machine Vision and Image Processing(MVIP), Taipei, pp. 5-8.

[12]O. Mzoughi, I. Yahiaoui, N. Boujemaa, and E. Zagrouba. Advanced tree speciesidentification using multiple leaf parts image queries. In IEEE International Conference on Image Processing (ICIP), 2013.

[13]Zuolin ZHAO, Gang YANG, Xinyuan HUANG. Plant Recognition Based on Leaf and Bark Images, School of Information Science and Technology, Beijing Forestry University, Beijing 100024, China.

[14]Nilsback and Andrew Zisserman. A Visual Vocabulary for Flower Classification.Computer Vision and Pattern Recognition, IEEE Computer Society Conference on. Vol.2, 2006.

[15]Nilsback, M.E., Zisserman, A. Automated flower classification over a large num-ber of classes. Indian Conference on Computer Vision, Graphics and Image Processing. pp. 722{729 (2008)

[16]Qi, Wenjing, Xue Liu, and Jing Zhao. Flower classification based on local and spatial visual cues. Computer Science and Automation Engineering (CSAE), Vol. 3,2012.

[17]Pei, Yong, and Weiqun Cao. A method for regional feature extraction of flower images.Intelligent Control and Information Processing (ICICIP), IEEE, 2010.

[18]Siraj, Fadzilah,  Muhammad Ashraq Salahuddin, and Shahrul Azmi Mohd Yusof.Digital Image Classification for Malaysian Blooming Flower. Computational Intelligence, Modelling and Simulation (CIMSiM), IEEE, 2010.

[19]Guru, D. S., Y. H. Sharath, and S. Manjunath. Texture features and KNN in classification of flower images.IJCA, Special Issue on RTIPPR (1) (2010): 21-29, 2010.

[20]Naiara Aginako, Javier Lozano, Marco Quartulli, Basilio Sierra, Igor G.Olaizola.Identification  of  plant  species on  large botanical  image datasets, Artificial Intelligence Department,University of the Basque Country,and Paseo Mikeletegi,57, 20009 Donostia-San Sebastián, 2014.

[21]H. Goeau, P. Bonnet, J. Barbe, V. Bakic, A. Joly, J.-F. Molino, D.Barthelemy, and N.Boujemaa. Multi-organ plant identification. In Proceedings of the 1st  ACM  International Workshop on Multimedia Analysis for Ecological Data, MAED '12,2012.


[23]Lowe, David G. "Distinctive image features from scale-invariant keypoints." International journal of computer vision (Springer Netherlands) 60, no. 2 (November 2004): 91-110.

[24]Lisin, Dimitri A, Marwan A Mattar, Matthew B Blaschko, Erik G 

[25]Learned-Miller, and Mark C Benfield. "Combining local and global image  features  for  object  class  recognition."  Computer  Vision  and Pattern Recognition-Workshops. IEEE, 2005. 47-47.

[26]K, Mikolajczyk,  and  Schmid C.  "A performance evaluation  of local descriptors." IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE) 27, no. 10 (October 2005 ): 1615 - 1630.

[27]Lindeberg, T. Journal of Applied Statistics, 21(2):224-270.

[28]J, Sivic, and Zisserman A. "Video Google: a text retrieval approach to object matching in videos." Ninth IEEE International Conference on Computer Vision, 2003. Proceedings. Nice, France : IEEE, 2003. 1470 -1477.

[29]Lazebnik, Svetlana, Cordelia Schmid, and Jean Ponce. "Beyond bags of features: Spatial pyramid matching for recognizing natural scene categories." IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE, 2006. 2169-2178.

[30]J Wright, Y Ma, J Mairal, G Sapiro, TS Huang, S Yan - Sparse representation for computer vision and pattern recognition,Proceedings of the IEEE, 2010.

[31]B. Olshausen, D. Field(2004). Current Opinion in Neurobiology Current Opinion in Neurobiology 2004, 14(4):481–487

[32]Yang J-C, Yu K, Gong Y-H, et al.J OURNAL OF MULTIMEDIA, VOL. 9, NO. 1, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 1794-1801, 2009.

[33]Mairal, F. Bach, J. Ponce, G. Sapiro, and A. Zisserman. Supervised dictionary learning. In NIPS, 2009.

[34]Ramirez, P. Sprechmann, a nd G. Sapiro. Classification and clustering via dictionary learning with structured incoherence and shared features. In CVPR, 2010.

[35]J.C. Yang, K. Yu, and T. Huang. Supervised Translation-Invariant Sparse coding. In CVPR, 2010.

[36]M. Yang, L. Zhang, J. Yang and D. Zhang. Metaface learning for sparse representation based face recognition. In ICIP, 2010.

[37]J. Mairal, F. Bach, J. Ponce, G. Sapiro, and A. Zissserman Learning discriminative dictionaries for local image analysis. In CVPR, 2008.

[38]F. Rodriguez and G.Sapiro. Sparse representation for image classification: Learning discriminative and reconstructive non- parametric dictionaries . IMA Preprint 2213, 2007.

[39]Pham and S. Venkatesh. Joint learning and dictionary construction for pattern recognition. In CVPR, 2008.

[40]Meng Yang,Zhang, D.Xiangchu Feng,Zhang.Fisher Discriminative Dictionary Learning for Sparse Represenatation.ICCV,2011