Work place: Dr Vishwanath Karad MIT World Peace University, Pune, India
E-mail: vaidehi.deshmukh@mitwpu.edu.in
Website: https://orcid.org 0000-0002-6664-8593
Research Interests:
Biography
Vaidehi Deshmukh, who is another co-author of this journal is an experienced Electronics and Communication engineering faculty involved in teaching various engineering subjects like Machine Learning, Python, and Java Programming. Her doctoral research involved image fusion using image processing-based algorithms and the development of an algorithm for image fusion. She has published a book on Image Fusion. She has handled research projects in image fusion, emotion detection using Convolutional neural networks, disease detection etc. She has proficiency in MS Excel, Python Programming, and MATLAB and has collaborated with industries for assisting them in solving their problems. She has recently completed a certificate course in Data Science.
By Abhinav Chandra Anuradha Chetan Phadke Vaidehi Deshmukh
DOI: https://doi.org/10.5815/ijigsp.2026.02.02, Pub. Date: 8 Apr. 2026
Satellite imagery is always used to study spatial geographies to find water, residential, farmland, and forest lands; which can be further used for township development and planning, landscape detection etc. Semantic segmentation and image classification are the two crucial procedures in determining the spatial geographies. In order to improve the generalization ability of semantic segmentation algorithms, a combined model of UNet_ResNet is used in this paper. The engineered model is a type of Convolutional Neural Networks using GeoGANs which detects semantic patches in neural networks with smaller sizes and regional characteristics within a certain spatial and pixel scale. However, it faces a semantic segmentation challenge of identifying roadways in metropolitan areas. The model shows an accuracy score from 93% to 97.3% for image classification and segmentation purposes which fares better than the implementation of various existing architectures.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals