AdaptiveSuperpixelGeneration andSegmentation inLarge-Scale Urban Imagery Using Deep Learning

Authors

  • Ms J Kavitha Author
  • YEDURUPAKALA REVANTH KUMAR Author
  • SURADA DINESH KUMAR Author
  • PALLA KISHAN SAI Author
  • PALAROUTHU SRINIVAS Author
  • THURANGI MANIKANTA Author

DOI:

https://doi.org/10.62643/ijerst.2026.v22.n1(2).2089

Abstract

Precise image segmentation and depth estimation in large scale urban settings have remained a challenge to computer vision systems largely because of the existence of uniform, homogeneous, and a textureless planar objects like roads, walls, and building facades. The gradient and photometric methods often generate over-segmented results and broken depth maps under such circumstances. In this paper, I propose a hybrid scheme incorporating the Simple Linear Iterative Clustering (SLIC) superpixels algorithm with an Artificial Neural Network (ANN) to make urban image segmentation adaptive and based on learning. The given system produces tiny, edge-constrained superpixels, derivatives a multi-dimensional feature space, consisting of color statistics, texture descriptors, edge texture, and spatial information, and uses a trained ANN to identify the type of each of these regions as either planar or non-planar. The results of the experiments indicate that the framework attains the training accuracy of about 85 percent and decreases loss per epoch, producing smoother depth images and over-segmentation of textureless urban areas is reduced significantly. The modular pipeline is written in Python based on OpenCV, scikit-image, and TensorFlow/Keras and is scaleable and can easily be integrated into an app (e.g., 3D city reconstruction, autonomous vehicle navigation, and monitoring of smart city infrastructure).

Downloads

Published

22-03-2026

How to Cite

AdaptiveSuperpixelGeneration andSegmentation inLarge-Scale Urban Imagery Using Deep Learning. (2026). International Journal of Engineering Research and Science & Technology, 22(1(2), 89-94. https://doi.org/10.62643/ijerst.2026.v22.n1(2).2089