Advanced object detection for smart accessibility: a Yolov10 with marine predator algorithm to aid visually challenged people.
Journal:
Scientific reports
Published Date:
Jul 1, 2025
Abstract
A significant challenge for many visually impaired people is they cannot be entirely independent and are restricted by their vision. They face problems with such actions and object detection should be an essential feature they can rely on a regular basis. Object detection is applied to discover objects in the real world from an image of the world, like chairs, bicycles, tables, or doors, that are normal in the scenes of a blind predicated on their places. Computer vision (CV) involves the automated extraction, understanding, and analysis of valuable information from a sequence of images or a single image. Machine learning (ML) and deep learning (DL) are significant and robust learning architectures broadly established, especially for CV applications. This study proposes a novel Advanced Object Detection for Smart Accessibility using the Marine Predator Algorithm to aid visually challenged people (AODSA-MPAVCP) model. The main intention of the AODSA-MPAVCP model is to enhance object detection techniques using advanced models for disabled people. Initially, the image pre-processing stage applies adaptive bilateral filtering (ABF) to eliminate the unwanted noise in input image data. Furthermore, the proposed AODSA-MPAVCP model utilizes the YOLOv10 model for object detection. Moreover, the feature extraction process employs the VGG19 method to transform raw data into meaningful and informative features. The deep belief network (DBN) technique is used for the classification process. Finally, the marine predator algorithm (MPA)-based hyperparameter selection process is performed to optimize the classification results of the DBN technique. The experimental evaluation of the AODSA-MPAVCP approach is examined under the Indoor object detection dataset. The performance validation of the AODSA-MPAVCP approach portrayed a superior accuracy value of 99.63% over existing models.