Intelligent Autonomous Robotic Car for Real-Time Disaster Area Analysis and Navigation

by E.A. Wanigasekara, Y.A.A. Kumarayapa

Published: January 17, 2026 • DOI: 10.51584/IJRIAS.2025.10120082

Abstract

Efficient victim detection and reliable navigation remain major challenges in robotic search and rescue operations within disaster affected regions. This research describes the design and implementation of an AI-driven autonomous robot car capable of making real-time decisions in complex and hazardous environments. The proposed system employs a sensor fusion approach that combines visual human detection using YOLOv5, thermal-based classification through a convolutional neural network, and audio-based human voice detection. These AI modules are supported by additional sensors including ultrasonic sensors, INMP441 microphone, MPU6050 inertial unit, and gas sensors (MQ2 and MQ135), all coordinated using a Raspberry Pi 3B+, ESP32, and ESP32-CAM modules. Precise localization and remote communication are achieved using a NEO-6M GPS receiver and a SIM800L GSM module. A web-based monitoring platform is developed to display real-time sensor readings, survivor locations, and environmental hazard warnings at a base station. The system is validated using a physical prototype designed for low-cost, rapid deployment, and ease of use. Experimental observations indicate that the robot can autonomously navigate, identify potential survivors, and transmit critical information, highlighting its suitability for disaster-response applications.