What it does
ÆGIS autonomously explores disaster zones to locate victims, identify environmental threats like toxic gas, heat and assess danger levels with dual-layer AI. It improves safety, speeds up search-and-rescue efforts and reduces the risk for human responders.
Your inspiration
The idea stemmed from news coverage of earthquakes and building collapses, where precious time was lost searching manually through unstable rubble. I was moved by the need for autonomous systems that could act faster and safer than humans. I envisioned a robotic assistant capable of detecting victims, classifying danger levels, and reporting data in real time. Combining my passion for robotics and AI, I aimed to build a low-cost yet high-impact solution. The name "ÆGIS" reflects the protective, watchful nature of the system — a robotic shield to guard and assist those in need during emergencies.
How it works
ÆGIS, or known as AEGIS, is powered by an ESP32-S3 microcontroller and features onboard Wi-Fi for real-time communication. It integrates a thermal imaging sensor (MLX90640), ESP32-S3 camera module, gas (MQ2), temperature (DHT11), and LiDAR (VL53L0X) sensors. These collect data on heat signatures, distances, human posture, and environmental threats. YOLOv11 and MediaPipe process visual data to detect victims and assess their condition, while a PyTorch-trained feedforward neural network computes severity levels. The robot operates in three modes — manual, autonomous, and "stay-with-person" — and avoids obstacles using real-time sensor fusion. Data is sent to a Python GUI for monitoring, control, and logging. All components are housed in a 3D-printed shell with mecanum wheels for omnidirectional mobility.
Design process
The project started with identifying gaps in existing disaster robots — namely, poor autonomy and limited sensor fusion. I built an early prototype using ESP32-CAM for basic vision tasks. As the system evolved, I upgraded to ESP32-S3 for better processing and sensor compatibility. The robot chassis was designed in Fusion 360 and 3D-printed to fit all components tightly, including fans for active cooling. A custom PCB was fabricated and soldered to minimise wiring complexity. I trained a neural network in PyTorch to process multi-sensor data and calculate a threat level. YOLOv11 and MediaPipe were implemented for visual object and posture detection, while the colour contrast of red is used to detect possible bleeding. The system was tested in simulated disaster conditions using cluttered layouts, obstacle paths, and heat sources. The GUI was developed in Python using Tkinter for live monitoring, motor control, and threat visualisation. Several iterations were made to improve detection reliability, battery life, and navigation smoothness under varying environmental constraints.
How it is different
Unlike many disaster robots that rely on either remote control or a single mode of detection, ÆGIS integrates thermal vision, ToF sensors, gas and temperature sensors, and computer vision into one cohesive system. It is capable of autonomous operation, threat scoring, and decision-making with minimal human intervention. Its unique “stay-with-person” mode halts the robot near a victim and continuously tracks the victim, allowing first responders to locate them faster. It also processes environmental data using a neural network for real-time danger classification, which few systems in its category. Most importantly, ÆGIS runs on affordable, low-power components, making it scalable for mass deployment in large-scale disasters where many zones must be scanned simultaneously.
Future plans
The next steps include expanding the robot’s terrain adaptability using SLAM with 2D mapping, enhancing accuracy with better AI training data, and field testing in simulated debris fields. I plan to upgrade the mechanical chassis for better mobility and integrate GPS for geolocation reporting. I also hope to collaborate with agencies or emergency response teams to deploy ÆGIS in real-world drills. Eventually, I envision a swarm of ÆGIS units working together for large-scale disaster zones, supported by a central dashboard or AR interface.
Awards
Nominated as first runner-up in the MMU-Infineon Innovative Project Competition 2025 (Engineering Category) and showcased at the Faculty of Engineering and Technology. Received positive evaluations for innovation, real-world impact, and technical integration from faculty and industry judges.
Connect