Wearable-based SLAM with Sensor Fusion in Firefighting Operations
<- Publications
Wearable-based SLAM with Sensor Fusion in Firefighting Operations
Renjie Wu,
Boon Giin Lee,
Matthew Pike,
Liang Huang,
Wan-Young Chung,
Gen Xu
Intelligent Human Computer Interaction | 2024
| View on Publisher's Website
Abstract
In challenging indoor fire rescue scenarios characterized by heavy smoke and dust, conventional cameras struggle to capture high-quality images. Frames with limited visual data fail to provide sufficient information for SLAM (Simultaneous Localization and Mapping) systems to achieve accuracy. This research introduces an innovative solution in the form of a wearable firefighter protective boot integrated with a SLAM system. This system incorporates Pedestrian Dead Reckoning (PDR) and ultrasound sensors to autonomously generate the user’s trajectory and an internal structural map. The ultrasound module is strategically positioned on the outer side of the calf, effectively scanning the surrounding boundaries. Additionally, a 9-axis inertial measurement unit, located atop the forefoot, detects walking motions and calculates continuous step positions to determine the trajectory. The Map Point Calculation (MPC) algorithm combines ultrasound range data with the computed trajectory to construct the map model. To validate the system’s performance, experiments were conducted within a smoke-filled environment simulated by firefighters at the local fire station. The results unequivocally demonstrate the system’s capability to provide highly accurate trajectory estimations and generate precise map points.