IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition
Information Fusion, January 2025
Xiaoqing Chai, Boon Giin Lee, Chenhang Hu, Matthew Pike, David Chieng, Renjie Wu, Wan-Young Chung. 2025. IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition. In Information Fusion. DOI:https://doi.org/10.1016/j.inffus.2024.102650
Xiaoqing Chai and Boon Giin Lee and Chenhang Hu and Matthew Pike and David Chieng and Renjie Wu and Wan-Young Chung. (2025). IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition. Information Fusion. https://doi.org/10.1016/j.inffus.2024.102650
Xiaoqing Chai and Boon Giin Lee and Chenhang Hu and Matthew Pike and David Chieng and Renjie Wu and Wan-Young Chung. "IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition." Information Fusion, 2025. https://doi.org/10.1016/j.inffus.2024.102650
Xiaoqing Chai, Boon Giin Lee, Chenhang Hu, Matthew Pike, David Chieng, Renjie Wu, Wan-Young Chung. 2025. IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition. Information Fusion. doi:10.1016/j.inffus.2024.102650
Xiaoqing Chai and Boon Giin Lee and Chenhang Hu and Matthew Pike and David Chieng and Renjie Wu and Wan-Young Chung, "IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition," Information Fusion, 2025. doi: 10.1016/j.inffus.2024.102650
@article{info-fusion-2025,
title={IoT-FAR: A multi-sensor fusion approach for IoT-based firefighting activity recognition},
author={Xiaoqing Chai and Boon Giin Lee and Chenhang Hu and Matthew Pike and David Chieng and Renjie Wu and Wan-Young Chung},
journal={Information Fusion},
year={2025},
doi={10.1016/j.inffus.2024.102650}
}
Human activity recognition, Sensor fusion, Machine learning, Wearable computing, Internet of things
Abstract
Inadequate training poses a significant risk of injury among young firefighters. Although Human Activity Recognition (HAR) algorithms have shown potential in monitoring and evaluating performance, most existing studies focus on daily activities and have difficulty distinguishing complex firefighting tasks. This study introduces the Internet of things (IoT)-based wearable firefighting activity recognition (IoT-FAR) system which employs a multi-modal sensor fusion approach to achieve comprehensive activity recognition during firefighting training. The IoT-FAR comprises five wearable body sensor nodes and a coordinator node. This study explores the significance of features extracted from the surface electromyography, heart rate, and inertial measurement units in firefighting training activity recognition. A hybrid machine learning (HML)-based network is proposed, which integrates three models: one trained with all features (MA), another with upper body features (MU), and a third with lower body features (ML). The proposed HML-SVM-RBF1-RF2 network achieves superior performance, with a mean recall of 93.94%, mean precision of 90.94%, and a mean accuracy rate of 98.29%. Additionally, the study introduces the specialized firefighting training associated activities (SFTAA) dataset, which includes endurance training activities involving self-contained breathing apparatus (SCBA) conducted by eighteen firefighters. This dataset represents preliminary work towards building a comprehensive dataset covering various events and scenarios for tracking firefighter activities. The IoT-FAR system also demonstrates the potential use of misclassified activities as evaluation metrics for assessing firefighter training performance.