Automated recognition of worker activity using machine learning and multi-sensor fusion

Abstract

Activity recognition and worker safety are critical parts of future industrial environments. Consecutive transitions to massive automatization and reductions in workforce could lead to more focused monitoring of the remaining workers. Effective recognition often relies on a number of sensors, such as electromyography leads, accelerome ters, magnetometers, or gyroscopes, with each method potentially impacting system cost. This study focuses on automatic worker activity recognition using multilayer perceptron and multi-sensor fusion. We present a testing methodology and analyze the influence of individual sensors on system accuracy. The experiment therefore aims to systematically quantify the trade-off between the type and number of sensors used and the resulting accuracy, with the goal of providing a quantified basis for designing future systems with an optimal price/performance ratio. The presented results demonstrate that different sensor fusion strategies lead to vastly different accuracies, ranging from 71 % to over 99 %. These early findings highlight the crucial role of optimal sensor selection in the design of cost-effective worker monitoring systems. The study demonstrated that using a combination of the accelerometer, magnetometer, gyroscope, and electromyography resulted in the highest test accuracy of 98.04 %. Among these, the gyroscope and electromyography contributed less significantly. When only the accelerometer, magnetometer, and gyroscope were used, the accuracy dropped by approximately 6 %.

Description

Delayed publication

Available after

Subject(s)

activity recognition, artificial neural network, classification, Industry 4.0, machine learning, wearable sensors

Citation

Applied Soft Computing. 2026, vol. 189, art. no. 114473.