Automated recognition of worker activity using machine learning and multi-sensor fusion

dc.contributor.authorJaroš, René
dc.contributor.authorMajidzadeh Gorjani, Ojan
dc.contributor.authorBarnová, Kateřina
dc.contributor.authorLajdolf, Martin
dc.contributor.authorMartinek, Radek
dc.contributor.authorBilík, Petr
dc.contributor.authorDanys, Lukáš
dc.date.accessioned2026-04-27T06:40:13Z
dc.date.available2026-04-27T06:40:13Z
dc.date.issued2026
dc.description.abstractActivity recognition and worker safety are critical parts of future industrial environments. Consecutive transitions to massive automatization and reductions in workforce could lead to more focused monitoring of the remaining workers. Effective recognition often relies on a number of sensors, such as electromyography leads, accelerome ters, magnetometers, or gyroscopes, with each method potentially impacting system cost. This study focuses on automatic worker activity recognition using multilayer perceptron and multi-sensor fusion. We present a testing methodology and analyze the influence of individual sensors on system accuracy. The experiment therefore aims to systematically quantify the trade-off between the type and number of sensors used and the resulting accuracy, with the goal of providing a quantified basis for designing future systems with an optimal price/performance ratio. The presented results demonstrate that different sensor fusion strategies lead to vastly different accuracies, ranging from 71 % to over 99 %. These early findings highlight the crucial role of optimal sensor selection in the design of cost-effective worker monitoring systems. The study demonstrated that using a combination of the accelerometer, magnetometer, gyroscope, and electromyography resulted in the highest test accuracy of 98.04 %. Among these, the gyroscope and electromyography contributed less significantly. When only the accelerometer, magnetometer, and gyroscope were used, the accuracy dropped by approximately 6 %.
dc.description.firstpageart. no. 114473
dc.description.sourceWeb of Science
dc.description.volume189
dc.identifier.citationApplied Soft Computing. 2026, vol. 189, art. no. 114473.
dc.identifier.doi10.1016/j.asoc.2025.114473
dc.identifier.issn1568-4946
dc.identifier.issn1872-9681
dc.identifier.urihttp://hdl.handle.net/10084/158490
dc.identifier.wos001656079200001
dc.language.isoen
dc.publisherElsevier
dc.relation.ispartofseriesApplied Soft Computing
dc.relation.urihttps://doi.org/10.1016/j.asoc.2025.114473
dc.rights© 2025 The Authors. Published by Elsevier B.V.
dc.rights.accessopenAccess
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectactivity recognition
dc.subjectartificial neural network
dc.subjectclassification
dc.subjectIndustry 4.0
dc.subjectmachine learning
dc.subjectwearable sensors
dc.titleAutomated recognition of worker activity using machine learning and multi-sensor fusion
dc.typearticle
dc.type.statusPeer-reviewed
dc.type.versionpublishedVersion
local.files.count1
local.files.size2246340
local.has.filesyes

Files

Original bundle

Now showing 1 - 1 out of 1 results
Loading...
Thumbnail Image
Name:
1568-4946-2026v189an114473.pdf
Size:
2.14 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 out of 1 results
Loading...
Thumbnail Image
Name:
license.txt
Size:
718 B
Format:
Item-specific license agreed upon to submission
Description: