FEMM Hub Logo

New FEMM journal paper published in the Journal of Manufacturing Systems

An aircraft wing, with multiple people carrying out work on it.

Many high value manufacturing systems still require ergonomically intensive manual activities which could expose workers to dangerous levels of ergonomically awkward positions which can lead to musculoskeletal conditions. This impacts productivity through lost days.

In this work, researchers in the Department of Automatic Control and Systems Engineering at the University of Sheffield propose a cognitive architecture for wearable sensors (CAWES); a wearable sensor system and cognitive architecture that is capable of taking data streams from multiple wearable sensors on a worker’s body and fusing them to enable digitisation, tracking and analysis of human ergonomics in real time on a shop floor.

Furthermore, through tactile feedback, the architecture is able to inform workers in real time when ergonomics rules are broken. The architecture is validated through the use of an aerospace case study undertaken in laboratory conditions.

Diagram showing a cognitive architecture for wearable sensors (CAWES) based on applying the ACT-R architecture to develop a data fusion pipeline for data streams from multiple wearable sensors. The numbers beside each module relate to the corresponding sections that provide relevant detailed narrative.

Reference: Oyekan, J., Chen, Y., Turner, C. and Tiwari, A. (2021) Applying a Fusion of Wearable Sensors and a Cognitive Inspired Architecture to Real-time Ergonomics Analysis of Manual Assembly Tasks, Journal of Manufacturing Systems, Vol. 61, pp. 391-405, DOI:10.1016/j.jmsy.2021.09.015