活动识别
日常生活活动
计算机科学
变压器
可穿戴计算机
人工智能
机器学习
辅助生活
嵌入式系统
工程类
医学
心理学
护理部
电压
精神科
电气工程
作者
Gabriela Augustinov,Muhammad Adeel Nisar,Frédéric Li,Amir Tabatabaei,Marcin Grzegorzek,Keywan Sohrabi,Sebastian Fudickar
标识
DOI:10.1145/3558884.3558895
摘要
Smart support systems for the recognition of Activities of Daily Living (ADLs) can help elderly people live independently for longer improving their standard of living. Many machine learning approaches have been proposed lately for Human Activity Recognition (HAR), including elaborated networks that contain convolutional, recurrent, and attentive layers. The ubiquity of wearable devices has provided an increasing amount of time-series data that can be used for such applications in an unobtrusive manner. But there are not many studies on the performance of the attention-based Transformer model in HAR, especially not for complex activities such as ADLs. This work implements and evaluates the novel self-attention Transformer model for the classification of ADLs and compares it to the already well-established approach of recurrent Long-Short Term Memory (LSTM) networks. The proposed method is a two-level hierarchical model, in which atomic activities are initially recognized in the first step and their probability scores are extracted and utilized for the Transformer-based classification of seven more complex ADLs in the second step. The Transformer is used at the second step to classify seven ADLs. Our results show that the Transformer model reaches the same performance and even outperforms LSTM networks cleary in the subject-dependent configuration (73.36 % and 69.09 %), while relying only on attention-mechanism to depict global dependencies between input and output without the need to use any recurrence. The proposed model was tested using two different segment lengths, indicating its effectiveness in learning long-range dependencies of shorter actions in complex activities.
科研通智能强力驱动
Strongly Powered by AbleSci AI