单传感器数据驱动的人体日常短时行为识别方法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Human daily short-time activity recognition method driven by single sensor data
  • 作者:苏本跃 ; 郑丹丹 ; 汤庆丰 ; 盛敏
  • 英文作者:Su Benyue;Zheng DANDan;Tang Qingfeng;Sheng Min;School of Computer and Information, Anqing Normal University;The Key Laboratory of Intelligent Perception and Computing of Anhui Province;Medical College, Hangzhou Normal University;
  • 关键词:短时行为 ; 模板匹配 ; 行为模板库 ; 相似度直方图 ; 单传感器
  • 英文关键词:short-time activity;;template matching;;activity template library;;similarity histogram;;single sensor
  • 中文刊名:HWYJ
  • 英文刊名:Infrared and Laser Engineering
  • 机构:安庆师范大学计算机与信息学院;安徽省智能感知与计算重点实验室;杭州师范大学医学院;
  • 出版日期:2019-02-21
  • 出版单位:红外与激光工程
  • 年:2019
  • 期:v.48;No.292
  • 基金:国家自然科学基金(61603003,11471093);; 教育部“云数融合科教创新”基金(2017A09116);; 安徽省高校优秀拔尖人才培育资助项目(gxbjZD26)
  • 语种:中文;
  • 页:HWYJ201902040
  • 页数:9
  • CN:02
  • ISSN:12-1261/TN
  • 分类号:282-290
摘要
在基于惯性传感器的人体行为识别研究中,特征提取是其中的关键环节之一。而离散数据统计特征的稳定性依赖于特征提取的窗口大小。一般来说,训练数据的窗口长度需要大于一个运动周期。因此,针对测试数据远小于一个运动周期的短序列样本识别问题,提出了一种基于模板匹配的新的解决方案。首先,通过适当分割训练数据的长序列样本,构建一个过完备的短时行为模板库,将待测短时样本与模板库中样本进行一致化处理并进行匹配;其次,在匹配算法中,采用样本间的F范数与整体梯度向量的2范数累加作为匹配度量准则,得到相似度直方图;最后,基于相似度直方图,根据投票策略得到最终分类识别结果。实验表明:在使用单传感器识别短时行为的情况下,新算法比传统算法在精度和稳定性上具有更好的性能,并能适应不同窗口下短时行为分类问题。
        In the study of human activity recognition(HAR) based on the inertial sensor, feature extraction was one of the key points. The stability of discrete data statistical features depended on the window size of feature extraction. Generally speaking, the length of window needed to be greater than one motion cycle. Therefore, compared to the traditional behavior recognition, short-time behavior recognition was more difficult. Thus a novel template matching method was proposed for recognizing the test samples whose durations were shorter than one motion cycle. Firstly, by properly dividing the long sequence samples, a complete short-time activity template library was constructed. The short-time samples to be tested and the samples in the template library were processed and matched. Secondly, in the matching algorithm, the similarity histogram was obtained by utilizing the sum of the F norm distance between the samples and the 2 norm distance of the global gradient vector as the matching metric. Finally, based on the similarity histogram, the final classification recognition results were obtained according to the voting strategy. Experiments show that in the case of using a single sensor to identify short-term behavior, the new algorithm had better performance than traditional algorithms in accuracy and stability, and can be adapted to short-term behavior classification problems under different windows.
引文
[1]Poppe R.A survey on vision-based human action recognition[J].Image&Vision Computing,2010,28(6):976-990.
    [2]Wang J,Liu Z,Wu Y,et al.Learning actionlet ensemble for 3D human action recognition[J].IEEETransactions on Pattern Analysis&Mchine Intelligence,2014,36(5):914-927.
    [3]Bulling A,Blanke U,Schiele B.A tutorial on human activity recognition using body-worn inertial sensors[J].Acm Computing Surveys,2014,46(3):1-33.
    [4]Lara O D,Labrador M A.A survey on human activity recognition using wearable sensors[J].IEEECommunications Surveys&Tutorials,2013,15(3):1192-1209.
    [5]Yang J,Wang S,Chen N,et al.Wearable accelerometer based extendable activity recognition system[C]//IEEEInternational Conference on Robotics and Automation.Piscataway,2010:3641-3647.
    [6]Davis K,Owusu E,Bastani V,et al.Activity recognition based on inertial sensors for ambient assisted living[C]//International Conference on Information Fusion,2016:371-378.
    [7]Su B,Tang Q F,Jiang J,et al,A novel method for short-time human activity recognition based on improved template matching technique[C]//ACMSIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry,2016:233-242.
    [8]Li L J,Su H,Lim Y,et al.Object bank:An objectlevel image representation for high-level visual recognition[J].International Journal of Computer Vision,2014,107(1):20-39.
    [9]Sadanand S,Corso J J.Action bank:A high-level representation of activity in video[C]//Computer Vision and Pattern Recognition(CVPR),2012:1234-1241.
    [10]Liu Z J,Lin W,Geng Y L,et al.Intent pattern recognition of lower-limb motion based on mechanical sensors[J].Journal of Automatica Sinica,2017,4(4):651-660.
    [11]Young A J,Simon A M,Hargrove L J.A training method for locomotion mode prediction using powered lower limb prostheses[J].IEEE Transactions on Neural Systems&Rehabilitation Engineering,2014,22(3):671-677.
    [12]Young A J,Simon A M,Fey N P,et al.Intent recognition in a powered lower limb prosthesis using time history information[J].Annals of Biomedical Engineering,2014,42(3):631-641.
    [13]Yuan K B,Wang Q N,Wang L.Fuzzy-logic-based terrain identification with multisensor fusion for transtibial amputees[J].IEEE/ASME Transactions on Mechatronics,2015,20(2):618-630
    [14]Zheng E H,Wang Q N.Noncontact capacitive sensingbased locomotion transition recognition for amputees with robotic transtibial prostheses[J].IEEETransactions on Neural Systems&Rehabilitation Engineering,2016,25(2):161-170.
    [15]Yang A Y,Jafari R,Sastry S S,et al.Distributed recognition of human actions using wearable motion sensor networks[J].Journal of Ambient Intelligence&Smart Environments,2009,1(2):103-115.
    [16]Su B,Tang Q,Wang G,et al.Transactions on Edutainment XII:The Recognition of Human Daily Actions With Wearable Motion Sensor System[M].Germany:Springer,2016:68-77.
    [17]He W,Guo Y,Gao C,et al.Recognition of human activities with wearable sensors[J].Eurasip Journal on Advances in Signal Processing,2012,2012(1):1-13.
    [18]Xiao L.Li R F,Luo J.Recognition on human activity based on compressed sensing in body sensor networks[J].Journal of Electronics&Information Technology,2013,35(1):119-125.
    [19]Li F,Pan J K.Human motion recognition based on triaxial accelerometer[J].Journal of Computer Research and Development,2016,53(3):621-631.
    [20]Sheng M,Jiang J,Su B,et al.Short-time activity recognition with wearable sensors using convolutional neural network[C]//Proceedings of the 15th ACMSIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry,2016:413-416.