用户名: 密码: 验证码:
基于表面肌电信号的人体动作识别与交互
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
随着计算设备融入人类生活环境的方方面面,真实世界、数字世界和作为主体的人成为一个有机整体。人与环境的无缝沟通和自由互动的需求促使了人体动作识别成为未来多通道多模态人机交互研究中的热点。人体动作识别是指计算机自动检测、分析和理解人体各类运动和行为,如手指、手腕、手臂、头、面部或身体等姿态或运动模式,以判断人的意图并提供相应服务的过程。
     人体的任何一个动作都是由多组肌群在神经系统的支配下相互协调、共同完成的。肌电信号(EMG)是伴随肌肉活动产生的一种重要的生物电信号。由表面肌电(SEMG)传感器在相应肌群皮肤表面捕获的肌肉活动信息不但能反映关节的伸屈状态和伸屈强度,还能反映动作完成过程中肢体的形状和位置等信息,是感知人体动作的重要方式。
     本文以手势动作、肩颈部动作和腿部运动为研究对象,对基于SEMG的多种类人体动作的检测与识别技术进行了深入探索,设计并实现了基于人体动作的实时交互系统,开展了一定规模的用户测试实验,为实现自然和谐的人机交互提供了几种具体有效的解决方案。研究成果将促进多模态智能人机接口技术的进一步发展与推广应用,同时在人类行为理解、康复医学、情境感知、普适计算和导航定位等应用研究中也具有重要价值。本文具体的研究工作和创新点包括:
     1.基于多通道SEMG的手势动作识别研究。此研究的目标是为基于手势的人机交互系统提供有效的识别算法,并为手势命令集的选择和SEMG传感器的安置提供理论依据。一方面,以8类常用手势动作为研究对象,对包含信号采集、活动段检测、特征提取以及分类识别在内的多种动作SEMG识别方法进行了研究,提出了一种适用于实时交互的优化算法,并在此基础上,构建了基于SEMG的实时手势识别系统。另一方面,结合解剖学知识,对涉及多种精细手势动作在内的20种手势开展了手势命令集及SEMG传感器安放位置的优化研究,并在此基础上提出了一种实用的交互控制应用方案。在实时系统上进行的特定用户、多用户以及与用户无关的实验结果表明,该交互方案具有很好的鲁棒性,此研究成果可为交互应用中手势集的选取及传感器的安置提供参考依据。
     2.基于加速度与SEMG信息融合的手语手势识别研究。此研究是对基于多传感器信息检测和融合的手势识别技术的初步探索。放置在手前臂的加速计适合区分具有不同运动轨迹的手臂挥划,而SEMG信息更适于表达手指手腕等精细动作引起的具有不同模式的肌肉活动。针对这两种传感器捕获手语手势信息的互补性,本文提出了一种有效融合加速度和SEMG信息的手语手势动作识别方法框架,利用多流HMM和决策树融合两种异质传感器信息,对30种中国手语单手词和16种情景对话例句展开了分类识别研究,并实现了采用18类手势动作对虚拟魔方进行控制的实时手势交互系统。
     3.基于SEMG的肩颈部动作识别研究。头肩颈部动作在自然和谐的人机交互中可作为辅助的交互手段。利用从背部、肩部和颈部等相关肌群检测到的多通道SEMG,对7类肩颈部动作进行了分类识别研究,探索其用于交互的可行性与有效性。同时,在手势实时识别与交互系统基础上进行了相应改进,实现了基于SEMG的肩颈部动作实时识别系统。
     4.融入肌电信息的个人导航研究。步行者正常行走时,左右两腿交替迈步,相应腓肠肌轮流收缩施力完成身体前向运动。从腓肠肌表面采集的SEMG信号强度随步行时腿部用力大小表现出显著的节律性。针对这一运动生理学特点,本文提出了一种基于SEMG的步行者运动分析技术,并与数字罗盘相结合,实现了一种新型的步行者航位推算(PDR)方法。该方法采用叠加窗分帧技术、样本熵特征提取对双腿腓肠肌SEMG信号进行处理,用HMM分类器对步行者正常步行还是站立动作进行分类,由正常步行的SEMG信号检测迈步周期和估计步长,结合航向信息即得到步行者的位置和航迹。在此基础上进一步结合GPS接收机,验证了该方法实现个人室内外无缝导航的有效性。
     本论文的研究得到国家863高科技研究发展计划“基于肌电传感器和加速计的手势交互设备研究”(2009AA01Z322)、国家自然科学基金项目“基于表面肌电的中国手语手势识别研究”(60703069)、NOKIA赫尔辛基(Helsinki)研究中心及北京研究院合作项目和中国科学技术大学研究生创新基金的资助。
As computing devices have become involved into all aspects of human living environment, the real world, digital world and the people who constitute the main body of the world are organized as an integrated whole. The demand of seamless communication and liberated interaction between people and the environment promotes the human body gesture recognition to become the hot spot of the research on future multi-channel and multi-modal human-computer interaction (HCI). Human body gesture recognition is known as the process that the computer automatically capture, analyze and understand the various types of gestures and human behaviors, such as fingers, wrists, arms, head, face and body posture or gesture patterns, to determine people's intentions and provide the corresponding services.
     Arbitrary body movements are completed by groups of muscles which are coordinated and work closely together under the control of the nervous system. The electromyographic (EMG) signal caused by muscular activities is regarded as one kind of important bioelectric signals. Attached on the surface of skin above relevant muscles, Surface EMG (SEMG) sensor can capture the information of human muscular activities, which not only reflect the state and strength of flexion and extension of the joints, but also reflect the information of limb postures and positions. The EMG processing technologies provide us with important opportunities to capture human body gestures.
     Aiming at the recognition of hand gestures, neck and shoulder gestures and leg motions, the dissertation investigates the detection and recognition of various kinds of human body gestures based on SEMG signals, designs real-time gesture-based interactive systems, conducts a certain amount of user testing experiments, and provides some practical solutions for the natural and harmonious HCI. The research will promote further the development, application and extension of multi-modal intelligent HCI techniques. Moreover, the research achievements are of sufficient importance in the fields of human behavior understanding, rehabilitation medicine, context awareness, pervasive computing, and navigation. The main work and achievements of the dissertation focuses on the following aspects:
     1. Hand gesture recognition based on the multi-channel SEMG. The purpose of this study is to realize effective algorithms for hand gesture-based HCI systems, and to provide theoretical foundations for the selection of input hand gesture commands and SEMG sensor placements. In one aspect, the SEMG processing and recognition methods including signal measurement, active segmentation, feature extraction and classification are studied to classify 8 kinds of commonly used hand gestures. Subsequently, an optimized algorithm is proposed for real-time interaction. And on this basis, a SEMG-based real-time hand gesture recognition system is established. In the other aspect, according to the anatomical knowledge, optimization study on the definition of hand gesture commands and SEMG sensor placement is conducted on the classification of 20 kinds of hand gestures including some subtle finger movements. Thereby, a practical interactive scheme for HCI applications is proposed. The user testing experiments are conducted on the real-time hand gesture recognition system in user-specific, multi-user, and user-independent classification. The experimental results demonstrate the robust performance of proposed interactive scheme. The achievements of this study can provide important references on the selection of input hand gestures and the placement of SEMG sensors in applications of SEMG-based HCI.
     2. Sign language recognition based on the information fusion of acceleration (ACC) and SEMG The aim of this study is to investigate the hand gesture recognition technique based on the information fusion of multiple sensors. ACC-based methods are capable of distinguishing larger scale gestures with different hand trajectories of forearm movement, whereas SEMG-based recognition systems are capable of distinguishing subtle gestures with different muscular activities, such as subtle finger or wrist movements. Considering the complementary characteristics of ACC- and SEMG-based measurements, a framework for hand gesture recognition based on the information fusion of a 3-axis ACC and multi-channel SEMG is presented. The framework utilizes multi-stream HMM and decision tree for information fusion of the two heterogeneous sensors. Based on the framework, the classification of 30 kinds of Chinese sign language (CSL) words and 16 CSL dialog sentences is implemented. Furthermore, a promising real-time interactive system is built for the control of virtual Rubik's cube game using 18 kinds of hand gestures.
     3. Neck and shoulder gesture recognition based on SEMG sensors. Neck and shoulder gestures can be regarded as a supplementary mean of natural and harmonious HCI. The feasibility and practicability of building muscle-computer interfaces starting from SEMG-based neck and shoulder gesture recognition is investigated. The multi-channel SEMG signals are measured from the relevant back, shoulder and neck muscles to classify 7 kinds of neck and shoulder gestures. Then, a real-time SEMG-based neck and shoulder gesture recognition and interaction system is established by the improvement of real-time hand gesture recognition system.
     4. Personal navigation fused with SEMG information. Considering the characteristics of the left and right leg alternately making each pace and contractions of muscles are cyclic when a pedestrian is walking, the EMG signal from the surface skin of calf (Gastrocnemius) showed a significant rhythm according with strength of every pace exerted by leg muscles. Taking advantage of human physiological characteristics during walking, a novel pedestrian dead reckoning (PDR) method is proposed with the fusion of SEMG-based technique for analyzing pedestrian's activities and digital compass based technique for measuring azimuth. In the PDR method, overlapped windowing schemes and sample entropy feature extraction are firstly utilized to process gastrocnemius SEMG signals, and HMM classifiers are used to classify pedestrian activities such as walking or standing still. Then the SEMG-based step detection and step length estimation are implemented during walking, and are combined with the heading of each step measured by digital compass to determine the trace and position of the pedestrian. The field tests demonstrate that a GPS receiver integrated with our proposed PDR method has great potential to provide feasible and effective solutions to seamless outdoor/indoor pedestrian navigation.
     The research is supported by the National High Technology Research and Development Program of China (The 863 Program) "Research on the Gesture Input Devices Based on the Accelerometers and Surface EMG sensors" (2009AA01Z322), National Natural Science Foundation of China "Chinese Sign Language Recognition based on Surface Electromyogram" (60703069), cooperation projects with NOKIA Research Center (Helsinki & Beijing) and Graduate Innovation Foundation of USTC.
引文
白冬梅,邱天爽,李小兵.2007.样本熵及在脑电癫痫检测中的应用[J].生物医学工程学杂志,24(1):200-205.
    杜树新,吴铁军.2003.模式识别中的支持向量机方法[J].浙江大学学报(工学版),37(5):521-527.
    高士濂,于频.1998.人体解剖图谱(修订版)[M].上海:上海科学技术出版社.
    李强.2008.表面肌电信号的运动单位动作电位检测[D]:[博士].合肥:中国科学技术大学.
    刘磊,岳文浩.1983.神经肌电图原理[M].北京:科学出版社.
    何乐生.2006.基于肌电信号的人机接口技术的研究[D]:[博士].南京:东南大学.
    胡广书.2003.数字信号处理理论、算法与实现[M].北京:清华大学出版社.
    胡巍,赵章琰,路知远,陈香.2009.无线多通道表面肌电信号采集系统设计[J].电子测量与仪器学报,23(11):30-35.
    任海兵.2003.非特定人自然的人体动作识别[D]:[博士].北京:清华大学.
    阮迪云,寿天德.1992.神经生理学[M].合肥:中国科学技术大学出版社.
    孙作雷,茅旭初,田蔚风,张相芬.2008.基于动作识别和步幅估计的步行者航位推算[J].上海交通大学学报,42(12):2002-2005,2009.
    汤晓芙.2002.神经病学—神经系统临床电生理学(下)(肌电图学及其他)M].北京:人民军医出版社.
    涂有强,陈香,张旭,赵章琰,杨基海.2008.一种适用于手势动作sEMG信号识别的改进型模糊推理分类器[J].北京生物医学工程,27(4):362-366,392.
    王春立.2003.面向大词汇量的连续中国手语识别系统的研究与实现[D]:[博士].大连:大连理工大学.
    王飞,罗志增.2004.基于AR模型和BP网络的表面EMG信号模式分类[J].华中科技大学学报(自然科学版),32(增刊):100-102.
    杨福生,高上凯.1989.生物医学信号处理[M].北京:高等教育出版社.
    姚天任,孙洪.1999.现代数字信号处理[M].武汉:华中理工大学出版社.
    谢洪波,王志中,黄海.2004.表面肌电的支持向量机分类[J].北京生物医学工程,23(2):94-96,157.
    张良国,高文,陈熙霖,陈益强,王春立.2006.面向中等词汇量的中国手语视觉识别系统[J],计算机研究与发展,43(3):476-482.
    赵光宙.1989.利用AR模型提取控制用肌电信号的特征[J].北京生物医学工程,8(3):136-142.
    赵京东,姜力,蔡鹤皋,刘宏.2007.基于小波变换和样本熵的假手肌电模式识别方法[J].控制与决策,22(8):927-930.
    赵章琰,陈香,雷培源,杨基海.2009.阵列式表面肌电信号采集仪[J].电子测量与仪器学报,23(12):88-93.
    中国残疾人联合会教育就业部,中国聋人协会.2003.中国手语(修订本)[M].北京:华夏出版社.
    邹伟,原魁,臧爱云,张海波.2003.一种中国手语单手词汇识别系统[J].系统仿真学报,1 5(2):290-293.
    Basmajian JV, De Luca CJ.1985. Muscles alive, their functions revealed by electromyography, 5th ed[M]. Baltimore:Williams & Wilkins.
    Beauregard S, Haas H.2006. Pedestrian Dead Reckoning:A Basis for Personal Positioning[C]. Proceedings of the 3rd Workshop on Positioning, Navigation and Communication (WPNC'06), 27-36.
    Bourke AK, O'brien JV, Lyons GM.2007. Evaluation of a threshold-based tri-axial accelerometer fall detection algorithm[J]. Gait & Posture,26(2):194-199.
    Brashear H, Starner T, Lukowicz P, Junker H.2003. Using Multiple Sensors for Mobile Sign Language Recognition[C]. Proc.7th IEEE Int. Sympos. Wearable Computers,45-52.
    Brashear H, Henderson V, Park KH, Hamilton H, Lee S, Starner T.2006. American sign language recognition in game development for deaf children [C]. Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility, Portland, USA,79-86.
    Carrozza MC, Cappiello G, Stellin G, Zaccone F, Vecchi F, Micera S, Dario P.2005. On the development of a novel adaptive prosthetic hand with compliant joints:Experimental platform and EMG control[C].2005 IEEE/RSJ International Conference on Intelligent Robots and Systems,1-4:3951-3956,4120.
    Chan ADC, Englehart KB.2003. Continuous classification of myoelectric signals for powered prostheses using gaussian mixture models[C]. Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vols 1-4,25:2841-2844, 3878.
    Chen X, Zhang X, Zhao Z, Yang J, Lantz V, Wang KQ.2007. Hand gesture recognition research based on surface EMG sensors and 2D-accelerometers[C], Proc.11th IEEE International Symposium on Wearable Computers,11-14.
    Chen X, Li Q, Yang J, Lantz V, Wang K.2008. Test-Retest Repeatability of Surface Electromyography Measurement for Hand Gesture[C]. Proc. of the 2nd Int. Conf. on Bioinformatics and Biomedical Engineering (ICBBE 2008), Shanghai, China:1923-1926.
    Chen R, Chen Y, Pei L, Chen W, Kuusniemi H, Liu J, Leppakoski H, Takala J.2009. A DSP-based multi-sensor multi-network positioning platform[C]. Proceedings of ION GNSS 2009, Savannah, Georgia, USA.
    Chen W, Chen R, Chen Y, Kuusniemi H, Wang J, Fu Z.2010. An effective pedestrian dead reckoning algorithm using a unified heading error model[C]. Proceedings of the IEEE/ION Position, Location and Navigation Symposium (PLANS) 2010, Indian Wells, CA, USA
    Chu JU, Moon I, Mun MS.2006. A real-time EMG pattern recognition system based on linear-nonlinear feature projection for a multifunction myoelectric hand[J]. IEEE Transactions on Biomedical Engineering,53(11),2232-2239.
    Clancy EA, Morin EL, Merletti R.2002. Sampling, noise-reduction and amplitude estimation issues in surface electromyography[J]. Journal of Electromyography and Kinesiology,12(1): 1-16.
    Costanza E, Inverso SA, Allen R.2005. Toward subtle intimate interfaces for mobile devices using an EMG controller[C]. Proceedings of the SIGCHI conference on Human factors in computing systems (CHI 2005), Portland, USA,481-489.
    Costanza E, Inverso SA, Allen R, Maes P.2007. Intimate interfaces in action:Assessing the usability and subtlety of EMG-based motionless gestures[C]. Proceedings of the SIGCHI conference on Human factors in computing systems, San Jose, USA,819-828.
    Culhane KM, O'connor M, Lyons D, Lyons GM.2005. Accelerometers in rehabilitation medicine for older adults [J]. Age and Ageing,34(6):556-560.
    De Luca CJ.1979. Physiology and mathematics of myoelectric signals[J]. IEEE Transaction on Biomedical Engineering,26(3):313-25.
    De Luca CJ, Roy AM, Erim Z.1993. Synchronization of motor-unit firing in several human muscles[J]. Journal of Neurophysiology,70(5):2010-2023.
    Duda RO, Hart PE, Stork DG 2000. Pattern Classification (2nd Edition)[M]. Wiley-Interscience.
    Englehart K, Hudgins B, Parker PA.2001. A wavelet-based continuous classification scheme for multifunction myoelectric control[J]. IEEE Transactions on Biomedical Engineering,48(3): 302-311.
    Englehart K, Hudgins B.2003. A robust, real-time control scheme for multifunction myoelectric control[J]. IEEE Transactions on Biomedical Engineering,50(7):848-854.
    Enoka RM, Fuglevand AJ.2001. Motor unit physiology:some unresolved issues[J]. Muscle & Nerve,24(1):4-17.
    Fang G, Gao W, Zhao D.2003. Large vocabulary sign language recognition based on hierarchical decision trees[C]. Proceedings of the 5th international conference on Multimodal interfaces, Vancouver, Canada,125-131.
    Fang G, Gao W, Zhao D.2004. Large vocabulary sign language recognition based on fuzzy decision trees[J]. IEEE Transaction on Systems, Man, and Cybernetics, Part A:System and Humans,34(3):305-314.
    Fasel B, Luettin J.2003. Automatic facial expression analysis:A survey[J]. Pattern Recognition, 36(1),259-275.
    Fels SS, Hinton GE.1993. Glove-talk-a neural network interface between a data-glove and a speech synthesizer[J]. IEEE Transactions on Neural Networks,4(1):2-8.
    Fistre, J, Tanaka, A.2002. Real Time EMG Gesture Recognition for Consumer Electronics Device Control[EB/OL]. Sony CSL Paris Open House. http://www.csl.sony.fr/-atau/gesture/
    Gravier G, Axelrod S, Potamianos G, Neti C.2002. Maximum entropy and MCE based HMM stream weight estimation for audio-visual ASR[C]. Proc. Int. Conf. Acoust. Speech Signal Process, (ICASSP),2002.
    Grejner-Brzezinska DA, Toth C, Moafipoor S.2007. Pedestrian tracking and navigation using an adaptive knowledge system based on neural networks[J], Journal of Applied Geodesy,111-123.
    Gurban M, Thiran JP, Drugman T, Dutoit T.2008. Dynamic modality weighting for multi-stream HMMs in audio-visual speech recognition[C]. Proc. the 10th Int. Conf. Multimodal interfaces, Chania, Greece,237-240.
    Hargrove LJ, Englehart K, Hudgins B.2007. A comparison of surface and intramuscular myoelectric signal classification[J]. IEEE Transactions on Biomedical Engineering,54(5): 847-853.
    Hargrove LJ, Li GL, Englehart KB, Hudgins BS.2009. Principal components analysis preprocessing for improved classification accuracies in pattern-recognition-based myoelectric control[J]. IEEE Transactions on Biomedical Engineering,56(5):1407-1414.
    Hartson HR, Hix D.1989. Human computer-interface development-concepts and systems for its management[J]. Computing Surveys,21(1):5-92.
    Hu X, Nenov V.2004. Multivariate AR modeling of electromyography for the classification of upper arm movements [J]. Clinical Neurophysiology,115(6):1276-1287.
    Hu X, Wang ZZ, Ren XM.2005. Classification of surface EMG signal using relative wavelet packet energy[J]. Computer Methods and Programs in Biomedicine,79(3),189-195.
    Huang YH, Englehart KB, Hudgins B, Chan ADC.2005. A Gaussian mixture model based classification scheme for myoelectric control of powered upper limb prostheses[J]. IEEE Transactions on Biomedical Engineering,52(11),1801-1811.
    Hudgins B, Parker P, Scott RN.1993. A new strategy for multifunction myoelectric control[J]. IEEE Transactions on Biomedical Engineering,40(1),82-94.
    Ladetto Q, Gabaglio V, Merminod B, Terrier P, Schutz Y.2000. Human walking analysis assisted by DGPS[C]. Proc. Int. Symp. GNSS.
    Lee SW, Mase K.2001. Recognition of walking behaviors for pedestrian navigation[J]. Proceedings of the 2001 IEEE International Conference on Control Applications (CCA'01), 1152-1155.
    Liang RH, Ouhyoung M.1995. A real-time continuous alphabetic sign language to speech conversion VR system[J]. Computer Graphics Forum,14(3):C67-C76.
    Lindner HYN, Linacre JM, Hermansson LMN.2009. Assessment of capacity for myoelectric control:Evaluation of construct and rating scale[J]. Journal of Rehabilitation Medicine,41(6), 467-474.
    Liu C, Wechsler H.2000. Robust coding schemes for indexing and retrieval from large face databases[J]. IEEE Transaction on Image Processing,9(1):132-137.
    Jacobsen SC, Jerard RB.1974. Computational requirements for control of the Utah arm[C], Proceedings of the 1974 annual conference,1:149-155.
    Kadous MW.1996. Machine recognition of Auslan signs using PowerGloves towards large-lexicon recognition of sign language[C]. Proc. Workshop on the Integration of Gesture in Language and Speech,165-174.
    Kela J, Korpipaa P, Mantyjarvi J, Kallio S, Savino G, Jozzo L, Marca SD.2006. Accelerometer-based gesture control for a design environment[J]. Personal and Ubiquitous Computing,10(5): 285-299.
    Kelly MF, Parker PA, Scott RN.1991. Neural network classification of myoelectric signal for prosthesis control[J]. Journal of Electromyography and Kinesiology,1(4),229-236.
    Kim J, Mastnik S, Andre E.2008. EMG-based hand gesture recognition for realtime biosignal interfacing[C]. Proceedings of the 13th international conference on Intelligent user interfaces, Gran Canaria, Spain,30-39.
    Kim J, Wagner J, Rehm M, Andre E.2008. Bi-channel Sensor Fusion for Automatic Sign Language Recognition[C]. Proc. IEEE Int. Conf. Automatic Face and Gesture Recognition, Amsterdam, Netherlands.
    Kim JS, Jang W, Bien ZN.1996. A dynamic gesture recognition system for the korean sign language (KSL) [J]. IEEE Transactions on Systems Man and Cybernetics, Part B:Cybernetics, 26(2):354-359.
    Kinnaird CR, Ferris DP.2009. Medial gastrocnemius myoelectric control of a robotic ankle exoskeleton[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering,17(1), 31-37.
    Kosmidou VE, Hadjileontiadis LJ, Panas SM.2006. Evaluation of Surface EMG Features for the Recognition of American Sign Language Gestures [C]. Proc. of the 28th IEEE EMBS Annual Int. Conf., New York City, USA,153-158.
    Kosmidou V, Hadjileontiadis L.2009. Sign Language Recognition Using Intrinsic Mode Sample Entropy on sEMG and Accelerometer Data[J]. IEEE Transaction on Biomedical Engineering, 56(12):2879-2890.
    Lee KS.2008, EMG-based speech recognition using hidden markov models with global control variables[J]. IEEE Transactions on Biomedical Engineering,55(3),930-940.
    Manabe H, Zhang Z.2004. Multi-stream HMM for EMG-based speech recognition[C]. Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society,26:4389-4392.
    Mandryk RL, Atkins MS, Inkpen KM.2006. A continuous and objective evaluation of emotional experience with interactive play environments [C]. Proceedings of the SIGCHI conference on Human Factors in computing systems, Montreal, Canada,1027-1036.
    Mantyjarvi J, Kela J, Korpipaa P, Kallio S.2004. Enabling fast and effortless customisation in accelerometer based gesture interaction[C]. Proc.3rd Int. Conf. Mobile & Ubiquitous Multimedia, New York, USA,25-31.
    Merletti R, Parker P.2004. Electromyography:physiology, engineering, and noninvasive application[M]. Wiley-IEEE Press.
    Micera S.1999. A hybrid approach to EMG pattern analysis for classification of arm movements using statistical and fuzzy techniques [J]. Medical Engineering & Physics,21(5):303-311.
    Mitra S, Acharya T.2007. Gesture recognition:A survey[J]. IEEE Transactions on Systems Man and Cybernetics Part C-Applications and Reviews,37(3):311-324.
    Moeslund TB, Granum E.2001. A survey of computer vision-based human motion capture[J], Computer Vision and Image Understanding,81(3):231-268.
    Miller LA, Stubblefield KA, Lipschutz RD, Lock BA, Kuiken TA.2008. Improved myoelectric prosthesis control using targeted reinnervation surgery:A case series[J]. IEEE Transactions on Neural Systems and Rehabilitation Engineering,16(1),46-50.
    Naik GR, Kumar DK, Singh VP, Palaniswami M.2006. Hand gestures for HCI using ICA of EMG[C]. Proceedings of the HCSNet workshop on Use of vision in human-computer interaction-Volume 56, Canberra, Australia,67-72.
    Ong SCW, Ranganath S.2005. Automatic Sign Languge Analysis:A Survey and the Future beyond Lexical Meaning[J]. IEEE Transaction on Pattern Analysis and Machine Intelligence, 27(6):873-891.
    Oskoei MA, Hu HS.2007. Myoelectric control systems-a survey[J]. Biomedical Signal Processing and Control,2(4),275-294.
    Parker P, Englehart K, Hudgins B.2006, Myoelectric signal processing for control of powered limb prostheses[J]. Journal of Electromyography and Kinesiology,16(6):541-548.
    Park SH, Lee SP.1998. EMG pattern recognition based on artificial intelligence techniques[J]. IEEE Transactions on Rehabilitation Engineering,6(4):400-405.
    Peter K.2005. The ABC of EMG-A practical introduction to kinesiological electromyography [EB/OL]. Noraxon INC. USA. Http://www.noraxon.com/downloads/educational.php3
    Pincus SM.1991. Approximate entropy as a measure of system complexity [J]. Proc of the National Academy of Sciences,88:2297-2301.
    Potamianos G, Graf HP.1998. Discriminative training of HMM stream exponents for audio-visual speech recognition[C]. Proc. International Conference on Acoustics, Speech and Signal Processing, Seattle, USA,3733-3736.
    Pylvanainen T.2005. Accelerometer based gesture recognition using continuous HMMs[C]. LNCS:Pattern Recognition and Image Analysis,3522/2005:639-646.
    Rabiner LR, Juang BH.1993. Fundamentals of speech recognition[M]. Englewood Cliffs: Prentice Hall.
    Reaz MBI, Hussain MS, Mohd-Yasin F.2006. Techniques of EMG signal analysis:Detection, processing, classification and applications[J]. Biological Procedures Online,11-35.
    Shanableh T, Assaleh K, Al-Rousan M.2007. Spatio-Temporal Feature-Extraction Techniques for Isolated Gesture Recognition in Arabic Sign Language[J]. IEEE Transaction on Systems, Man, and Cybernetics, Part B:Cybernetics,37(3):641-650.
    Shao X, Barker J.2008. Stream weight estimation for multistream audio-visual speech recognition in a multispeaker environment[J]. Speech Communication,50(4),337-353.
    Sherrill DM, Bonato P, De Luca CJ.2002. A neural network approach to monitor motor activities [C]. Proceedings of the Second Joint EMBS/BMES Conference, Houston, USA, 52-53.
    Saponas TS, Tan DS, Morris D, Balakrishnan R.2008. Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces [C]. Proceedings of the 26th Annual Conference on Human Factors in Computing Systems, Florence, Italy,515-524.
    Saponas TS, Tan DS, Morris D, Balakrishnan R, Turner J, Landay JA.2009. Enabling always-available input with muscle-computer interfaces[C]. Proceedings of the 22nd annual ACM symposium on User interface software and technology, Victoria, Canada,167-176.
    Saponas TS, Tan DS, Morris D, Turner J, Landay JA.2010. Making muscle-computer interfaces more practical[C]. Proceedings of the 28th international conference on Human factors in computing systems, Atlanta, USA,851-854.
    Schieber MH.1995. Muscular production of individuated finger movements-the roles of extrinsic finger muscles [J]. Journal of Neuroscience,15(1),284-297.
    Schultz T, Wand M.2010. Modeling coarticulation in EMG-based continuous speech recognition[J]. Speech Communication,52(4),341-353.
    Starner T, Weaver J, Pentland A.1998. Real-time american sign language recognition using desk and wearable computer based video[J]. IEEE Transaction. on Pattern Analysis and Machine Intelligence,20:1371-1375
    Tamura S, Iwano K, Furui S.2005. A stream-weight optimization method for multi-stream HMMs based on likelihood value normalization[C]. Proc. of ICASSP05, Philadelphia, USA:468-472.
    Taylor DR, Finley FR.1974. Multiple-axis prosthesis control by muscle synergies[C]. The Control of Upper Extremity Prostheses and Orthoses, Springfield,181-189.
    Theodoridis S. Koutroubas K.2003. Pattern Recognition (3rd Edition)[M]. Academic Press.
    Toth C, Grejner-Brzezinska DA, Moafipoor S.2007. Pedestrian Tracking and Navigation Using Neural Networks and Fuzzy Logic[J]. Proc. IEEE Int. Symp. Intelligent Signal Processing (WISP 2007),1-6.
    Turk M. (2004) Computer vision in the interface[J]. Communications of the ACM,47(1),60-67.
    Vogler C, Metaxas D.2001. A framework for recognizing the simultaneous aspects of American Sign Language[J], Computer Vision and Image Understanding,81(3):358-384.
    Vuskovic M, Du SJ.2002. Classification of prehensile EMG patterns with simplified fuzzy artmap networks[C]. Proceeding of the 2002 International Joint Conference on Neural Networks, Vols 1-3:2539-2544,2934.
    Wang C, Shan S, Gao W.2002. An approach based on phonemes to large vocabulary Chinese sign language recognition[C]. Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, Washington, USA,411.
    Wang G, Wang ZZ, Chen WT, Zhuang J.2006. Classification of surface EMG signals using optimal wavelet packet method based on davies-bouldin criterion[J]. Medical & Biological Engineering & Computing,44(10):865-872.
    Wang L, Hu W, Tan T.2003. Recent Developments in Human Motion Analysis [J], Pattern Recognition,36(3):585-601.
    Wheeler KR, Chang MH, Knuth KH.2006. Gesture-based control and EMG decomposition[J]. IEEE Transactions on Systems, Man, and Cybernetics, Part C:Applications and Reviews,36(4): 503-514.
    Wheeler KR, Jorgensen CC.2004. Gestures as Input:Neuroelectric Joysticks and Keyboards[J]. IEEE Pervasive Computing,2(2):57-61.
    Wilson A, Shafer S.2003. Xwand:UI for intelligent spaces[C]. Proceedings of the SIGCHI conference on Human factors in computing systems, Ft. Lauderdale, USA,545-552.
    Woo M, Neider J, Davis T, Shreiner D.1999. OpenGL Programming Guide:The Official Guide to Learning OpenGL, Version 1.2 (3rd edition)[M]. Boston:Addison-Wesley Longman Publishing Co., Inc.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700