用户名: 密码: 验证码:
物理学习空间中学习者情绪感知研究综述
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:A Review of Physical Learning Spaces Oriented Learners' Emotion Perception Research
  • 作者:刘智 ; 方常丽 ; 刘三(女牙) ; 孙建文
  • 英文作者:Liu Zhi;Fang Changli;Liu Sanya;Sun Jianwen;National Engineering Laboratory for Educational Big Data, Central China Normal University;Collaborative Innovation Center for Educational Technology, Central China Normal University;National Engineering Research Center for E-Learning, Central China Normal University;
  • 关键词:情绪感知 ; 传感器 ; 可穿戴技术 ; 生理信号 ; 学习分析 ; 智能教育
  • 英文关键词:Emotion Perception;;Sensor;;Wearable Technology;;Physiological Signals;;Learning Analytics;;Intelligent Education
  • 中文刊名:YCJY
  • 英文刊名:Journal of Distance Education
  • 机构:华中师范大学教育大数据应用技术国家工程实验室;华中师范大学教育信息技术协同创新中心;华中师范大学国家数字化学习工程技术研究中心;
  • 出版日期:2019-03-20
  • 出版单位:远程教育杂志
  • 年:2019
  • 期:v.37;No.251
  • 基金:国家自然科学基金项目“多场景网络学习中基于‘行为—情感—主题’联合建模的学习者兴趣挖掘关键技术研究”(项目编号:61702207);; 国家重点研发计划课题“数据驱动的数字教育个性化服务支撑技术研究”(项目编号:2017YFB1401303);; 教育部人文社会科学研究项目“高校慕课环境下的互动话语行为及其对学习效果的影响机理研究”(项目编号:16YJC880052)资助
  • 语种:中文;
  • 页:YCJY201902005
  • 页数:12
  • CN:02
  • ISSN:33-1304/G4
  • 分类号:35-46
摘要
在"人—技"协同进化的教育发展态势下,学习者的学习方式和交互环境正面临深刻变革,物理学习空间内的学习支持服务亟待重塑。近年来,研究者们致力于采用传感器获取学习者的生理行为数据,结合学习分析技术推测其情绪状态,并以适当的干预机制来提高积极情绪唤醒度,进而助力于个体学业成功。当前,在物理学习空间中,针对学习者情绪感知的主要手段有人工观察法、自我报告法、基于生理信号、语音信号、面部表情信号以及眼动信号的感知方法;应用研究案例包含智能导师系统、虚拟学习同伴、情绪互动支持、自我调节能力评估、学情分析监控等主题。对物理学习空间中学习者情绪感知的研究,可为未来学习空间的重塑带来新的研究视角和参照。
        Under the educational development of "human-technology" co-evolution, learners' learning styles and interactive environments are facing profound transformations; Thus, the learning support services in physical spaces need to be reshaped urgently. In recent years, researchers have focused on using sensor technology to collect learners ' physiological and behavioral data, combining learning analytics to infer their emotional states, and adopting appropriate intervention mechanisms to improve the positive emotional arousal and academic success. Currently, the main methods of learners' emotion perception(artificial observational and self-report methods, perception methods based on physiological signals, speech signals, facial expression signals and eye movement signals)and five typical cases(intelligent tutoring system, virtual learning companion, emotional interaction support, self-regulated capability assessment and learning situation analysis monitoring) are pointed out to provide a novel research insight for the future reconstruction of the physical learning spaces.
引文
[1]Dom anska M.HU Computer Scientist Develops Sensors that Help People with Anxiety Disorders[EB/OL].[2018-10-11].https://www.adlershof.de/en/news/measuring-emotions/.
    [2]Becker S A,Cummins M,Davis A,et al.NMC Horizon Report:2017 Higher Education Edition[R].The New Media Consortium,2017.
    [3]孙波,刘永娜,陈玖冰,等.智慧学习环境中基于面部表情的情感分析[J].现代远程教育研究,2015(2):96-103.
    [4]Picard R,Kort B,Reilly R.Exploring the Role of Emotion in Propelling the SMET Learning Process[J].NSF Project Summary and Description,2007.
    [5]刘智,刘三女牙,康令云.物理空间中的智能学伴系统:感知数据驱动的学习分析技术---访柏林洪堡大学教育技术专家NielsPinkwart教授[J].中国电化教育,2018(7):67-72.
    [6]吴永和,曹盼,邢万里,等.学习分析技术的发展和挑战---第四届学习分析与知识国际会议评析[J].开放教育研究,2014(6):72-80.
    [7]李香勇,左明章,王志锋.学习分析的研究现状与未来展望---2016年学习分析和知识国际会议述评[J].开放教育研究,2017(1):46-55.
    [8]吴永和,李若晨,王浩楠.学习分析研究的现状与未来发展---2017年学习分析与知识国际会议评析[J].开放教育研究,2017(5):42-56.
    [9]王建伟.基于深度学习的情绪感知系统的研究与设计[D].成都:电子科技大学,2017.
    [10]黄泳锐,杨健豪,廖鹏凯,等.结合人脸图像和脑电的情绪识别技术[J].计算机系统应用,2018(2):9-15.
    [11]李寿山,黄磊,周国栋.一种微博文本情绪识别方法及系统:中国,CN104809104A[P].2015.
    [12]孙继民,姚温青,胡学平.声音情绪的跨文化识别[J].心理科学进展,2014(5):802-809.
    [13]戴逸翔,王雪,戴鹏,等.面向可穿戴多模生物信息传感网络的栈式自编码器优化情绪识别[J].计算机学报,2017(8):1750-1763.
    [14]Becker S A,Freeman A,Hall C G,et al.NMC/CoSNHorizon Report:2016 K-12 Edition[M].The New Media Consortium,2016:1-52.
    [15]Not Just a Meditation Tool:The Muse Brain Sensing Headband in Neuroscience Research[EB/OL].[2018-11-22].https://medtechboston.medstro.com/blog/2015/10/14/not-just-a-meditation-toolthe-muse-brain-sensing-headband-in-neuroscience-research/.
    [16]李丽霞,包汉宗.基于情感计算的E-learning系统建模[J].电脑知识与技术,2010(6):9783-9784.
    [17]张永皋,马青玉,孙青.基于MFCC和CHMM技术的语音情感分析及其在教育中的应用研究[J].南京师范大学学报(工程技术版),2009(2):89-92.
    [18]吴彦文,刘伟,张昆明.基于情感识别的智能教学系统研究[J].计算机工程与设计,2008(9):2350-2352.
    [19]牟智佳.“人工智能+”时代的个性化学习理论重思与开解[J].远程教育杂志,2017(3):22-30.
    [20]许亚锋,高红英.面向人工智能时代的学习空间变革研究[J].远程教育杂志,2018(1):48-60.
    [21]Cooper D G,Arroyo I,Woolf B P,et al.Sensors Model Student Self Concept in the Classroom[C]//International Conference on User Modeling,Adaptation,and Personalization.Springer,Berlin,Heidelberg,2009:30-41.
    [22]Kiesler S,Powers A,Fussell S R,et al.Anthropomorphic Interactions with a Robot and Robot-like Agent[J].Social Cognition,2008,26(2):169-181.
    [23]The Teachable Agents Group at Vanderbilt University.Betty’s Brain[EN/OL].[2018-11-13].http://teachableagents.org/research/bettysbrain.php.
    [24]Loos R.CoWriter:Children Using Robots to Learn Writing[EB/OL].[2018-11-03].http://www.roboticstoday.com/news/cowriter-childrenusing-robots-to-learn-writing-3113.
    [25]Oshima J.Collaborative Reading Comprehension with Communication Robots as Learning Partners[J].Proceedings of ICLS2012,2012,2:182-186.
    [26]Timms M J.Letting Artificial Intelligence in Education out of the Box:Educational Cobots and Smart Classrooms[J].International Journal of Artificial Intelligence in Education,2016,26(2):701-712.
    [27]齐继东.论情绪情感在大学生自主学习中的作用及实施策略[J].吉林广播电视大学学报,2018(1):62-63.
    [28]Spikol D,Ruffaldi E,Dabisias G,et al.Supervised Machine Learning in Multimodal Learning Analytics for Estimating Success in Project-based Learning[J].Journal of Computer Assisted Learning,2018,34(4):366-377.
    [29]Shen L,Wang M,Shen R.Affective e-Learning:Using“Emotional”Data to Improve Learning in Pervasive Learning Environment[J].Journal of Educational Technology&Society,2009,12(2).
    [30]Petrovica S.Adaptation of Tutoring to Students’Emotions in Emotionally Intelligent Tutoring Systems[C]//e-Learning and e-Technologies in Education(ICEEE),2013 Second International Conference on.IEEE,2013:131-136.
    [31]陶雷,罗朝雄.基于ARM处理器的智慧教室电器开关控制系统硬件设计[J].信息与电脑,2016(19):128-129.
    [32]Temkar R,Gupte M,and Kalgaonkar S.Internet of Things for Smart Classrooms[J].International Research Journal of Engineering and Technology,2016:203-206.
    [33]程萌萌,林茂松,王中飞.应用表情识别与视线跟踪的智能教学系统研究[J].中国远程教育,2013(3):59-64.
    [34]Luis-Ferreira F,Artifice A,McManus G,et al.An Architecture to Support Wearables in Education and Wellbeing[J].International Association for Development of the Information Society,2017:233-235.
    [35]Boulton H,Brown D,Standen P,et al.Multi-modalities in Classroom Learning Environments[R].12th International Technology,E-ducation and Development Conference,2018.
    [36]郭漩.基于人工神经网络的多生理信号情绪识别系统设计与实现[D].上海:华东师范大学,2014.
    [37]周肖肖.基于多模态融合的情感计算研究[D].西安:西安邮电大学,2018.
    [38]Hsieh G,Li I,Dey A,et al.Using Visualizations to Increase Compliance in Experience Sampling[C]//Proceedings of the 10th International Conference on Ubiquitous Computing.ACM,2008:164-167.
    [39]Consolvo S,Mcdonald D W,Toscos T,Chen M Y,Froehlich J,Harrison B,Landay J A.Activity Sensing in the Wild:A Field Trial of UbiFit Garden[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.ACM,2008:1797-1806.
    [40]Froehlich J,Dillahunt T,Klasnja P,et al.UbiGreen:Investigating a Mobile Tool for Tracking and Supporting Green Transportation Habits[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.ACM,2009:1043-1052.
    [41]Heggen S.Integrating Participatory Sensing and Informal Science E-ducation[C]//Proceedings of the 2012 ACM Conference on Ubiquitous Computing.ACM,2012:552-555.
    [42]Carroll E A,Czerwinski M,Roseway A,et al.Food and Mood:Just-in-time Support for Emotional Eating[C]//Affective Computing and Intelligent Interaction(ACII),2013 Humaine Association Conference on.IEEE,2013:252-257.
    [43]Kim J,Ko H.Reconfigurable Multiparameter Biosignal Acquisition SoC for Low Power Wearable Platform[J].Sensors,2016,16(12):2002.
    [44]Popescu R,Ponescu D,Roibu H,et al.Smart Classroom-Affective Computing in Present-Day Classroom[C]//2018 28th EAEEIE Annual Conference(EAEEIE).IEEE,2018:1-9.
    [45]徐振国,陈秋惠,张冠文.新一代人机交互:自然用户界面的现状、类型与教育应用探究---兼对脑机接口技术的初步展望[J].远程教育杂志,2018(4):41-50.
    [46]苏逸飞,张永魁.用于情绪识别的无线生理信号采集传输系统[J].电气电子教学学报,2009(5):61-63.
    [47]李英春,尤磊,贺靖康,等.基于生理信号的情绪识别腕戴设备[J].电子技术应用,2017(2):69-72+76.
    [48]Schneider J,B觟rner D,Van Rosmalen P,et al.Augmenting the Senses:A Review on Densor-based Learning Support[J].Sensors,2015,15(2):4097-4133.
    [49]Yun H,Domanska M,Fortenbacher A,et al.Sensor Data for Learning Support:Achievements,Open Questions&Opportunities[C]//R Zender,ed.Proceedings der Pre-Conference-Workshops der,2016:28-36.
    [50]Fortenbacher A,Pinkwart N,Yun H.[LISA]Learning Analytics for Sensor-based Adaptive Learning[C]//Proceedings of the Seventh International Learning Analytics&Knowledge Conference.ACM,2017:592-593.
    [51]Lu Y,Zhang S,Zhang Z,et al.A Framework for Learning Analytics Using Commodity Wearable Devices[J].Sensors,2017,17(6):1382.
    [52]冯满堂,马青玉,王瑞杰.基于人脸表情识别的智能网络教学系统研究[J].计算机技术与发展,2011(6):193-196.
    [53]Yun H,Fortenbacher A,Pinkwart N,et al.A Pilot Study of Emotion Detection using Sensors in a Learning Context:Towards an Affective Learning Companion[M].DeLFI and GMW Workshops,2017:1-11.
    [54]Bosch N,D’mello S K,Ocumpaugh J,et al.Using Video to Automatically Detect Learner Affect in Computer-enabled Classrooms[J].ACM Transactions on Interactive Intelligent Systems(TiiS),2016,6(2):17.
    [55]刘三女牙,李小文,刘智.面向网络行为的情感识别关键技术研究[M].武汉:科学出版社,2018:29.
    [56]Mehrabian A.Communication without Words[J].Communication Theory,2008:193-200.
    [57]Praetorius A K,McIntyre N A,Klassen R M.Reactivity Effects in Video-based Classroom Research:An Investigation Using Teacher and Student Questionnaires as well as Teacher Eye-tracking[J].Zeitschrift Für Erziehungswissenschaft,2017,20(1):49-74.
    [58]Sandanayake T C,Madurapperuma A P.Affective e-learning Model for Recognising Learner Emotions in Online Learning Environment[C]//Advances in ICT for Emerging Regions(ICTer),2013 International Conference on.IEEE,2013:266-271.
    [59]Dragon T,Arroyo I,Woolf B P,et al.Viewing Student Affect and Learning through Classroom Observation and Physical Sensors[C]//International Conference on Intelligent Tutoring Systems.Springer,Berlin,Heidelberg,2008:29-39.
    [60]Bahreini K,Nadolski R,Westera W.Towards Real-time Speech Emotion Recognition for Affective e-learning[J].Education and Information Technologies,2016,21(5):1367-1386.
    [61]Bosch N,D’Mello S K,Baker R S,et al.Detecting Student Emotions in Computer-Enabled Classrooms[C]//IJCAI.2016:4125-4129.
    [62]D’Mello S,Olney A,Williams C,et al.Gaze Tutor:A Gaze-reactive Intelligent Tutoring System[J].International Journal of Humancomputer Studies,2012,70(5):377-398.
    [63]Woolf B P,Burleson W,Arroyo I,et al.Affect-aware Tutors:Recognizing and Responding to Student Affect[J].International Journal of Learning Technology,2009,4(3/4):129-164.
    [64]Yun H,Israel J H,Fortenbacher A,et al.User-Centric Approach to the Design of a Mobile Learning Companion[C]//Fachgruppe Begreifbare Interaktion Workshop,Mensch und Computer,2017:401-406.
    [65]Shen L,Callaghan V,Shen R.Affective e-Learning in Residential and Pervasive Computing Environments[J].Information Systems Frontiers,2008,10(4):461-472.
    [66]Burleson W,Picard R W,Perlin K,et al.A Platform for Affective Agent Research[C]//Workshop on Empathetic Agents,International Conference on Autonomous Agents and Multiagent Systems,Columbia University,New York,NY.2004:108-115.
    [67]Balaam M,Fitzpatrick G,Good J,et al.Exploring Affective Technologies for the Classroom with the Subtle Stone[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.ACM,2010:1623-1632.
    [68]Moffitt T E,Arseneault L,Belsky D,et al.A Gradient of Childhood Self-control Predicts Health,Wealth,and Public Safety[J].Proceedings of the National Academy of Sciences,2011,108(7):2693-2698.
    [69]Spann C A,Schaeffer J,Siemens G.Expanding the Scope of Learning Analytics Data:Preliminary Findings on Attention and Self-regulation Using Wearable Technology[C]//Proceedings of the Seventh International Learning Analytics&Knowledge Conference.ACM,2017:203-207.
    [70]Wen M,Yang D,Rose C.Sentiment Analysis in MOOC Discussion Forums:What does It Tell Us?[C]//Proceedings of Educational Data Mining,2014:130-137.
    [71]Alyuz N,Okur E,Oktay E,et al.Towards an Emotional Engagement Model:Can Affective States of a Learner be Automatically Detected in a 1:1 Learning Scenario?[C]//ACM Conference on User Modeling,Adaptation and Personalization.ACM,2016.
    [72]Kadar M,Ferreira F,Calado J,et al.Affective Computing to Enhance Emotional Sustainability of Students in Dropout Prevention[C]//Proceedings of the 7th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion.ACM,2016:85-91.
    [73]Zaletelj J,Andrej K.Predicting Students’Attention in the Classroom from Kinect Facial and Body Features[J].EURASIP Journal on Image and Video Processing,2017(1):80.
    [74]Immordino-Yang M H,Faeth M.The Role of Emotion and Skilled Intuition in Learning[J].Mind,Brain,and Education:Neuroscience Implications for the Classroom,2010,69:83.
    [75]Hill V,Lee H J.Libraries and Immersive Learning Environments U-nite in Second Life[J].Library Hi Tech,2009,27(3):338-356.
    (1)Subtle Stone为学生在学习互动过程中以自我报告的形式,实时向老师提供情感体验的工具。

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700