Affective Computing in Games
详细信息    查看全文
  • 关键词:Affective Computing ; Serious Game ; Emotion ; Affect detection ; Sensors ; Physiological data ; Facial expressions ; Speech
  • 刊名:Lecture Notes in Computer Science
  • 出版年:2016
  • 出版时间:2016
  • 年:2016
  • 卷:9970
  • 期:1
  • 页码:402-441
  • 全文大小:436 KB
  • 参考文献:1.Ambadar, Z., Schooler, J.W., Cohn, J.F.: Deciphering the enigmatic face - the importance of facial dynamics in interpreting subtle facial expressions. Psychol. Sci. 16(5), 403–410 (2005)CrossRef
    2.Anderson, N.H.: Likableness ratings of 555 personality-trait words. J. Pers. Soc. Psychol. 9(3), 272 (1968)CrossRef
    3.Anliker, U., Ward, J.A., Lukowicz, P., Tröster, G., Dolveck, F., Baer, M., Keita, F., Schenker, E.B., Catarsi, F., Coluccini, L., et al.: AMON: a wearable multiparameter medical monitoring and alert system. IEEE Trans. Inf. Technol. Biomed. 8(4), 415–427 (2004)CrossRef
    4.Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R.: Emotion sensors go to school. In: Proceedings of AIED, vol. 200, pp. 17–24 (2009)
    5.Aviezer, H., Hassin, R.R., Ryan, J., Grady, C., Susskind, J., Anderson, A., Moscovitch, M., Bentin, S.: Angry, disgusted, or afraid? Studies on the malleability of emotion perception. Psychol. Sci. 19(7), 724–732 (2008)CrossRef
    6.Ayaz, H., Shewokis, P.A., Bunce, S., Onaral, B.: An optical brain computer interface for environmental control. In: International Conference on Engineering in Medicine and Biology Society (EMBC), pp. 6327–6330 (2011)
    7.Bartlett, M.S., Littlewort, G., Fasel, I., Movellan, J.R.: Real time face detection and facial expression recognition: development and applications to human computer interaction. In: Proceedings of Computer Vision and Pattern Recognition Workshop, vol. 5, p. 53 (2003)
    8.Baveye, Y., Dellandrea, E., Chamaret, C., Chen, L.: Liris-accede: a video database for affective content analysis. IEEE Trans. Affect. Comput. 6(1), 43–55 (2015)CrossRef
    9.Bernays, R., Mone, J., Yau, P., Murcia, M., Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Christopherson, R., Atkinson, R.: Lost in the dark: emotion adaption. In: Adjunct Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST), pp. 79–80 (2012). doi:10.​1145/​2380296.​2380331 , ISBN 978-1-4503-1582-1
    10.Biel, J.-I., Teijeiro-Mosquera, L., Gatica-Perez, D.: Facetube: predicting personality from facial expressions of emotion in online conversational video. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, pp. 53–56 (2012)
    11.Bojko, A.: Eye Tracking the User Experience. Rosenfeld Media, Brooklyn (2013)
    12.Bollen, J., Pepe, A., Mao, H.: Modeling public mood and emotion: twitter sentiment and socio-economic phenomena. In: Proceedings of ICWSM, vol. 11, pp. 450–453 (2009)
    13.Boucsein, W.: Electrodermal Activity. Springer Science & Business Media, Berlin (2012)CrossRef
    14.Bradley, M.M., Miccoli, L., Escrig, M.A., Lang, P.J.: The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4), 602–607 (2008)CrossRef
    15.Brave, S., Nass, C.: Emotion in human-computer interaction. In: Jacko, J.A., Sears, A. (eds.) Human-Computer Interaction, pp. 53–67. CRC Press, Boca Raton (2003)
    16.Breazeal, C., Aryananda, L.: Recognition of affective communicative intent in robot-directed speech. Auton. Robots 12(1), 83–104 (2002)MATH CrossRef
    17.Brouwer, A.-M., Van Wouwe, N., Muehl, C., Van Erp, J., Toet, A.: Perceiving blocks of emotional pictures, sounds: effects on physiological variables. Front. Hum. Neurosci. 7, 1–10 (2013). Article 295, ISSN 1662–5161
    18.Cacioppo, J.T., Tassinary, L.G., Berntson, G.G.: Handbook of Psychophysiology. Cambridge University Press, Cambridge (2007)CrossRef
    19.Calvo, R.A., D’Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)CrossRef
    20.Carrera, P., Oceja, L.: Drawing mixed emotions: sequential or simultaneous experiences? Cogn. Emot. 21(2), 422–441 (2007)CrossRef
    21.Castiglioni, P., Faini, A., Parati, G., Di Rienzo, M.: Wearable seismocardiography. In: 2007 29th Annual International Conference of the IEEE Engineering in Medicine, Biology Society, pp. 3954–3957, August 2007. doi:10.​1109/​IEMBS.​2007.​4353199
    22.Cattell, R.B., Eber, H.W., Tatsuoka, M.M.: Handbook for the Sixteen Personality Factor Questionnaire (16 PF), in Clinical, Educational, Industrial, and Research Psychology, for use with all forms of the Test. Institute for Personality and Ability Testing, Champaign (1970)
    23.Cavazza, M., Pizzi, D., Charles, F., Vogt, T., André, E.: Emotional input for character-based interactive storytelling. In: Proceedings of the International Conference on Autonomous Agents and Multiagent Systems, vol. 1, pp. 313–320 (2009)
    24.Chaffar, S., Inkpen, D.: Using a heterogeneous dataset for emotion analysis in text. In: Butz, C., Lingras, P. (eds.) AI 2011. LNCS (LNAI), vol. 6657, pp. 62–67. Springer, Heidelberg (2011). doi:10.​1007/​978-3-642-21043-3_​8 CrossRef
    25.Chanel, G., Kronegg, J., Grandjean, D., Pun, T.: Emotion assessment: arousal evaluation using EEG’s and peripheral physiological signals. In: Gunsel, B., Jain, A.K., Tekalp, A.M., Sankur, B. (eds.) MRCS 2006. LNCS, vol. 4105, pp. 530–537. Springer, Heidelberg (2006). doi:10.​1007/​11848035_​70 CrossRef
    26.Chanel, G., Rebetez, C., Bétrancourt, M., Pun, T.: Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 41(6), 1052–1063 (2011)CrossRef
    27.Childers, D.G., Skinner, D.P., Kemerait, R.C.: The cepstrum: a guide to processing. Proc. IEEE 65(10), 1428–1443 (1977)CrossRef
    28.Christie, I.C., Friedman, B.H.: Autonomic specificity of discrete emotion and dimensions of affective space: a multivariate approach. Int. J. Psychophysiol. 51(2), 143–153 (2004)CrossRef
    29.Cohn, J.F., Schmidt, K.L.: The timing of facial motion in posed and spontaneous smiles. Int. J. Wavelets Multiresolut. Inf. Process. 2(02), 121–132 (2004)CrossRef
    30.Costa Jr., P.T., McCrae, R.R.: Set like plaster? Evidence for the stability of adult personality. In: Heatherton, T., Weinberger, J. (eds.) Can Personality Change?, pp. 21–40. American Psychological Association, Washington, D.C (1994)CrossRef
    31.Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.G.: Emotion recognition in human-computer interaction. IEEE Sig. Process. Mag. 18(1), 32–80 (2001)CrossRef
    32.Dalgleish, T., Dunn, B.D., Mobbs, D.: Affective neuroscience: past, present, and future. Emot. Rev. 1(4), 355–368 (2009)CrossRef
    33.Davidson, R.J., Scherer, K.R., Goldsmith, H.: Handbook of Affective Sciences. Oxford University Press, Oxford (2003)
    34.Davis, S.B., Mermelstein, P.: Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences. IEEE Trans. Acoust. Speech Sig. Process. 28(4), 357–366 (1980)CrossRef
    35.De Choudhury, M.C.S., Gamon, M.: Not all moods are created equal! Exploring human emotional states in social media. In: Proceedings of the ICWSM (2012)
    36.Dekker, A., Champion, E.: Please biofeed the zombies: enhancing the gameplay and display of a horror game using biofeedback. In: Proceedings of DiGRA, pp. 550–558 (2007)
    37.Dhall, A., Goecke, R., Lucey, S., Gedeon, T.: Static facial expression analysis in tough conditions: data, evaluation protocol and benchmark. In: IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 2106–2112 (2011)
    38.D’Mello, S., Graesser, A.: Autotutor and affective autotutor: learning by talking with cognitively and emotionally intelligent computers that talk back. ACM Trans. Interact. Intell. Syst. (TiiS) 2(4), 23 (2012)
    39.D’Mello, S.K., Kory, J.: A review and meta-analysis of multimodal affect detection systems. ACM Comput. Surv. 47(3), February 2015. doi:10.​1145/​2682899 , ISSN 0360-0300
    40.Drachen, A., Nacke, L.E., Yannakakis, G., Lee Pedersen, A.: Correlation between heart rate, electrodermal activity and player experience in first-person shooter games. In: Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games, pp. 49–54 (2010)
    41.Egges, A., Kshirsagar, S., Magnenat-Thalmann, N.: A model for personality and emotion simulation. In: Palade, V., Howlett, R.J., Jain, L. (eds.) KES 2003. LNCS (LNAI), vol. 2773, pp. 453–461. Springer, Heidelberg (2003). doi:10.​1007/​978-3-540-45224-9_​63 CrossRef
    42.Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992a)CrossRef
    43.Ekman, P.: Are there basic emotions? Psychol. Rev. 99(3), 550–553 (1992b)CrossRef
    44.Ekman, P.: Facial expression and emotion. Am. Psychol. 48(4), 384 (1993)CrossRef
    45.Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Stanford University, Palo Alto (1978)
    46.Ekman, P., Rosenberg, E.L.: What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford University Press, Oxford (1997)
    47.El Ayadi, M., Kamel, M.S., Karray, F.: Survey on speech emotion recognition: features, classification schemes, and databases. Pattern Recogn. 44(3), 572–587 (2011)MATH CrossRef
    48.Emotiv.Emotiv (2016). http://​emotiv.​com . Accessed 26 May 2016
    49.Fazli, S., Mehnert, J., Steinbrink, J., Curio, G., Villringer, A., Müller, K.-R., Blankertz, B.: Enhanced performance by a hybrid NIRS-EEG brain computer interface. Neuroimage 59(1), 519–529 (2012)CrossRef
    50.Fernández-Aranda, F., Jiménez-Murcia, S., Santamaría, J.J., Gunnard, K., Soto, A., Kalapanidas, E., Bults, R.G.A., Davarakis, C., Ganchev, T., Granero, R.: Video games as a complementary therapy tool in mental disorders: PlayMancer, a European multicentre study. J. Ment. Health 21(4), 364–374 (2012)CrossRef
    51.Fontaine, J.R.J., Scherer, K.R., Roesch, E.B., Ellsworth, P.C.: The world of emotions is not two-dimensional. Psychol. Sci. 18(12), 1050–1057 (2007)CrossRef
    52.France, D.J., Shiavi, R.G., Silverman, S., Silverman, M., Wilkes, M.: Acoustical properties of speech as indicators of depression, suicidal risk. IEEE Trans. Biomed. Eng. 47(7), 829–837 (2000)CrossRef
    53.Frijda, N.H.: Varieties of affect: emotions and episodes, moods, and sentiments. In: Ekman, P., Davison, R. (eds.) The Nature of Emotions: Fundamental Questions, pp. 197–202. Oxford University Press, Oxford (1994)
    54.García-García, C., Larios-Rosillo, V., Luga, H.: Agent behaviour modeling using personality profile characterization for emergency evacuation serious games. In: Plemenos, D., Miaoulis, G. (eds.) Intelligent Computer Graphics 2012. Studies in Computational Intelligence, vol. 441, pp. 107–128. Springer, Heidelberg (2013)CrossRef
    55.Gebhard, P., Kipp, K.H.: Are computer-generated emotions and moods plausible to humans? In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS (LNAI), vol. 4133, pp. 343–356. Springer, Heidelberg (2006). doi:10.​1007/​11821830_​28 CrossRef
    56.Golbeck, J., Robles, C., Turner, K.: Predicting personality with social media. In: Proceedings of CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 253–262 (2011)
    57.Gunes, H., Piccardi, M.: A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior. In: International Conference on Pattern Recognition (ICPR), vol. 1, pp. 1148–1153 (2006)
    58.Gunes, H., Schuller, B.: Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis. Comput. 31(2), 120–136 (2013)CrossRef
    59.Gunes, H., Schuller, B., Pantic, M., Cowie, R.: Emotion representation, analysis, synthesis in continuous space: a survey. In: IEEE International Conference on Automatic Face & Gesture Recognition and Workshops, pp. 827–834 (2011)
    60.Guthier, B., Alharthi, R., Abaalkhail, R., El Saddik, A.: Detection and visualization of emotions in an affect-aware city. In: Proceedings of the 1st International Workshop on Emerging Multimedia Applications and Services for Smart Cities, pp. 23–28 (2014)
    61.Hamann, S.: Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends Cogn. Sci. 16(9), 458–466 (2012)CrossRef
    62.Hansen, J.H.L., Cairns, D.A.: Icarus: Source generator based real-time recognition of speech in noisy stressful and lombard effect environments. Speech Commun. 16(4), 391–422 (1995)CrossRef
    63.Homma, I., Masaoka, Y.: Breathing rhythms and emotions. Exp. Physiol. 93(9), 1011–1021 (2008)CrossRef
    64.Hoover, A., Singh, A., Fishel-Brown, S., Muth, E.: Real-time detection of workload changes using heart rate variability. Biomed. Sig. Process. Control 7(4), 333–341 (2012)CrossRef
    65.Horlings, R., Datcu, D., Rothkrantz, L.J.M.: Emotion recognition using brain activity. In: Proceedings of the 9th International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing, p. 6 (2008)
    66.Ikehara, C.S., Crosby, M.E.: Assessing cognitive load with physiological sensors. In: Proceedings of the Hawaii International Conference on System Sciences (HICSS), p. 295a (2005)
    67.Izard, C.E., et al.: Special section: on defining emotion. Emot. Rev. 2(4), 363–385 (2010)CrossRef
    68.Jerritta, S., Murugappan, M., Nagarajan, R., Wan, K.: Physiological signals based human emotion recognition: a review. In: IEEE International Colloquium on Signal Processing and its Applications (CSPA), pp. 410–415 (2011)
    69.Johnstone, T., van Reekum, C.M., Hird, K., Kirsner, K., Scherer, K.R.: Affective speech elicited with a computer game. Emotion 5(4), 513 (2005)CrossRef
    70.Kao, E.C.-C., Liu, C.-C., Yang, T.-H., Hsieh, C.-T., Soo, V.-W.: Towards text-based emotion detection a survey and possible improvements. In: International Conference on Information Management and Engineering, ICIME 2009, pp. 70–74 (2009)
    71.Kapoor, A., Picard, R.W.: Multimodal affect recognition in learning environments. In: Proceedings of the 13th Annual ACM International Conference on Multimedia, pp. 677–682 (2005)
    72.Kirk, M.: Thoughtful Machine Learning: A Test-Driven Approach. O’Reilly Media Inc., California (2014)
    73.Kleinginna Jr., P.R., Kleinginna, A.M.: A categorized list of emotion definitions, with suggestions for a consensual definition. Motiv. Emot. 5(4), 345–379 (1981)CrossRef
    74.Kleinsmith, A., Bianchi-Berthouze, N., Steed, A.: Automatic recognition of non-acted affective postures. IEEE Trans. Syst. Man Cybern. Part B Cybern. 41(4), 1027–1038 (2011)CrossRef
    75.Knutson, B.: Facial expressions of emotion influence interpersonal trait inferences. J. Nonverbal Behav. 20(3), 165–182 (1996)CrossRef
    76.Koelstra, S., Mühl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: Deap: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)CrossRef
    77.Koolagudi, S.G., Rao, K.S.: Emotion recognition from speech: a review. Int. J. Speech Technol. 15(2), 99–117 (2012)CrossRef
    78.Kundu, S.K., Kumagai, S., Sasaki, M.: A wearable capacitive sensor for monitoring human respiratory rate. Japan. J. Appl. Phys. 52(4S), 04CL05 (2013)CrossRef
    79.Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical report A-8 (2008)
    80.Lankes, M., Riegler, S., Weiss, A., Mirlacher, T., Pirker, M., Tscheligi, M.: Facial expressions as game input with different emotional feedback conditions. In: Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, pp. 253–256 (2008)
    81.Laukka, P., Juslin, P., Bresin, R.: A dimensional approach to vocal expression of emotion. Cogn. Emot. 19(5), 633–653 (2005)CrossRef
    82.Lee, C.M., Narayanan, S.S.: Toward detecting emotions in spoken dialogs. IEEE Trans. Speech Audio Process. 13(2), 293–303 (2005)CrossRef
    83.Lee, C.M., Narayanan, S.S., Pieraccini, R.: Combining acoustic and language information for emotion recognition. In: Proceedings of INTERSPEECH (2002)
    84.Lee, C.M., Yildirim, S., Bulut, M., Kazemzadeh, A., Busso, C., Deng, Z., Lee, S., Narayanan, S.: Emotion recognition based on phoneme classes. In: Proceedings of Interspeech, pp. 205–211 (2004)
    85.Leichtenstern, K., Bee, N., André, E., Berkmüller, U., Wagner, J.: Physiological measurement of trust-related behavior in trust-neutral and trust-critical situations. In: Wakeman, I., Gudes, E., Jensen, C.D., Crampton, J. (eds.) IFIPTM 2011. IAICT, vol. 358, pp. 165–172. Springer, Heidelberg (2011). doi:10.​1007/​978-3-642-22200-9_​14 CrossRef
    86.Leshed, G., Kaye, J.J.: Understanding how bloggers feel: recognizing affect in blog posts. In: Proceedings of CHI 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 1019–1024 (2006)
    87.Lewis, M., Haviland-Jones, J.M., Barrett, L.F.: Handbook of Emotions. Guilford Press, New York City (2010)
    88.Liapis, A., Katsanos, C., Sotiropoulos, D., Xenos, M., Karousos, N.: Recognizing emotions in human computer interaction: studying stress using skin conductance. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9296, pp. 255–262. Springer, Heidelberg (2015). doi:10.​1007/​978-3-319-22701-6_​18 CrossRef
    89.Lisetti, C.L., Nasoz, F.: Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Adv. Sig. Process. 2004(11), 1–16 (2004)
    90.Litman, D.J., Forbes-Riley, K.: Predicting student emotions in computer-human tutoring dialogues. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, p. 351 (2004)
    91.Littlewort, G., Whitehill, J., Wu,T., Fasel, I., Frank, M., Movellan, J., Bartlett, M.: The computer expression recognition toolbox (CERT). In: IEEE International Conference on Automatic Face & Gesture Recognition and Workshops, pp. 298–305 (2011)
    92.Littlewort, G.C., Bartlett, M.S., Lee, K.: Automatic coding of facial expressions displayed during posed and genuine pain. Image Vis. Comput. 27(12), 1797–1803 (2009)CrossRef
    93.Liu, C., Rani, P., Sarkar, N.: An empirical study of machine learning techniques for affect recognition in human-robot interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2662–2667 (2005)
    94.Liu, X., Zheng, Y., Phyu, M.W., Zhao, B., Je, M., Yuan, X.: Multiple functional ECG signal is processing for wearable applications of long-term cardiac monitoring. IEEE Trans. Biomed. Eng. 58(2), 380–389 (2011)CrossRef
    95.Liu, Y., Sourina, O., Nguyen, M.K.: Real-time EEG-based human emotion recognition and visualization. In: 2010 International Conference on Cyberworlds (CW), pp. 262–269 (2010)
    96.López, G., Custodio, V., Moreno, J.I.: Lobin: E-textile and wireless-sensor-network-based platform for healthcare monitoring in future hospital environments. IEEE Trans. Inf. Technol. Biomed. 14(6), 1446–1458 (2010)CrossRef
    97.Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression.In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 94–101 (2010)
    98.Lugger, M., Janoir, M.-E., et al.: Combining classifiers with diverse feature sets for robust speaker independent emotion recognition. In: 2009 17th European Signal Processing Conference, pp. 1225–1229 (2009)
    99.Mandryk, R.L.: Physiological measures for game evaluation. In: Lazzaro, M. (ed.) Game usability,: Advice from the experts for advancing the player experience, pp. 207–235. Morgan Kaufmann, Burlington (2008)
    100.Mandryk, R.L., Atkins, M.S.: A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int. J. Hum. Comput. Stud. 65(4), 329–347 (2007)CrossRef
    101.Mandryk, R.L., Atkins, M.S., Inkpen, K.M.: A continuous and objective evaluation of emotional experience with interactive play environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1027–1036 (2006)
    102.Mandryk, R.L., Inkpen, K.M., Calvert, T.W.: Using psychophysiological techniques to measure user experience with entertainment technologies. Behav. Inf. Technol. 25(2), 141–158 (2006)CrossRef
    103.Marwick, A.E., et al.: I tweet honestly, I tweet passionately: twitter users, context collapse, and the imagined audience. New Media Soc. 13(1), 114–133 (2011)CrossRef
    104.McDuff, D., El Kaliouby, R., Senechal, T., Amr, M., Cohn, J.F., Picard, R.: Affectiva-MIT facial expression dataset (AM-FED): naturalistic and spontaneous facial expressions collected “in-the-wild”. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 881–888 (2013)
    105.Mehrabian, A.: Analysis of the big-five personality factors in terms of the PAD temperament model. Aust. J. Psychol. 48(2), 86–92 (1996a)CrossRef
    106.Mehrabian, A.: Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996b)MathSciNet CrossRef
    107.Mehrabian, A.: Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. J. Psychopathol. Behav. Assess. 19(4), 331–357 (1997)CrossRef
    108.Miyamoto, Y., Uchida, Y., Ellsworth, P.C.: Culture, mixed emotions: co-occurrence of positive and negative emotions in Japan and the United States. Emotion 10(3), 404 (2010)CrossRef
    109.Mohammad, S.M.: #Emotional tweets. In: Proceedings of the Sixth International Workshop on Semantic Evaluation, pp. 246–255 (2012)
    110.Mower, E., Matarić, M.J., Narayanan, S.: A framework for automatic human emotion classification using emotion profiles. IEEE Trans. Audio Speech Lang. Process. 19(5), 1057–1070 (2011)CrossRef
    111.Mundt, C.W., Montgomery, K.N., Udoh, U.E., Barker, V.N., Thonier, G.C., Tellier, A.M., Ricks, R.D., Darling, R.B., Cagle, Y.D., Cabrol, N.A., et al.: A multiparameter wearable physiologic monitoring system for space and terrestrial applications. IEEE Trans. Inf. Technol. Biomed. 9(3), 382–391 (2005)CrossRef
    112.Murugappan, M., Ramachandran, N., Sazali, Y., et al.: Classification of human emotion from EEG using discrete wavelet transform. J. Biomed. Sci. Eng. 3(04), 390 (2010)CrossRef
    113.Myers, C.S., Rabiner, L.R.: A comparative study of several dynamic time-warping algorithms for connected-word recognition. Bell Syst. Tech. J. 60(7), 1389–1409 (1981)CrossRef
    114.Nacke, L., Lindley, C.A.: Flow and immersion in first-person shooters: measuring the player’s gameplay experience. In: Proceedings of the 2008 Conference on Future Play: Research, Play, Share (Future Play 2008), pp. 81–88. ACM, New York (2008). http://​dx.​doi.​org/​10.​1145/​1496984.​1496998
    115.Naqvi, N., Shiv, B., Bechara, A.: The role of emotion in decision making a cognitive neuroscience perspective. Current Directions in Psychological Science 15(5), 260–264 (2006)CrossRef
    116.Neumann, S.A., Waldstein, S.R.: Similar patterns of cardiovascular response during emotional activation as a function of affective valence and arousal and gender. J. Psychosom. Res. 50(5), 245–253 (2001)CrossRef
    117.Neviarouskaya, A., Prendinger, H., Ishizuka, M.: Compositionality principle in recognition of fine-grained emotions from text. In: Proceedings of ICWSM (2009)
    118.Newberg, L.A.: Error statistics of hidden Markov model and hidden Boltzmann model results. BMC Bioinform. 10(1), 1 (2009)MathSciNet CrossRef
    119.Nicholson, J., Takahashi, K., Nakatsu, R.: Emotion recognition in speech using neural networks. Neural Comput. Appl. 9(4), 290–296 (2000)MATH CrossRef
    120.Nicolaou, M.A., Gunes, H., Pantic, M.: Continuous prediction of spontaneous affect from multiple cues, modalities in valence-arousal space. IEEE Trans. Affect. Comput. 2(2), 92–105 (2011)CrossRef
    121.Pedro Alves Nogueira: Rui Amaral Rodrigues, Eugénio C Oliveira, and Lennart E Nacke. Understanding and shaping players’ affective experiences in digital games. In AIIDE, Guided emotional state regulation (2013)
    122.Norman, W.T.: Toward an adequate taxonomy of personality attributes: replicated factor structure in peer nomination personality ratings. J. Abnorm. Soc. Psychol. 66(6), 574 (1963)CrossRef
    123.Nwe, T.L., Foo, S.W., De Silva, L.C.: Speech emotion recognition using hidden Markov models. Speech Commun. 41(4), 603–623 (2003)CrossRef
    124.Oppenheim, A.V., Schafer, R.W.: From frequency to quefrency: a history of the cepstrum. IEEE Sig. Process. Mag. 21(5), 95–106 (2004)CrossRef
    125.Ortigosa, A., Carro, R.M., Quiroga, J.I.: Predicting user personality by mining social interactions in facebook. Journal of Computer and System Sciences 80(1), 57–71 (2014)MathSciNet CrossRef
    126.Ortony, A., Clore, G.L., Collins, A.: The Cognitive Structure of Emotions. Cambridge University Press, Cambridge (1990)
    127.Paas, F.G.W.C., Van Merriënboer, J.J.G.: Instructional control of cognitive load in the training of complex cognitive tasks. Educ. Psychol. Rev. 6(4), 351–371 (1994)CrossRef
    128.Pantic, M., Bartlett, M.S.: Machine Analysis of Facial Expressions. I-Tech Education and Publishing, Vienna (2007)CrossRef
    129.Parikh, R., Movassate, M.: Sentiment analysis of user-generated twitter updates using various classification techniques. CS224N Final Report, pp. 1–18 (2009)
    130.Paunonen, S.V., Haddock, G., Forsterling, F., Keinonen, M.: Broad versus narrow personality measures and the prediction of behaviour across cultures. Eur. J. Pers. 17(6), 413–433 (2003)CrossRef
    131.Pavlidis, I., Dowdall, J., Sun, N., Puri, C., Fei, J., Garbey, M.: Interacting with human physiology. Comput. Vis. Image Underst. 108(1), 150–170 (2007)CrossRef
    132.Pekrun, R., Stephens, E.J.: Achievement emotions: a control-value approach. Soc. Pers. Psychol. Compass 4(4), 238–255 (2010)CrossRef
    133.Pennebaker, J.W., Francis, M.E., Booth, R.J.: Linguistic inquiry, word count: LIWC 2001. Mahwah: Lawrence Erlbaum Associates, vol. 71 no. 2001 (2001)
    134.Perrinet, J., Olivier, A.-H., Pettré, J.: Walk with me: interactions in emotional walking situations, a pilot study. In: Proceedings of the ACM Symposium on Applied Perception, pp. 59–66 (2013)
    135.Peter, C., Herbon, A.: Emotion representation and physiology assignments in digital systems. Interact. Comput. 18(2), 139–170 (2006)CrossRef
    136.Petrantonakis, P.C., Hadjileontiadis, L.J.: Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 14(2), 186–197 (2010)CrossRef
    137.Picard, R.W.: Affective Computing. MIT press, Cambridge (1997)CrossRef
    138.Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 23(10), 1175–1191 (2001)CrossRef
    139.Plutchik, R.: A general psychoevolutionary theory of emotion. In: Plutchik, R., Kellerman, H. (eds.) Theories of Emotion, vol. 1, pp. 3–31. Academic press, Cambridge (1980)CrossRef
    140.Porter, M.F.: An algorithm for suffix stripping. Program 14(3), 130–137 (1980)CrossRef
    141.Prkachin, K.M., Solomon, P.E.: The structure, reliability and validity of pain expression: evidence from patients with shoulder pain. Pain 139(2), 267–274 (2008)CrossRef
    142.Quercia, D., Kosinski, M., Stillwell, D., Crowcroft, J.: Our twitter profiles, our selves: predicting personality with twitter. In: IEEE International Conference on Privacy, Security, Risk and Trust (PASSAT) and Social Computing (SocialCom), pp. 180–185 (2011)
    143.Quigley, K.S., Barrett, L.F.: Is there consistency and specificity of autonomic changes during emotional episodes? Guidance from the conceptual act theory and psychophysiology. Biol. Psychol. 98, 82–94 (2014)CrossRef
    144.Rainville, P., Bechara, A., Naqvi, N., Damasio, A.R.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int. J. Psychophysiol. 61(1), 5–18 (2006)CrossRef
    145.Ramirez, G.A., Baltrušaitis, T., Morency, L.-P.: Modeling latent discriminative dynamic of multi-dimensional affective signals. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011. LNCS, vol. 6975, pp. 396–406. Springer, Heidelberg (2011). doi:10.​1007/​978-3-642-24571-8_​51 CrossRef
    146.Rani, P., Sarkar, N., Liu, C.: Maintaining optimal challenge in computer games through real-time physiological feedback. In: Proceedings of the 11th International Conference on Human Computer Interaction, pp. 184–192 (2005)
    147.Ravaja, N.: Contributions of psychophysiology to media research: review and recommendations. Media Psychol. 6(2), 193–235 (2004)CrossRef
    148.Ravaja, N., Turpeinen, M., Saari, T., Puttonen, S., Keltikangas-Järvinen, L.: The psychophysiology of James Bond: phasic emotional responses to violent video game events. Emotion 8(1), 114 (2008)CrossRef
    149.Revelle, W., Scherer, K.R.: Personality and emotion. In: Scherer, K., Sander, D. (eds.) Oxford Companion to Emotion and the Affective Sciences, pp. 304–306. Oxford University Press, OXford (2009)
    150.Ruan, S., Chen, L., Sun, J., Chen, G.: Study on the change of physiological signals during playing body-controlled games. In: Proceedings of the International Conference on Advances in Computer Enterntainment Technology, pp. 349–352 (2009)
    151.Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161 (1980)CrossRef
    152.Russell, J.A.: Core affect and the psychological construction of emotion. Psychol. Rev. 110(1), 145 (2003)CrossRef
    153.Sandbach, G., Zafeiriou, S., Pantic, M., Yin, L.: Static and dynamic 3D facial expression recognition: a comprehensive survey. Image Vis. Comput. 30(10), 683–697 (2012)CrossRef
    154.Schafer, R.W., Rabiner, L.R.: Digital representations of speech signals. Proc. IEEE 63(4), 662–667 (1975)CrossRef
    155.Scherer, K.R.: What are emotions? And how can they be measured? Soc. Sci. Inf. 44(4), 695–729 (2005)CrossRef
    156.Schuller, B., Rigoll, G., Lang, M.: Speech emotion recognition combining acoustic features and linguistic information in a hybrid support vector machine-belief network architecture. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 1, p. I-577 (2004)
    157.Schuller, B., Lang, M., Rigoll, G.: Robust acoustic speech emotion recognition by ensembles of classifiers. Fortschritte der Akustik 31(1), 329 (2005)
    158.Schuller, B., Valster, M., Eyben, F., Cowie, R., Pantic, M.: AVCE 2012: the continuous audio/visual emotion challenge. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, pp. 449–456 (2012)
    159.Setz, C., Arnrich, B., Schumm, J., La Marca, R., Troster, G., Ehlert, U.: Discriminating stress from cognitive load using a wearable EDA device. IEEE Trans. Inf. Technol. Biomed. 14(2), 410–417 (2010)CrossRef
    160.Shen, L., Wang, M., Shen, R.: Affective e-learning: using emotional data to improve learning in pervasive learning environment. J. Educ. Technol. Soc. 12(2), 176–189 (2009)
    161.Shergill, G.S., Sarrafzadeh, A., Diegel, O., Shekar, A.: Computerized sales assistants: the application of computer technology to measure consumer interest-a conceptual framework. J. Electron. Commer. Res. 9(2), 176–191 (2008)
    162.Shi, Y., Ruiz, N., Taib, R., Choi, E., Chen, F.: Galvanic skin response (GSR) as an index of cognitive load. In: Proceedings of CHI 2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2651–2656 (2007)
    163.Shivhare, S.N., Khethawat, S.: Emotion detection from text. Comput. Sci. Inf. Technol. 5, 371–377 (2012)
    164.Strapparava, C., Valitutti, A., et al.: WordNet Affect: an affective extension of WordNet. In: Proceedings of LREC, vol. 4, pp. 1083–1086 (2004)
    165.Sun, F.-T., Kuo, C., Cheng, H.-T., Buthpitiya, S., Collins, P., Griss, M.: Activity-aware mental stress detection using physiological sensors. In: Gris, M., Yang, G. (eds.) MobiCASE 2010. LNICSSITE, vol. 76, pp. 211–230. Springer, Heidelberg (2012). doi:10.​1007/​978-3-642-29336-8_​12 CrossRef
    166.Sun, Y., Hu, S., Azorin-Peris, V., Kalawsky, R., Greenwald, S.: Noncontact imaging photoplethysmography to effectively access pulse rate variability. J. Biomed. Optics 18(6), 1–9 (2013). Article 061205
    167.Teixeira, T., Wedel, M., Pieters, R.: Emotion-induced engagement in internet video advertisements. J. Mark. Res. 49(2), 144–159 (2012)CrossRef
    168.Thought Technology Ltd.Procomp infiniti system (2016). http://​thoughttechnolog​y.​com/​index.​php/​hardware.​html . Accessed 26 May 2016
    169.Tian, Y., Kanade, T., Cohn, J.F.: Facial expression recognition. In: Li, S.Z., Jain, A.K. (eds.) Handbook of Face Recognition, pp. 487–519. Springer, London (2011)CrossRef
    170.Tiller, W.A., McCraty, R., Atkinson, M.: Cardiac coherence: a new, noninvasive measure of autonomic nervous system order. Altern. Ther. Health Med. 2(1), 52–65 (1996)
    171.Toole, A.J., Harms, J., Snow, S.L., Hurst, D.R., Pappas, M.R., Ayyad, J.H.: Hervé Abdi, A.: video database of moving faces, people. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 812–816 (2005)CrossRef
    172.Trejo, L.J., Knuth, K., Prado, R., Rosipal, R., Kubitz, K., Kochavi, R., Matthews, B., Zhang, Y.: EEG-based estimation of mental fatigue: convergent evidence for a three-state model. In: Schmorrow, D.D., Reeves, L.M. (eds.) FAC 2007. LNCS (LNAI), vol. 4565, pp. 201–211. Springer, Heidelberg (2007). doi:10.​1007/​978-3-540-73216-7_​23 CrossRef
    173.Valstar, M., Pantic, M.: Induced disgust, happiness, surprise: an addition to the MMI facial expression database. In: Proceedings of International Workshop on EMOTION (satellite of LREC): Corpora for Research on Emotion and Affect, p. 65 (2010)
    174.Valstar, M.F., Gunes, H., Pantic, M.: How to distinguish posed from spontaneous smiles using geometric features. In: Proceedings of the International Conference on Multimodal Interfaces, pp. 38–45 (2007)
    175.Vasu, V., Heneghan, C., Arumugam, T., Sezer, S.: Signal processing methods for non-contact cardiac detection using doppler radar. In: 2010 IEEE Workshop on Signal Processing Systems (SIPS), pp. 368–373 (2010)
    176.Vi, C., Subramanian, S.: Detecting error-related negativity for interaction design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 493–502 (2012)
    177.Wache, J.: The secret language of our body: affect and personality recognition using physiological signals. In: Proceedings of the 16th International Conference on Multimodal Interaction, pp. 389–393 (2014)
    178.Watson, D., Tellegen, A.: Toward a consensual structure of mood. Psychol. Bull. 98(2), 219 (1985)CrossRef
    179.Weigert, A.J.: Mixed Emotions: Certain Steps Toward Understanding Ambivalence. SUNY Press, New York (1991)
    180.Westerink, J.H.D.M., Van Den Broek, E.L., Schut, M.H., Van Herk, J., Tuinenbreijer, K.: Computing emotion awareness through galvanic skin response and facial electromyography. In: Probing Experience, pp. 149–162. Springer (2008)
    181.Witten, I.H., Frank, E., Mining, D.: Practical Machine Learning Tools and Techniques. Morgan Kaufmann, Burlington (2005)MATH
    182.Xu, J., Wang, Y., Chen, F., Choi, H., Li, G., Chen, S., Hussain, S.: Pupillary response based cognitive workload index under luminance and emotional changes. In: Proceedings of CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 1627–1632 (2011)
    183.Yik, M., Russell, J.A., Steiger, J.H.: A 12-point circumplex structure of core affect. Emotion 11(4), 705 (2011)CrossRef
    184.Zeng, Z., Pantic, M., Roisman, G., Huang, T.S., et al.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)CrossRef
    185.Zhou, F., Xingda, Q., Jiao, J.R., Helander, M.G.: Emotion prediction from physiological signals: a comparison study between visual and auditory elicitors. Interact. Comput. 26(3), 285–302 (2014)CrossRef
  • 作者单位:Benjamin Guthier (18)
    Ralf Dörner (19)
    Hector P. Martinez (20)

    18. Department of Computer Science IV, University of Mannheim, A5, 6, 68131, Mannheim, Germany
    19. Department Design, Computer Science, Media, RheinMain University of Applied Sciences, Unter den Eichen 5, 65195, Wiesbaden, Germany
    20. Center for Computer Games Research, IT University of Copenhagen, Rued Langgaards Vej 7, 2300, Copenhagen S, Denmark
  • 丛书名:Entertainment Computing and Serious Games
  • ISBN:978-3-319-46152-6
  • 刊物类别:Computer Science
  • 刊物主题:Artificial Intelligence and Robotics
    Computer Communication Networks
    Software Engineering
    Data Encryption
    Database Management
    Computation by Abstract Devices
    Algorithm Analysis and Problem Complexity
  • 出版者:Springer Berlin / Heidelberg
  • ISSN:1611-3349
  • 卷排序:9970
文摘
Being able to automatically recognize and interpret the affective state of the player can have various benefits in a Serious Game. The difficulty and pace of a learning game could be adapted, or the quality of the interaction between the player and the game could be improved – just to name two examples. This Chapter aims to give an introduction to Affective Computing with the goal of helping developers to incorporate the player’s affective data into the games. Suitable psychological models of emotion and personality are described, and a multitude of sensors as well as methods to recognize affect are discussed in detail. The Chapter ends with a number examples where human affect is utilized in Serious Games.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700