The processing of speech, gesture, and action during language comprehension
详细信息    查看全文
  • 作者:Spencer Kelly ; Meghan Healey ; Asli ?zyürek ; Judith Holler
  • 关键词:Language comprehension ; Embodied cognition ; Gesture ; Action
  • 刊名:Psychonomic Bulletin & Review
  • 出版年:2015
  • 出版时间:April 2015
  • 年:2015
  • 卷:22
  • 期:2
  • 页码:517-523
  • 全文大小:435 KB
  • 参考文献:1. Clark, HH (1996) Using language. Cambridge University Press, Cambridge, UK CrossRef
    2. Holler, J, Kokal, I, Toni, I, Hagoort, P, Kelly, SD, ?zyürek, A (2014) Eye’m talking to you: Speakers-gaze direction modulates co-speech gesture processing in the right MTG. Social Cognitive and Affective Neuroscience.
    3. Hostetter, AB (2011) When do gestures communicate? A meta-analysis. Psychological Bulletin 137: pp. 297-315 CrossRef
    4. Hostetter, AB, Alibali, MW (2008) Visible embodiment: Gestures as simulated action. Psychonomic Bulletin & Review 15: pp. 495-514 CrossRef
    5. Hostetter, AB, Alibali, MW (2010) Language, gesture, action! A test of the Gesture as Simulated Action framework. Journal of Memory and Language 63: pp. 245-257 CrossRef
    6. Kelly, SD, Ward, S, Creigh, P, Bartolotti, J (2007) An intentional stance modulates the integration of gesture and speech during comprehension. Brain and Language 101: pp. 222-233 CrossRef
    7. Kelly, SD, ?zyürek, A, Maris, E (2010) Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science 21: pp. 260-267 CrossRef
    8. Kendon, A. (1986). Current issues in the study of gesture. In J.-L. Nespoulous, P. Perron, A. Roch Lecours, & the Toronto Semiotic Circle (Eds.), / The biological foundations of gestures: Motor and semiotic aspects (Vol. 1, pp. 23-7). Hillsdale, NJ: Erlbaum.
    9. Kendon, A (2004) Gesture: Visible action as utterance. Cambridge University Press, Cambridge, UK CrossRef
    10. Krahmer, E, Swerts, M (2007) The effects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception. Journal of Memory and Language 57: pp. 396-414 CrossRef
    11. McNeill, D (1992) Hand and mind: What gestures reveal about thought. University of Chicago Press, Chicago, IL
    12. McNeill, D (2012) How language began: Gesture and speech in human evolution. Cambridge University Press, New York, NY CrossRef
    13. Novack, MA, Congdon, EL, Hemani-Lopez, N, Goldin-Meadow, S (2014) From action to abstraction using the hands to learn math. Psychological Science 25: pp. 903-910 CrossRef
    14. Paivio, A (1986) Mental representations: A dual coding approach. Oxford University Press, Oxford, UK
    15. Rosenbaum, DA, Chapman, KM, Weigelt, M, Weiss, DJ, Wel, R (2012) Cognition, action, and object manipulation. Psychological Bulletin 138: pp. 924-946 CrossRef
    16. Wagner Cook, S, Tanenhaus, MK (2009) Embodied communication: Speakers-gestures affect listeners-actions. Cognition 113: pp. 98-104 CrossRef
    17. Willems, RM, ?zyürek, A, Hagoort, P (2008) Seeing and hearing meaning: ERP and fMRI evidence of word versus picture integration into a sentence context. Journal of Cognitive Neuroscience 20: pp. 1235-1249 CrossRef
    18. Willems, RM, ?zyürek, A, Hagoort, P (2009) Differential roles for left inferior frontal and superior temporal cortex in multimodal integration of action and language. NeuroImage 47: pp. 1992-2004 CrossRef
    19. Wu, YC, Coulson, S (2011) Are depictive gestures like pictures? Commonalities and differences in semantic processing. Brain and Language 119: pp. 184-195
文摘
Hand gestures and speech form a single integrated system of meaning during language comprehension, but is gesture processed with speech in a unique fashion? We had subjects watch multimodal videos that presented auditory (words) and visual (gestures and actions on objects) information. Half of the subjects related the audio information to a written prime presented before the video, and the other half related the visual information to the written prime. For half of the multimodal video stimuli, the audio and visual information contents were congruent, and for the other half, they were incongruent. For all subjects, stimuli in which the gestures and actions were incongruent with the speech produced more errors and longer response times than did stimuli that were congruent, but this effect was less prominent for speech–action stimuli than for speech–gesture stimuli. However, subjects focusing on visual targets were more accurate when processing actions than gestures. These results suggest that although actions may be easier to process than gestures, gestures may be more tightly tied to the processing of accompanying speech.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700