Time-compressed spoken word primes crossmodally enhance processing of semantically congruent visual targets
详细信息    查看全文
  • 作者:Angela Mahr (1)
    Dirk Wentura (2)
  • 关键词:Cross ; modal ; Multisensory processing ; Semantic priming ; Response priming ; Stroop ; Time compression
  • 刊名:Attention, Perception, & Psychophysics
  • 出版年:2014
  • 出版时间:February 2014
  • 年:2014
  • 卷:76
  • 期:2
  • 页码:575-590
  • 全文大小:535 KB
  • 参考文献:1. Alvarez, G. A., & Cavanagh, P. (2004). The capacity of visual short-term memory is set both by visual information load and by number of objects. / Psychological Science, 15, 106-11. doi:10.1111/j.0963-7214.2004.01502006.x CrossRef
    2. Boersma, P., & Weenink, D. (2011). Praat: Doing phonetics by computer [Computer program]. Retrieved from www.praat.org/
    3. Brown, T. L., Joneleit, K., Robinson, C. S., & Brown, C. R. (2002). Automaticity in reading and the stroop task: Testing the limits of involuntary word processing. / American Journal of Psychology, 115, 515-43. CrossRef
    4. Calvert, G., Spence, C., & Stein, B. E. (2004). / The handbook of multisensory processes. Cambridge, MA: MIT Press.
    5. Chen, Y.-C., & Spence, C. (2010). When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures. / Cognition, 114, 389-04. CrossRef
    6. Chen, Y.-C., & Spence, C. (2011). Crossmodal semantic priming by naturalistic sounds and spoken words enhances visual sensitivity. / Journal of Experimental Psychology: Human Perception and Performance, 37, 1554-568. doi:10.1037/a0024329
    7. Cowan, N. (2000). The magical number 4 in short-term memory: A reconsideration of mental storage capacity. / Behavioral and Brain Sciences, 24, 87-14. doi:10.1017/S0140525X01003922 . disc. 114-85. CrossRef
    8. Cowan, N., & Barron, A. (1987). Cross-modal, auditory–visual Stroop interference and possible implications for speech memory. / Perception & Psychophysics, 41, 393-01. CrossRef
    9. De Houwer, J. (2003). On the role of stimulus–response and stimulus–stimulus compatibility in the Stroop effect. / Memory & Cognition, 31, 353-59. CrossRef
    10. Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. / Annual Review of Neuroscience, 18, 193-22. doi:10.1146/annurev.ne.18.030195.001205 CrossRef
    11. Dien, J., & Santuzzi, A. M. (2005). Application of repeated measures ANOVA to high-density ERP datasets: A review and tutorial. In T. C. Handy (Ed.), / Event-related potentials: A methods handbook (pp. 57-4). Cambridge, MA: MIT Press.
    12. Driver, J., & Spence, C. (1998). Crossmodal attention. / Current Opinion in Neurobiology, 8, 245-53. doi:10.1016/S0959-4388(98)80147-5 CrossRef
    13. Duncan-Johnson, C., & Kopell, B. (1981). The Stroop effect: Brain potentials localize the source of interference. / Science, 214, 938-40. CrossRef
    14. Elliott, E. M., Cowan, N., & Valle-Inclan, F. (1998). The nature of cross-modal color–word interference effects. / Perception & Psychophysics, 60, 761-67. doi:10.3758/BF03206061 CrossRef
    15. Glaser, W. R., & Glaser, M. O. (1989). Context effects in Stroop-like word and picture processing. / Journal of Experimental Psychology: General, 118, 13-2. CrossRef
    16. Ho, C., & Spence, C. (2005). Assessing the effectiveness of various auditory cues in capturing a driver’s visual attention. / Journal of Experimental Psychology: Applied, 11, 157-74.
    17. Holm, S. (1979). A simple sequentially rejective multiple test procedure. / Scandinavian Journal of Statistics, 6, 65-0.
    18. Iordanescu, L., Grabowecky, M., Franconeri, S., Theeuwes, J., & Suzuki, S. (2010). Characteristic sounds make you look at target objects more quickly. / Attention, Perception, & Psychophysics, 72, 1736-741. doi:10.3758/APP.72.7.1736 CrossRef
    19. Iordanescu, L., Guzman-Martinez, E., Grabowecky, M., & Suzuki, S. (2008). Characteristic sounds facilitate visual search. / Psychonomic Bulletin & Review, 15, 548-54. doi:10.3758/PBR.15.3.548 CrossRef
    20. Jackson, C. V. (1953). Visual factors in auditory localization. / Quarterly Journal of Experimental Psychology, 5, 52-5. CrossRef
    21. Kahneman, D., & Chajczyk, D. (1983). Tests of the automaticity of reading: Dilution of Stroop effects by color-irrelevant stimuli. / Journal of Experimental Psychology: Human Perception and Performance, 9, 497-09. doi:10.1037/0096-1523.9.4.497
    22. Keetels, M., & Vroomen, J. (2011). Sound affects the speed of visual processing. / Journal of Experimental Psychology: Human Perception and Performance, 37, 699-08.
    23. Lavie, N. (1995). Perceptual load as a necessary condition for selective attention. / Journal of Experimental Psychology: Human Perception and Performance, 21, 451-68. doi:10.1037/0096-1523.21.3.451
    24. Lavie, N., & Cox, S. (1997). On the efficiency of visual selective attention: Efficient visual search leads to inefficient distractor rejection. / Psychological Science, 8, 395-98. doi:10.1111/j.1467-9280.1997.tb00432.x CrossRef
    25. Luck, S. J., & Vogel, E. K. (1997). The capacity of visual working memory for features and conjunctions. / Nature, 390, 279-81. doi:10.1038/36846 CrossRef
    26. Lupyan, G., & Spivey, M. J. (2010). Making the invisible visible: Verbal but not visual cues enhance visual detection. / PLoS ONE, 5, e11452.
    27. Lupyan, G., & Thompson-Schill, S. L. (2012). The evocative power of words: Activation of concepts by verbal and nonverbal means. / Journal of Experimental Psychology: General, 141, 170-86. CrossRef
    28. MacLeod, C. M. (1991). Half a century of research on the Stroop effect: An integrative review. / Psychological Bulletin, 109, 163-03. doi:10.1037/0033-2909.109.2.163 CrossRef
    29. Macmillan, N. A., & Creelman, C. D. (2005). / Detection theory: A user’s guide (2nd ed.). Mahwah, NJ: Erlbaum.
    30. Mazza, V., Turatto, M., Rossi, M., & Umiltà, C. (2007). How automatic are audiovisual links in exogenous spatial attention? / Neuropsychologia, 45, 514-22. CrossRef
    31. Melara, R. D., & Algom, D. (2003). Driven by information: A tectonic theory of Stroop effects. / Psychological Review, 110, 422-71. doi:10.1037/0033-295X.110.3.422 CrossRef
    32. O’Brien, R. G., & Kaiser, M. K. (1985). MANOVA method for analyzing repeated measures designs: An extensive primer. / Psychological Bulletin, 97, 316-33. CrossRef
    33. Olivers, C. N. L., Meijer, F., & Theeuwes, J. (2006). Feature-based memory-driven attentional capture: Visual working memory content affects visual attention. / Journal of Experimental Psychology: Human Perception and Performance, 32, 1243-265. doi:10.1037/0096-1523.32.5.1243
    34. Phillips, W. A. (1974). On the distinction between sensory storage and short-term visual memory. / Perception & Psychophysics, 16, 283-90. doi:10.3758/BF03203943 CrossRef
    35. Potter, M. C. (1975). Meaning in visual search. / Science, 187, 965-66. doi:10.1126/science.1145183 CrossRef
    36. Roelofs, A. (2005). The visual–auditory color–word Stroop asymmetry and its time course. / Memory & Cognition, 33, 1325-336. doi:10.3758/BF03193365 CrossRef
    37. Salverda, A. P., & Altmann, G. T. M. (2011). Attentional capture of objects referred to by spoken language. / Journal of Experimental Psychology: Human Perception and Performance, 37, 1122-133.
    38. Schneider, T. R., Debener, S., Oostenveld, R., & Engel, A. K. (2008a). Enhanced EEG gamma-band activity reflects multisensory semantic matching in visual-to-auditory object priming. / NeuroImage, 42, 1244-254. CrossRef
    39. Schneider, T. R., Engel, A. K., & Debener, S. (2008b). Multisensory identification of natural objects in a two-way crossmodal priming paradigm. / Experimental Psychology, 55, 121-32. CrossRef
    40. Shimada, H. (1990). Effect of auditory presentation of words on color naming: The intermodal Stroop effect. / Perceptual and Motor Skills, 70, 1155-161. CrossRef
    41. Spence, C. (2007). Audiovisual multisensory integration. / Acoustical Science and Technology, 28, 61-0. CrossRef
    42. Spence, C., & Driver, J. (1997). Audiovisual links in exogenous covert spatial orienting. / Perception & Psychophysics, 59, 1-2. doi:10.3758/BF03206843 CrossRef
    43. Spence, C., Senkowski, D., & R?der, B. (2009). Crossmodal processing. / Experimental Brain Research, 198, 107-11. CrossRef
    44. Tellinghuisen, D. J., & Nowak, E. J. (2003). The inability to ignore auditory distractors as a function of visual task perceptual load. / Perception & Psychophysics, 65, 817-28. CrossRef
    45. Thorpe, S., Fize, D., & Marlot, C. (1996). Speed of processing in the human visual system. / Nature, 381, 520-22. doi:10.1038/381520a0 CrossRef
    46. Tukey, J. W. (1977). / Exploratory data analysis. Reading, MA: Addison-Wesley.
    47. Wentura, D., & Degner, J. (2010). A practical guide to sequential priming and related tasks. In B. Gawronski & B. K. Payne (Eds.), / Handbook of implicit social cognition: Measurement, theory, and applications (pp. 95-16). New York, NY: Guilford.
  • 作者单位:Angela Mahr (1)
    Dirk Wentura (2)

    1. German Research Center for Artificial Intelligence (DFKI), Campus D 3_2, 66123, Saarbrücken, Germany
    2. Institut für Psychologie, Universit?t des Saarlandes, Saarbrücken, Germany
  • ISSN:1943-393X
文摘
Findings from three experiments support the conclusion that auditory primes facilitate the processing of related targets. In Experiments 1 and 2, we employed a crossmodal Stroop color identification task with auditory color words (as primes) and visual color patches (as targets). Responses were faster for congruent priming, in comparison to neutral or incongruent priming. This effect also emerged for different levels of time compression of the auditory primes (to 30?% and 10?% of the original length; i.e., 120 and 40?ms) and turned out to be even more pronounced under high-perceptual-load conditions (Exps. 1 and 2). In Experiment 3, target-present or -absent decisions for brief target displays had to be made, thereby ruling out response-priming processes as a cause of the congruency effects. Nevertheless, target detection (d') was increased by congruent primes (30?% compression) in comparison to incongruent or neutral primes. Our results suggest semantic object-based auditory–visual interactions, which rapidly increase the denoted target object’s salience. This would apply, in particular, to complex visual scenes.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700