Faculty » Lawrence D. Rosenblum

My research concerns speech, face, and auditory perception as taken from an Ecological perspective. Ecological Psychology strives to understand how humans perceive and act in the natural environment. Emphasis is placed on establishing a thorough description of the information available to our senses rather than on the mental processes usually thought to embellish impoverished input.

I am interested in speech perception as a multi-sensory function. While speech is usually considered something to hear, we can also perceive speech by watching someone speak (lipreading) or even touching someone's face. Regardless of our level of hearing, we all lipread to some degree and research shows that integrating audiovisual speech information occurs automatically and unconsciously. Our own research has shown that even pre-linguistic infants integrate audiovisual speech.

Our research also suggests that the brain treats auditory and visual speech information similarly. For example, the most useful portions of both auditory and visual speech signals are those that are changing and dynamic. This fact has been known of auditory speech for some time, and our laboratory has now shown that this is also true of visual speech. For these purposes, we have developed a 'point-light' image technique in which all facial features are removed except for small reflective dots placed on the articulators. We have found that facial displays reduced to these moving dots can be lipread and can be integrated with auditory speech. Examples of these and other lab stimuli can be seen at http://www.faculty.ucr.edu/~rosenblu/lab-index.html.

I am also interested in face recognition and its relation to visual speech perception. Historically, visual speech and face recognition have been considered functionally and neuropsychologically separate. However, we have found that familiarity with a speaker's face can facilitate lipreading and that image manipulations thought to only affect face recognition also influence visual speech perception. This link between visual speech and face functions might be based on their use of common information. In fact, we have found that the isolated visual speech information in our point-light displays can be used for recognizing faces. Interestingly, these relations between visual speech and face perception are similar to those existent for auditory speech and voice recognition. Thus, this connection between speech and speaker information constitutes another way in which the brain treats auditory and visual speech similarly.

Regarding general auditory perception, I am interested in how humans use sound to help guide their everyday behaviors. We have found that listeners can anticipate the arrival of a looming sound source based on hearing just a short portion of the approach. We have also examined how humans use reflected sound to 'echolocate' their environment (similar to the skill of bats). In general, our research has helped reveal the exquisite sensitivity of the human auditory system to naturally occurring acoustic information.

Selected Publications

Pastore, R.E., Schmuckler, M.A., Rosenblum, L.D. and Szczesuil, R. (1983). Duplex perception with musical stimuli. Perception and Psychophysics, 33, 469-474.

Pastore, R.E., Szczesuil, R., and Rosenblum, L. (1984). Does silence simply separate speech components? Journal of the Acoustical Society of America, 75, 1904-1907.

Carello, C., Rosenblum, L.D., and Grosofsky, A. (1986). Static depiction of movement. Perception, 15, 41-58.

Turvey, M.T., Rosenblum, L.D., Schmidt, R.C., and Kugler, P.N. (1986). Fluctuations and phase symmetry in coordinated rhythmic movements. Journal of Experimental Psychology: Human Perception and Performance, 12, 564-583.

Rosenblum, L.D., Carello, C., Pastore, R.E. (1987). Relative effectiveness of three stimulus variables for locating a moving sound source. Perception, 16, 175-186.

Turvey, M.T., Schmidt, R.C., Rosenblum, L.D., and Kugler, P.N. (1988). On the time allometry of coordinated rhythmic movements. Journal of Theoretical Biology, 130, 285-325.

Rosenblum, L.D. and Turvey, M.T. (1988). Maintenance tendency in coordinated rhythmic movements: Relative fluctuations and phase. Neuroscience, 27, 289-300.

Turvey, M.T., Schmidt, R.A., and Rosenblum, L.D. (1989). 'Clock' and 'Motor' components in absolute coordination of rhythmic movements. Neuroscience, 33, 1-10.

Kugler, P.N., Turvey, M.T., Schmidt, R.C. and Rosenblum, L.D. (1990). Investigating a nonconservative invariant of motion in coordinated rhythmic movements. Ecological Psychology, 2(2), 151-189.

Bingham, G.P., Schmidt, R.C., and Rosenblum, L.D. (1990). Hefting for maximum distance thrown: A smart perceptual mechanism. Journal of Experimental Psychology: Human Perception and Performance. 15, 507-528.

Fowler, C.A. and Rosenblum, L.D. (1991). Perception of the phonetic gesture. In I.G. Mattingly and M. Studdert-Kennedy (Eds.), Modularity and the Motor Theory. Hillsdale, NJ: Lawrence Earlbaum.

Fowler, C.A. and Rosenblum, L.D. (1991). Duplex perception: A comparison of monosyllables and slamming doors. Journal of Experimental Psychology: Human Perception and Performance. 16, 742-754.

Bingham, G.P., Schmidt, R.C., Turvey, M.T., and Rosenblum, L.D. (1991). Task dynamics and resource dynamics in the assembly of a coordinated rhythmic activity. Journal of Experimental Psychology: Human Perception and Performance. 17, 359-381.

Rosenblum, L.D. and Fowler, C.A. (1991). Audio-visual investigation of the loudness-effort effect for speech and nonspeech stimuli. Journal of Experimental Psychology: Human Perception and Performance. 17 (4) 976-985.

Rosenblum, L.D., and Saldaña, H.M. (1992). Discrimination tests of visually-influenced syllables. Perception and Psychophysics. 52 (4), 461-473.

Rosenblum, L.D., Saldaña, H.M., and Carello, C. (1993). Dynamical constraints on pictorial action lines. Journal of Experimental Psychology: Human Perception and Performance. 19 (2) 381-396.

Saldaña, H.M. and Rosenblum, L.D. (1993). Visual influences on auditory pluck and bow judgments. Perception and Psychophysics. 54 (3), 406-416.

Rosenblum, L.D. (1993). Acoustical information for controlled collisions. In A. Schick (Ed.), Contributions to Psychological Acoustics. Oldenburg, Germany: Bibliotheks- und Informationssystem der Carl von Ossietzky Universität Oldenburg.

Rosenblum, L.D., Wuestefeld, A.P., and Saldaña, H.M (1993). Auditory Looming Perception: Influences on Anticipatory Judgments. Perception. 22, 1467-1482.

Rosenblum, L.D. (1994). How special is audiovisual speech integration? Current Psychology of Cognition. 13(1), 110-116.

Saldaña, H.M. and Rosenblum, L.D. (1994). Selective adaptation in speech perception using a compelling audiovisual adaptor. Journal of the Acoustical Society of America. 95(6), 3658-3661.

Bingham, G.P., Rosenblum, L.D., and Schmidt, R.C. (1995). Dynamics and the orientation of kinematic forms in visual event recognition. In press at Journal of Experimental Psychology: Human Perception and Performance. 21(4), 1473-1493.

Rosenblum, L.D. and Saldaña, H.M. (1996). An audiovisual test of kinematic primitives for visual speech perception. Journal of Experimental Psychology: Human Perception and Performance. 22(2), 318-331.

Rosenblum, L.D., Wuestefeld, A.P., and Anderson, K.L. (1996). Auditory reachability: An affordance approach to the perception of sound source distance. Ecological Psychology. 8(3), 1-24.

Rosenblum, L.D., Johnson, J. A., and Saldaña, H.M. (1996). Visual kinematic information for embellishing speech in noise. Journal of Speech and Hearing Research 39(6), 1159-1170.

Rosenblum, L.D., Schmuckler, M.A., and Johnson, J.A. (1997). The McGurk effect in infants. Perception & Psychophysics, 59(3), 347-357.

Rosenblum, L.D. & Saldaña, H.M. (1998). Time-varying information for visual speech perception. In R. Campbell, B. Dodd, D. Burnham (Eds.), Hearing by Eye: Part 2, The Psychology of Speechreading and Audiovisual Speech. pp. 61-81. Earlbaum: Hillsdale, NJ

Rosenblum, L.D., Yakel, D.A., & Greene, K.G. (2000). Face and mouth inversion affects on visual and audiovisual speech perception. Journal of Experimental Psychology: Human Perception and Performance. 26(3), 806-819.

Rosenblum, L.D., Gordon, M.S., & Jarquin, L. (2000). Echolocation by moving and stationary listeners. Ecological Psychology, 12 (3), 181-206.

Rosenblum, L.D., Gordon, M.S. & Wuestefeld, A.P. (2000). Effects of performance feedback and feedback withdrawal on auditory looming perception. Ecological Psychology, 12 (4), 273-291.

Yakel, D.A., Rosenblum, L.D., & Fortier, M.A. (2000). Effects of talker variability on speechreading. Perception & Psychophysics, 62, 1405-1412.

Rosenblum, L.D. & Yakel, D.A. (2001). The McGurk effect from single and mixed speaker stimuli. Acoustic Research Letters Online, 2, 67-72.

Gordon, M.C. & Rosenblum, L.D. (2001). Audiovisual Speech Web-Lab: An Internet Teaching and Research Laboratory. Behavioral Research Methods, Instruments, & Computers. 33(2), 267-269.

Rosenblum, L.D., Yakel, D.A., Baseer, N., Panchal, A., Nordarse, B.C. & Niehus, R.P. (2002). Visual speech information for face recognition. Perception & Psychophysics, 64(2), 220-229.

Rosenblum, L.D. (2002). The perceptual basis of audiovisual speech integration. Proceedings of the 7th International Conference on Spoken Language Processing, 3, 1461-1464.

Rosenblum, L.D. (2004). Perceiving articulatory events: Lessons for an ecological psychoacoustics. In J.G. Neuhoff (ed.) Ecological Psychoacoustics . Elsevier: San Diego, CA. pp. 219-248.

Gordon, M.S. & Rosenblum, L.D. (2004). Perception of acoustic sound-obstructing surfaces using body-scaled judgments. Ecological Psychology, 16 , 87-113.

Rosenblum, L.D. (2005). The primacy of multimodal speech perception. In D. Pisoni & R. Remez (Eds.). Handbook of Speech Perception . Blackwell: Malden, MA. pp. 51-78.

Gordon, M.S. & Rosenblum, L.D. (2005). Effects of Intra-Stimulus Modality Change on Audiovisual Time-to-Arrival Judgments. Perception & Psychophysics , 67, 580-594.

Rosenblum, L.D., Smith, N.M., Nichols, S. Lee, J. & Hale, S. (2006). Hearing a face: Cross-modal speaker matching using isolated visible speech. Perception & Psychophysics , 68, 84-93.

Rosenblum, L.D., Smith, N.M. & Niehus, R.P. (2007). Look who's talking: Recognizing friends from visible articulation. Perception , 36, 157-159.

Rosenblum, L.D., Miller, R.M. & S.K. (2007). Lipread me now, hear me better later: Crossmodal transfer of talker familiarity effects. Psychological Science , 18, 392-396.

Rosenblum, L.D. & Robart, R.L. (2007). Hearing silent shapes: Identifying the shape of a sound-obstructing surface. Ecological Psychology , 19, 351-366.