91B0FBB4-04A9-D5D7-16F0F3976AA697ED
C9A22247-E776-B892-2D807E7555171534
Jeremy Skipper
Jeremy Skipper

Assistant Professor of Psychology Jeremy I. Skipper gave an invited talk in a workshop sponsored by the Experimental Psychology Society (EPS) Jan. 7-8 at University College London. In “Hearing lips and… hands, smiles and print too: How listening to words in the wild is not all that auditory to the brain” he discussed the role of visual contextual cues in the processing of auditory information.

 

In his talk, part of the workshop “WHAT IF... the study of language started from the investigation of signed, rather than spoken, languages?” Skipper said that “the brain uses available context to predict forthcoming sounds or words resulting in less need to process incoming auditory information.” He demonstrated that “speech-associated mouth movements, co-speech gestures, and valenced facial expressions are all used by the brain to generate predictions of sounds or words associated with those forms of context, resulting in a dramatic reduction of processing in auditory areas.”

 

He concluded that “real-world language comprehension is not always strongly reliant on auditory brain areas because (visual) contextual information can be used to supplant processing in those areas. This implies that the de-contextualized picture of language and the brain that has emerged from studying isolated sounds or words is incomplete if not misleading.”
 

Help us provide an accessible education, offer innovative resources and programs, and foster intellectual exploration.

Site Search