Facebook pixel tracker

Bejjanki Co-Authors Article on Multisensory Cue Integration

Vikranth Bejjanki
Vikranth Bejjanki

Assistant Professor of Psychology Vikranth Bejjanki co-authored a paper published in Scientific Reports, an online, open access journal from the publishers of Nature. “Sensory cue-combination in the context of newly learned categories” presents the results of research by Bejjanki and co-authors Kaitlyn Bankieris and Richard Aslin of the University of Rochester.

Objects and events in the natural environment provide human observers with multiple sources of information (or cues), within and across sensory modalities. According to Bejjanki, “A key challenge in psychology and neuroscience has been to understand the mechanisms that allow the human brain to integrate these multisensory cues into a coherent form.

“Prior research has shown that human observers combine multiple cues pertaining to stimuli drawn from continuous dimensions, such as distance or size, in a statistically efficient manner. However, most of our interactions with the world involve stimuli drawn from categorical dimensions, such as objects or words, and the mechanisms that underlie multisensory cue integration in such situations have been less clear,” he said.

Bejjanki and his colleagues have previously examined multisensory cue integration with natural categories, providing qualitative evidence that human observers utilize information about the distributional properties of task-relevant categories, in addition to sensory information, in such categorical cue integration tasks.

In the current study, the researchers expanded upon this prior work by creating and teaching human participants novel audiovisual categories, thereby allowing them to quantitatively evaluate participants’ multisensory integration of sensory and categorical information.

Their results showed that human observers utilize both sensory and categorical information during multisensory integration, and that they do so in a statistically optimal fashion.

Bejjanki said the findings “help elucidate the computational underpinnings of multisensory integration in real-world situations.”

Back to Top