A Massively Parallel Multisensory Switchboard

Nearly everything we do involves combining information we get from the various senses.

When you listen to someone talk, you watch the speaker's mouth, and combine what you hear with what you see. When you read, you decode the letter shapes you see into the sounds that make up words that you know.

But think about the sorts of information that your senses represent: How something sounds and how it looks is like comparing apples to oranges. And yet the brain communicates between, and even combines, all the senses all the time. How does it do it so well? Can it be improved? And what happens if something goes wrong?

The Computational Cognitive Neuroscience Laboratory uses behavioral experiments, functional magnetic resonance imaging (fMRI), and computational modeling with artificial neural networks to test ideas about how the brain can so flexibly translate between different sensorimotor representations.

Next

Exploring the Human Connectome

The brain is wired to integrate an enormous amount of information.
In real time.

Each of your senses has its own specialized center in the brain. How these centers communicate with one another depends on the pathways of neural connections between them. Mapping the connections between all the brain's neurons is a huge challenge. Though we're still a long way off, our work uses machine learning to discover connections between brain regions from patterns of activity in fMRI data.

Next

Building Better Brain Models

Modeling normal and atypical multisensory integration help us understand cognitive deficits and what we might do to treat them.

A complete understanding of how various systems communicate with one another lets us predict or better identify the sorts of problems that might arise from a particular pattern of brain connectivity (e.g., after a stroke), and could lead to novel therapies designed to target these connections.

Next