Scientists at the University of Rochester have shown for the first time that our brains automatically consider many possible words and their meanings before we’ve even heard the final sound of the word.
Previous theories have proposed that listeners can only keep pace with the rapid rate of spoken language—up to 5 syllables per second—by anticipating a small subset of all words known by the listener, much like Google search anticipates words and phrases as you type. This subset consists of all words that begin with the same sounds, such as “candle”, “candy,” and “cantaloupe,” and makes the task of understanding the specific word more efficient than waiting until all the sounds of the word have been presented.
But until now, researchers had no way to know if the brain also considers the meanings of these possible words. The new findings are the first time that scientists, using an MRI scanner, have been able to actually see this split-second brain activity…
Search this blog
More from M. G. Saldivar
Filter posts by category
Subscribe to the Cognitive Science Blog RSS Feed