Neuroscientists from Trinity have identified a brain signal specifically associated with the conversion of speech into understanding.
The group discovered that the signal is present when the listener has understood what they are hearing but isn’t present when they didn’t understand or weren’t paying attention.
Ussher Assistant Professor in Trinity’s School of Engineering, Ed Lalor, led the research team comprised of scientists from Trinity and the University of Rochester. The discovery has a number of potential applications, including tracking language development in infants, assessing brain functions and identifying the early onset of dementia in older persons.
Lalor noted in a press statement that, though there is more work to be done in this field, “we have already begun searching for other ways that our brains might compute meaning, and how those computations differ from those performed by computers”.
The findings were published today in the journal Current Biology and heavily emphasise the impressive nature of the human brain’s understanding, particularly with words that have different meanings in different contexts. However, the understanding of how the brain computes the meaning of words in context has been unclear until now.
The researchers used state-of-the-art techniques that are used by computers and smartphones to “understand” speech. To test if human brains can understand and recognise the similarity and differences between spoken words, researchers recorded electric brainwave signals from their subjects’ scalp as the participants listened to audiobooks. Researchers analysed this data and found that there was a specific brain response that showed how similar a given word was from the words that had preceded it in the story.
This signal would disappear completely when those completing the experiment couldn’t understand the speech due to noise levels or when they weren’t paying attention.
“We hope the new approach will make a real difference when applied in some of the ways we envision”, Lalor said.