Skip to content
Search

Latest Stories

AI systems process data 'startlingly like' human brain, finds UC Berkeley study

The newly documented similarities in brain waves and AI waves are a benchmark on how close researchers are to building mathematical models that resemble humans as closely as possible

AI systems process data 'startlingly like' human brain, finds UC Berkeley study

A new study has revealed that artificial intelligence (AI) systems are capable of processing data in a manner that is remarkably comparable to how the brain decodes speech.

Researchers from the University of California, Berkeley, tracked individuals' brain activity as they listened to the word "bah" only once. The signals produced by an AI system that had been taught to understand English were then matched to the brain activity.


The two signals' side-by-side comparison graph revealed a startling likeness. The researchers claimed that the data was unaltered and that it was raw.

"Understanding how different architectures are similar or different from humans is important," said Gasper Begus, assistant professor of linguistics at UC Berkeley and lead author of the study published recently in the journal Scientific Reports.

That is because, he said, understanding how those signals compare to the brain activity of human beings is an important benchmark in the race to build increasingly powerful systems.

For example, Begus said, having that understanding could help put guardrails on increasingly powerful AI models. It could also improve our understanding of how errors and bias are baked into the learning processes.

To do so, Begus turned to his training in linguistics.

He said that the sound of spoken words enters our ears and gets converted into electrical signals, which then travel through the brainstem and to the outer parts of our brain.

Using electrodes, researchers traced that path in response to 3,000 repetitions of a single sound and found that the brain waves for speech closely followed the actual sounds of language.

The researchers transmitted the same recording of the "bah" sound through an unsupervised neural network - an AI system - that could interpret sound. They then measured the coinciding waves and documented them as they occurred.

Begus said he and his colleagues are collaborating with other researchers using brain imaging techniques to measure how these signals might compare. They're also studying how other languages, like Mandarin, are decoded in the brain differently and what that might indicate about knowledge.

Many models are trained on visual cues, like colours or written text - both of which have thousands of variations at the granular level. Language, however, opens the door for a more solid understanding, Begus said.

The English language, for example, has just a few dozen sounds.

"If you want to understand these models, you have to start with simple things. And speech is way easier to understand," Begus said.

In cognitive science, the researchers said, one of the primary goals is to build mathematical models that resemble humans as closely as possible.

The newly documented similarities in brain waves and AI waves are a benchmark on how close researchers are to meeting that goal, they said.

(PTI)

More For You

Pharmacy Business Awards celebrate 25 years of excellence in community pharmacy

Award winners celebrate their recognition at the 25th annual Pharmacy Business Awards in London on October 3

Pharmacy Business Awards celebrate 25 years of excellence in community pharmacy

LEADING lights from community pharmacy were celebrated for the crucial role they play in public health at the 25th annual Pharmacy Business Awards in London on October 3.

Woking-based May & Thomson Pharmacy took home the coveted Pharmacy Business of the Year Award. Its owner and second-generation pharmacist, Sunil Chandarana, was recognised for building a business that has achieved significant growth by embracing technology and clinical services.

Keep ReadingShow less