Neuroscientists at UC Berkeley are able to reconstruct what a person is listening to by directly reading their brainwaves.
Neuroscientists recorded electrical activity from many areas of the brain (yellow and red dots) as patients listened to the Pink Floyd song, “Another Brick in the Wall, Part 1.” Using artificial intelligence software, they were able to reconstruct the song from the brain recordings. This is the first time a song has been reconstructed from intracranial electroencephalography recordings.
Their goal? To capture the electrical activity of brain regions specialized in attributes of music — tone, rhythm, harmony and words — to see if they could reconstruct what the patient was hearing.
Here is a recording of the original music:
And now, here is that same music, but the version the neuroscientists at U.C. Berkley were able to retrieve by scanning the test subject’s brain waves:
The researchers did NOT train the neural net on the phrase you can hear in the example in the link. They were looking for similar bits, and found them. Again, these are brainwaves being analyzed and translated, there are no audio signals in the data. The music is from “decoding” the brainwaves, then playing as audio. Listen for the prhase “All in all, it was just a brick in the wall”, sounding like it’s underwater.
The music-patterns were found after detailed AI analysis of data from 29 patients undergoing preparation for epilepsy surgery. Each patient had 2668 electrodes spaced a few millimeters apart on the surface of their brain, giving detailed measurements. Because these intracranial electroencephalography (iEEG) recordings can be made only from the surface of the brain — as close as you can get to our auditory centers — it may be a while before someone can be eavesdropping on your thoughts from a distance. However, last year an Australian team was able to get similar results to surface EEG using magnets outside the head.
Ludovic Bellier, and Dr.Robert Knight, a neurologist and UC Berkeley professor of psychology, and their team were also the first to translate the words a person was hearing by studying their brainwaves in 2012. It’s been long understood speech functions are mostly on the left side of the brain. Music is more evenly distributed, but more processing on the right side of the brain. The group also identified a possible center for perceiving rhythm in the right superior temporal gyrus, another major discovery, if confirmed. They even found neurons that pulsed with the same frequency as the Pink Floyd rhythm guitar, if they used the closest spaced electrodes.
There is actually a point to all this; there are practical uses. All languages contain melodic nuances, including tempo, stress, accents and intonation. “These elements, which we call prosody, carry meaning that we can’t communicate with words alone,” Bellier says. He hopes the model will improve brain-computer interfaces, assistive devices that record speech-associated brain waves and use algorithms to reconstruct intended messages.
We also offer you this opinion piece on the future of AI. Every day it becomes more important to get control and regulate AI technology. The field is still young, but already artificial intelligence is powerful enough to actually read minds.
David Raiklen wrote, directed and scored his first film at age 9. He began studying keyboard and composing at age 5. He attended, then taught at UCLA, USC and CalArts. Among his teachers are John Williams and Mel Powel.
He has worked for Fox, Disney and Sprint. David has received numerous awards for his work, including the 2004 American Music Center Award. Dr. Raiklen has composed music and sound design for theater (Death and the Maiden), dance (Russian Ballet), television (Sing Me a Story), cell phone (Spacey Movie), museums (Museum of Tolerance), concert (Violin Sonata ), and film (Appalachian Trail).
His compositions have been performed at the Hollywood Bowl and the first Disney Hall. David Raiken is also host of a successful radio program, Classical Fan Club.