
A head X-ray of one participant in the experiment shows the placement of electrodes over the frontal (top) and temporal (bottom) regions of the brain. These electrodes were placed on the surface of the brain to locate the origin points of epileptic seizures. Credit: UC Berkeley
Key points:
- Neuroscientists successfully decoded the words to a Pink Floyd song from the recorded electrical activity of the brain.
- The reconstruction shows the feasibility of recording and translating brain waves to capture the musical elements of speech.
- The discovery has implications for those whom have trouble communicating, whether because of stroke or paralysis.
After 10 years of analysis, neuroscientists at the University of California, Berkeley have shown it is possible to use recordings of brain waves to capture and transfer the musical elements of speech, as well as syllables.
For the research, neuroscientists at Albany Medical Center recorded the activity of electrodes placed on the brains of 29 patients undergoing epilepsy surgery— as the chords of Pink Floyd's “Another Brick in the Wall, Part 1,” filled the surgery suite.
In this new study, postdoctoral fellow Ludovic Bellier reanalyzed those brain recordings obtained in 2012 and 2013, hoping to go beyond previous studies—which had tested whether decoding models could identify different musical pieces and genres—to actually reconstruct music phrases through regression-based decoding models and artificial intelligence.
According to the results, the phrase "All in all it was just a brick in the wall" comes through recognizably in the reconstructed song, its rhythms intact, and the words muddy, but decipherable. This is the first time researchers have reconstructed a recognizable song from brain recordings.
For people who have trouble communicating, whether because of stroke or paralysis, such recordings from electrodes on the brain surface could help reproduce the musicality of speech that's missing from today's robot-like reconstructions.
As brain recording techniques continue to improve, it may be possible to make such recordings without opening the brain, perhaps using sensitive electrodes attached to the scalp. Currently, scalp EEG can measure brain activity to detect an individual letter from a stream of letters, but the approach takes at least 20 seconds to identify a single letter, making communication effortful and difficult.
"Noninvasive techniques are just not accurate enough today. Let's hope, for patients, that in the future we could, from just electrodes placed outside on the skull, read activity from deeper regions of the brain with a good signal quality. But we are far from there," said Bellier.
Co-study author Robert Knight, a neurologist and UC Berkeley professor of psychology, is now embarking on new research to understand the brain circuits that allow some people with aphasia due to stroke or brain damage to communicate by singing when they cannot otherwise find the words to express themselves.