ARTIFICIAL intelligence has learned to identify the songs someone is listening to from their brain readings.
Derek Lomas at Delft University of Technology in the Netherlands and his colleagues asked 20 people to listen to 12 songs through headphones. The volunteers did this blindfolded and in a dimly lit room to minimise the effect of their other senses on the results. Each person’s brainwaves were recorded using an electroencephalography (EEG) cap that detects electrical activity.
The EEG readings from each person were cut into short segments and used along with the matching music clip to train an AI to spot patterns between the two. The AI was then tested on unseen portions of the data, identifying songs with an accuracy of 85 per cent.
But the software struggles if it is trained on EEG data from one person and then attempts to identify a song when someone else listens to it. Accuracy in such tests dropped below 10 per cent (CODS COMAD 2021, doi.org/frks).
The researchers believe that this is due to each person’s aesthetic response to a song being unique and people tending to focus on different elements of the music during training. Ultimately, however, they aim to identify aspects of EEG responses to music that are common to all humans.
Lomas hopes that this will further our understanding of the brain, as well as boost knowledge of how and why humans consume music.
“I think it’s really provocative to think about how the combination of machine learning and high-density data from EEG can be combined to bring insights into moving emotional experiences, but also to figure out what’s going on inside your head,” he says.
Music is ultimately “just voltage fluctuations”, he says. “And it’s the same with the EEG.”