Have you got the AI Factor? Robots can now identify a hit… so Simon Cowell may have to watch out! 

>

Do you have the AI ​​factor? Robots can now identify a surefire hit… which means music moguls like Simon Cowell might have to watch out!

  • AI can now identify a hit with near 100% accuracy, a study found

Music magnates like Simon Cowell may find that their days are numbered.

Artificial intelligence can now identify a surefire hit with nearly 100 percent accuracy, a study found.

Simply asking people which songs they liked best is a terrible way to identify a hit, researchers found after recruiting 33 volunteers to listen to 24 recent songs.

But by interpreting their brain signals and then letting AI interpret the results, a hit or a flop can be distinguished with 97.2 percent accuracy.

The AI ​​can recognize a hit with an impressive 82 percent accuracy after hearing the song for just one minute.

Music magnates like Simon Cowell will find their days numbered as artificial intelligence can now identify a surefire hit with nearly 100 percent accuracy, a study finds

The results were tested against the modern definition of a hit: whether a song has been played more than 700,000 times on internet streaming services.

That’s incredibly useful when 168,000 songs are released worldwide each week, and less than four percent of them become hits.

The researchers think the technology could be used to predict which movies, TV shows and even social media posts will take off.

Professor Paul Zak, senior author of the study, from Claremont Graduate University, said: ‘Applying machine learning to neurophysiological data allowed us to identify hit numbers almost perfectly.

‘That the neural activity of 33 people can predict whether millions of others are listening to new songs is quite astonishing.

“Nothing close to this accuracy has ever been shown before.”

The volunteers in the study listened to songs from a variety of genres, including rock music and hip-hop, that had been released in the previous six months.

They wore a simple fitness tracker that recorded their heart rate.

Years of previous research suggest that subtle patterns in a person’s heart rate, based on fluctuations in speed or the gap between beats, can show whether they produce the brain chemical dopamine, while focusing on a song, and the hormone oxytocin, while they emotionally connect with it.

The researchers used this to find out when people were in a state of “immersion” and invested in the song, and when they entered a state of “retreat” where they were less interested and only half listened.

The volunteers in the study listened to songs from a variety of genres, including rock music and hip-hop, that had been released in the past six months (file photo)

The volunteers in the study listened to songs from a variety of genres, including rock music and hip-hop, that had been released in the past six months (file photo)

This information, analyzed using old-fashioned statistics, was able to identify hits and flops with 69 percent accuracy.

But that jumped to 97.2 percent when using AI, which could calculate immersion and retraction in nearly 800 different ways.

The people in the study were also asked a series of questions about how much they liked each song.

But how many people said they liked songs they hadn’t heard before was unrelated to whether those songs were hits, researchers found.

Professor Zak said, “You don’t necessarily know if a song is going to be a hit, but your brain does, which is crazy cool.”

Although the study used a small number of songs, Professor Zak said: ‘In the future, the right entertainment could be directed to audiences based on their neurophysiology. “Instead of getting hundreds of choices, they might only get two or three, making it easier and faster for them to choose music they’ll enjoy.”

The study is published in the journal Frontiers of Artificial Intelligence.