Introduction to the Issue
There has been a rising trend of AI-generated songs that are sneaking into the playlists of real artists on streaming platforms, according to Dougie Brown from UK Music, who represents the British music industry. These fraudulent tracks are uploaded under the names of actual artists to claim associated royalties.
Case Studies: Emily Portman and Paul Bender
Emily Portman, a British folk musician, was surprised to find an unreleased album, “Orca,” in her Spotify and Apple Music catalogs in July. The song titles resembled her work, but upon closer inspection, she realized it was an AI-generated album based on her previous records.
Portman expressed concern that fans might believe she created the album or even enjoy it, despite the artificial perfection of the voice and meaningless lyrics. She couldn’t identify the fraudsters but suspects they impersonated her to a music distribution company.
Paul Bender, an Australian musician, discovered four poorly composed songs generated by AI in the profiles of his band, The Sweet Enoughs. Bender criticized the streaming industry’s lack of strong authentication for uploading music online, making it easy for anyone to falsely claim authorship.
Bender shared his concerns on Instagram, receiving hundreds of messages from artists and fans. He found numerous suspicious albums, including those falsely attributed to deceased artists like Sophie. Bender launched a petition on change.org, gathering over 24,000 signatures, urging platforms to enhance security.
“Musical Plundering”
AI-generated tracks are uploaded under real artists’ names to claim associated royalties, explains Dougie Brown from UK Music. Although individual streaming royalties are low, they can quickly accumulate, especially when amplified by bots.
Portman and Bender requested the removal of fraudulent tracks, which took between 24 hours and eight weeks. However, they did not file complaints.
While some laws protect artists from imitation, especially in California, copyright limitations remain concerning AI-driven risks in other countries like the UK.
As AI music generators, such as Suno and Udio, become more sophisticated, most listeners struggle to differentiate between AI-created songs and human performances, according to a November 2021 Ipsos study for the French platform Deezer.
AI-generated groups, like The Velvet Sundown, have amassed a million followers on Spotify. Streaming giants acknowledge the exacerbation of existing industry issues, such as spam, fraud, and deceptive content, due to AI.
Accused of transparency issues regarding AI, these platforms have announced measures to increase reliability and transparency. Both Spotify and Apple Music claim to collaborate proactively with distributors to detect such frauds.
Key Questions and Answers
- What is the issue? AI-generated songs are infiltrating real artists’ playlists without authorization, with fraudsters claiming associated royalties.
- Who are the affected artists? Emily Portman, a British folk musician, and Paul Bender, an Australian musician, are among those impacted.
- How is this happening? The streaming industry lacks robust authentication, allowing anyone to falsely claim authorship.
- What are the consequences? AI-generated music can deceive fans into believing it’s created by human artists, potentially damaging genuine artists’ reputations.
- What are the platforms doing about it? Streaming services like Spotify and Apple Music claim to work preventatively with distributors to detect and address these fraudulent activities.