AI-generated songs are appearing on artists’ official streaming profiles, with scammers exploiting weak verification to upload fraudulent tracks and claim royalties.
LONDON: Musicians are raising the alarm over a surge in AI-generated tracks fraudulently uploaded to their official profiles on major streaming platforms.
British folk artist Emily Portman discovered a fake album titled “Orca” on her Spotify and Apple Music pages in July, despite not releasing new music since 2022.
She described the AI-produced music as having “pristine perfection” in the vocals but “vacuous lyrics”, and said it was clearly trained on her previous work.
Portman believes scammers pose as artists to distribution companies, which upload the music without proper identity checks.
Australian musician Paul Bender also found four “bizarrely bad” AI songs added to his band The Sweet Enoughs’ profiles earlier this year.
He criticised the streaming industry’s lax security, calling the process “the easiest scam in the world”.
Bender compiled a list of suspect albums, including on profiles of deceased artists, and a petition he launched garnered 24,000 signatures.
A November study for Deezer found most listeners now cannot distinguish AI-generated tracks from genuine artist recordings.
UK Music’s Dougie Brown explained that fraudsters upload music under real artists’ names to claim streaming royalties.
Revenues are amplified using bots to artificially inflate play counts, he added.
Both Portman and Bender successfully requested platforms remove the fake tracks, a process taking from 24 hours to eight weeks.
Philip Morris of the Musicians’ Union said limited copyright laws in places like the UK leave artists vulnerable to such AI impersonation.
Spotify recently announced new measures to improve reliability and transparency, acknowledging AI accelerates problems like fraud.
The platform said it is working with distributors to better detect deceptive content upstream.







