Spotify goes after AI-generated content

By
Nat Rubio-Licht

Sep 26, 2025

12:00pm UTC

Copy link
Share on X
Share on LinkedIn
Share on Instagram
Share via Facebook

Spotify wants to sift out the real from the fake.

The streaming company announced changes on Thursday that seek to address the surge of AI-generated music on its platform, as generative models make it easier than ever to create content. 

Spotify’s payouts to artists have skyrocketed from $1 billion in 2014 to $10 billion in 2024, the company said in its press release. That payout, in turn, has attracted “bad actors.”

“In the past 12 months alone, a period marked by the explosion of generative AI tools, we’ve removed over 75 million spammy tracks from Spotify,” the company said.

Spotify is implementing three changes:

  • It debuted a new impersonation policy to clarify rules around AI voice clones and impersonation, giving artists “stronger protections and clearer recourse.”
  • This fall, it’s releasing a music spam filter to identify “slop,” such as mass uploads, duplicates or artificially short tracks of just over 30 seconds, “as AI tools make it simpler for anyone to generate large volumes of music,” and dilute the royalty pool.
  • In partnership with DDEX, a standard-setting organization for the music business, it’s developing a new metadata standard for disclosing when AI is used in the song creation process.

Despite rumors that Spotify itself is creating these AI-generated tracks as a scheme to avoid paying artists royalties for their music, Sam Duboff, head of marketing and policy, told The Verge that there is “no truth to the conspiracy theories,” noting that all music on Spotify is licensed by third parties. 

As legal battles between publishers and major model developers rage on, changes like these could indicate that Spotify is reading the tea leaves, implementing protections before it faces any heat.