Folk musician Murphy Campbell got hit twice: first by AI-generated songs uploaded to streaming services under her name, then by copyright claims that targeted her own public-domain recordings. The Murphy Campbell AI clone problem shows how easily music platforms, distributors, and automated rights systems can be gamed, even when the tracks in question are old enough to have outlived several formats and a few bad industry ideas.

Campbell found songs on her Spotify profile in January that she had recorded but never uploaded there herself. The vocals sounded wrong, and her suspicion was straightforward: someone appears to have taken performances she posted on YouTube, used them to make AI covers, and pushed those tracks to streaming services as if they were hers. She eventually got most of them removed, but one copy still surfaced on Spotify under another artist profile with the same name. So yes, the catalog now includes multiple Murphy Campbells. The real one was not thrilled.

AI covers are now easy enough to pass for the real thing

The scary part is not that this happened, but that it worked. Campbell says Spotify is testing a system that would let artists approve songs before they appear on their profile, which sounds sensible until you remember how long it took to reach that idea. Streaming services have spent years building scale; now they are discovering that scale is also a convenient disguise for fraud.

There is also a broader industry problem here: AI voice tools are getting better, music distribution pipelines are wide open, and the burden of proof still tends to fall on the artist who has already been copied. That is a charming business model if you enjoy paperwork and humiliation.

Public-domain recordings triggered a copyright claim anyway

Then came the second headache. On the day a Rolling Stone piece about Campbell’s AI imitators ran, videos were uploaded through distributor Vydia and used to claim ownership of material in some of her YouTube clips. One notice told Campbell: ”You are now sharing revenues with the copyright owners of the music detected in your video, Darling Corey.” The head-scratcher is that the songs involved are public domain, including ”In the Pines,” a traditional song dating back to at least the 1870s and later covered by everyone from Lead Belly to Nirvana.

Vydia later dropped the claims and said the uploader had been banned. It also said fewer than 0.02 percent of more than 6,000,000 claims filed through YouTube’s Content ID system were found invalid. That sounds impressive until you remember that 0.02 percent of 6,000,000 is still a lot of mistakes if you are the person on the receiving end. YouTube declined to comment.

The problem is bigger than one distributor

Campbell does not think Vydia alone deserves the blame, and she’s right to push wider than one company. Generative AI, music distribution, and copyright enforcement now overlap in ways that create too many chances for abuse and too few fast remedies. Spotify, YouTube, and distributors all want the same thing until something goes wrong: friction for bad actors, not for everyone else.

The most likely next step is more platform-side verification, more manual review, and more artists being forced to prove they are themselves. That is not elegant, but it may be the only path left if streaming services want to stop fake Murphy Campbells from multiplying faster than the real one can file complaints.

Leave a comment

Your email address will not be published. Required fields are marked *