twitter Facebook Facebook

Subscriber Login here

In tune. Informed. Indispensable.

Robin Millar comments on the misconception of AI




Yet again, a small group of extremely wealthy, high-profile artists are climbing on each other’s shoulders to declare that their oeuvres must be "protected" from AI. It’s a rallying cry that echoes the panic around Napster in the late 1990s: Ban it! Shut it down! Back then, the argument was that digital music distribution would destroy the industry. Instead, common sense prevailed. Streaming became not just the great democratizer of music but also a way to reduce the environmental cost of plastic, vinyl, and global distribution.

Let me be clear: I am for copyright. I spent nine years on the board of PPL, working to ensure musicians get paid when their work is performed or sampled.

As a creator myself, I have no interest in anything that would harm artists’ rights or income. But that’s not what’s happening here.

The misconception being pushed by some in the industry is that allowing AI models to study and learn from a broad range of music is equivalent to theft or plagiarism. That is simply not the case. If these critics spent any time on AI music platforms, they would see that the vast majority already implement strict safeguards. Most refuse outright to allow copyrighted material to be directly used or replicated.

AI Is Not Plagiarism
The demand to stop AI models from analyzing music, images, films, television, books, medical research, and architecture is neither practical nor necessary.

Reference libraries have existed for centuries—long before copyright laws were formalized. Academics, musicologists, novelists, and painters have always studied past works, making notes, drawing inspiration, and producing something new. Artists have imitated each other for generations—some more successfully than others.

The Beatles and the Rolling Stones, for example, built their early careers trying to copy the blues greats. Stravinsky famously said, “A good composer does not imitate; he steals.” Picasso put it more bluntly: “Bad artists copy. Great artists steal.” Artistic evolution has always depended on reinterpreting the past.

The outrage over AI fails to acknowledge this basic reality. No AI system is "stealing" Kate Bush’s voice or Elton John’s melodies. Their revenue streams will not dry up because someone, somewhere, has used AI to generate a chord progression that resembles something from the past. Copyright law is already well-equipped to handle genuine cases of misrepresentation, plagiarism, and fraud. The same legal principles that prevent counterfeit Gucci handbags and fake iPhones apply to music. If someone tries to pass off an AI-generated track as an Enya song, the law is there to stop them.

The Misguided Public Appeal
The 1,000 signatories calling on the public to back their campaign assume that ordinary listeners share their fears. But why would they? The majority of people do not see AI-generated content as a threat to music any more than drum machines were in the 1980s or digital sampling in the 1990s. Both were met with similar outrage from some corners of the industry. Both became tools that empowered new generations of artists.

Yes, there are risks. There must be clear guidelines to ensure artists are compensated fairly and that AI-generated works do not mislead consumers. But calling for blanket bans and fueling hysteria is not the answer. AI is not the death of creativity—it is another tool in the artist’s arsenal. Just as synthesizers, samplers, and Pro Tools once were.

The real conversation should be about how to integrate AI responsibly into the creative landscape. That requires nuance, not fearmongering.

Robin Millar

 

Submit news or a press release

Want to add your news or press release? Email Paul or Kevin

Two week FREE trial
device: pc