The United States Attorney’s Office, Southern District of New York and New York Field Office of the Federal Bureau of Investigation (FBI) have indicted a man who allegedly used artificial intelligence (AI) to defraud music streaming platforms out of millions of royalty payments.
In the indictment, Michael Smith, 52, received three counts after reportedly using AI and bot technologies to generate billions of streams with fake songs, totalling $10m in royalties.
Authorities arrested Smith on Wednesday last week, where he will face a hearing with a Magistrate Judge in North Carolina, the statement read.
The US Attorney’s Office’s Complex Frauds and Cybercrime Unit will lead the prosecution via Assistant US Attorneys Nicholas W Chiuchiolo and Kevin Mead.
If convicted, he could face three charges — wire fraud, wire fraud conspiracy, and money laundering conspiracy — with each carrying maximum sentences of up to 20 years, totalling up to 60 years in jail.
The Blueprint Behind the Crime
According to the indictment, Smith, from Cornelius, North Carolina, developed a network of accounts across several streaming platforms like Spotify and Apple, Amazon, and YouTube Music, and used bots to stream the songs around 661,440 times per day.
He then automated the process across thousands of songs, earning him annual royalties over $1,207,000 and evading streaming platform checks.
Authorities noted he had emailed two accomplices in December 2018, which read,
“We need to get a TON of songs fast to make this work around the anti-fraud policies these guys are all using now.”
The co-conspirators, an AI music company CEO and music promoter, facilitated the creation of hundreds of thousands of song copies using AI.
A 2019 email from the promoter cited in the indictment read: “Keep in mind what we’re doing musically here… this is not ‘music,’ it’s ‘instant music’ ;).”
Comments on ‘AI Stream Scheme’
US Attorney Damian Williams accused Smith of “fraudulently” streaming songs he created with AI “billions of times in order to steal royalties.”
He added that Smith had used a “brazen fraud scheme” to steal the royalties from musicians.
“Today, thanks to the work of the FBI and the career prosecutors of this Office, it’s time for Smith to face the music,” Williams concluded.
Furthermore, Christie M Curtis, Acting Assistant Director, FBI, stated that Smith had created “hundreds of thousands of songs” using AI and bots to “generate unlawful royalties to the tune of $10 million.”
She added,
“The defendant’s alleged scheme played upon the integrity of the music industry by a concerted attempt to circumvent the streaming platforms’ policies. The FBI remains dedicated to plucking out those who manipulate advanced technology to receive illicit profits and infringe on the genuine artistic talent of others.”
Thoughts on AI-Based Cybercrime
AI has become yet another tool in the arsenal of cybercriminals to defraud musicians of their hard-earned money. Along with the rise of metaverse-based social platforms, AI will become another component of the technology stack heavily regulated and monitored by authorities.
From personal knowledge, music streaming rose to prominence amid a longstanding controversy between those sharing music files and the Recording Industry Association of America (RIAA), leading to a compromise between record labels, their artists, and the public.
With the majority of people streaming their music on the world’s biggest platforms like YouTube, Spotify, Amazon Music, Deezer, and others, analytics play a central role in receiving compensation for their work, sometimes at much lower rates than with physical copies.
According to the Richmond Journal of Law and Technology, the advent of Napster in 1999 kicked off an era of peer-to-peer file sharing, leading to the massive loss of “billions” in music industry revenues.
The RIAA trade organisation launched hundreds of mass “John Doe” lawsuits against people accused of music piracy.
At the time, many were using their updated computers — complete with CD burners, access to higher-speed internet, and access to torrent websites like The Pirate Bay and others — along with over 30,000 lawsuits directed at individuals violating copyright law.
RIAA Chief Mitch Glazier slammed The Pirate Bay’s efforts to escape US law in a press statement, calling the platform “one of the worst of the worst,” Torrent Freak reported in 2012.
Of course, this triggered scoffs from the Pirate Bay, who called the RIAA “delusional” for stating The Pirate Bay had stolen copyrights.
The tit-for-tat began to wind down due to difficulties in enforcing such lawsuits and the rise of streaming platforms like Spotify and YouTube. However, in the age of AI, watchdogs may come back to the forefront as issues with cyberfraud and cybercrime expand rapidly worldwide.
Countries and their respective governments have understandably upgraded their investigative units to tackle such issues. With the inauguration of the Council of Europe’s AI Convention, will certainly hope the platform can transnationally coordinate efforts to enforce legally-binding frameworks to tackle such issues.
Hopefully, these frameworks will have an “as above, so below” approach to AI-based cybercrime, ethics, and best practices as things, with impartial scrutiny, notwithstanding current headaches over EU tech regulation.
Individuals such as Michael Smith will inevitably trigger regulatory radars, but this will expand exponentially over the next few years.
Large language model (LLM) creators training with consumer data, transnational data flows across borders with incongruent policies, and concerns over open vs proprietary source are set to enflame tensions across the tech landscape. This will also extend to every industry vertical that incorporates emerging technologies.
I have advocated for a blockchain-based system for disseminating royalties to music artists and have questioned industry members on the feasibility of such business models. This could potentially hedge against fraudulent music streams by leveraging verifiable, real-time handshakes with servers using machine learning (ML) tools to protect the creative community, similarly to those working with non-fungible tokens (NFTs).
This will be another rabbit hole to explore in the near future.
Like this article? Be sure to like, share, and subscribe for all the latest updates from D×M!






Leave a comment