Allegro

It’s time to reexamine our dysfunctional relationship with tech

Volume 125, No. 9October, 2025

Ken Hatfield

This month’s A.I./tech article is by Ken Hatfield, a member of Local 802 since 1977.


It’s time to reexamine our dysfunctional relationship with tech.

It’s not just a lingering Luddite sensibility that’s led an increasing number of inhabitants of our little blue planet to question our relationship with tech. Once you investigate tech’s deceptive behavior, their surveillance business model, their predatory business practices, their negative impact on our environment and our politics, and how they use their enormous wealth to influence legislation, litigation, and public perception, it’s difficult not to conclude that they need far more regulation than they currently receive.

At the very least we need to reexamine our reliance on products and services offered by an industry that’s used musicians as the canaries in the coal mine for a quarter century to see what they can get way with.

It’s in this context that so many are so alarmed about the recent presidential executive order reinstating the ten-year “safe harbor” moratorium on state regulation of AI that Local 802 and the AFM, along with governors and attorneys general from 40 states, fought so hard to defeat 99 to 1 in the Senate. Will the courts really allow an executive order to undo by pen what was defeated by vote, overturning more than 235 years of U.S. copyright law with over 110 revisions passed by various sessions of Congress — and signed into law by presidents of every political persuasion this country has ever known? The constitution gives this power exclusively to Congress, not to a president:

“Congress shall have the Power . . . To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.”

— United States Constitution, Article I, Section 8

Copyright was among the first laws passed by the first U.S. Congress in 1790, yet the tech industry wants to eliminate all intellectual property laws, like copyright, in the service of some ambiguous national security agenda that supposedly requires unregulated development of AI. If our government and country really need our music for national defense against China (as administration comments on the executive order referenced above indicate), then pay us for it like the government would pay Boeing for designing a new fighter jet for the U.S. Air Force.

If the courts and legislators won’t stand up for their own rights, how can we count on them to defend ours? When money meets politics, all bets are off. Many tech companies measure their annual profits in the trillions. History has shown they will use that enormous wealth to insure they achieve their goals and maintain a status quo that profits them above all others.

According to current market capitalization rankings of the largest global corporations, five of the top ten are big tech companies: Alphabet/Google is #2, Apple is #3, NVIDIA is #5, Amazon is #6, and Meta is #8. Yet tech is among the least regulated industries in the world, despite being so ubiquitous and indispensable that they are essentially a utility!

Tech has a history of exploiting the work of musicians, and now they want to devour all the recorded music available online for ingestion by their servers to train AI to create music that digital service providers (including streamers) will use to replace us. They falsely claim they cannot even identify what they have “acquired” and ingested (a claim which their use of fingerprinting for “de-duplication” proves is a lie), all the while claiming identification is not cost effective for them.

We refuse to tolerate destruction of our industry. We demand:

(1) consent (we have to opt in before tech can use our music for AI training)

(2) credit (we want credit for our ingested work)

(3) compensation (pay us for using our music to train AI, both at the input and output stages).

Most musicians have a vague sense of the inequities perpetrated by the predatory business practices of digital service providers, especially streaming services. Yet we are so addicted to our digital devices that few of us take the time to investigate how tech’s business models exploit our work with little or no remuneration. We just sense something is terribly wrong. Most of us don’t even read tech terms of use agreements before accepting them, especially when we’re in a hurry to get a new app or access a website requiring us to accept cookies. We need to question and change that behavior.

Streaming

According to a 2022 study by the British Parliament, 85 percent of musicians with music on streaming services generate less than $250 annually from streaming, and it’s only gotten worse since then. Here’s a quote from that study:

“… It is widely acknowledged that consumers have benefited from streaming through access to full catalogues of music and innovative services for free or at a fixed monthly price…. However, there are real questions as to whether creators — those who write and perform the songs — have benefitted to the same extent…. Many creators struggle to make a decent living from their profession….”

When you investigate why this is so, you discover multiple contributing factors. In the U.S., in 2023 the three-judge rate court increased streaming remuneration rates to $0.0026 for commercial subscription streams and $0.0021 for commercial non-subscription streams. Even though many streaming services pay a bit more, their remuneration rates are still in the fraction of a cent range. Many DSPs use questionable methods to avoid paying even those paltry sums. For example: streaming services don’t permit direct submission of your song catalogue to them. You must go through one of their “accepted” digital distributors (like TuneCore, Distro Kid or CD Baby). Such digital distributors either charge an annual fee or take 30 percent off the top of the $0.0026 mandated by the rate court. Spotify divides subscription fees three ways: one third goes to music, one third goes to podcasts and one third goes to audio books. Application of this policy even applies to subscribers that never listen to podcasts or audio books. Spotify then takes what’s left and, rather than employ a “user-centric” model (where your subscription fees go to artists you actually listen to), they use a “pro-rata” formula, which is based solely on market share. This results in what in any other market would be deemed a deceptive business practice, because many subscribers’ fees DO NOT go to artists these subscribers listen to! In fact, the one-third of their subscription fees that do go to music often go to artists they never listen to and may not even like. To make matters worse, in April 2024 Spotify unilaterally declared they would no longer pay any fees for tracks with less than one thousand annual streams, despite laws requiring payment for ALL streams.

The outcome of all this chicanery is that it is often as impossible for fans to support artists they dig via streaming, as it is for artists (especially indie artists) to even recoup the cost of documenting their work via recording. This has altered or virtually destroyed our relationships with our fans, especially those that rely solely on streaming to access our music. Despite all the blathering to the contrary on social media, only the top-earning artists profit from streaming. I need 11,538 fully paid streams (i.e., without all the questionable deductions) to generate what I receive from the sale of one vinyl record.

You may well ask how streamers get away with disregarding the law, or when and how did our rights become unenforceable? Safe harbors, discussed below, are part of the problem. Following the money and crunching the numbers reveals more. At $0.0026 per stream divided three ways with an additional 30 percent taken off the top, even getting 100 percent of what you may have been ripped off for (assuming you win a lawsuit against a DSP), doesn’t generate enough money to pay a lawyer, even one that would take your case on a contingency basis. You could possibly seek redress via a class action lawsuit, but the courts haven’t been friendly to class action defenses of copyright.

Safe harbors

There are provisions in two laws from the 1990s intended to govern internet commerce that tech exploits for their benefit, to the detriment of most professional musicians: Section 230 of the Communications Decency Act (1996) and section 512 of the Digital Millennium Copyright Act (1998). These provisions, called “safe harbors,”legally permit qualifying individuals or entities (like corporations) to sidestep or eliminate regulatory liability in certain situations, provided preestablished conditions are met.

Section 512 of the DMCA is directly responsible for the loss of much of our market for recorded music because it limits DSP liability for facilitating copyright infringement (e.g., posting others’ music on YouTube without a license).

Here’s an excerpt from the 2020 United States Copyright Office (USCO) report on the DMCA:

“The Copyright Office concludes that the balance Congress intended when it established the section 512 safe harbor system is askew.… While OSPs [online service providers], supported in many aspects by user advocacy groups, report satisfaction with the current operation of the safe harbors, that view is not shared by the other intended beneficiaries of the section 512 system, including authors, creators, and rightsholders of all sorts and sizes.”

Tech’s safe harbor abuses are not limited to the music world. Here’s an example of how safe harbors are used to shield DSPs outside the music world:

On August 5, 2025, a three judge U.S. appellate court partially reversed a lawsuit filed by the mother of a 10-year-old Pennsylvania girl who died attempting a viral challenge she saw on TikTok. This “game,” called “the blackout challenge,”dared people to choke themselves to the point of passing out.

A district judge initially dismissed the mother’s lawsuit, citing Section 230 of the Communications Decency Act. Like other safe harbors, section 230 of the CDA is often used to protect internet companies from liability for egregious and unlawful activities DSPs tolerate, facilitate, and profit from, on their sites, but don’t post themselves.

Due to the appellate court’s partial reversal, some form of the lawsuit against TikTok can now proceed. But you can be sure TikTok will use section 230 of the CDA to claim they have no liability for the ten-year-old girl’s death. Stay tuned to see how the matter’s decided.

So, how is TikTok still doing business in the U.S. eight months after its ban took effect? How can TikTok continue facilitating reprehensible behavior like the “blackout challenge” on their site despite multiple warnings about it? Didn’t President Biden sign a law passed by Congress requiring TikTok’s sale to an American business concern on April 24, 2024? Why does the current president keep issuing executive delays of that law’s implementation? How do executive orders issued by a president that was sworn into office the day after the TikTok ban took effect mitigate a law passed by Congress and signed by his predecessor?

One problem with granting tech its current anointed status, elevated above the laws the rest of us must adhere to, is that they’re essentially unaccountable!

Environmental costs

Big tech’s abuses also impact our environment. Their unquenchable need for electricity to run their data centers and our reliance on their services all contribute to global warming.

According to Timothy Snyder, “… by Google’s own count we put about 2,000 tons of carbon dioxide into the atmosphere every day with our Web searches alone.” It’s estimated that artificial intelligence uses will be tenfold that amount!

Tech’s AI agenda is driving soaring demand for electricity to run data centers. In states like Virginia and Ohio, large buildings packed with servers consumed more than 4 percent of the nation’s electricity in 2023. U.S. government analysts estimate that will increase to as much as 12 percent in three years. That means your electric bills will climb precipitously.

In 2028 Three Mile Island, the nuclear power plant near Middletown, PA, and the scene of the worst commercial nuclear accident in U.S. history, is slated to reopen to power Microsoft’s data centers, which in turn will power the tech giant’s cloud computing and AI programs.

Let’s face it: implementation of technological innovations, often driven by the desire to cut labor costs, generally means replacing human workers with tech of some kind. Many young workers entering the workforce have already noticed diminished entry level opportunities attributable to AI. While it’s unclear what AI’s impact on higher-skilled workers will be as the technology evolves, we do know human displacement matters little to the crowd that proudly call themselves “disruptors” who move fast and break things.

So, what’s to be done? While advocating and passing legislation protecting our rights is helpful, the actual wording of such legislation (often containing loopholes like safe harbors), has historically provided tech with easy and legal ways to exploit our work with little or no remuneration. Now we face the most disruptive technology developed so far in a legal environment that forbids its regulation for the next ten years if the current president’s executive order on AI withstands the inevitable legal challenges.

We are limited in the actions we can take. In the current climate, traditional union actions like withholding our labor are either likely to prove ineffective (especially if AI can generate “acceptable” facsimiles of our recorded work to replace us) or are illegal due to provisions in laws like the secondary boycott prohibition in the Taft Hartley Act, which forbids economic actions against anyone that’s not your primary employer and prohibits other unions joining in solidarity by also withholding their labor.

Of course, we can individually remove our recordings from streamers that treat us unfairly, as many northwest coast musicians have recently done. But what impact will that have on our audience? Will a generation raised on one click access to all the recoded music in history still follow us if it is harder to find our recorded music online?

At a minimum we can and should be suspicious of all the new toys tech provides us, and not use them without at least investigating why we get them for free. Remember Jaron Lanier’s famous quote: “If you’re not paying for the product, then you are the product”? We need to read every user agreement before accepting the terms of use.

Tech’s history of over a quarter century of questionable predatory business practices leads me to conclude that they are not our friends. Tech is not a benevolent or benign force for good in our world. They are an industry of separate companies competing for things to monetize… things like our attention and our data. They use content that doesn’t belong to them (which they provide access to for less than it costs to produce) to get our attention and then mine data pertaining to our online activities and proclivities.

Most of us kind of know all this. So why do we continue to support and rely so heavily on tech provided by our exploiters? That rhetorical question is key to changing our behavior in relation to tech.

There are other ways for tech to monetize their innovations without exploiting creators of the content that drives the entire enterprise. But there is little incentive for tech companies to be good global citizens. We may have to give them that incentive by boycotting their products and services long enough to get their attention. I learned a long time ago that when someone demonstrates all they care about is money, you need to cost them money to get their attention.

All of this leads me to conclude that only an organized global consumer action to limit or even eliminate our reliance on tech owned and operated by our exploiters will get their attention and precipitate change on their part.

If we musicians (who are both creators and consumers of music) cannot sever the umbilical cord with our tech devices and services, then there’s little chance the wider public will change how it interacts with tech, and the status quo will remain. However, if we demonstrate how the inequities that tech has exploited to enrich themselves have been damaging to both creators and consumers, we may be able to build a coalition of those demanding change and implement actions that will move us toward a more equitable, safer internet marketplace for all.

If you are interested in the Local 802 A.I. committee, please send an e-mail to Local 802 In-House Counsel Harvey Mars at hmars@local802afm.org and A.I. Committee Chair Jerome Harris at jeromeharr@aol.com. Ken Hatfield has been a member of Local 802 since 1977. Send feedback on Local 802’s A.I. series to Allegro@Local802afm.org.


OTHER ARTICLES IN THIS SERIES:

What the courts are now saying about A.I. ingesting copyrighted art

A.I. is moving at light speed

Art & A.I. — Who Owns Creativity?

Copyright: Defending the Most Fundamental of All Artist Rights

VICTORY FOR HUMAN CREATORS IN A.I. COURT CASE

Defining the Magic

Watermarking and Fingerprinting Explained

A.I. LEGAL UPDATE

A FEW WAYS A.I. COULD BENEFIT MUSICIANS

The TRAIN Act is a good start in protecting musicians from A.I. exploitation

Case Tracker: Artificial Intelligence, Copyrights and Class Actions

BIG MUSIC AND A.I.

So how does A.I. actually work?

Protecting musicians from the existential threats of artificial intelligence

A DEEP DIVE INTO HOW A.I. AFFECTS MUSICIANS

“It all sounds the same”

“Artful” Intelligence

“How are you going to stop AI from stealing our jobs?”

Unleashing Creativity