The Strike That Changed Everything (And Some Things It Didn't)
The 2023 WGA strike lasted 148 days. Writers picketed over residuals, but AI protections were the issue that studios genuinely feared giving ground on. When the dust settled, the WGA secured groundbreaking contract language: AI couldn't be used to write or rewrite literary material, and any AI-generated content submitted to a writer had to be disclosed.
It felt like a win. And in some ways, it was.
But here we are in 2026, and the reality is messier than the headlines suggested. Studios found workarounds. New tools emerged. And the writers who thought the contract would protect them are discovering that legal language and technological capability are very different things.
What the 2023 WGA Contract Actually Said
Before we get to 2026, it's worth being precise about what writers actually won in 2023. The contract established that:
- AI-generated text does not constitute "literary material" under the MBA (Minimum Basic Agreement)
- Studios cannot use AI output as a first draft and then pay writers to polish it at a lower rate
- Writers can choose whether to use AI tools themselves, but studios can't require it
- Any AI material given to a writer must be disclosed as such
What the contract didn't do was ban AI from production pipelines entirely. Development, marketing, research, scheduling, and pre-production work remained largely unprotected territory. Studios moved fast.
How Studios Used the Gaps: 2024 to 2026
Within six months of the contract signing, several major studios had quietly shifted their development processes. Executives were using tools like Jasper AI and Copy.ai to generate pitch summaries, beat sheets, and loglines. They'd then bring those to writers as "development notes" rather than scripts, technically staying within contract bounds.
Is that ethical? Most writers we've spoken to say no. Is it legal under the current MBA? Their lawyers say yes.
The situation with AI video generation got more complicated after tools like Sora matured. If you want to understand just how powerful these video generation tools have become, our Sora 2 review shows exactly why studios started taking synthetic content seriously as a production option.
Meanwhile, tools like Synthesia, HeyGen, and ElevenLabs made it increasingly cheap to produce promotional content, trailers, and even supplementary scenes without traditional crew. Descript and Pictory let small studios edit and repurpose footage at a fraction of the previous cost.
None of this technically violated the WGA contract. All of it absolutely changed how work got distributed.
The 2026 Contract Negotiations: What's Different This Time
The WGA's current MBA was set to expire, and 2026 negotiations started with AI even more central than before. Writers arrived at the table with three years of data showing how studios had exploited contract gaps. Studios arrived with three years of cost savings showing exactly why they'd resist closing those gaps.
Several key demands from the WGA in 2026 include:
- AI audit rights. Writers want the ability to verify whether studios used AI tools in development before assigning them material.
- Compensation for AI training data. If a studio's AI model was trained on a writer's past work, the WGA argues that writer deserves residuals.
- Broader definition of "literary material." Closing the beat-sheet and logline loophole explicitly.
- AI transparency across all departments. Not just writing, but casting research, scheduling, and post-production decisions that affect writers downstream.
Studios are pushing back hard on the training data residuals. The legal argument is genuinely complex, and both sides have strong positions. No resolution as of this writing.
SAG-AFTRA and the Synthetic Actor Problem
Writers aren't alone. SAG-AFTRA reached its own AI agreement in 2023, but the digital likeness provisions have proven nearly impossible to enforce at scale.
Background actors who signed consent forms for "limited digital use" have found their likenesses appearing in far more material than expected. The tools involved, including Leonardo AI for image generation and HeyGen for synthetic video, have made it trivially easy to generate crowd scenes, minor characters, and background elements without additional payments.
The deepfake problem is serious enough that detection tools have become a genuine industry. We covered the best options in our review of AI deepfake detection tools for 2026. Studios are now required under some state laws to use these tools to verify content before broadcast, but enforcement is spotty.
What Actually Changed for Working Writers
We talked to working writers across TV, film, and streaming about their day-to-day experience in 2026. The picture varies a lot by career stage.
Staff writers and showrunner assistants report the most disruption. Traditional writer's room structures have shrunk. Where a drama might have had 8 to 10 staff writers in 2021, rooms of 4 to 6 are now common. The work that junior writers used to do, breaking stories, writing outlines, drafting scenes for punch-up, is increasingly where studios test AI tools first.
Mid-level writers with one or two credits have seen their pilot deals dry up. Studios are greenlighting fewer projects overall, partly due to the streaming correction, but also partly because development costs have dropped when AI handles early-stage work.
Established showrunners are, somewhat ironically, doing fine. Their value is in taste, relationships, and the ability to run a production, none of which AI can replicate yet. Some have privately admitted to using tools like Notion AI and Perplexity AI for research and organizational work, while carefully keeping the actual writing human.
The Tools Writers Are Actually Using
Here's something the conversation often misses: many writers are using AI tools themselves, strategically, to stay competitive.
Perplexity AI has become a genuine research staple. Writers use it to quickly verify historical facts, medical details, and technical accuracy during drafts. It's faster than Google and more reliable than asking the writers' room intern.
Grammarly's advanced features help with line-level editing. Some writers treat it as a first-pass copy editor before sending drafts to showrunners.
Otter.ai is widely used in writers' rooms for transcribing pitch sessions and story meetings, saving hours of manual note-taking per week.
The WGA contract allows writers to choose whether to use these tools. Most working writers have landed somewhere pragmatic: use AI for research, organization, and editing; write the actual material yourself. That seems like a reasonable line for now.
International Productions and the Regulatory Patchwork
One major 2026 development is that AI content rules now vary dramatically by country. EU AI Act provisions have created specific disclosure requirements for synthetic content in broadcast media. UK broadcasters have their own emerging standards. Several US states, California most aggressively, have passed laws governing AI use in entertainment contracts.
Studios producing internationally have started using tools to manage this complexity. The compliance burden is real, and it's created its own niche of legal and technical work.
This regulatory complexity isn't unique to entertainment. We've seen similar fragmentation in financial AI tools, as our overview of best AI tools for tax compliance in 2026 covers in detail.
The Argument Studios Are Making (And Why It Has Some Merit)
It's easy to frame this as studios being villainous. Some behavior has been genuinely bad faith. But the economic argument for AI tools isn't purely cynical.
Streaming economics are brutal right now. Content costs ballooned during the 2019 to 2022 peak, and the correction has been severe. A streamer that can produce a competent limited series for 40% less, even if that means a smaller writers' room and more AI-assisted development, may be the one that's still operating in five years.
The studios' position is essentially: we need these tools to survive, and survival means there are jobs at all. That's not entirely wrong, even if the way they've implemented it has often been exploitative.
What "AI-Written" Actually Means Now
The public debate often treats AI writing as binary. Either a human wrote it or a machine did. The reality in 2026 is a spectrum.
Consider a typical AI-assisted script development process:
- An executive uses a tool to summarize 200 reader reports on submitted manuscripts
- A story editor uses AI to generate 10 possible episode structures based on that summary
- A writer is assigned to develop one of those structures into a full script
- The finished script is 100% written by the human writer
Is that script AI-written? Technically no. Did AI shape what that writer was hired to write? Absolutely yes. The contract language hasn't caught up to this reality, which is exactly why 2026 negotiations are so contentious.
Predictions: Where This Goes Next
We're not going to pretend we know exactly how the 2026 negotiations resolve. But based on what we've seen, a few things seem likely.
First, training data residuals will become a significant revenue source for some writers, if the WGA can establish the legal precedent. The precedent from music licensing suggests this is possible but will take years of litigation.
Second, the definition of "literary material" will expand in the next contract. Studios have gotten too aggressive with the loopholes, and there's enough documented evidence to force the issue.
Third, the gap between well-established and emerging writers will widen further before any protections kick in. Junior writers entering the industry in 2026 face a genuinely difficult market. The 2023 strike helped working writers. It didn't do much for people who hadn't broken in yet.
The broader pattern here mirrors what's happening across creative industries. AI tools have created extraordinary capabilities for content generation, as anyone who's looked seriously at Midjourney V7 can attest. The question was never whether these tools would get used. It was always about who benefits and who bears the cost.
The Bottom Line for 2026
The Hollywood writers strike put AI protections into a major union contract for the first time. That mattered. But contracts are only as strong as their enforcement, and technological capability moves faster than collective bargaining.
Three years on, working writers have more legal protections than they did in 2022. They also have smaller writers' rooms, fewer development deals, and colleagues who are using AI tools because they feel they have no choice. The protections won in 2023 preserved the principle that humans should write scripts. They didn't preserve the jobs that surround and support that writing.
The 2026 negotiations are the real test. If the WGA can close the development loopholes and establish training data compensation, it'll be a genuine landmark. If studios hold the line, the 2023 agreement may ultimately be remembered as protecting the center while the edges got carved away.
We're watching closely. So should anyone who cares about what the next decade of storytelling looks like.