As part of the WGA’s negotiating committee, I’ve done a lot of work (and press) on behalf of the Guild’s proposal regarding AI, which seeks to regulate the use of AI on MBA-covered projects.
Specifically, the proposal would ensure that AI-generated material is not considered “literary material” or “source material” — two terms with specific and important meaning in our contract. Getting this language in our contract protects writers from having AI write or rewrite us, and ensures that the things we write isn’t fed into the algorithm to generate “new” versions of our work.1
Amid the list of other urgent concerns about compensation and working conditions, the Guild’s AI proposal might seem like an outlier. Indeed, when it was first proposed as an addition to our pattern of demands back in November, some members of the committee wondered if it was too early. ChatGPT had only just been announced, and very few people had experimented with it.
But I’d had an early preview of similar technologies, and a sense that we needed to be thinking about AI issues now.
In addition to my day job writing movies, I run a tiny software company. We make Weekend Read and Highland, which is maybe the third most popular screenwriting app. One cool thing about working in software is that you meet other people in tech. In August 2021, a friend introduced me to Amit Gupta, who was starting a company called Sudowrite. He described as “Photoshop for text.” That’s a cool pitch.
A tool or a weapon?
I met with Amit and saw a demo. The app was just a web page with a text box. You could click buttons and have it rewrite the text you provided, or expand upon it. It was primitive, but it felt like magic. This was more than a year before ChatGPT, but it provided the similar level of “wait, is this actually possible?” intrigue. And right after that, a corollary feeling of “wait, this could be really bad if used for evil.”
How do you make sure this is a tool used by writers, like spellcheck and Wikipedia, and not a tool used to replace writers?
I wasn’t ensorcelled; I recognized that if Sudowrite could do this, other companies could as well, including companies with no qualms about replacing writers. (Indeed, most of the early competitors focused on writing ad copy and SEO-optimized websites.)
My company made a small investment in Amit’s company and started talking about ways actual writers might use this technology. They were focused on prose fiction, which made sense. Screenwriting is weird. As a person who sells an app that formats it, it’s a small and specialized market. The world is full of folks who write fiction, especially fan fiction. They would be a better fit for the tool as it stood.
I’m listed as an advisor to Sudowrite, but that overstates my involvement. I haven’t hyped it up or used it beyond those initial few weeks. As far as I know, none of my books or scripts have been used in any of its training. Neither I nor my company ever made a cent from our investment in Sudowrite, and never incorporated any of their stuff into the apps we sell.
My interactions with Sudowrite gave me an early preview of what was compelling and troubling about the intersection of AI and writing, which helped put it on the Guild’s radar.
GPT-2-point-something
I wasn’t the only person who was intrigued by these AI writing tools. Stephen Follows, a data scientist I’d worked with on figuring out the truth behind the “one page per minute” rule and other screenwriting esoterica, reached out in June 2022 to say that he and a friend were working on a project to write a screenplay using AI.
He invited me to come on their podcast to discuss it. They shared their screen to show how they were using an early version of OpenAI’s GPT tool to generate screenplay scenes. Bad scenes, it must be noted. Like, not even first year of film school scenes. But it was the first time I saw people using this kind of tool to do the kind of work we are hired to do in film and TV. And when they said they were working with a guild-signatory producer, that set off alarm bells.
How do you determine credit? Who is the writer of record? Would a script created this way even be copyrightable?
I raised these concerns in the podcast, but then reached out to the Guild to put this on their radar. In early July, I had my first phone calls and conversations with Guild staff about how situation like this could handled under existing rules and definitions.
This was all before ChatGPT, which debuted in November 2022.
The WGA West assembled a board committee to study the issue, ultimately recommending the proposal which was included in our pattern of demands. It was quirk of the calendar that our three-year contract was coming up before the use of these AI system had become widespread. As ChatGPT 3 evolved to the much more capable ChatGPT 4, it became clear that waiting another three years to address the issue was not an option.
Studio silence
As noted on the two-page summary of where our issues stand as the WGA went out on strike, the studios rejected our proposal and refused to make a counter. In the room, they said that the technology is new and they are not inclined to limit their ability to use this new technology in the future.
That’s ominous and unacceptable. I believe we have to address this issue in our contract now. We can’t spend three years stalling with committees.
Along with our other urgent needs for compensation and working conditions, I believe we’ll ultimately win these necessary gains in regulating AI. As evidenced by the 2007 strike over the internet, our members understand how important it is to grapple with new technology before it becomes entrenched.
But even when we win this battle, the issues surrounding material generated by AI won’t be over.
As writers and a society, we have to grapple with the implications of this technology. How do we ensure the material used to train these models is provided with consent, credit and compensation? How do we deal with the bias? How do we treat the material output by these systems in terms of copyright? There are myriad concerns that go well beyond our contract, and will require developing both ethical and legal frameworks. I discussed some of those at this week’s listening session with the U.S. Copyright Office.
My early look at Sudowrite and related systems gave me a brief preview of a few of these issues, but none of the answers. We’re going to be grappling with the implications of these technologies for years.
- We sometimes call this the “Nora Ephron Problem,” the idea that you could feed Nora Ephron’s screenplays into a system and generate a “new” script in her voice. ↩