Valerie Veatch’s work will make you feel all kinds of things: dismayed, icky, but also empowered with the knowledge that we are not, in fact, actually doomed to a future of AI overlords. Across her career, the Seattle-born filmmaker has made numerous documentaries about the impact of technology on our personal and collective lives. Me at the Zoo and Love Child are tales of a brave new world coming of age in an online universe. But those films, made in 2012 and 2014 respectively, retain an aura of dreaminess around chilling stories.ย
In Ghost in the Machine, the dream is crushed. The film takes us back to the 19th-century roots of statistical mathematics, from which modern algorithmic computer science has sprung, to reveal how the logic of compute is tethered to a group of eugenicists intent on producing scientific rationale for racial genocide.ย
Veatch has the receipts. For nine months, she interviewed 40 scientists, historians, and philosophers to get to the bottom of this sordid story. I sat down with her to chat about the making of this exposรฉ.
What made you decide to make this film?
There’s an urgent, lo-fi grittiness to this film because, in a lot of ways, I didn’t want to make it, but the story had to be told. In my previous films, I approached technologyโs impact on humans, but I was still kind of optimistic about technology. Ghost in the Machine was a culmination of a lot of the things that I was tugging on [in] my work.
What were those things?
A friend signed me up for [OpenAIโs] Sora early artists access program. It was all so secretive, how they were secretly training this thing. I was trying to be open-hearted, and part of me was like, Wow, I can type “mountain” and there’s a mountain. As a filmmaker, that’s like gold dust. But then, the clouds gather instantly: hyper-sexualized depictions of women as a default, and socioeconomic, racist stereotypesโit was jarring. I tried talking to OpenAI about it. Long story short, I walked away like, Oh my god. Everybody is drinking the Kool-Aid. I started reading white papers and researching, trying to get context for why this gave me such ick and heartache. At the time, I didn’t know how to articulate what I was so angry about.
Were the connections to eugenics obvious from the start?
In the beginning, I just knew I had something to say about the misogynistic culture around big tech. Dan McQuillan, author of Resisting AI, is among the experts who contributed to the documentary. In one of our first interviews, he offhandedly mentioned that, of course, all of these things go back to eugenics. I was like, what? I started researching and was blown awayโthat there are these two tracks that stem from the late Victorian moment: one is the idea that intelligence is an externalizable feature around which you can categorize, rate, and create hierarchy. And then the other is the very structure of statisticsโthe algorithms of machine learning. People argue that โmath is math,โ but when you have causal agnostic mathematics, which is what statistics is, there’s no causal relation between two data points. There’s no y-axis.
The geneticists of New York State kept amazing records of all their lettersโso many fundraising letters. They were obsessed with proving poverty and degradation are genetic, and they were also obsessed with raising money. They were convinced that poverty was race-based, and they were using statistics to prove it. That is now the math we have for machine learning. There’s something about the eugenics story that feels really important to highlight, because the rhetoric we’re encountering with these so-called artificial intelligences is really just a consolidation of wealth and power and compute.
Is there an identity forming around AI resistance?
More and more. Since releasing this film, I’m seeing a massive crystallization of AI resistance. This movie creates a space where we can openly reject the crazy idea that machines can think. Because they can’t. As soon as we stop anthropomorphizing compute, weโll have a clearer picture of how power operates, and what’s actually happening with our social institutions.
I feel like Iโm seeing more awareness in the comment sectionsโpeople pushing back against the narrative that weโre all doomed.ย
McQuillanโs concept of decomputing looks at ways individuals and companies can utilize local computeโyou don’t need to process everything through an LLM. We need a different kind of understanding of what compute is.
But AI resistance is also actively being co-opted by big tech. Some AI doomers might look like they have things in common with AI resistance, but their whole point of view is that AI is going to become a super-intelligent godlike entity. It distracts from where the actual power sits and accommodates the actual harm being doneโlike horrifically racist predictive policing algorithms in justice systems. This isnโt existential; it is actual harm. These doomer groups get all of the funding and media attention, then issue vague statements about phoning your senator and regulating AI. It’s a waste of political will and energy. I think it will become increasingly clear where and how real AI resistance will be centralized and what it will look like. I think that by next year, weโll look back at our 2024 and 2025 obsession with AI, and we’ll giggle. The limitations of these systems are already becoming clear. And in 10 years, certainly, it will be hilarious.
It’s like a singularity becoming more dumb instead of becoming god.
The statistical phenomenon is called โregression to the mean.โ Itโs partially the fault of its own one-dimensional math, always aggressively finding the middle, then reframing itself on that. Thatโs model collapse.
You probably get this all the time, but what can we do?
I personally am trying to get a hold of communities and engineers in the spaces of the software companies I use and telling them that, as a user, I want to be able to say that at no time in the creation of this piece of media was the token engaged. The default now is to be opted in. It should be the other way around.
Iโm also trying to articulate what it means to have a workflow free of AI. That will look different for different industries and media, but I think that overall, not engaging a hyperscale data center in the production of your work and relying on local computeโand insisting on software that allows you to do thatโis something we can all do.
I realize when I talk, I sound like, take responsibility for yourself, don’t use AI, recycle your plastic bottles. And that is true. But we also need to challenge the power structures, the companies that make the plastic bottles. We can all work towards that. Creating space for conversations around AI refusal is what I can do. And I think it’s working. People tell me stories about successfully moving toward being AI-free companies. Itโs important to give people confidence that not using this technology does not set you behind professionally. I think five years from now, not using AI will be what puts you at the top of your field.
Thatโs a relief.ย
But itโs scary how much the idea of AI superintelligence has become embedded. When the film was screening at Sundance, I made a bunch of โNOT AIโ buttons to hand out. When I picked them up from the manufacturer, the person working there was like, โI’m worried about what AI will think of this button.โ This is the degree to which the narrative has become embedded.
Do you see anything positive about AIโฆ anything?
Not at all. Itโs a whole-hog ideology to be rejected, mocked, and seen for what it isโa fabrication. Unfortunately, the idea itself is the most harmful thing about itโharmful to the way we understand and value each other in this world. Itโs the thought crime of the century. Which is why itโs important to state, outright, that thereโs nothing good that comes with this technology. Because itโs not real. Engaging in the idea at all, or in the language around it, just perpetuates the narrative.
What do you want people to take away from the film?
Confidence that by refusing to use AI, you will get ahead professionally, instead of the big tech narrative that you lose out. And the narrative that all of this is inevitable. The kind of consolidation of power and the erosion of our political space is not inevitable. We can exercise our ability to have community and democracy. We can center care and mutual aid. We can ground ourselves in our lived experience. And that is something that they can’t take away from us. I hope people walk out of the film knowing, Of course, machines can’t think. AI is an invented concept. Itโs not real. This is not inevitable.
Ghost in the Machine screens at SIFF May 10 at PACCAR IMAX Theater at Pacific Science Center and May 11 at SIFF Cinema Uptown
