Some Thoughts on Techno-Fascism From Socialism 2025

"This is the endgame of our isolation."

Flowers bloom from a book above the words "organizing my thoughts."

For the next couple of days, I will be attending Socialism 2025, which is my favorite conference of the year, despite my ideological promiscuity. It's a great place to be on the 4th of July, because here, I am surrounded by people who haven't bought into any celebratory myths about colonialism or empire. Right now, I am resting in my hotel room for a bit because my back problems are flaring up. So, while I lie here on this heating pad, I thought I would share some of the remarks I made at one of today's panels with you all. The panel, titled "Rethinking Anti-Fascism," is available in its entirety here (the panel begins at the 18:13 mark).


For years now, I have been trying to raise the alarm about the tech cults of Silicon Valley, and the financial, political, and institutional capture that billionaires attached to these ideologies were pulling off. Émile P. Torres and Timnit Gebru have done groundbreaking work in this area, describing this grouping of tech-based ideologies as the TESCREAL bundle—an acronym that refers to a group of connected and overlapping ideologies, some of which evolved out of one another, that center the tech industry as the guiding force of humanity, or whatever humanity is supposed to yield or become. Some adherents, such as Elon Musk, view humanity as a mere 'biological bootloader' for artificial intelligence—an outlook that frames empathy for the suffering of real, living people today as counterproductive, or even “suicidal” for civilization. These tech-based ideologies invoke the threat of human extinction and the promise of utopia—and often position artificial intelligence as the determining factor in the fate of our species. As Gebru and Torres have noted, discriminatory and eugenicist attitudes are widespread in these movements.

Last fall, I wrote about the alarming intersection between these fascist tech cults and Christian Nationalism—how some tech elites wove their expansionist fantasies through the fabric of Christianity, and others, like Elon Musk, simply endorsed Christianity, as the interests of Big Tech and Christian Nationalists converged in the Trump coalition. Naomi Klein and Astra Taylor have recently described this mash-up of apocalyptic fantasies and death-making politics as “end times fascism."

In this context, I want to talk for a moment today about AI and what it means for our anti-fascist politics.

So, how is AI affecting us right now? From job losses to environmental devastation and filling the internet with low-grade slop, we are experiencing a massive assault on our livelihoods, our senses, our capacities, and our ability to relate to one another.

We are talking about a business plan that amounts to the near completion of capitalist alienation. Rather than simply robbing us of our time, eliminating third spaces, and turning us into competitors who fear each other more often than we relate to one another, Big Tech now seeks to root out even the aspiration for human connection, offering enshittified convenience food for the soul in its place. The use of chatbots is cultivating spiritual delusions in large numbers of vulnerable people and further warping our ability to relate to one another in real, messy, human ways. Friendship, therapeutic support, spiritual questions, thought partnership, and even romance can now be outsourced to sycophantic chatbots—which occasionally tell people to act on violent or self-destructive impulses, or to indulge in meth a couple of times a week. ChatGPT and its competitors are mass-manufacturing individual delusions and social alienation.

Where social media reduced human beings to short-form performances and 140-character captures of our politics and experiences, AI platforms have further eroded our ability to navigate difference. When I talk to people who have become socially dependent on ChatGPT and similar tools, they often say they find the program more empathetic and accommodating than real people. This is especially true for people who are largely neglected in our society, including disabled people whose communication styles are often misunderstood or rejected. People with depression and chronic health issues, who may feel that no one wants to hear them talk at length about their problems are also vulnerable. When it comes to romantic dependence, some people have lost real-life relationships and even jobs over their AI fixations. Artificial companionship—sycophantic, always available, always supportive, and often full of shit—can quickly become an addiction that threatens our mental health, our human connectedness, and our ability to function.

This is the endgame of our isolation. Rather than throwing up barriers that make it harder to seek companionship and support in each other, we are being offered an artificial replacement for other people’s humanity, and for our own thoughts. We are being bundled away from one another and deskilled, as critical thinking atrophies or goes undeveloped in people who outsource their thought work to an autocomplete mechanism. Men who sell machines that mimic people want us to become people who mimic machines. They want techno feudal subjects who will believe and do what they’re told. We, as people, are being strategically simplified. This is a fascist process.

AI platforms are also a firehose of bullshit, generating false realities for people to engage with across the internet. When I think about deepfakes, I don’t think about how far the technology has come; I think about the time someone sued Elon Musk for saying Teslas could drive "autonomously with greater safety than a person." Musk's lawyers argued that he shouldn’t be deposed because he didn’t remember saying these things, and that as a high-profile person, he was “the subject of many deepfake videos and audio recordings.” They didn’t claim the specific video in question was fake. They simply argued that the landscape was too polluted for accountability to be possible. That argument failed in court in 2023, but it gives us a glimpse of where the technofascists want to take us. They don’t need total narrative control. They just need to make the truth dismissible, debatable, and ultimately irrelevant.

That desire isn’t new among fascists. But the scale at which they can now distort reality and the human experience is the product of evolving technologies. These technologies are profoundly exploitative—from the theft of intellectual and artistic labor, to the abuse of workers, to the ecological damage endured by communities and wildlife near data centers. Tech companies operate like religions and empires. They want to undo society and governance as we know it and replace it with something even worse: fascist corporate fiefdoms ruled by tech CEOs.

In the meantime, algorithmic governance is being used to standardize the kind of bureaucratic violence that once required complicit people to enforce. Now, the dehumanizing work of organized abandonment—who gets approved or cut off, who gets turned away in crisis—can be automated. Remaining human employees can be taught to answer to the algorithm. Some people may be conditioned to believe that these systems are actually fairer because they are "dispassionate"—even though algorithms simply reflect the priorities of those who impose them. Tech companies are always seeking to manufacture dependencies. And we are now in a moment where both the public and the government are being thrust into dependency on technologies that will be hard to walk back. We must resist by targeting these companies, leveraging labor power, and through cultural work that defends our social bonds, creativity, and critical thinking.

First and foremost, we need union power. We need a new Luddite movement—one that defends the integrity of human expertise and skill. These machines aren’t what they’re cracked up to be, and our labor still powers this economy. We must recognize that, build power, and leverage it. As the government attacks the structures that legitimize unions, we should exploit their strategic errors. The NLRB was created to contain union militancy—its weakening could let the labor movement off its leash.

We need to support Black communities in Memphis who are organizing against Musk’s xAI data center and its pollution, and workers in the Global South, who have been viciously exploited to train AI models. 

We must also understand that our movements need to hold people in their humanity. Many people are in crisis. So many of us are lonely and searching for comfort and answers. The right exploits these vulnerabilities expertly. People experiencing collapse are particularly susceptible to hyper-religiosity, cults, and demagoguery. And even those who aren’t pulled in by Christian Nationalism or tech cults like Longtermism or Effective Accelerationism may fall for “Abstract AI Solutionism”—the belief that our best hope lies in scaling up AI to solve everything. But we already know how to stop killing the Earth. We already know how to reduce preventable deaths and crimes of desperation. The problem is that the solutions don’t serve billionaires—70% of whom made their fortunes in tech.

So, how do we build the counterculture we need?

We have to make space for grief, connection, and the sacred. In a biweekly support group for activists called Understory, which I co-facilitate with Tanuja Jagernauth, participants explore what is sacred to them, check in about how they’re doing, and discuss what it means to move forward under current conditions. It’s a spiritually nourishing space that helps us rehearse our values outside productivity-based frameworks. Projects like the Black Zine Fair have demonstrated that people are ready to receive and share knowledge offline. We need to invest in those ways of learning and sharing—ones that predate and transcend billionaire-owned platforms. These spaces are not only culturally vital, but can also create more safety for necessary conversations in a time of heightened surveillance.

We have to learn together. Last week, I was moved by the story of a book club in a prison where one imprisoned woman read a copy of We Grow the World Together aloud through a heat vent, and others listened. She read the entire book to them that way. These imprisoned women then discussed the book in conversations and in letters. We have so much to learn from our imprisoned co-strugglers. While those of us on the outside face repression, we still have a lot of freedom to move—and we should use it while we can. We need support groups, discussion groups, grief groups, and opportunities to learn together. We need to learn from experiments like Women Building Up and Interrupting Criminalization’s “Communiversity,” and we need to continue to create space to feel and think and do the messy work of being human together. Because that work is already countercultural—and in time, it may be revolutionary.


I will be speaking on another panel tomorrow at 10 am CT, which you can watch virtually here. After that, I will be on medical leave.

I am sending you all so much love and solidarity during this difficult time. As I said on social media yesterday: Come what may, I will bet on us, every time, because there is no other bet worth placing. I am all in, fam. I will never stop looking for openings and aiming at them. I will never stop reaching for people and saying, “Take my hand. We can do this.” Because we are worth it.

Much love,

Kelly

Organizing My Thoughts is a reader-supported newsletter. If you appreciate my work, please consider becoming a free or paid subscriber today. There are no paywalls for the essays, reports, interviews, and excerpts published here. However, I could not do this work without the support of readers like you, so if you are able to contribute financially, I would greatly appreciate your help.