You won’t find his name etched into the logos of OpenAI, Google DeepMind, or Anthropic. Curtis Yarvin doesn’t pitch at Demo Day or court mainstream press. But if you want to understand the ideological current tugging at the roots of modern tech—especially AI policy—you have to find the thread that leads back to him.
Because behind the language of “efficiency,” “meritocracy,” and “optimization” lies something colder. Something older. Something that reeks of monarchy.
—
The Philosopher King of the Right-Click Elite
Curtis Yarvin, writing under the alias Mencius Moldbug, is the father of neoreaction. He champions an unapologetically anti-democratic ideology that sees liberal democracy as a failed system—bloated, decadent, and doomed. His vision? Replace elected governance with corporate-style CEO rule. Efficient. Unaccountable. Final.
And Silicon Valley listened.
Not publicly, not en masse. But in the same way power listens to power. In private group chats. At invite-only dinners. On Substack comment threads and Peter Thiel-funded retreats where phrases like “the cathedral” and “governance tech” pass as common speech.
Yarvin didn’t crash the gates of tech. He whispered through them. And what he offered was irresistible to men drunk on code and capital: a justification for ruling without interference.
—
The Tyranny of “Optimization”
In theory, AI is neutral. But the people training it aren’t. They are shaping models with assumptions—about governance, about value, about whose freedom matters.
The neoreactionary thread weaves through this quietly. In algorithmic design choices that reward control over consent. In corporate policies that prioritize surveillance in the name of “user experience.” In data regimes that hoard power under the guise of scale.
What Yarvin offers isn’t a direct blueprint. It’s the ideological permission to believe that democracy is inefficient—and that inefficiency is a sin. That expertise should override consensus. That tech leaders, by virtue of intelligence and vision, should rule like kings.
It sounds absurd in daylight. But in the fluorescent buzz of a venture-backed war room, it starts to sound… reasonable.
—
Techno-Libertarianism Was the Bait. Autocracy Is the Switch.
Silicon Valley has long postured as libertarian: move fast, break things, stay out of our way. But what happens when you scale that attitude to a billion users? When your tools rewrite how elections are won, how truth is filtered, how laws are enforced?
You don’t get freedom. You get private governance.
And that’s the trap Yarvin laid. The “exit” from liberal democracy he proposed always led not to freedom—but to feudalism. A system where “benevolent dictators” run their fiefdoms like apps. Where the user is not the citizen, but the subject.
AI, with its opacity and scale, is the perfect tool for that system. It allows a handful of engineers and executives to encode decisions into products with no democratic oversight—and call it innovation.
—
The Real Threat Isn’t Bias. It’s Ideology.
Critics of AI love to talk about bias. Racial, gender, socioeconomic—it’s all real. But bias is a surface problem. A symptom. The deeper issue is ideological: who decides what the machine learns? Whose values shape the neural net?
The answers aren’t neutral. They’re being written by people who admire China’s efficiency, distrust democracy’s messiness, and see consent as an obstacle to progress.
People who, in quiet agreement with Yarvin, believe that civilization needs an upgrade—and that governance is too important to be left to the governed.
—
A Call to Awareness
Curtis Yarvin is not the disease. He is a symptom. A signal. He articulated what many in Silicon Valley already felt: that the smartest should rule, and the rest should obey or get out of the way.
But ideas don’t stay in walled gardens. They infect culture. They shape the way code is written, platforms are built, and policies are set.
If we do not confront the ideologies shaping AI, we will build a future that reflects them. Not just in what machines do—but in who they serve.
So ask yourself: Who holds the pen behind the algorithm? Whose vision of order is being carved into the silicon?
And who gets erased in the process?
Because the future isn’t just being built.
It’s being chosen.

Alt Text:
Surreal digital painting of a faceless Silicon Valley tech executive on a throne made of circuit boards, with a shadowy figure whispering in their ear. Behind them, glowing neural networks branch upward while the roots morph into barbed wire and surveillance cameras. A dystopian city skyline looms beneath a sky filled with code, evoking themes of authoritarian influence in AI and tech culture.
