The Leverage Arbitrage: Why Everything Feels Broken

Something fundamental has shifted in how power works, and most of our institutions haven't noticed. We're living through what might be called "leverage arbitrage divergence"—a growing gap between how fast some actors can change the world and how fast others can respond to those changes.

Think about it: A small team at TikTok can alter the attention patterns of a billion teenagers in months. Meanwhile, educational institutions need years just to update their curricula, and democratic governments require decades to develop coherent responses to technological change. This isn't just about technology moving fast—it's about different types of power operating at completely different speeds.

The framework begins with a simple observation made by entrepreneur Naval Ravikant: there are three types of leverage. Labor leverage scales through human coordination—think committees, voting, traditional management. Capital leverage scales through resource deployment—investment, market creation, financial engineering. Code leverage scales through systematic automation—algorithms, platforms, network effects. Each operates at a different mathematical order: linear, exponential, and systematic respectively.

Here's where it gets interesting: we're experiencing massive "leverage arbitrage" where actors with higher-order leverage can extract value from systems faster than actors with lower-order leverage can maintain those systems. Google's compensation revolution didn't just raise tech salaries—it systematically destroyed entrepreneurial ecosystems globally by making employment more attractive than company-building. Social media platforms don't just connect people—they reshape democratic discourse faster than democratic institutions can adapt. AI systems aren't just tools—they're deployed faster than we can develop frameworks for understanding their social implications.

The result is what we might call the "modern tragedy of the commons." In Garrett Hardin's original formulation, everyone overused shared resources until they collapsed. Today, higher-leverage actors are strip-mining institutional commons—democratic norms, social trust, educational relevance, economic mobility—faster than lower-leverage actors can regenerate them.

This explains seemingly disconnected modern phenomena. Why do individuals feel powerless despite having unprecedented access to global platforms and financial markets? Because they're measuring their agency using frameworks designed for labor leverage while actually wielding code and capital leverage daily. Why do tech companies keep creating "unintended consequences" despite hiring brilliant people? Because engineers build complex adaptive systems while executives manage them using business metrics designed for predictable, linear growth.

Why does government regulation always feel like it's fighting the last war? Because bureaucratic processes designed for industrial-age problems can't keep pace with algorithmic systems that reshape society in real-time. Why do young people feel anxious about their futures despite living in the most opportunity-rich era in history? Because educational institutions prepare them for career models that economic change has already obsoleted.

The leverage arbitrage creates a vicious cycle. As the gap between different leverage types widens, traditional coordination mechanisms become increasingly ineffective. This drives more actors toward higher-leverage approaches, accelerating the divergence. Meanwhile, the institutional commons that make civilization possible—shared truth, democratic discourse, economic mobility, social cohesion—continue degrading because they depend on lower-leverage maintenance that can't compete with higher-leverage extraction.

But understanding this dynamic also suggests solutions. Instead of trying to slow down technological change or somehow make institutions faster, we need "leverage literacy"—helping people recognize the type of power they're actually wielding and use it more consciously. We need organizations designed to operate across leverage levels simultaneously, with decision-making processes that account for different mathematical orders of impact.

Most importantly, we need to redesign how we measure success and allocate resources. Current systems reward short-term optimization within single leverage types while ignoring long-term effects across leverage types. The result is systematic underinvestment in the institutional commons that higher-leverage systems depend on to function sustainably.

The stakes couldn't be higher. If we don't develop conscious approaches to managing leverage arbitrage, we risk a future where technological capability advances exponentially while social coordination capacity deteriorates linearly—a recipe for civilizational breakdown. But if we can build leverage literacy into how we design institutions, raise children, and structure careers, we might create coordination mechanisms sophisticated enough to harness exponential technological power for genuinely beneficial outcomes.

The choice isn't between technology and tradition—it's between conscious coordination across leverage levels and unconscious optimization within leverage silos. The former could create unprecedented human flourishing. The latter is already creating unprecedented institutional dysfunction.

The question is: which future will we choose to build?

12 responses
I'm glad to have read this article — it puts into words a thought I've been mulling for a while. I've been thinking about encouraging people who want to do good to instead go and start businesses that can indirectly perform good, since businesses are much more scalable than equivalent charities (and can fundraise a lot easier). They're (imo) one of the easiest ways to upgrade one's leverage class. Thanks for the leverage layering framing. The em-dashes (—) in the above message are human-inserted, and not the product of an AI model.
I think outcome-linked prediction markets could be a way to make it so advocating and implementing smart public policy becomes much higher leverage.
Very nice article. But in your penultimate paragraph you premise the good outcome on "if we can design institutions, raise children, and structure careers ..." But if your argument holds, that will take too long, and will be rendered obsolete by the higher leverage actors you refer to. I think we have already passed the stage of any meaningful restraint on technology; the developers won't do it, the government can't do it; and expecting millions of people to suddenly abandon free games for responsible software isn't going to happen. From here on, we're just along for the ride, we're not "designing" any future. It's just emerging.
Largely agree with everything except the premise that a majority of societies held these values at a sort of similar plane, what is happening has just been happening, but technology is actually accelerating it while making it known
I mostly don't disagree, but I've seen a lot of articles like this where the author tries and takes a new perspective, and honestly perspectives where you compare by power dynamics ordinally even if its indirect, always end up being divisive and useless. The article also doesn't actually dig into the meat of what drives the leverage, which is the main connection to reality so while it sounds nice, it has some serious deficits, if the point of the piece was to educate/move towards the stated goal. With indirect or ambiguous things like this, the very first thing you need to have is a definition, and while you have indirect statements for the various things it lacks the properties required of a definition. A definition requires what is known in philosophy as a property called metaphysical objectivity, or identity, like Descartes "I think therefore I am" argument. It is tied to an objective measure that can be determined. The framework described, lacks this property, and focuses on social standing and power dynamics which are circular, rather than incentives, economics, or human action. That's problematic. In the former, you'll always have a malcontent with no solution because its circularly defined, Hegel knows this best. The most obvious measure you could potentially tie to this, and meet all the requirements would be the presence of non-reserve money-printing in whichever form it takes. Be it debt, money-substitute (frequent flyer miles, points, membership perks), paper commodity warrants that exceed underlying materials in the vault for delivery, options contracts, etc. As it stands though, the post has a fairly weak call to action.
Interesting read and fairly spot on, but at some point, the average slob just becomes a victim. I'm still trying to figure out who broke the internet. For example, if I'm searching for a product on Amazon, Walmart, etc., and I sort by lowest price or turn on filters to narrow down my choices, none of it works? Even taking the sponsored choices into account, it just doesn't. Try a Google search for something specific, and that doesn't seem to work much of the time. AI searches, forget it, it's just scraping data from websites and presenting us with a distorted view. The bigger problem is that the world is now relying on garbage information and pairs it with a low moral compass; we are doomed. A 13-year-old teenager addicted to video games in 2001 used to be thought of as bad; now those folks are running the world. What about a 13-year-old today who is addicted to porn and getting all their political and moral information from TikTok, distorting the view of what a family is, a relationship is, what marriage is, or even what gender they are? They get to run the world soon...God help us.
6 visitors upvoted this post.