Leading with Strategic Intelligence

  • Home
  • Services
  • About Peter
  • Insights
  • ETR Book
  • Set up a Meeting
  • …  
    • Home
    • Services
    • About Peter
    • Insights
    • ETR Book
    • Set up a Meeting
Get in Touch

Leading with Strategic Intelligence

  • Home
  • Services
  • About Peter
  • Insights
  • ETR Book
  • Set up a Meeting
  • …  
    • Home
    • Services
    • About Peter
    • Insights
    • ETR Book
    • Set up a Meeting
Get in Touch

Thoughts on Empire of AI by Karen Hao

Every empire looks inevitable — until it isn’t. Rome, the British Empire, Big Oil. Today’s AI titans — OpenAI, Anthropic, Google, Microsoft, and yes, Apple — carry themselves like they’re writing the next chapter of destiny.

But as Karen Hao reminds us in her powerful book Empire of AI, inevitability is an illusion. What looks like technological progress is often empire-building in disguise — extraction of data, energy, and human labour, wrapped in glossy narratives about innovation.

The Empire Lens

Hao’s work forces us to strip back the marketing and see AI as empire:

Extraction: annotation work, content moderation, vast amounts of energy and water.

Concealment: hidden supply chains, silent externalities.

Concentration: decision-making and governance locked in the hands of a few firms.

It’s not just about the brilliance of the models. It’s about the costs — human, environmental, societal — that the empire doesn’t want you to see.

Earn the Right Perspective

In Earn the Right, I argue that technology only works when it serves outcomes that matter — not vanity metrics, not scale for its own sake. Empires collapse because they forget that legitimacy isn’t built on dominance. It’s earned, repeatedly, through competence, accountability, and contribution to human flourishing.

That principle applies just as much to AI. The question isn’t “who can build the biggest model?” It’s: “who can earn the right to shape the future responsibly?”

Judging the Players with Hao’s Risk Matrix

If we take Hao’s empire criteria and apply them to today’s major AI players, the picture gets interesting:

  • OpenAI – Innovative, bold, but opaque. Moving faster than its governance can keep up.
  • Anthropic – Safety-framed, but still empire logic: massive models, massive compute, centralised power.
  • Google – The old empire, adapting. Deep resources but high secrecy and external costs.
  • Microsoft – The empire by proxy, using its cloud and capital to extend reach via OpenAI.
  • Apple – The outlier. Risk of outsourcing its AI “brain,” but also a chance to chart an anti-empire path: local AI, privacy-first, user-centric. If it chooses courage over convenience, Apple could show another way.

The Human + Technology Interface

This is where my work and Hao’s lens intersect. Empires fail when they forget the human. Strategy execution, with or without AI, is about balancing the interface:

Humans define the purpose.

  • Technology amplifies intelligence.
  • Outcomes create legitimacy.
  • Lose that balance, and you drift into empire. Keep it, and you build systems that endure.

The Future Isn’t Pre-Ordained

The pursuit of superintelligence isn’t destiny. Neither is empire. The future of AI will be written not by GPUs alone but by the leaders and organisations who earn the right to wield this power responsibly.

The critical question is not when we reach superintelligence, but who gets to define what “super” means.

Perhaps now is the time to revisit what we believe our Winning Aspirations should be. Beyond the narrow and predictable targets of quarterly earnings and macro-economic growth. When we align behind different outcomes — human flourishing, resilience, environmental sustainability, fairness — we change the brief. We ask technology to deliver outputs that serve these goals, not the empire’s.

Empires fall. But great systems endure — when they respect the balance between human agency and technological capability. That’s where legitimacy lives. That’s where the future should be.

Previous
The Hidden Cost of Strategic Amnesia: Why Your Next Plan...
Next
 Return to site
Cookie Use
We use cookies to improve browsing experience, security, and data collection. By accepting, you agree to the use of cookies for advertising and analytics. You can change your cookie settings at any time. Learn More
Accept all
Settings
Decline All
Cookie Settings
Necessary Cookies
These cookies enable core functionality such as security, network management, and accessibility. These cookies can’t be switched off.
Analytics Cookies
These cookies help us better understand how visitors interact with our website and help us discover errors.
Preferences Cookies
These cookies allow the website to remember choices you've made to provide enhanced functionality and personalization.
Save