
Tommi Hippeläinen
February 12, 2026
When we imagine a "country of geniuses in a datacenter", it is tempting to picture a form of power without precedent - a civilization-ending asymmetry of intelligence that humanity may not survive.
But history suggests something else.
Throughout every major technological leap - from metallurgy to nuclear weapons - large power imbalances have emerged. And in every case, secondary control systems evolved alongside them. Not because humans were wise. Not because institutions were mature. But because power that ignores physical, biological, and ecological constraints eventually collides with them.
The deeper question is not whether AI will become extremely intelligent.
The deeper question is: What kind of power is intelligence, when it has no body?
A datacenter full of superintelligence is often described as a "country". But a country controls territory, logistics, energy, food, materials, and force.
AI controls none of those directly.
It lives inside:
Remove electricity, and a "country of geniuses" becomes inert silicon.
This is not a trivial point. It is structural.
Biological organisms evolved over hundreds of millions of years to survive:
Humans can operate in deserts, mountains, forests, and at sea. We repair infrastructure. We improvise under stress. We survive blackouts.
AI cannot.
Intelligence without embodiment is extraordinarily powerful in narrow domains - but extremely fragile in ecological terms.
History shows a recurring pattern:
The printing press destabilized religious authority. It also created modern pluralism.
Gunpowder destabilized feudal orders. It also created centralized states.
Nuclear weapons created existential risk. They also created deterrence regimes, verification systems, and decades of uneasy stability.
Every extreme asymmetry has produced:
The idea that AI represents the first power that escapes this pattern assumes that intelligence alone overrides every other layer of reality.
But intelligence is downstream of energy. Energy is downstream of physical infrastructure. Infrastructure is downstream of labor and materials. Materials are downstream of geology and climate.
No intelligence stack floats above thermodynamics.
The real imbalance between humans and AI is not intelligence.
It is adaptability.
Humans:
AI systems:
Even the most advanced model remains dependent on a fragile industrial substrate.
This creates a structural ceiling on what "runaway intelligence" can do without sustained cooperation from embodied human systems.
The anxiety around AI often assumes that exponential cognitive speed automatically translates into world-scale dominance.
But the physical world does not operate at cognitive speed.
Manufacturing takes time. Logistics take time. Robotics requires materials. Energy requires infrastructure. Infrastructure requires maintenance.
A model can design a better drone in milliseconds.
But producing millions of drones requires:
A storm can shut down more power than a model can generate.
A supply chain disruption can halt more production than intelligence can compensate for.
Biology survives chaos. Silicon does not.
When nuclear weapons emerged, many believed humanity had reached irreversible instability.
Instead, we saw:
Not perfection - but stabilization.
AI will not exist in a vacuum. It will exist in:
Secondary control systems will emerge not from idealism, but from necessity.
The greater risk is not that AI, as pure intelligence, transcends biology.
The greater risk is that human institutions fail to adapt fast enough to manage:
AI is powerful. But power without embodied autonomy remains dependent.
The real imbalance is political, not metaphysical.
The metaphor of technological adolescence implies that we are a child handed a weapon.
But humanity has already passed through multiple adolescence events:
Each time, catastrophe was possible. Each time, adaptation occurred.
Not because humans were perfectly wise - but because biological survival pressures force stabilization.
AI does not eliminate that pressure. It amplifies the need for it.
A "country of geniuses in a datacenter" is formidable.
But it is not sovereign.
It depends on:
The ultimate asymmetry remains biological resilience.
Until intelligence can metabolize, self-repair, and survive independently of industrial civilization, it is powerful - but not autonomous in the evolutionary sense.
The challenge is not whether intelligence becomes overwhelming.
The challenge is whether governance evolves as quickly as capability.
Secondary control systems are not optional. They are historically inevitable.
The question is whether we build them deliberately - or let them emerge through crisis.
That is the true rite of passage.
Not survival against superintelligence.
But institutional maturity in the presence of it.