A Phone Call at the Edge of the Compute Bubble
A reflection on enterprise AI adoption, risk ownership, and decision paralysis under accelerating technological change
Why AI adoption is stalling in the real world
The “Quick Call” that lasts more than one hour
A few days ago, a leadership team from a top-tier global electronics company got in touch for a “quick sync” on AI. For ninety minutes, we stayed on the line.
The difference between a fifteen-minute check-in and a ninety-minute deep dive is the loudest signal in the market right now. It is the sound of a group of people realizing that things are changing.
This wasn’t a business that was having trouble because a pilot didn’t work or they didn’t have enough money. They have experience with safety-critical issues on a worldwide scale and a lot of pressure from their board to “do something.” But they couldn’t move. It’s not the technology; it’s that there is no internal structure to support a single high-stakes decision. In that hour and a half, it became evident that they couldn’t locate a single person who could explain to the CEO in words that would truly justify a capital commitment.
They weren’t questioning whether AI was important. They were learning they couldn’t plug it in anywhere without blowing their internal fuses. What is worse, they can’t even catch up with the new risks introduced into the AI landscape yesterday. They don’t even know which fuse they should be looking after.
The fuses of December aren’t the same ones from November. Worse, they don’t even understand what the fuse was that was supposed to break in November had they decided to adopt something.
The Compute Bubble vs. The Reality Mismatch
In my previous piece, Breakfast at the Edge of the Compute Bubble, I said that we are in a supply-led frenzy. Before we know what will be plugged into it, Capital is building the grid. If we keep digging, the effects will be personal.
This is the “ohh man” moment: The Valley feels that getting businesses to use it is a problem of discovery. It’s really a problem with de-risking.
People often argue that once we find the appropriate “killer app” or agent framework, everything will fall into place. This is not true. It doesn’t take into account the fact that a C-suite is made up of people. Most companies don’t need new use cases; they need a means to give a probabilistic “black box” inside a company culture that is based on determinism, audits, and “the buck stops here” accountability.
You can’t ask an experienced COO to put their career on the line for a system that works “most of the time.”
Why Progress is Creating Friction, Not Flow
We are seeing a clear divide that no amount of computing power can fix.
- Model Dev is limited, quick, and rewards the idea of “move fast and break things.”
- Integration is based on the path taken, is socio-technical, and punishes surprises.
As models get better, it becomes tougher to peek through the blackbox. The system’s failure costs more the more independent models get. This makes a big gap in the mind. This isn’t an argument against advancement; it’s just a way of saying that institutional ownership isn’t keeping up with it.
From the supply side, this seems like unstoppable momentum in the Valley. From the Enterprise’s point of view, it looks like unknown exposure is increasing. This is why the clients are going back to “perpetual pilots.” The technology has moved too fast.
They are looking at a Ferrari, but they know they don’t have a driver’s license, insurance, or even a road.
The Absorption Wall
We refer to legacy systems as “technical debt.” That’s not right, and to be honest, it’s a little rude to the folks who made them. Those systems aren’t just old code; they’re the company’s hardwired instincts for survival.
They are the painful lessons the company has learned about every safety margin, every regulatory fine, and every labor deal. They exist because they are responsible.
Putting AI into an organization is not like upgrading its engineering; it’s like giving it a new organ. And for now, the body is not accepting the donor. It’s not a lack of “intelligence” or “ambition” that’s causing the backlog.
- The Valley is optimizing for Capability.
- The Enterprise is optimized for Accountability.
The groundwork for a huge AI revolution is already in place. But if we keep disregarding the hard, unsexy task of “absorption,” which is figuring out how to integrate, where a system may fail and who is to blame when it does, we aren’t making a revolution. We are making a graveyard of high-performance models that no one has the guts to use.
In 2026, people will stop asking what the models can accomplish and start asking what the organizations can handle.
Closing: Making Space for the Unseen
AI adoption is no longer held back by a lack of intelligence, computer power, or ambition. It is the ability to take in.
People that work with AI know that these systems will behave in ways that can’t be completely foreseen. What is still not obvious is not if something will go wrong, but where uncertainty is allowed to exist, how its effects are managed, and who is responsible for the outcome when it happens.
But it would be foolish to suppose that the difficulties we have now will never go away. This area is changing too quickly and in too many ways. People or tactics that weren’t observed before often lead to breakthroughs from places we wouldn’t expect. History illustrates that new things don’t go away when systems feel most limited; they just come from different areas.
The hard part isn’t figuring out what will happen in the future; it’s being able to see it when it does. You can’t get that skill by doing more demos or more pilots. It comes from doing the easier things, like making sure everyone knows what their job is, allowing for doubt, and letting institutions develop without having to be sure right immediately.
Unless companies can have that debate honestly with themselves and their employees, AI will keep feeling both strong and strangely meaningless. Not because development has failed, but because the conditions needed to obtain it are still being made.