The Number That Should Worry You
Eighty-eight percent of organizations now use AI in at least one business function. That number, from McKinsey’s State of AI research (surveying nearly 2,000 executives across 105 countries), sounds like progress. And it is.
But here is the number that matters more: only 6% of those organizations qualify as “high performers,” meaning AI contributes more than 5% of EBIT in a measurable, sustained way. Nearly two-thirds have not begun scaling AI across the enterprise at all. They are stuck in what the industry has started calling “pilot purgatory.”
In the first article in this series, I described the difference between AI optimization and AI transformation. Most companies are doing the former and calling it the latter. This article is about what the latter actually looks like when it is real.
Transformation Is Not a Technology Problem
The instinct when companies fail to get value from AI is to blame the tools. The model was not good enough. The data was not clean. The vendor oversold the capability.
Sometimes those things are true. But McKinsey’s research points to something more fundamental: the organizations getting real value from AI are not using better technology. They are organized differently.
High performers in McKinsey’s data are nearly three times more likely than other organizations to fundamentally redesign their workflows when deploying AI. They do not layer AI on top of existing processes. They rebuild the process around what AI makes possible.
This is the operational distinction that separates optimization from transformation. Optimization asks: how do we do this existing thing faster? Transformation asks: now that this capability exists, what should the process actually look like?
Those are different questions, and they produce different organizations.
What the 6% Do Differently
McKinsey’s research identifies consistent patterns among AI high performers. None of them are about which model or vendor they chose.
They redesign workflows, not just tasks. Only 21% of organizations using generative AI have redesigned even some of their workflows. The high performers do not automate a step in an existing process. They look at the entire workflow and ask which steps should exist at all now that AI is in the picture. The metaphor McKinsey’s analysts use is installing a jet engine on a horse-drawn cart. The engine works fine. The vehicle is not designed for it.
Senior leadership owns the initiative. This is not delegated to IT or to an innovation team. In high-performing organizations, senior leaders are actively engaged in driving AI adoption, including modeling the use of AI themselves. McKinsey’s data shows that having a dedicated transformation office or AI leadership structure is one of the strongest predictors of value creation. When AI lives in a silo, it produces siloed results.
They invest at a different scale. More than one-third of high performers allocate over 20% of their digital budgets to AI. That is five times the rate of average organizations. This is not about spending recklessly. It is about making a strategic bet rather than hedging with small pilots that never reach the scale required to change the business.
They define what humans validate and what machines decide. One of the strongest predictors of AI value in the research is having defined processes for when and how human judgment validates AI output. This sounds like a governance detail. It is actually an operating model decision. It determines the speed, quality, and trustworthiness of every AI-assisted workflow in the company.
They measure outcomes, not activity. High performers track competitive differentiation, market share, customer satisfaction, and revenue signals. They do not measure success by the number of AI tools deployed or the number of use cases piloted. The metric is business impact, not technology adoption.
The Operational Layer Most Companies Are Missing
There is a pattern underneath all five of those behaviors. It is not a technology pattern. It is an operational leadership pattern.
The companies getting real value from AI have someone (or a team) sitting above the execution layer who is accountable for connecting AI investments to business outcomes. Someone who can look across functions, see where workflows need to be redesigned, determine where human judgment is essential and where it is overhead, and make the call on what to scale and what to stop.
This is operational infrastructure. It is the kind of work that sits between the CEO’s vision and the team’s execution. Without it, AI adoption becomes a collection of disconnected experiments that each team runs according to its own logic. The tools get deployed. The workflows do not change. The org chart stays the same. And a year later, the leadership team is asking why the AI investment has not produced the results they expected.
McKinsey’s research supports this directly. Their “Rewired” framework, based on more than 200 at-scale AI transformations, identifies six dimensions essential to capturing value: strategy, talent, operating model, technology, data, and adoption at scale. Four of those six are operational and organizational, not technical.
The technology works. The question is whether the organization is designed to use it.
The Honest Version
Not every company needs to be in the 6%. Transformation is not the right move for every organization at every stage. Some companies genuinely need optimization right now: reduce costs, improve efficiency, stabilize operations. That is a legitimate strategy.
The problem is when leadership believes it is transforming when it is actually optimizing. Because the two require different investments, different organizational structures, different timelines, and different definitions of success. Confusing them does not just waste money. It wastes the strategic window where real transformation was possible.
The companies that will look back on this period as the moment they pulled ahead are the ones that understood the difference, made a deliberate choice, and built the operational infrastructure to execute on it.
The next article in this series examines where the project management professional fits in this picture, and why the PM skillset is more relevant to AI initiatives than most organizations realize.
Travis Cox is a Fractional COO working with founders and operators at 10 to 100 person companies. If your AI investments are producing activity but not outcomes, the Operational Readiness Diagnostic can help you figure out why.