AI amplifies whatever you point it at. Strong teams with clear strategies get faster and more focused on what matters. Weak teams with vague strategies get noisier and more distracted. The gap does not close. It widens. Clear thinking compounds. Confused thinking unravels.
This is Kahneman’s overconfidence finding applied to AI-accelerated work: we confuse fluency with accuracy and mistake activity for progress. In a world where AI makes everyone more productive, the illusion of progress becomes even more seductive — you ship more, build more, deploy more, and mistake the volume for value. The vanity metrics that emerge (percentage of code written by AI, number of agents deployed, tokens consumed, features shipped per sprint) feel like progress but say nothing about whether you are building something that wins.
The antidote is not to slow down but to redirect what you measure: optimize for feedback loops that sharpen judgment, not metrics that measure activity. This pattern mirrors Every optimization has a shadow regression — guard commands make the shadow visible — amplifying speed without tracking judgment quality creates a shadow regression in strategic direction. It also extends Commodity work's terminal value is zero but structured expert judgment compounds indefinitely from markets to individual practice: commodity output has zero terminal value, but structured judgment about what to build compounds indefinitely.