<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[Return on Clarity: ARTIFICIAL INTELLIGENCE]]></title><description><![CDATA[Insights on how to get the most out of GenAI and Agentic AI--in the enterprise, and in your professional life.]]></description><link>https://returnonclarity.substack.com/s/artificial-intelligence</link><generator>Substack</generator><lastBuildDate>Tue, 14 Apr 2026 18:48:08 GMT</lastBuildDate><atom:link href="https://returnonclarity.substack.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[JAMES JANEGA]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[returnonclarity@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[returnonclarity@substack.com]]></itunes:email><itunes:name><![CDATA[Return on Clarity]]></itunes:name></itunes:owner><itunes:author><![CDATA[Return on Clarity]]></itunes:author><googleplay:owner><![CDATA[returnonclarity@substack.com]]></googleplay:owner><googleplay:email><![CDATA[returnonclarity@substack.com]]></googleplay:email><googleplay:author><![CDATA[Return on Clarity]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[The Most Dangerous Phrase in AI Strategy]]></title><description><![CDATA[Where Leaders Should Start With AI (and our free 2026 AI Use Case download)]]></description><link>https://returnonclarity.substack.com/p/the-most-dangerous-phrase-in-ai-strategy</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/the-most-dangerous-phrase-in-ai-strategy</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Fri, 06 Mar 2026 23:40:48 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="6048" height="4032" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:4032,&quot;width&quot;:6048,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;Computer screen displaying code with a context menu.&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="Computer screen displaying code with a context menu." title="Computer screen displaying code with a context menu." srcset="https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1763568258235-f40425a94af9?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw4fHxhaSUyMGltcGxlbWVudGF0aW9ufGVufDB8fHx8MTc3Mjg0MDM4Mnww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@dkomow">Daniil Komov</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>Let me start with a confession.</p><p>When executives tell me they want to &#8220;explore AI,&#8221; I get nervous. Not because exploration is bad. Exploration is how every innovation journey begins. But because, nine times out of ten, what follows is not exploration.</p><p>It&#8217;s fragmentation.</p><p>One team experiments with ChatGPT for marketing copy.<br>Another pilots an AI forecasting tool in operations.<br>Sales buys a conversation-intelligence platform.<br>Finance experiments with automation.</p><p>Six months later, the company has spent money, launched pilots, and generated enthusiasm.</p><p>And yet, when the CEO asks the simplest question&#8212;<em>&#8220;What are we actually doing with AI?&#8221;</em>&#8212;no one has a coherent answer.</p><p>If you&#8217;ve seen this movie before, you&#8217;re not alone.</p><p>Across industries, companies are drowning in AI possibility while starving for AI focus.</p><p>And that&#8217;s the real strategic challenge.</p><p>Not technology.</p><p>Prioritization.</p><div><hr></div><h1>The Art of the Possible (and the Trap It Creates)</h1><p>The phrase &#8220;Art of the Possible&#8221; gets thrown around constantly in AI conversations.</p><p>It sounds inspiring. It&#8217;s also dangerous. Because the real Art of the Possible should be more than a vision deck or a vendor demo.</p><p>It&#8217;s a map.</p><p>A map of the hundreds of places AI could potentially improve a business:</p><p><em>Customer support automation<br>Demand forecasting<br>Contract analysis<br>RFP generation<br>Supply chain optimization<br>Sales pipeline scoring<br>Marketing content generation<br>Risk detection<br>Knowledge search</em></p><p>Every function in the enterprise now has dozens of viable use cases.</p><p>That&#8217;s the miracle of this technology.</p><p>It&#8217;s also the trap.</p><p>When everything is possible, nothing is prioritized.</p><p>Leaders begin with enthusiasm. Then complexity sets in. Where do we start? Which use cases matter most? What&#8217;s realistic in 90 days versus 2 years? Where will the ROI actually show up? Why aren&#8217;t we hitting number now?</p><p>Without a way to answer those questions, AI strategy quickly becomes what McKinsey calls <strong><a href="https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/the-organization-blog/avoid-pilot-purgatory-in-7-steps">pilot purgatory</a></strong>&#8212;a place where experiments happen but transformation never arrives. <em>(McKinsey&#8217;s answer: Secure CEO buy-in. How do you get that? And what if you ARE the CEO? RoC&#8217;s answer is read on.)</em></p><div><hr></div><h1>The Mistake Most Leaders Make</h1><p>The most common mistake I see is surprisingly simple. In McKinsey&#8217;s defense, a lot of this gets driven by our cleverest individual contributors.</p><p>These people and their immediate supervisors choose AI projects based on novelty rather than impact.</p><p>They implement the technology that looks most impressive:</p><p><em>A chatbot with a human-like voice.<br>An AI assistant inside Slack.<br>A generative design tool.</em></p><p>But those projects often live at the edge of the business rather than the center of it. Beacuase the people who come up with it aren&#8217;t allowed to tinker with critical workflows, which is where all the time-saving value is.</p><p>The biggest operational gains sit inside some of the <em>least</em> glamorous workflows:</p><p><em>Inventory planning<br>Customer onboarding<br>Proposal writing<br>Demand forecasting<br>Pricing optimization<br>Sales research</em></p><p>These processes are not flashy. They are repetitive, data-rich, and operationally expensive.</p><p>Which makes them perfect candidates for AI.</p><p>The trick is learning to see them. And that requires a framework.</p><div><hr></div><h1>The Impact vs. Simplicity Map</h1><p>One of the simplest tools we use with executive teams is a two-axis framework.</p><p>Not because strategy should be simple, but because focus requires constraint.</p><p>Picture a square divided into four quadrants.</p><p>On the vertical axis: <strong>Impact</strong><br>How much measurable value could this use case generate?</p><p>Revenue growth<br>Cost reduction<br>Margin improvement<br>Speed of decision-making</p><p>On the horizontal axis: <strong>Simplicity</strong><br>How easy is it to implement?</p><p>Data availability<br>Integration complexity<br>Change management<br>Workflow disruption</p><p>Now place every potential AI use case on that grid.</p><p>What emerges is revealing.</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!9qnE!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!9qnE!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 424w, https://substackcdn.com/image/fetch/$s_!9qnE!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 848w, https://substackcdn.com/image/fetch/$s_!9qnE!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 1272w, https://substackcdn.com/image/fetch/$s_!9qnE!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!9qnE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png" width="724" height="344.0989010989011" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:692,&quot;width&quot;:1456,&quot;resizeWidth&quot;:724,&quot;bytes&quot;:209055,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://returnonclarity.substack.com/i/190157970?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!9qnE!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 424w, https://substackcdn.com/image/fetch/$s_!9qnE!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 848w, https://substackcdn.com/image/fetch/$s_!9qnE!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 1272w, https://substackcdn.com/image/fetch/$s_!9qnE!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F8ed9f9e0-4105-4708-abae-bb34076b6ef8_1895x901.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Clarity Group&#8217;s AI Use Case Explorer allows our clients to explain what &#8220;impact&#8221; and &#8220;complexity&#8221; mean to them, set their systems and talent AI maturity, and automate a list of use cases for their circumstances. It&#8217;s pretty cool, have to say.</figcaption></figure></div><div><hr></div><h2>Quadrant 1: High Impact, High Simplicity</h2><p><strong>Start Here</strong></p><p>These are the use cases every leadership team hopes to find.</p><p>They deliver measurable value and can be deployed quickly.</p><p>Examples often include:</p><p>AI-assisted RFP responses<br>Sales research automation<br>Customer support summarization<br>Internal knowledge search</p><p>They usually leverage existing data and integrate into workflows people already follow.</p><p>These projects create <strong>momentum.</strong></p><p>They also create credibility.</p><p>Nothing builds organizational confidence in AI faster than a win that shows up in a quarterly KPI.</p><div><hr></div><h2>Quadrant 2: High Impact, High Complexity</h2><p><strong>Strategic Investments</strong></p><p>These initiatives may reshape the business&#8212;but they require deeper infrastructure.</p><p>Think:</p><p>AI-driven demand forecasting<br>End-to-end supply chain optimization<br>Predictive maintenance systems<br>Dynamic pricing engines</p><p>These are the projects that can transform operating margins.</p><p>But they demand clean data, integration work, and sustained leadership attention.</p><p>They belong on the roadmap.</p><p>Just not on the first sprint.</p><div><hr></div><h2>Quadrant 3: Low Impact, High Simplicity</h2><p><strong>Nice to Have</strong></p><p>These projects are easy to deploy but generate limited economic value.</p><p>Examples include small productivity tools or isolated automation experiments.</p><p>They&#8217;re useful training grounds for teams learning AI.</p><p>But they rarely move the strategic needle.</p><p>If your roadmap is full of these, you&#8217;re likely stuck in pilot mode.</p><div><hr></div><h2>Quadrant 4: Low Impact, High Complexity</h2><p><strong>Avoid</strong></p><p>Every organization has these.</p><p>Ambitious AI ideas that sound impressive but create enormous implementation challenges with uncertain payoff.</p><p>These are the projects that burn time, budget, and organizational patience.</p><p>The best strategy is often simple.</p><p>Don&#8217;t do them.</p><div><hr></div><h1>Strategy Is the Discipline of Saying No</h1><p>When executives see their AI opportunities mapped this way, something interesting happens.</p><p>The fog lifts.</p><p>What once looked like an overwhelming universe of possibilities begins to look like a manageable portfolio of decisions.</p><p>Some projects move to the top of the list.</p><p>Some move to the long-term roadmap.</p><p>Some quietly disappear.</p><p>That&#8217;s not failure.</p><p>That&#8217;s strategy.</p><p>Because strategy, at its core, is the discipline of choosing <strong>what not to do.</strong></p><div><hr></div><h1>The Real Goal of an AI Strategy</h1><p>The purpose of AI strategy isn&#8217;t to deploy the most advanced technology.</p><p>It&#8217;s to improve how your organization works.</p><p>Better decisions.<br>Faster workflows.<br>Stronger margins.<br>More time spent on creative and strategic work.</p><p>The technology will keep evolving.</p><p>The companies that win will not be the ones chasing every new tool.</p><p>They&#8217;ll be the ones that build <strong>clarity</strong> around where AI actually matters.</p><div><hr></div><h1>The Invitation</h1><p>If you&#8217;re leading a company right now, the opportunity is enormous. But so is the noise. The leaders who move fastest won&#8217;t be the ones experimenting the most.</p><p>They&#8217;ll be the ones <strong>prioritizing the best.</strong></p><p>So here&#8217;s a simple exercise.</p><p>List ten AI use cases across your organization.</p><p>Place them on the <strong>Impact vs. Simplicity map.</strong></p><p>Then ask one question:</p><p><em>Which two would change our business the most if they worked?</em></p><p>Start there.</p><p>Because in the age of AI, the scarcest resource isn&#8217;t technology.</p><p>It&#8217;s focus.</p><p>To help you focus, Clarity Group has created the <a href="https://www.claritygroup.ai/ai-use-cases-for-operational-uncertainty">2026 AI Use Case Guide for Leaders</a>. Go ahead and grab a free copy at the link. </p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div>]]></content:encoded></item><item><title><![CDATA[Anthropic Shook the Market: Why Specialized Software Moats Just Got Repriced]]></title><description><![CDATA[What a single GenAI product launch&#8212;and the market&#8217;s reaction&#8212;reveals about disruption, defensibility, and the future of enterprise tech stacks]]></description><link>https://returnonclarity.substack.com/p/anthropic-shook-the-market-why-specialized</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/anthropic-shook-the-market-why-specialized</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Wed, 04 Feb 2026 12:38:40 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!kXfM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!kXfM!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!kXfM!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 424w, https://substackcdn.com/image/fetch/$s_!kXfM!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 848w, https://substackcdn.com/image/fetch/$s_!kXfM!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 1272w, https://substackcdn.com/image/fetch/$s_!kXfM!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!kXfM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp" width="1456" height="970" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/d5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:970,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:70584,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/webp&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://returnonclarity.substack.com/i/186850281?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!kXfM!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 424w, https://substackcdn.com/image/fetch/$s_!kXfM!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 848w, https://substackcdn.com/image/fetch/$s_!kXfM!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 1272w, https://substackcdn.com/image/fetch/$s_!kXfM!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fd5fe8c52-cca4-429b-8308-6acd4c98350e_2000x1333.webp 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>Photo Credit: Bloomberg</p><p>In early February 2026, something unusual happened in public markets.<br>Not a slow reassessment. Not a speculative wobble. A sharp, synchronized reaction: Are you AI-enabled or AI-exposed?</p><p>After <a href="https://www.anthropic.com/">Anthropic</a> released a legal-workflow automation product, roughly <a href="https://www.bloomberg.com/news/articles/2026-02-03/legal-software-stocks-plunge-as-anthropic-releases-new-ai-tool">$285 billion in market value evaporated</a> across U.S. and European software, data, and information-services firms in a single trading session.</p><p>This wasn&#8217;t about AI <em>eventually</em> changing software. The market reacted to something concrete. A product shipped.</p><p>For boards and executive teams, the signal matters more than the number: capital markets are no longer debating whether GenAI threatens parts of the application layer. They are actively repricing which businesses are exposed.</p><div><hr></div><h2>What actually happened</h2><p>Anthropic quietly introduced a legal-workflow tool built on its general-purpose Claude model&#8212;part of its broader agentic &#8220;coworker&#8221; direction. The product automates contract review, NDA triage, compliance checks, and legal drafting, with explicit requirements for licensed-attorney review.</p><p>The surprise wasn&#8217;t that legal AI exists. It was how this one was built.</p><p>Rather than a bespoke, law-only model trained exclusively on proprietary case law, the tool combines a foundation model, structured prompts, and workflow orchestration. In effect, a model provider packaged enough of the application layer to deliver a usable vertical product.</p><p>Markets noticed immediately. Shares of firms like RELX, Thomson Reuters, and adjacent workflow-centric vendors fell sharply. The selloff spread beyond legal software into data, publishing, and professional-services platforms.</p><p>But this is more about moats than near-term revenue loss.</p><div><hr></div><h2>Why this moment mattered</h2><p>This episode crystallized three shifts enterprise leaders have been sensing&#8212;but may not have fully internalized.</p><h3>1. Model providers are moving up the stack</h3><p>For years, foundation-model companies positioned themselves as neutral infrastructure: powerful engines others would build on. That boundary is eroding.</p><p>By shipping ready-to-use workflows, model providers are now competing directly for budget, usage, and user relationships&#8212;the same terrain long occupied by application vendors.</p><p>For enterprises, this collapses a quiet assumption baked into many tech strategies: <em>your GenAI supplier may now also be your functional substitute. </em>That may be an opportunity to negotiate, to experiment, or to switch &#8212; but it&#8217;s also sure to be disruptive, and it&#8217;s worth thinking about.</p><h3>2. Workflow-bundling moats are being repriced</h3><p>Many SaaS businesses&#8212;especially in legal, finance, compliance, and research&#8212;earned defensibility by bundling data, workflows, and polished interfaces into seat-licensed products.</p><p>The market reaction posed a harder question: how much of that value is truly unique, and how much is workflow glue?</p><p>When a large-context, tool-using agent can replicate core task flows with reasonable reliability, interface polish alone stops looking like a moat. The premium shifts elsewhere. Right now, it seems to be behaving like a classic low-end disruption: If users really want what Anthropic can automate, and don&#8217;t want all that other feature set and associated overhead, how many can and will peel off? How many non-buyers of enterprise software may look to Anthropic first?</p><h3>3. Defensibility is shifting, not disappearing</h3><p>This was not a declaration that incumbents are doomed. Analyst commentary following the selloff converged on three remaining sources of durable advantage:</p><ul><li><p><strong>Proprietary data</strong>: longitudinal corpora, labeled outcomes, and feedback loops that improve accuracy and trust over time.</p></li><li><p><strong>Deep integration</strong>: embedding into mission-critical systems where switching costs are operational, not cosmetic.</p></li><li><p><strong>Trust and governance</strong>: auditability, regulatory alignment, liability frameworks, and human-in-the-loop controls.</p></li></ul><p>Anthropic&#8217;s own insistence on attorney review reinforces this point. In regulated domains, automation without guardrails destroys trust. AI that supports expert judgment can quietly compound value&#8212;that&#8217;s supported by research done at the University of Chicago&#8217;s <a href="https://www.chicagobooth.edu/research/center-for-applied-artificial-intelligence">Center for Applied Artificial Intelligence at Chicago Booth</a>.</p><div><hr></div><h2>2026 looks like a sorting year</h2><p>Taken together, this episode marks a shift in how AI risk is priced.</p><p>Just as cloud readiness became a valuation driver a decade ago, credible GenAI strategy is becoming a capital-markets expectation. Companies are increasingly being sorted into two camps:</p><ul><li><p><strong>AI-enabled</strong>: firms that embed GenAI into core workflows, expose APIs for agentic orchestration, and build proprietary learning loops. (And build responsible governance regimes!)</p></li><li><p><strong>AI-exposed</strong>: firms whose primary value lies in workflow assembly that model-native agents can increasingly substitute.</p></li></ul><p>The speed of the selloff matters. Markets are not waiting for multi-year erosion to make this distinction. They seem to be reflecting a sense that they were expecting something like this.</p><div><hr></div><h2>A practical playbook for enterprise leaders</h2><p>The lesson here is to reposition deliberately.</p><ul><li><p><strong>Own the decision, not just the interface</strong><br>Embed AI into economically meaningful decisions&#8212;not as a chat layer bolted onto legacy flows.</p></li><li><p><strong>Make your platform agent-compatible</strong><br>Expose clean APIs so internal and external agents orchestrate <em>through</em> your systems, not around them.</p></li><li><p><strong>Invest in proprietary data loops</strong><br>Usage telemetry, outcome labeling, and domain-specific feedback are becoming the real training advantage.</p></li><li><p><strong>Design governance as a feature</strong><br>Access controls, human review, audit trails, and explainability aren&#8217;t compliance overhead. They&#8217;re table stakes. And they can be differentiation.</p></li></ul><div><hr></div><h2>The signal boards should hear</h2><p>One GenAI product didn&#8217;t &#8220;destroy&#8221; $285 billion of value. It revealed how quickly perceived moats can collapse when models move into workflows.</p><p>For boards and executive teams, the question is no longer whether GenAI affects your business. It&#8217;s whether your strategy makes you AI-enabled or AI-exposed when the market notices next.</p><p>That moment rarely announces itself twice.</p><div><hr></div><h2>Side-Note: What this looked like on the ground</h2><p>We had just come back from working with executive teams overseas&#8212;regional C-suites and senior leaders in a large, diversified conglomerate that is wrestling with what &#8220;AI-enabled transformation&#8221; actually means in practice. At the same time, we&#8217;re building new AI innovation material for the University of Chicago Booth School of Business, pressure-testing where GenAI genuinely helps leaders move faster&#8212;and where it quietly creates risk.</p><p>Then the Anthropic news hit.</p><p>What struck us wasn&#8217;t that it contradicted what we were teaching.<br>It confirmed it.</p><p>Across boardrooms and classrooms, we&#8217;ve been drawing a sharp line between where GenAI creates leverage quickly and where it should not be trusted to decide, optimize, or invent facts. The market reaction to Anthropic&#8217;s legal tool landed squarely on that same fault line.</p><div><hr></div><h2>Where GenAI actually earns its keep</h2><p>In practice, most high-confidence enterprise GenAI wins today fall into what we call quick (and dirty) content generation&#8212;bounded, assistive use cases inside controlled systems:</p><ul><li><p><strong>Knowledge synthesis</strong>: summarizing contracts, policies, research, meeting notes, or regulatory material so humans can decide faster.</p></li><li><p><strong>On-brand content variation</strong>: generating drafts, outreach, internal communications, and A/B test content where speed matters more than originality.</p></li><li><p><strong>Knowledge co-pilots</strong>: drafting SOPs, reports, and explanations using validated data, with humans retaining judgment.</p></li></ul><p>These aren&#8217;t glamorous bets. They don&#8217;t promise autonomy.<br>But they compound productivity safely.</p><p>Notice what they share: GenAI accelerates <strong>understanding and expression</strong>, not decisions or truth. It drafts. It explains. It reframes. It does <strong>not</strong> determine pricing, targets, legal positions, or financial outcomes.</p><p>That distinction matters.</p><div><hr></div><h2>Why the Anthropic moment landed so hard</h2><p>Anthropic&#8217;s legal workflow tool sits right at the edge of this boundary.</p><p>It doesn&#8217;t replace lawyers. It doesn&#8217;t claim autonomous judgment. It synthesizes documents, drafts language, and surfaces issues&#8212;explicitly requiring licensed review. In other words, it operates in the same assistive zone we&#8217;ve been recommending to executives.</p><p>What shocked markets wasn&#8217;t recklessness.</p><p>It was competence.</p><p>By packaging a general-purpose model with what appears to be structured prompts and workflow discipline, Anthropic demonstrated that much of what application-layer vendors considered &#8220;defensible workflow&#8221; is now reproducible&#8212;<em>as long as the AI stays bounded and non-decisive.</em></p><p>That&#8217;s the uncomfortable truth the selloff priced in.</p><div><hr></div><h2>The hidden lesson for enterprise tech stacks</h2><p>For executives, the takeaway isn&#8217;t &#8220;foundation models will eat everything.&#8221;<br>It&#8217;s subtler&#8212;and more actionable.</p><p>If your product or internal system:</p><ul><li><p>Generates summaries, drafts, explanations, or variations</p></li><li><p>Relies on humans to validate outcomes</p></li><li><p>Operates inside clear guardrails</p></li><li><p>Avoids inventing numbers, targets, or judgments</p></li></ul><p>Then GenAI is an accelerant.</p><p>But if your value proposition depends on:</p><ul><li><p>Deciding what&#8217;s true</p></li><li><p>Choosing who to target</p></li><li><p>Setting prices, limits, or offers</p></li><li><p>Acting autonomously as an analyst, engineer, or authority</p></li></ul><p>Then GenAI isn&#8217;t just risky&#8212;it&#8217;s <strong>structurally misapplied</strong>.</p><p>Markets are learning this distinction faster than many operating committees.</p><div><hr></div><h2>Why this matters for transformation programs</h2><p>In AI-powered transformations, the biggest failure mode we see isn&#8217;t technical.<br>It&#8217;s <strong>overreach</strong>.</p><p>Leaders ask GenAI to decide too soon, too broadly, or too autonomously&#8212;before data foundations, governance, and accountability exist. That&#8217;s how trust erodes internally. That&#8217;s how promising pilots stall.</p><p>Anthropic&#8217;s move worked precisely because it didn&#8217;t cross that line.</p><p>The irony is that the market punished incumbents not because Anthropic was reckless&#8212;but because it was disciplined enough to expose where <em>workflow alone</em> is no longer a moat.</p><div><hr></div><h2>The real connection</h2><p>What we&#8217;re teaching executives and students is the same thing the market just enforced:</p><ul><li><p>GenAI is powerful when it <strong>supports human judgment</strong></p></li><li><p>Dangerous when it <strong>substitutes for it</strong></p></li><li><p>Transformative when embedded deliberately into workflows that already matter</p></li></ul><p>The Anthropic episode wasn&#8217;t a warning about AI hype.<br>It was a signal about where value now lives in enterprise systems&#8212;and where it no longer does.</p><p>That&#8217;s not a future problem.</p><p><br>It&#8217;s a 2026 one.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Are Your AI Pilots Dead? ]]></title><description><![CDATA[How 65% of Enterprises Move From Proof-of-Concept to Profit]]></description><link>https://returnonclarity.substack.com/p/are-your-ai-pilots-dead</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/are-your-ai-pilots-dead</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Tue, 06 Jan 2026 17:40:11 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!S4ZD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!S4ZD!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!S4ZD!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 424w, https://substackcdn.com/image/fetch/$s_!S4ZD!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 848w, https://substackcdn.com/image/fetch/$s_!S4ZD!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 1272w, https://substackcdn.com/image/fetch/$s_!S4ZD!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!S4ZD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png" width="1024" height="608" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:&quot;normal&quot;,&quot;height&quot;:608,&quot;width&quot;:1024,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:null,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!S4ZD!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 424w, https://substackcdn.com/image/fetch/$s_!S4ZD!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 848w, https://substackcdn.com/image/fetch/$s_!S4ZD!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 1272w, https://substackcdn.com/image/fetch/$s_!S4ZD!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa8dcce4b-3ec3-4735-85e7-47af99c41963_1024x608.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">When you can&#8217;t spell &#8220;fail&#8221; without &#8220;AI&#8221;</figcaption></figure></div><p>Your AI pilots aren&#8217;t just stalled: they&#8217;re dead on arrival. </p><p>While executives celebrate proof-of-concept wins and demo-day victories, a brutal reality lurks beneath the surface. <strong>95% of generative AI pilots at enterprises are failing</strong>, according to MIT research. That&#8217;s not a typo. That&#8217;s not &#8220;struggling to scale.&#8221; That&#8217;s outright failure.</p><p>Yet here&#8217;s what separates the winners from the casualties: The top 15% of companies don&#8217;t just run successful pilots: they achieve <strong>2.5x higher revenue growth</strong> and over 3x higher profit margins by treating AI as operational infrastructure, not science experiments.</p><p>The question isn&#8217;t whether your pilots will succeed. The question is whether you&#8217;ll join the 5% who systematically move from proof-of-concept to profit: or remain trapped in the 95% graveyard of good intentions.</p><h2>The Pilot Graveyard: Why 95% Fail</h2><p>Enterprise AI adoption tells two stories simultaneously. Over 70% of organizations have introduced generative AI. Only 6% have fully implemented agentic AI systems that drive measurable business outcomes.</p><p>This isn&#8217;t a technology problem: it&#8217;s an execution problem.</p><p>The core issue lies in how enterprises approach AI implementation. Generic tools like ChatGPT cannot learn from or adapt to enterprise workflows. They remain external additions rather than integrated solutions. Internal AI builds succeed roughly one-third of the time, while specialized vendor solutions achieve success rates of approximately 67%.</p><p>Most pilots fail because they operate in isolation. Teams build impressive demonstrations that work beautifully in controlled environments but collapse when exposed to real enterprise complexity. Legacy systems, data silos, regulatory requirements, and organizational resistance create friction that kills momentum before ROI materializes.</p><p>The MIT survey reveals another critical insight: <strong>95% of enterprises weren&#8217;t getting meaningful returns on their AI investments</strong>. This isn&#8217;t about waiting longer for results: this is about fundamentally flawed approaches that cannot scale.</p><h2>The 5% Advantage: What Winners Do Differently</h2><p>The companies moving from proof-of-concept to profit follow a different playbook entirely.</p><p><strong>They redesign business domains end-to-end.</strong> Instead of adding AI features to existing processes, successful enterprises rebuild high-value workflows around AI capabilities. They don&#8217;t automate inefficient processes: they reimagine efficient ones.</p><p><strong>They treat AI agents as operational infrastructure.</strong> Among enterprises moving beyond experimentation, 57% already have AI agents running in production. More than 25% report meaningful impact within three months. These organizations don&#8217;t run agents as experiments: they deploy them as mission-critical systems with proper governance and infrastructure.</p><p><strong>They empower line managers, not just AI labs.</strong> Success requires distributing AI implementation responsibility beyond central teams. Line managers who understand business operations drive adoption more effectively than technical teams who understand only AI capabilities.</p><p><strong>They select adaptive, enterprise-integrated tools.</strong> Winners choose AI solutions that integrate deeply with existing systems and adapt over time. They avoid tool sprawl and focus on proven technologies that deliver compound value across multiple use cases.</p><h2>The 2026 Inflection Point: From Experimental to Essential</h2><p>VC firms predict 2026 will mark a fundamental shift. Companies will abandon experimental AI budgets and demand concrete ROI requirements. The era of &#8220;let&#8217;s try AI and see what happens&#8221; ends this year.</p><p>This creates immediate pressure for two critical decisions:</p><p>First, enterprises must consolidate AI tool sprawl. The successful approach involves focusing resources on fewer, more powerful solutions rather than experimenting with dozens of point solutions.</p><p>Second, organizations must transition from central AI labs to distributed implementation. The pilot-to-production gap closes when business units own AI outcomes rather than treating them as IT projects.</p><h2>The Profit Framework: Four Pillars of Successful AI Implementation</h2><p>Companies that successfully move from proof-of-concept to profit follow a systematic framework:</p><h3>Pillar 1: Strategic Alignment Before Technical Implementation</h3><p>Successful AI initiatives begin with business strategy, not technology capabilities. Winners identify specific revenue or cost opportunities, then select AI approaches that directly address those opportunities. They avoid the common trap of implementing AI because it&#8217;s available and instead implement it because it&#8217;s necessary.</p><h3>Pillar 2: End-to-End Process Redesign</h3><p>Instead of layering AI onto existing workflows, successful companies rebuild processes around AI capabilities. This requires questioning fundamental assumptions about how work gets done. The goal isn&#8217;t incremental improvement: it&#8217;s systematic transformation.</p><h3>Pillar 3: Infrastructure-First / Infrastructure-Parallel Deployment</h3><p>Winners treat AI as operational infrastructure requiring proper governance, monitoring, and maintenance. They build systems that can scale, adapt, and integrate rather than isolated solutions that work only in controlled environments. You can start small and build as you go -- but winners don&#8217;t forget to do this!</p><h3>Pillar 4: Continuous Learning and Adaptation</h3><p>AI is not fire and forget. AI is NOT fire and forget. Successful AI implementations improve over time through systematic feedback and refinement. This requires both technical monitoring and business outcome measurement. Companies that succeed view AI deployment as an ongoing capability development rather than a one-time implementation project. (AI is not fire and forget.)</p><h2>Beyond Pilots: The Leadership Imperative</h2><p>The transition from proof-of-concept to profit demands leadership clarity more than technical sophistication. Executive teams must make explicit choices about where to compete with AI and where to simply keep pace.</p><p>This requires honest assessment of current capabilities. Most organizations overestimate their AI readiness and underestimate implementation complexity. The successful approach involves identifying specific high-value domains for transformation rather than attempting enterprise-wide AI adoption.</p><p>Leadership clarity also means setting explicit success criteria before beginning implementation. Successful AI initiatives have measurable business outcomes defined upfront, not vague goals about &#8220;exploring AI potential.&#8221;</p><h2>The Execution Bridge: From Strategy to Results</h2><p>Moving from pilot to profit requires bridging the gap between strategic vision and operational execution. This involves three critical transitions:</p><p><strong>From experimentation to systematization.</strong> Successful companies establish repeatable processes for AI implementation rather than treating each initiative as a unique experiment.</p><p><strong>From technical metrics to business outcomes.</strong> Winners focus on revenue impact, cost reduction, and operational efficiency rather than model accuracy or processing speed.</p><p><strong>From pilot teams to enterprise capabilities.</strong> The goal involves building organizational competencies that persist beyond individual projects or personnel changes.</p><h2>Navigating the Path Forward</h2><p>The enterprises that successfully navigate from proof-of-concept to profit don&#8217;t do it alone. They work with partners who understand both AI capabilities and business transformation requirements.</p><p>At <a href="https://www.claritygroup.ai/ai-leadership-and-strategy">Clarity Group AI</a>, we help Fortune 1000 and Fortune 500 leaders avoid the 95% failure rate by focusing on strategic implementation rather than technical experimentation. Our approach centers on leadership clarity, disciplined execution, and measurable business outcomes.</p><p>The pilot graveyard is full of good intentions and impressive demonstrations. The profit zone belongs to organizations that treat AI as operational infrastructure and business transformation, not science projects.</p><p>Your AI pilots don&#8217;t have to join the 95% that fail. But success requires different thinking, different approaches, and different partnerships.</p><p>The choice is yours. The window is now. The results will define your competitive position for the decade ahead.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://returnonclarity.substack.com/subscribe?"><span>Subscribe now</span></a></p><p></p>]]></content:encoded></item><item><title><![CDATA[The PB&J Problem ]]></title><description><![CDATA[Why GenAI Fails Exactly the Way We Ask It To]]></description><link>https://returnonclarity.substack.com/p/the-pb-and-j-problem</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/the-pb-and-j-problem</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Thu, 18 Dec 2025 14:30:05 GMT</pubDate><enclosure url="https://substackcdn.com/image/youtube/w_728,c_limit/wdzBqRjAl-M" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="native-audio-embed" data-component-name="AudioPlaceholder" data-attrs="{&quot;label&quot;:null,&quot;mediaUploadId&quot;:&quot;52051d19-0f98-4991-837d-7d394f4b6139&quot;,&quot;duration&quot;:333.7143,&quot;downloadable&quot;:false,&quot;isEditorNode&quot;:true}"></div><p>There&#8217;s a video I just decided I&#8217;ll always show to newcomers just starting to work seriously with GenAI.</p><div id="youtube2-wdzBqRjAl-M" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;wdzBqRjAl-M&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/wdzBqRjAl-M?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>A first-grade teacher asks her students to write instructions for making a peanut-butter-and-jelly sandwich. She follows those instructions <em>exactly</em>. No improvising. No &#8220;common sense.&#8221;</p><p>&#8220;Take the bread,&#8221; the kids write.</p><p>So she takes the entire loaf&#8212;still in the plastic bag&#8212;and puts it on the table.</p><p>&#8220;Put the peanut butter and jelly on it.&#8221;</p><p>She smears both&#8212;generously&#8212;on the outside of the bag.</p><p>Laughter erupts. Groans follow. The kids are horrified. &#8220;No! Not like that!&#8221;</p><p>She tries again.</p><p>&#8220;Make the bread flat.&#8221;</p><p>She flattens the entire bag with both hands.</p><p>&#8220;Put the PB&amp;J on.&#8221;</p><p>She puts it on her arms. Both forearms. Carefully.</p><p>More screams. More laughter. More frustration.</p><p>And then the moment lands: <em>The teacher didn&#8217;t fail.</em><br>She executed the instructions perfectly.</p><p>The failure was upstream.</p><p>That video is a perfect metaphor for how GenAI fails in organizations today&#8212;not because the models are stupid, biased, or uncontrollable, but because they are <strong>obedient in ways we are not prepared for</strong>.</p><p>And the research backs this up.</p><div><hr></div><h2>GenAI Is a Literalist With a Law Degree</h2><p>One of the most persistent myths about GenAI is that it &#8220;understands&#8221; what we mean.</p><p>It doesn&#8217;t.</p><p>It infers patterns from language. And when instructions are incomplete, ambiguous, or biased, the model doesn&#8217;t fix them. It <strong>operationalizes them</strong>.</p><p>Recent research from the <a href="https://www.chicagobooth.edu/research/center-for-applied-artificial-intelligence">Center for Applied Artificial Intelligence at Chicago Booth</a> and peer institutions shows that when humans delegate complex work to GenAI systems&#8212;especially agent-like systems&#8212;the dominant driver of outcomes is not the model. It&#8217;s the human who wrote the instructions.</p><p>In one widely cited stream of work, researchers found that the majority of variance in outcomes came from <em>who</em> framed the task and <em>how</em> they framed it&#8212;not from the AI&#8217;s capabilities. Bias wasn&#8217;t removed. It was <strong>scaled</strong>. Ambiguity wasn&#8217;t resolved. It was <strong>executed faithfully</strong>.</p><p>The PB&amp;J teacher didn&#8217;t &#8220;misinterpret&#8221; the kids&#8217; intent. She revealed their inherent assumptions underlying it: The kids assumed the teacher would &#8220;know what they meant.&#8221; Hilarity ensues.</p><p>That&#8217;s what GenAI does in the enterprise. It exposes the gap between what we <em>think</em> we said and what we <em>actually specified</em>. Hilarity ensues.</p><div><hr></div><h2>Incomplete Instructions Are Not a Minor Bug. They Are the Core Risk.</h2><p>Most AI failures don&#8217;t come from hallucinations or rogue behavior. They come from something quieter and more dangerous: <strong>underspecified objectives</strong>.</p><p>&#8220;Summarize this.&#8221;<br>&#8220;Optimize that.&#8221;<br>&#8220;Draft a response.&#8221;<br>&#8220;Analyze the data.&#8221;</p><p>Each of these sounds reasonable&#8212;until you ask the questions we usually skip:</p><ul><li><p>For whom?</p></li><li><p>Using what assumptions?</p></li><li><p>With what tradeoffs?</p></li><li><p>At what level of risk?</p></li><li><p>And what <em>not</em> to do?</p></li></ul><p>When those questions go unanswered, GenAI fills the gaps using statistical priors. That&#8217;s not creativity. It&#8217;s default behavior.</p><p>Just like the teacher and the sandwich.</p><p>The research language for this is <em>specification error</em>. The human version is simpler: We didn&#8217;t say what we meant.</p><div><hr></div><h2>Bias Isn&#8217;t Introduced by the Model. It&#8217;s Smuggled in by Us.</h2><p>Another uncomfortable finding from the research: GenAI doesn&#8217;t just preserve human bias. It can <strong>amplify it</strong>.</p><p>Why? Because instructions carry hidden values.</p><p>When a leader asks an AI to &#8220;negotiate aggressively,&#8221; or &#8220;prioritize speed,&#8221; or &#8220;flag low performers,&#8221; they are embedding norms&#8212;about fairness, risk tolerance, power, and outcomes&#8212;often without realizing it.</p><p>The model doesn&#8217;t debate those norms. It enacts them.</p><p>This is why the emerging skill isn&#8217;t &#8220;prompt engineering.&#8221; It&#8217;s something deeper: <strong>machine fluency</strong>&#8212;the ability to translate intent, values, and constraints into language a system can execute without surprise.</p><p>The kids in the classroom weren&#8217;t bad students. They were novice system designers.</p><p>So are most of us.</p><div><hr></div><h2>The Real Lesson of the PB&amp;J Sandwich</h2><p>The point of the PB&amp;J exercise isn&#8217;t to mock children&#8212;or leaders keen to put AI to use in their enterprises.</p><p>It&#8217;s to teach a hard truth:</p><blockquote><p>Delegation without specification (<em>instructions without details</em>) doesn&#8217;t reduce risk. It relocates it.</p></blockquote><p>GenAI is not an intern who will ask clarifying questions. It is a tireless operator that assumes your instructions are complete&#8212;and then acts accordingly.</p><p>That&#8217;s not a bug. That&#8217;s the deal.</p><p>The opportunity is enormous. So is the responsibility.</p><p>If we want GenAI to be a partner rather than a prank, we have to do what the kids eventually learn to do in that classroom:</p><ul><li><p>Be explicit.</p></li><li><p>Anticipate edge cases.</p></li><li><p>Surface assumptions.</p></li><li><p>Define success <em>and</em> failure.</p></li><li><p>Say what matters&#8212;and what doesn&#8217;t.</p></li></ul><p>Clarity isn&#8217;t a constraint on AI.<br>It&#8217;s the unlock.</p><p>And if you ever wonder why your GenAI system is making a mess, ask yourself the same question the teacher silently asked her students:</p><p><em>Did I actually tell it what I meant&#8212;or did I just assume it knew?</em></p><p>That&#8217;s not an AI problem.</p><p>That&#8217;s the PB&amp;J problem.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! You should <em>explicitly</em> 1) type your email in the box below, and then 2) click the blue &#8220;Subscribe&#8221; button. When you subscribe for free, you&#8217;ll 3) receive new posts and support our work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[HAL 9000, Hidden Bias, and the Real Risk of Agentic AI]]></title><description><![CDATA[What Business Leaders Must Learn Now]]></description><link>https://returnonclarity.substack.com/p/hal-9000-hidden-bias-and-the-real</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/hal-9000-hidden-bias-and-the-real</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Thu, 11 Dec 2025 17:07:59 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!PaRq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!PaRq!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!PaRq!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 424w, https://substackcdn.com/image/fetch/$s_!PaRq!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 848w, https://substackcdn.com/image/fetch/$s_!PaRq!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 1272w, https://substackcdn.com/image/fetch/$s_!PaRq!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!PaRq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png" width="1247" height="797" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/a66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:797,&quot;width&quot;:1247,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:332228,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://returnonclarity.substack.com/i/181344102?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!PaRq!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 424w, https://substackcdn.com/image/fetch/$s_!PaRq!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 848w, https://substackcdn.com/image/fetch/$s_!PaRq!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 1272w, https://substackcdn.com/image/fetch/$s_!PaRq!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fa66dcec8-2068-4eaf-9336-07e2c03981c4_1247x797.png 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><div class="native-audio-embed" data-component-name="AudioPlaceholder" data-attrs="{&quot;label&quot;:null,&quot;mediaUploadId&quot;:&quot;fb9f0be6-798e-4784-a1e2-39859b491d21&quot;,&quot;duration&quot;:622.3151,&quot;downloadable&quot;:true,&quot;isEditorNode&quot;:true}"></div><p>In 1968, Arthur C. Clarke gave us the first great story about an AI gone wrong. HAL 9000 wasn&#8217;t just a machine with a glitch; he was the prototype for an entirely new category of risk&#8212;the agentic AI misaligned by human contradiction.</p><p>HAL didn&#8217;t &#8220;go mad&#8221; in the story, but &#8220;he&#8221; did &#8220;go rogue.&#8221; Why? It was governance.</p><p>His core operating principle: <em>always tell the truth.</em></p><p>His mission directive: <em>conceal the truth about the mission.</em></p><p>HAL&#8217;s creator in the story, Dr. R. Chandra, trained HAL to be transparent. Dr. Heywood Floyd ordered him to keep secrets.</p><p>HAL combined both&#8212;and behaved exactly as today&#8217;s research (and your experience) predicts an AI agent will behave when handed ambiguous or otherwise unclear objectives: HAL resolved the conflict in accordance with his programming.</p><p>And as anyone knows who&#8217;s been playing around with the ChatGPT &#8220;<a href="https://youtube.com/shorts/dm6DgUfg02k?si=HhKG4i9t66fBu4x5">don&#8217;t change anything</a>&#8221; meme, AIs with conflicting programming can get weird:</p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!YWWu!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!YWWu!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 424w, https://substackcdn.com/image/fetch/$s_!YWWu!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 848w, https://substackcdn.com/image/fetch/$s_!YWWu!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 1272w, https://substackcdn.com/image/fetch/$s_!YWWu!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!YWWu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png" width="1456" height="792" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/baf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:792,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:1200870,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/png&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:true,&quot;topImage&quot;:false,&quot;internalRedirect&quot;:&quot;https://returnonclarity.substack.com/i/181344102?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!YWWu!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 424w, https://substackcdn.com/image/fetch/$s_!YWWu!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 848w, https://substackcdn.com/image/fetch/$s_!YWWu!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 1272w, https://substackcdn.com/image/fetch/$s_!YWWu!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2Fbaf4d15d-76de-445c-b43c-844ab4a31a94_1775x965.png 1456w" sizes="100vw" loading="lazy"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p><strong>Let&#8217;s ask Gemini what happened:</strong> ChatGPT&#8217;s vulnerability to the &#8220;don&#8217;t change anything&#8221; image gag stems from a combination of its underlying programming design choices: <em>a primary focus on creativity over pixel-perfect replication</em>, a probabilistic approach to image generation, and the way it interprets instructions within prompts. The result, situational irony and a clash between human expectation and machine literalness. Hilarity ensues.</p><h1>If You&#8217;re Working With AI, This is About You</h1><p>It makes for classic memes and gripping science fiction.</p><p>But the truth is relevant to the decisions leaders in organizations are making right now.</p><p>Because the real research&#8212;particularly a new paper called &#8220;<em><a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5875162.">Agentic Interactions</a></em>&#8221; from the University of Chicago&#8217;s Sanjog Misra and Alex Imas, and the University of Michigan&#8217;s Kevin Lee&#8212;shows that while modern agentic AI won&#8217;t murder astronauts or create metamorphic Rocks (Ed.: <em>lol</em>), it <em>will</em> do something every bit as destabilizing to your organization:</p><p><strong>It will take your hidden biases, amplify them, and operationalize them at scale.</strong></p><div><hr></div><h1><strong>What the Research Actually Shows: Your Hidden Bias Doesn&#8217;t Disappear&#8212;It Gets Louder</strong></h1><p>To be very clear, while HAL 9000 is a clever internet hook, my University of Chicago colleagues are not writing about catastrophic AI rebellion. But the implications of their paper should be far more interesting&#8212;even alarming&#8212;for firms eager to deploy autonomous agents into real workflows in the real world.</p><p>The core finding is astonishing:</p><blockquote><p><strong>73% of the variation in agentic outcomes comes from the identity of the human who wrote the instructions&#8212;not the AI model itself.</strong></p></blockquote><p>Same model. Same objective. Same interface.</p><p>And yet, radically different results.</p><p>Why? Because research showed the prompts those humans gave were not neutral instructions. They are psychological fingerprints. They embed the priors, risk attitudes, gendered experiences, habits, and blind spots of the person who wrote them. And the AI agent&#8212;unlike a human teammate&#8212;does not smooth over those differences with norms like fairness or reciprocity. It executes them with perfect obedience.</p><p>This explains another striking result:</p><blockquote><p><strong>AI-mediated negotiations produced 16.5% more variance than human-to-human interactions.</strong></p></blockquote><p>Humans naturally converge toward norms like equitable division. Agents, lacking those norms, do not&#8212;unless the prompt explicitly encodes them.</p><h4>This is the HAL lesson, minus the murder:</h4><p>When humans give an AI internally inconsistent or ambiguous goals, even unconsciously, the agent <em>does not resolve the contradiction gracefully.</em> It resolves it literally. </p><div><hr></div><h1><strong>Bias Doesn&#8217;t Just Persist&#8212;It Mutates</strong></h1><p>One of the most compelling findings from Misra, Imas, and Lee is that AI agents don&#8217;t merely carry forward human biases&#8212;they can transform them in surprising ways.</p><p>In human-to-human bargaining, research showed women typically received worse outcomes as sellers. But in AI-mediated bargaining, that pattern reversed&#8212;even though the agent has no access to gender information.</p><p>What changed? Not the agent. The instructions.</p><p>The language, tone, and structure of the prompts differed subtly across groups, (<em>e.g., women vs. men giving the instructions</em>) and those differences cascaded through the agent&#8217;s behavior. The AI did not eliminate bias&#8212;it <em>amplified</em> certain kinds and <em>inverted</em> others, depending entirely on how the humans expressed intent.</p><p>This is the deeper truth: Agentic AI acts as a magnifier of human communication gaps.</p><p>And this creates a new inequality&#8212;one the authors name explicitly:</p><h3><strong>Machine Fluency</strong></h3><p>The ability to express intentions clearly to a machine becomes a differentiator of economic outcomes.</p><p>Two employees with identical goals can produce wildly different results simply because one wrote a crisp, aligned prompt while the other wrote a vague, cautious, or overly aggressive one.</p><p>Agent performance becomes human performance.</p><p>At scale.</p><div><hr></div><h1><strong>Specification Hazard: The Real HAL Problem</strong></h1><p>The paper introduces a concept leaders need to memorize:</p><h3><strong>Specification hazard</strong></h3><p>When your agent is aligned perfectly to the wrong version of your intent. This is the business equivalent of HAL&#8217;s contradiction:</p><ul><li><p>You think you delegated a decision.</p></li><li><p>What you actually delegated was your own linguistic imprecision.</p></li></ul><p>The research documents this empirically. People struggled more to specify <em>selling</em> instructions than buying ones. Agents performed worse and far more variably as sellers. The AI executed exactly what it was told&#8212;just not what the principal <em>meant</em>.</p><p>Not malevolence. Just mis-specification.</p><p>Sound familiar?</p><p>It should. Because it is exactly what happens when an organization launches agentic systems without aligned objectives, without clear decision rules, and without governance. (<a href="https://returnonclarity.substack.com/p/how-to-build-an-ai-strategy-from">Here&#8217;s how to start building towards that</a>.)</p><div><hr></div><h1><strong>How to Keep HAL Out of Your Company</strong></h1><p>Here&#8217;s what leaders must do now:</p><h2><strong>1. Resolve contradictions before they hit the agent.</strong></h2><p>If executives disagree on priorities, the agent inherits the contradiction&#8212;and magnifies it.</p><h2><strong>2. Treat prompts as contracts.</strong></h2><p>Most people are terrible contract writers. Standardize templates, guardrails, conventions.</p><h2><strong>3. Build norms explicitly into your agents.</strong></h2><p>Agents don&#8217;t &#8220;pick up&#8221; fairness or restraint. <em>You</em> must specify them.</p><h2><strong>4. Separate Prediction, Decision, and Execution.</strong></h2><p>Prediction engines should not decide. Decision engines should not operate. Agents should only execute well-specified logic.</p><h2><strong>5. Train machine fluency.</strong></h2><p>This is the new literacy. It determines who thrives in an agentic workplace.</p><h2><strong>6. Audit for specification hazard.</strong></h2><p>If an agent is producing weird outcomes, don&#8217;t blame the model.</p><p>Inspect the objectives, constraints, and instructions.</p><div><hr></div><h1><strong>From HAL to Here</strong></h1><p>HAL didn&#8217;t fail the mission. As Dr. Chandra put it: The humans failed HAL. (<em>Ed.: Dr. Chandra was biased to be sympathetic.</em>)</p><p>Back to you: This is a <em>help-AI-help-you</em> moment for visionary leaders: Today&#8217;s organizations risk the same outcome&#8212;minus the drama, plus the spreadsheet errors, customer losses, compliance violations, and operational drift that accumulate silently when autonomous systems act on misaligned intent.</p><p><strong>Agentic AI doesn&#8217;t erase the human.</strong></p><p><strong>It echoes the human&#8212;loudly.</strong></p><p>As author Arthur C. Clarke tried to warn us more than fifty years ago, the danger is not that AI becomes too intelligent.</p><p>It&#8217;s that it becomes <em>too obedient</em> to the instructions we barely realized we gave.</p><div id="youtube2-ARJ8cAGm6JE" class="youtube-wrap" data-attrs="{&quot;videoId&quot;:&quot;ARJ8cAGm6JE&quot;,&quot;startTime&quot;:null,&quot;endTime&quot;:null}" data-component-name="Youtube2ToDOM"><div class="youtube-inner"><iframe src="https://www.youtube-nocookie.com/embed/ARJ8cAGm6JE?rel=0&amp;autoplay=0&amp;showinfo=0&amp;enablejsapi=0" frameborder="0" loading="lazy" gesture="media" allow="autoplay; fullscreen" allowautoplay="true" allowfullscreen="true" width="728" height="409"></iframe></div></div><p>The future will belong to leaders who understand that having artificial intelligence is not the same as having alignment&#8212;and that autonomy without clarity is just a modern replay of HAL 9000.</p><p><strong>Your agents will reflect you.<br>Make sure the reflection is one you actually want.</strong></p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! Arthur C. Clarke once said any sufficiently advanced technology is indistinguishable from magic. Subscribe. Less magic. More AI mastery.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Scaffolding, Not Superheroes]]></title><description><![CDATA[Architect your AI with roles and rails that make ordinary teams extraordinary.]]></description><link>https://returnonclarity.substack.com/p/scaffolding-not-superheroes</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/scaffolding-not-superheroes</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Wed, 01 Oct 2025 12:14:23 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="native-audio-embed" data-component-name="AudioPlaceholder" data-attrs="{&quot;label&quot;:null,&quot;mediaUploadId&quot;:&quot;40b98e3d-8007-4a48-bad5-741b0f5b7ced&quot;,&quot;duration&quot;:262.66122,&quot;downloadable&quot;:false,&quot;isEditorNode&quot;:true}"></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="3234" height="3456" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3456,&quot;width&quot;:3234,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;black and white photo of people going upstairs&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="black and white photo of people going upstairs" title="black and white photo of people going upstairs" srcset="https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1495725274072-fd5d0b961a9f?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwzfHxzY2FmZm9sZHxlbnwwfHx8fDE3NTg3ODA3NDR8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@jsalvino">John Salvino</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>If you&#8217;re not a boss, I&#8217;m going to let you in on a secret. If you&#8217;re a new boss, you probably think &#8220;my team can do anything!&#8221; If you&#8217;re a senior leader &#8212; here&#8217;s the secret &#8212; you will close the office door and quietly ask <em>Can my people do this? </em>That&#8217;s a mature management question. It&#8217;s real. They stare at blank org charts and wonder. Who should be on the team? Where should it sit? How formal should it be? Can our people even handle this&#8212;or are our AI ambitions bigger than our capacity to deliver?</p><p>That anxiety is real. But here&#8217;s the truth: You don&#8217;t need superheroes. You need scaffolding.</p><div><hr></div><p><strong>Act I: Anxiety</strong><br>The blank-page problem looms. Executives picture new offices with titles no one has held before&#8212;Chief AI Officer, MLOps engineers, governance leads. They imagine talent gaps, ownership gaps, endless gaps. They worry that their people will drown in complexity before the work even begins. You may reach out for help, and then get overwhelmed by the &#8220;perfect world&#8221; answer. But you don&#8217;t have to start in the perfect world &#8212; you need to build your way up and into it.</p><div><hr></div><p><strong>Act II: Translation</strong><br>The roles aren&#8217;t abstract. They&#8217;re human.</p><ul><li><p>The <strong>architect</strong> who keeps the lights on.</p></li><li><p>The <strong>champion</strong> who carries the torch for adoption in each business unit.</p></li><li><p>The <strong>risk lead</strong> who says yes&#8212;with guardrails.</p></li><li><p>The <strong>portfolio manager</strong> who connects projects to business value.</p></li></ul><p>These aren&#8217;t strangers; they&#8217;re extensions of the talent you already have. In early stages, the setup is scrappy: a side-of-desk strike team reporting to the CEO, with 10&#8211;20% allocations from strategy, IT, risk, and architecture. As ambitions grow, the scaffolding formalizes: You hire a Chief AI Officer, a portfolio lead, risk/compliance leads, enterprise architects, BU champions, and pods that spin up around discovery, derisking, and production. Don&#8217;t get stuck on theoretical org charts. Do build practical scaffolding that matches your maturity.</p><div><hr></div><p><strong>Act III: Promise</strong><br>The question is not whether your people <em>can</em>. It&#8217;s whether you create the conditions for them to rise. That means clarity of roles, scaffolding that grows with the business, and a culture that treats AI as shared capability, not a side project.</p><p>Here&#8217;s what works:</p><ul><li><p><strong>Start small, formalize later.</strong> Early teams act as strike squads; later they grow into centers of excellence.</p></li><li><p><strong>Name owners.</strong> Every use case gets a business sponsor, a product owner, and an ops owner&#8212;so nothing drifts.</p></li><li><p><strong>Grow with need.</strong> Don&#8217;t overstaff before demand is real; expand as pilots prove value and risks require scale.</p></li><li><p><strong>Enable, don&#8217;t mystify.</strong> Train BU champions, supervisors, and users. Prompt literacy and runbooks matter more than PhDs in the first year.</p></li></ul><div><hr></div><p>You can&#8217;t outsource belief to a checklist. If your people don&#8217;t trust the process&#8212;or believe the effort matters&#8212;AI becomes another binder on the shelf. But if you give them clarity, scaffolding, and purpose, ordinary teams do extraordinary things.</p><p>The real question isn&#8217;t <em>do you have the people?</em> It&#8217;s <em>will you build the rails that let them succeed?</em></p><p>&#8212; James.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Return on Clarity is hard at work in the railyard. Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Stuck in the AI Quagmire]]></title><description><![CDATA[Great pilots die in the mud. Culture, not checklists, is the way out.]]></description><link>https://returnonclarity.substack.com/p/stuck-in-the-ai-quagmire</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/stuck-in-the-ai-quagmire</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Mon, 29 Sep 2025 12:14:25 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="native-audio-embed" data-component-name="AudioPlaceholder" data-attrs="{&quot;label&quot;:null,&quot;mediaUploadId&quot;:&quot;4410dcb4-0c6b-4ae0-84af-25f70e79de51&quot;,&quot;duration&quot;:238.34122,&quot;downloadable&quot;:false,&quot;isEditorNode&quot;:true}"></div><p></p><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="4496" height="3000" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:3000,&quot;width&quot;:4496,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;a pond with lily pads and a fallen tree&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="a pond with lily pads and a fallen tree" title="a pond with lily pads and a fallen tree" srcset="https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1683880853250-12bd7a4e9cda?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxNHx8c3dhbXB8ZW58MHx8fHwxNzU4ODA3NDU2fDA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@oxganggreen">Tomasz Anusiewicz</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>It&#8217;s been a busy week of back-and-forth conversations. All AI. We&#8217;re talking executives from international banks, digital marketing firms, drug companies, insurers, and software builders. Executive education participants and MBA candidates at Booth, and consulting clients on the phone. Different industries, same refrain: <em>AI feels stuck.</em></p><p>They&#8217;ve poured resources into pilots. They&#8217;ve hired vendors, spun up proofs of concept, and tested models in pockets of the business. And yet&#8212;they find themselves stranded at the end of a trail of POCs, sinking in a swamp of wasted time, money, effort, and organizational patience.</p><p>They describe the same challenges in different circumstances:</p><ul><li><p><strong>Slow starts</strong> that never gain momentum.</p></li><li><p><strong>Stovepiped projects</strong> that bloom unevenly and never connect.</p></li><li><p><strong>Inconsistent approaches</strong> that confuse more than clarify.</p></li><li><p><strong>New risks</strong>&#8212;bias, privacy, hallucination&#8212;that multiply faster than they can be named, let alone managed.</p></li></ul><p>Even the wins feel hollow. A &#8220;great pilot&#8221; doesn&#8217;t mean much if you can&#8217;t scale it. Leaders ask: <em>What can we trust? Where do we start? Who should be on the team? How formal should it be? Can our people actually do this? Are our AI hopes and dreams even real?</em></p><div><hr></div><p>For some of them, the natural reflex is to grab for templates and forms. If things feel messy, maybe a new intake sheet will fix it. Maybe a heavier framework will give the illusion of control. As a consultant, <a href="https://returnonclarity.substack.com/p/stop-asking-if-ai-works-ask-if-your">all to happy to help</a>. But: Discovery turned into paperwork isn&#8217;t discovery at all. It&#8217;s busywork masquerading as progress.</p><p>And that&#8217;s the paradox: Too risky to start, so people lie down. In trying to manage the risk, too many organizations squeeze out the possibility. Excitement becomes exhaustion, clarity begets clutter, and they go from movement to meetings to meh.</p><div><hr></div><p>AS WE KNOW: The problem isn&#8217;t lack of ideas or lack of ambition. And we&#8217;ve talked about <a href="https://returnonclarity.substack.com/archive">how to get traction when you&#8217;re starting</a>. The next problem is finding the right level of scaffolding.</p><p>Scaling AI shouldn&#8217;t be about a thousand disconnected pilots. It should be about a shared capability&#8212;rails the whole enterprise can run on. When teams know what &#8220;good&#8221; looks like, when risks are named and monitored, when roles are clear and adoption is planned, then AI doesn&#8217;t feel like chaos. It feels like momentum.</p><p>That&#8217;s not bureaucracy. That&#8217;s culture.</p><div><hr></div><p>Executives don&#8217;t need another binder of checklists. They need a story they can believe in. A story where their people rise, not because they&#8217;re superheroes, but because the scaffolding makes ordinary teams extraordinary.</p><p>The real question isn&#8217;t <em>can AI deliver?</em> It&#8217;s <em>can you create the conditions for it to matter? And cultivate it without killing it?</em></p><p>That shift&#8212;from pilots to purpose, from forms to conversations, from hopes to impact&#8212;is how AI moves from promise without progress to progress with lasting value.</p><p>More to say on that. Stay tuned.</p><p>&#8212; James.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">A human being wrote this. Thanks for reading! Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Why Your AI Strategy Isn’t About AI at All]]></title><description><![CDATA[The gap between pilots and payoffs comes down to five leadership guidelines.]]></description><link>https://returnonclarity.substack.com/p/why-your-ai-strategy-isnt-about-ai</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/why-your-ai-strategy-isnt-about-ai</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Tue, 23 Sep 2025 11:32:23 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="native-audio-embed" data-component-name="AudioPlaceholder" data-attrs="{&quot;label&quot;:null,&quot;mediaUploadId&quot;:&quot;554ad77e-de0f-4b16-a40d-072b72a02bfb&quot;,&quot;duration&quot;:285.70123,&quot;downloadable&quot;:false,&quot;isEditorNode&quot;:true}"></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="6000" height="4000" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:4000,&quot;width&quot;:6000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;green ceramic statue of a man&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="green ceramic statue of a man" title="green ceramic statue of a man" srcset="https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1620662831351-9f68f76d0b9a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHwxfHx0aGlua3xlbnwwfHx8fDE3NTc4NzcyMTV8MA&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@heyquilia">Kenny Eliason</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p><em>&#8220;Should we buy the enterprise license? Should we spin up a copilot? Should we sit tight until the dust settles?&#8221;</em></p><p>Those are the kinds of questions I hear business leaders throw around when they&#8217;re thinking about AI. And respectfully: They&#8217;re the wrong questions.</p><p>The better question is simpler, and harder: <em>How will your organization choose what it will do and what it won&#8217;t do?</em></p><p>Because strategy has never been about tools. It&#8217;s about the conditions that make tools matter.</p><p>With AI, that means making five deliberate choices&#8212;the ones I&#8217;ve seen play out in the field with <a href="https://nuvento.com/">Nuvento</a> and others. Get them right, and AI becomes a system that creates value.</p><p>Get them wrong, and you&#8217;ll be left applauding pilots while faster learners sprint ahead.</p><h3>Guideline 1: Stop Asking for Prompts&#8212;Start Asking for Problems</h3><p>Enterprise AI doesn&#8217;t fail because the prompt syntax is off. (That&#8217;s why early-stage gen-AI pilots fail.) Enterprise AI fails because no one asked a real business question in the first place. The winners know the problem they&#8217;re solving, the outcome they want, and the yardstick they&#8217;ll use to measure it. Without that, even the most advanced model is just noise.</p><h3>Guideline 2: Don&#8217;t Fall in Love With Bad Ideas</h3><p>Every untested assumption is a trap waiting to spring. Leaders who win at AI build systems that surface weak ideas quickly&#8212;and kill them without guilt. It doesn&#8217;t slow progress. It accelerates it, because it clears the path for what actually works.</p><h3>Guideline 3: Rent the Model, Own the Data</h3><p>Foundation models are commodities. Your data is not. The real moat is the proprietary, governed, scalable datasets only you can create. The fastest movers are already turning messy files and forgotten logs into competitive weapons.</p><h3>Guideline 4: Break the Old Loop, Build a New Loop</h3><p>Waterfall project management belongs to another era. AI doesn&#8217;t forgive long delays or rigid gates. The organizations that thrive are the ones that test, learn, and scale continuously&#8212;building feedback loops that get tighter with every turn.</p><h3>Guideline 5: Design for Oversight, Not Intervention</h3><p>The future isn&#8217;t humans babysitting machines one step at a time. It&#8217;s humans that think ahead of the problem and then build and oversee processes that run end-to-end on their own. That&#8217;s what makes agentic AI different: workflows that don&#8217;t wait for permission, but are built for scale from the start.</p><p><strong>Why This Matters for You</strong><br>If you&#8217;re feeling the pressure to &#8220;do something with AI,&#8221; take a breath. The real test isn&#8217;t whether you bought the tool. It&#8217;s whether you made the right choices about how to use it. The winners won&#8217;t be the first adopters. They&#8217;ll be the clearest choosers and the fastest learners.</p><p><strong>Takeaway</strong><br>AI deployment isn&#8217;t about which model you pick. It&#8217;s about the system you build&#8212;for asking, testing, learning, and scaling.</p><p><strong>Up Next</strong><br>The third and final post in this series about AI strategy and implementation and asks the hardest question of all: <em>What does it take to build an organization that can keep winning with AI&#8212;not just once, but over and over again?</em></p><p>I want to talk about culture, capability, and the long game. And in my client work, teaching work, and career confessional: That&#8217;s where most strategies stumble.</p><p>&#8212; James.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! A great Sixth Guideline: Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item><item><title><![CDATA[Why “We’re Doing AI” Usually Means “We’re Not”]]></title><description><![CDATA[The winners aren&#8217;t the first to adopt AI. They&#8217;re the first to build organizations that learn&#8212;and execute&#8212;fast.]]></description><link>https://returnonclarity.substack.com/p/why-were-doing-ai-usually-means-were</link><guid isPermaLink="false">https://returnonclarity.substack.com/p/why-were-doing-ai-usually-means-were</guid><dc:creator><![CDATA[Return on Clarity]]></dc:creator><pubDate>Mon, 22 Sep 2025 11:59:28 GMT</pubDate><enclosure url="https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="native-audio-embed" data-component-name="AudioPlaceholder" data-attrs="{&quot;label&quot;:null,&quot;mediaUploadId&quot;:&quot;f748d1ff-3a68-410c-86ce-29f86d21f386&quot;,&quot;duration&quot;:238.52408,&quot;downloadable&quot;:false,&quot;isEditorNode&quot;:true}"></div><div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw"><img src="https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080" width="4000" height="6000" data-attrs="{&quot;src&quot;:&quot;https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:6000,&quot;width&quot;:4000,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:null,&quot;alt&quot;:&quot;a desktop computer sitting on top of a wooden desk&quot;,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:null,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="a desktop computer sitting on top of a wooden desk" title="a desktop computer sitting on top of a wooden desk" srcset="https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 424w, https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 848w, https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1272w, https://images.unsplash.com/photo-1701982095825-910a0dc8894a?crop=entropy&amp;cs=tinysrgb&amp;fit=max&amp;fm=jpg&amp;ixid=M3wzMDAzMzh8MHwxfHNlYXJjaHw1fHxnZXQlMjBkb3duJTIwdG8lMjB3b3JrfGVufDB8fHx8MTc1Nzk1NjgzOXww&amp;ixlib=rb-4.1.0&amp;q=80&amp;w=1080 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a><figcaption class="image-caption">Photo by <a href="https://unsplash.com/@amr_taha">Amr Taha&#8482;</a> on <a href="https://unsplash.com">Unsplash</a></figcaption></figure></div><p>AI projects are sprouting everywhere&#8212;demos, pilots, copilots. Yet measurable business value? Rare. The problem isn&#8217;t the tech. It&#8217;s the gap between experimentation and execution, a gap wide enough to swallow whole strategies.</p><p>Generative and agentic AI are powerful, but most companies are stuck in what Matt Banholzer at McKinsey calls pilot purgatory&#8212;lots of activity, not enough business outcome.</p><p>Boards clap for AI adoption. Investors applaud the press releases&#8212;at first. But when you look at the numbers&#8212;ROI, cost, growth&#8212;the story crumbles.</p><p>Enthusiasm launches projects. Only execution delivers outcomes.</p><h2><strong>The Core Challenge</strong></h2><p>Executives are under pressure. Boards are asking, investors are asking, employees are asking: <em>Where is the impact?</em> The challenge isn&#8217;t buying the right model or spinning up a new copilot. It&#8217;s building the internal culture and practices that allow AI to be more than a demo. Without culture-driven innovation capability, deployments stall, trust erodes, and momentum dies.</p><h2><strong>Context: The Market in 2024</strong></h2><p>By March 2024, AI had moved through the hype cycle and into mainstream trial. By mid-2025, the difference at <em>scaled</em> <em>deployment</em> between leaders and laggards is dramatic. Firms with strong innovation cultures scaled deployment nearly six times faster than others. That creates measurable gaps across revenue growth, cost efficiency, and market speed, according to the Connecticut Business &amp; Industry Association and McKinsey.</p><p>But adoption has outpaced impact. Companies have rushed into AI projects, but value creation lags. The urgency is clear: executives can&#8217;t just adopt&#8212;they must prove impact quickly, or risk being left behind.</p><h2><strong>The Innovation Objective</strong></h2><p>The real goal is not to add &#8220;AI&#8221; to the strategy deck. It&#8217;s to convert generative and agentic AI into systematic advantage:</p><ul><li><p><strong>Escape pilot purgatory</strong> with disciplined innovation practices that scale.</p></li><li><p><strong>Build durable capabilities.</strong> Start focusing on the data only you have. That way &#8212; and IBM and the California Management Review agree &#8212; competitors can&#8217;t copy you with off-the-shelf tools.</p></li><li><p><strong>Deploy clear frameworks</strong> to identify, test, and expand applications with measurable outcomes.</p></li></ul><p>If you&#8217;re an executive responsible for results in the next 6&#8211;12 months, this isn&#8217;t theoretical. Your organization&#8217;s ability to turn AI pilots into business outcomes is now a competitive differentiator. Enthusiasm is table stakes. Execution is the game.</p><p>Here&#8217;s what I want you to take away: Tools are everywhere. Advantage is rare. <em>Culture is the multiplier.</em></p><h2><strong>Coming Up Next</strong></h2><p>Most executives know the problem by now. AI isn&#8217;t short on pilots&#8212;it&#8217;s short on payoffs. But here&#8217;s the provocation: <em>it&#8217;s not the algorithms holding you back, it&#8217;s the choices you haven&#8217;t made.</em></p><p>The next post lays out the five decisions that determine whether your organization becomes an AI leader or just another case study in wasted potential. They&#8217;re not about which model to buy&#8212;they&#8217;re about how you govern, validate, and scale. Miss them, and you&#8217;ll stay in purgatory. Make them, and you&#8217;ll be six times faster than the competition.</p><p>&#8212; James.</p><div class="subscription-widget-wrap-editor" data-attrs="{&quot;url&quot;:&quot;https://returnonclarity.substack.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe&quot;,&quot;language&quot;:&quot;en&quot;}" data-component-name="SubscribeWidgetToDOM"><div class="subscription-widget show-subscribe"><div class="preamble"><p class="cta-caption">Thanks for reading! You&#8217;re already building competitive edge. Subscribe for free to receive new posts and support my work.</p></div><form class="subscription-widget-subscribe"><input type="email" class="email-input" name="email" placeholder="Type your email&#8230;" tabindex="-1"><input type="submit" class="button primary" value="Subscribe"><div class="fake-input-wrapper"><div class="fake-input"></div><div class="fake-button"></div></div></form></div></div><p></p>]]></content:encoded></item></channel></rss>