The Hardest Part Right Now Is Convincing Managers AI Isn't Needed Everywhere
June 30, 2025Lately, one of the biggest challenges in the software industry isn’t technical, it’s ideological. We’re in the middle of a tidal wave of hype where every other meeting, pitch deck, or press release includes the phrase “AI-powered” as if it’s a golden stamp of innovation. And the hardest part? Convincing managers that AI doesn’t need to be jammed into every crevice of our tech stack.
The AI Overdose
Let me be clear: AI is incredible. Tools like GitHub Copilot, ChatGPT, and CodeWhisperer have dramatically improved how we write and review code. They accelerate mundane tasks, unblock early scaffolding, and help explain obscure APIs. But we’ve swung too far in the other direction.
Every product now must have an AI angle. Every team is tasked with “leveraging AI,” even when it makes no practical sense. This has led to bloated roadmaps, confused priorities, and half-baked features nobody asked for—all in the name of chasing a trend.
Marketing ≠ Reality
You’ve probably seen the claims: “30% of our code is written by AI.” Sounds impressive, right? But peel back the layers and what this really means is:
100% of the code is written by human developers using AI tools.
That’s not semantics it’s an important distinction. AI doesn’t autonomously decide architecture, design robust systems, or handle context-specific trade-offs. It augments, it assists, it proposes. It doesn’t own responsibility. That still lies squarely with us, human developers.
The Misguided Push Away from Search Engines
Another troubling trend is the move to replace traditional search engines with AI chat interfaces. Some companies are trying to rebuild the search experience entirely around large language models.
Here’s the problem: search is not broken.
People use search engines because they want reliable sources, links, context, and comparison—not just a summarized answer with no citations and no way to dig deeper. AI tools often hallucinate, paraphrase wrongly, and present fiction with complete confidence. That’s not better, that’s dangerous.
Replacing search with a chatbot creates:
- Lack of verifiability: Where is this information coming from?
- Loss of depth: No ability to explore multiple perspectives.
- Overconfidence bias: Users believe the AI “must” be right.
- Missed learning opportunities: Search teaches you how to evaluate sources, AI can spoon-feed without teaching anything.
The result? A generation of users being trained to trust black boxes over actual understanding.
The Balance We Actually Need
We don’t need to scrap AI tools. We just need balance and pragmatism. Before introducing an “AI module” or launching a new “AI feature,” ask:
- Does this genuinely improve the user experience?
- Is the AI solving a real problem, or just generating noise?
- Do we understand the risks (technical debt, bias, opacity) that come with the tech?
- Would a simple rules-based system do the job better and more reliably?
Sometimes the best answer is to not use AI. And that’s okay.
AI is not magic. It’s not a universal fix. And it’s definitely not a substitute for clear thinking, solid engineering, or good product sense. The hard part isn’t convincing engineers of this, they already know. The hard part is helping non-technical decision-makers see past the marketing fluff and understand where AI actually fits.
So yes, AI is the future. But not every problem needs a neural network, and not every app needs a chatbot.
Let’s keep our tools sharp, our goals clear, and our hype meters calibrated.