WEST STACK AI
Back to Blog

The AI Adoption Catch-22: You Need People to Implement the Thing That Threatens to Replace Them

There's a paradox at the center of every AI implementation I've been part of.

To implement AI at a company, you need the help of the people whose jobs AI might change. You need their institutional knowledge, their workflow expertise, their willingness to show you where the inefficiencies are, and their patience as you build systems that automate parts of what they do.

And the entire time, the news cycle is telling them: AI will replace you.

How's that for a pitch? "Hey, could you spend the next three months helping us build something that might make your role obsolete? Great, thanks."

This is the catch-22 of enterprise AI adoption, and I believe it's one of the biggest reasons we're not seeing the level of adoption the technology warrants. The tools are proficient. The use cases are clear. The ROI is there. But the humans you need to make it work have every rational reason to resist.

The Resistance Is Rational

I see this firsthand working with financial services firms. The technology works. The pilots succeed. But somewhere between "successful proof of concept" and "company-wide rollout," things stall. Not because of technical limitations. Because of people.

And I don't blame them. If every time you opened LinkedIn, someone was telling you that your profession has 75% AI task exposure and your job is going to be automated, would you enthusiastically help your company adopt AI? Or would you quietly slow-walk the implementation, protect your turf, and hope the whole thing blows over?

The resistance isn't irrational. It's self-preservation.

The Messaging Problem

Here's what makes it worse: the people shaping the narrative about AI and jobs are the people who benefit most from AI adoption.

Anthropic just published a report mapping which jobs are most "exposed" to AI. But Anthropic is also the company that sells that AI. When the company selling the technology publishes a study saying 75% of programming tasks and 67% of data entry work can be handled by AI, there's an inherent credibility problem. The heads of AI companies have a financial incentive to make AI sound as capable and inevitable as possible. That doesn't mean they're wrong. But it does mean they can't be the only voices in the conversation.

And when employees at companies beginning to adopt AI see these reports — from the companies building the tools — the message they hear isn't "transformation." It's "you're next."

The Leadership Failure

I haven't seen enough leaders navigate this humanely. Not even close.

What would humane look like? It would look like honesty. "Your job will likely change. We don't know exactly how, but there will be opportunity for you on the other side of that change, if you help. Your knowledge of how this business actually works is the thing AI doesn't have. And we value that." And then back it up. Reward them. With promotions, with pay raises, with praise.

The Electricity Analogy

The doomsday perspective on AI is near-sighted.

If AI becomes a utility, and the LLMs are essentially the same, which is mostly true already — like electricity — I look at what electricity spawned. It created entire industries that didn't exist before. Refrigeration. Telecommunications. Broadcasting. Computing. The internet. AI. 🤔

Did electricity eliminate jobs? Yes. Did it create vastly more than it destroyed? Overwhelmingly yes. But that transition wasn't instant, and it wasn't painless. People lost livelihoods. Industries collapsed. New ones emerged. The transition took decades and required massive investment in infrastructure, education, and new institutions.

AI will likely follow a similar path. But we're in the early phase right now — the phase where the disruption is visible and the new opportunities are still forming. If you're a worker watching this unfold, the anxiety makes sense. If you're a leader, your job is to help people through the transition, not pretend it isn't happening.

What I'd Tell Executives

If you're leading an AI initiative at your company, here's what I've seen work:

Start with problems, not technology. Don't announce "we're adopting AI." Announce "we're solving this specific problem that everyone hates." When AI is the solution to something painful, people welcome it instead of fearing it.

Make the implementers the heroes. The people who help you build and deploy AI should be visibly rewarded — promoted, recognized, given new titles that reflect their expanded role. If helping implement AI is career-advancing rather than career-ending, resistance evaporates.

Be honest about uncertainty. Nobody knows exactly how this plays out. Saying "I don't know, but I want us to figure it out together" is more trustworthy than a polished narrative about augmentation that everyone suspects is corporate spin.

Invest in your people, not just your tools. For every dollar you spend on AI tooling, spend something on helping your team learn to work alongside it. The companies that treat AI adoption as a technology procurement exercise will fail. The ones that treat it as a workforce development exercise will win.

Stop letting AI companies control the narrative. Anthropic, OpenAI, Google — they all have a financial interest in making AI sound inevitable. Their research is useful, but it's not neutral. Your employees know this. Acknowledge it. Build your own perspective on what AI means for your specific business, and communicate that instead of forwarding industry reports written by people who profit from the conclusions.

The Opportunity

I see enormous opportunity in AI. Not in spite of the disruption, but because of the gap between what the technology can do and how organizations are actually adopting it. That gap exists largely because of the human dynamics I've described. The companies and leaders who figure out how to navigate the catch-22 — who earn their people's trust through the transition rather than bulldozing through it — will have a massive advantage.

AI is a useful, transformative technology. But more will be revealed.

Thanks, Franklin.

The AI Adoption Catch-22: You Need People to Implement the Thing That Threatens to Replace Them