Agents should help not replace
Every day, I open up my AI tools and get to work. I use AI constantly, for writing, coding, research, brainstorming, and a dozen other things. But here's the thing: AI doesn't do my work for me. It helps me do my work better. That distinction matters more than most people realize. And if you're using AI the other way around, letting it drive while you sit in the passenger seat, you might be setting yourself up for a problem.
The intern who knows everything and nothing
I like to think of AI as hiring an incredibly well-read intern. This intern has consumed virtually every public text on the internet. They can write fluently, summarize complex topics, and produce polished output on demand. On paper, they look amazing. But here's the catch: they don't know your business. They don't know your customers. They don't understand the subtle reasons you made a particular design decision last quarter, or why your team chose one architecture over another. They have world knowledge, but they lack your knowledge. Ethan Mollick, a professor at Wharton, popularized this "AI as intern" framing back in 2023. The analogy still holds. An intern can be incredibly useful, but only when someone with experience is guiding them, reviewing their work, and pointing them in the right direction.
Steering is the skill that matters
When I use AI, I'm not passively accepting what it gives me. I'm steering it. I bring context, judgment, and taste to every interaction. I know what a good output looks like for my specific situation, so I can tell when the AI is off-track and correct course. This is where domain expertise becomes a multiplier. Research from MIT has shown that financial analysts using AI tools make better forecasts than either humans or AI working alone, and their effectiveness increases over time as they learn to better leverage the technology alongside their own knowledge. The same pattern shows up in programming, writing, and design. People who already understand the domain get dramatically more value from AI than those who don't. It's like the difference between a copilot and an autopilot. A copilot assists you while you remain in control, making you faster and more capable. An autopilot takes over entirely. For most knowledge work right now, the copilot model wins. As one Microsoft customer's tech lead put it: "Our mantra is 'Copilot, not autopilot.'"
Why this means you can't be replaced
Here's what makes this exciting rather than scary: if you're the one doing the steering, AI makes you more valuable, not less. PwC's 2025 Global AI Jobs Barometer found exactly this. Analyzing close to a billion job ads across six continents, they concluded that AI can make people more valuable, not less, even in the most highly automatable jobs. The key factor? Whether the human brings expertise that shapes how AI is used. BCG's research tells a similar story. Their analysis projects that over the next two to three years, 50% to 55% of jobs in the US will be reshaped by AI, but reshaped is not the same as replaced. Most roles will remain. They'll just change substantially. The people who thrive will be those who learn to work with AI effectively. A Harvard Business School study categorized over 19,000 job tasks across more than 900 occupations and found that occupations with the highest "augmentation potential" tend to involve social skills, hands-on technical expertise, and human judgment. In other words, exactly the kind of work where a human needs to steer the AI rather than hand over the keys.
The real warning sign
If your AI agent is doing your entire job without you adding meaningful input, that should concern you. Not because AI is too powerful, but because it suggests the work itself might not require the kind of judgment, creativity, or domain knowledge that makes human involvement essential. That's not a reason to panic. It's a signal to reflect. What unique value do I bring? What do I know that the AI doesn't? If the answers feel thin, it might be time to either deepen your expertise or pivot toward work where your human judgment is the critical ingredient. The companies getting this right are the ones choosing augmentation over automation. A recent Harvard Business Review article argues that organizations focused on growing the top line through AI-augmented innovation will outperform those simply cutting headcount through automation. The difference isn't just ethical, it's strategic.
How I think about it
My approach is simple. I treat AI as the most capable assistant I've ever had, but an assistant nonetheless. I set the direction. I make the calls. I apply the context that only I have. The AI handles the heavy lifting, the first drafts, the research summaries, the boilerplate, so I can focus on the parts that actually require me. This is the model that works. Not AI replacing humans, but AI amplifying what humans can do. The intern doesn't replace the senior employee. The intern makes the senior employee more productive. And that's why building your own expertise still matters more than ever. The better you are at your craft, the more effectively you can steer AI, and the harder you become to replace.
References
- On-boarding your AI Intern by Ethan Mollick, One Useful Thing
- Enhance or Eliminate? How AI Will Likely Change These Jobs, Harvard Business School Working Knowledge
- Why Companies That Choose AI Augmentation Over Automation May Win in the Long Run, Harvard Business Review