How to lead when AI has your team on edge
There is a particular kind of unease settling across workplaces right now. It's not the usual pressure of deadlines or quarterly targets. It's something more existential, a creeping question that sits beneath the surface of every conversation about productivity, efficiency, and the future: What does AI mean for me? For my team? For my career?
As a leader, you're expected to have answers. You're meant to be the one steering the ship through this fog, projecting confidence about a destination you can't quite see. Meanwhile, half your team is quietly catastrophising, some are in denial, and others are frantically up-skilling, unsure whether they're preparing for opportunity or obsolescence.
This is AI anxiety, and it's becoming the defining leadership challenge of our time.
The weight of collective uncertainty
What makes AI anxiety particularly corrosive is its pervasiveness. Unlike a restructure or a difficult quarter, this isn't a discrete event with a beginning and end. It's ambient. It infiltrates coffee conversations, organisational strategy and change initiatives, performance reviews, and leadership training alike.
I spoke recently with a COO who described the atmosphere in her team as "perpetually unsettled." "Everyone is watching what AI can do," she said, "and mentally calculating whether their role will exist in five years. The irony is that the anxiety itself is making them less effective, the very thing that might make them vulnerable."
This is the cruel paradox of AI anxiety: the cognitive resources we need to adapt, to learn new tools, to think creatively about how to work alongside AI, these are precisely the resources that chronic anxiety depletes.
What's actually happening in your team's brains
When people face uncertainty about their fundamental relevance, their nervous systems respond as if facing a physical threat. Cortisol floods the system. The prefrontal cortex, responsible for complex reasoning, creativity, and learning new skills, gets hijacked by more primitive survival responses.
This means that at the exact moment your team needs to be learning, adapting, and thinking innovatively about how to integrate AI into their work, their brains are increasingly geared toward threat detection and self-protection.
You'll see this play out in predictable ways: people becoming territorial about their expertise, resistant to experimentation, or paralysed by perfectionism. Some will disengage entirely, mentally checking out while physically present. Others will overwork, trying to outrun an invisible competitor. All the while, some will outsource their crucial relationship development activities, like important conversations that require demonstration of care and concern, to a quick AI prompt and response.
The temptation to jump to fixing it - and why that backfires
When you see your team struggling, the instinct is to solve the problem. Roll out training programs. Share optimistic articles about AI as a tool, not a replacement. Deliver reassuring messages about the organisation's commitment to its people.
These responses aren't wrong, but they often miss the mark because they treat AI anxiety as an information deficit rather than an emotional and existential challenge.
I've watched well-meaning leaders make this mistake repeatedly. They provide facts and frameworks while their team nods along, then returns to their desks still carrying the same gnawing dread. The anxiety isn't resolved because it was never really about lacking information in the first place.
What people are grappling with is something deeper: uncertainty about their identity, their value, their place in a world that's shifting beneath them. You can't PowerPoint your way through that.
A different approach: leading with presence, not promises
The most effective leaders I've observed navigating this terrain share a common quality: they're not trying to eliminate uncertainty. They're modelling how to move through it.
A managing partner at a consulting firm described her approach: "I stopped pretending I knew what the next five years would look like. Instead, I started being honest that I was learning alongside everyone else. That seemed to reduce everyone's anxiety more than any reassurance I'd previously offered."
This tracks with what we know about psychological safety and leadership. When leaders acknowledge complexity rather than oversimplifying it, they create space for others to do the same. The message becomes: We can hold uncertainty without being undone by it.
Five approaches that actually help
Over the past couple of years, I've worked with leaders across professional services, consulting, and academia on this challenge. Here are some things that consistently make a difference:
Normalise the discomfort. Name what's happening. "There's a lot of uncertainty about AI right now, and it's natural for that to feel unsettling" is a more useful starting point than pretending everything is fine. Acknowledging reality doesn't amplify anxiety, it creates psychological relief by removing the additional burden of having to pretend.
Shift the frame from threat to exploration. AI anxiety often comes from a passive stance, waiting to see what happens to us. Leaders can help teams reclaim agency by framing AI as something to actively explore rather than passively await. This might mean protected time for experimentation, permission to fail, or simply asking: "What would you want to try if you weren't worried about getting it wrong?"
Watch for the uncharacteristically quiet ones. The team members making the most noise about AI aren't necessarily the ones struggling the most. Pay attention to those who've gone silent, who've stopped contributing ideas, or whose work quality has subtly declined. These withdrawal behaviours may signal someone who's privately struggling.
Model your own learning. Be visible about your own process of figuring out AI. Share what you're experimenting with, what's confusing you, what you've gotten wrong. This does two things: it normalises the learning curve, and it demonstrates that seniority doesn't exempt anyone from the adaptation required. This also reinforces psychological safety and is integral to lowering anxiety levels and unlocking a “can do” spirit, creativity and innovation.
Create containers for honest conversation. Teams need space to voice concerns without those concerns being immediately solved, dismissed, or turned into action items. Sometimes people need to articulate their fears before they can move past them. A well-facilitated team conversation about AI, one that allows for genuine expression rather than corporate talking points, can be remarkably cathartic.
The opportunity in the anxiety
Here's what I find genuinely hopeful: the teams that learn to navigate AI anxiety well are developing capabilities that extend far beyond this particular challenge.
They're building tolerance for discomfort and ambiguity. They're learning to act without certainty. They're developing the emotional resilience to stay creative under pressure. These capacities will serve them regardless of where AI goes from here.
The goal isn't to eliminate AI anxiety – some level of adaptive concern is appropriate given the magnitude of change underway. The goal is to keep your team's response in the zone where it mobilises rather than paralyses, where it prompts learning rather than withdrawal.
The ethical dimension to uncertainty
This isn't just about team performance. If you're leading people through a period of genuine vocational uncertainty, you have an ethical responsibility to do so with care.
AI anxiety, left unaddressed, doesn't just affect productivity metrics. It affects people's sleep, their relationships, their sense of self-worth. It can tip into clinical anxiety or depression. The leaders who treat this as merely a change management challenge are missing something important about their duty of care.
How you lead through this moment will be remembered. Your team is watching not just what you say about AI, but how you hold space for their very human responses to it.
Final thought
Your team doesn't need you to predict the future of AI. They need you to help them stay present, curious, and connected while that future unfolds.
The strongest teams won't be those that avoided AI anxiety altogether. They'll be the ones whose leaders helped them feel it, name it, and move through it together.