Why 95% of AI pilots fail: The 3 leadership skills managers need to drive AI adoption

Employees are significantly more likely to use and trust AI when managers actively support adoption, yet only 28% of employees say their manager encourages AI use.
Successful leaders must combine technical understanding, emotional intelligence, and coaching capability to guide teams through AI-driven change.
Companies that involve managers early, balance governance with experimentation, and build hands-on AI leadership capabilities are far more likely to move beyond failed pilots and achieve scalable transformation.
The leadership gap behind failed AI transformation
The headline number is hard to ignore: 95% of AI pilots failed to deliver returns this year. The simplest reading is a technology problem. The data tells a different story.
When managers actively champion and drive AI adoption, employees are:
- 2.1x more likely to use AI frequently
- 6.5x more likely to find AI tools useful for their work
- 8.8x more likely to say AI helps them do their best work
The pattern is consistent: AI ROI tracks managerial behaviour, not tool sophistication. The companies whose pilots succeed are not the ones with the best vendors. They are the ones whose managers know how to lead AI adoption.
The challenge is that most managers are not equipped for that role. Nearly half of executives cite leadership effectiveness as the single biggest driver of AI ROI, but only 28% of employees say their manager actively supports their team's AI use.
That gap between what executives know matters and what managers are actually doing is where AI transformation breaks.
What AI transformation demands from leaders
AI adoption requires manager capabilities that most leadership development programs were not built to deliver. Three shifts define what the role now requires.
1. Blending analytical and human skills
Leaders need to scope AI initiatives, understand outcomes, and assess risks. That requires analytical business acumen. They also need influencing skills, empathy, and the ability to speak with authority about something most of their team is uncertain about.
Traditional leadership development tends to specialise. Programs focus on either technical capability or soft skills. AI adoption demands both, at once, from the same person.
The manager rolling out AI to their team needs to be fluent in what the tool actually does, what its limits are, and which workflows it can credibly improve. They also need to read team anxiety, coach individuals through change, and create the psychological safety that makes experimentation possible. Either skill alone is insufficient.
2. Balancing governance with experimentation
Most organizations are heavily focused on risk management, compliance, and guardrails. This focus is necessary. It also creates a problem.
When leaders only communicate what can go wrong, teams become anxious, and the experimental mindset that AI innovation requires dies. Risk avoidance becomes the dominant operating mode. People stop trying things. Pilots stagnate.
The leadership move is to hold both at once. Compliance is real. Caution is appropriate. So is permission to experiment, share what works, and surface what doesn't. Managers need to push both buttons simultaneously, not pick one.
3. Measuring success at the right maturity stage
Early-stage AI adoption requires different metrics than mature implementation. Organizations just starting out should measure leading indicators: confidence levels, anxiety levels, willingness to experiment, frequency of AI use in real workflows. Organizations further along should track whether initiatives move beyond pilots to deliver business value.
Leaders using the wrong metrics for their maturity stage produce a predictable failure mode: teams that are actually progressing get measured as failures, and momentum dies before it has a chance to compound.
Expert take on manager AI development
Pascal Struijk, Product Lead at Lepaya, has spent the last year working with HR and L&D leaders on what leadership development for AI actually looks like in practice. His view on where most organizations miss the point:
"We're looking at a triangle: IT, HR, and managers. IT has the critical knowledge and systems. HR bridges executive leadership and people. But the only one who has access to teams is the leader. Remove one from the equation, and you won't see the results you're hoping for. If you don't involve middle management, you don't know what the most valuable investment is when it comes to AI and automation."
This is the structural insight most AI strategies miss. AI adoption gets owned by IT, sometimes supported by HR, and routinely skips the layer that actually drives whether teams use the tools, the layer of managers.
The "work with your team" principle
Pascal's main advice to leaders driving AI adoption:
"Work with your team, don't talk to your team. We can help employees see that it's much safer to co-shape the future than to wait and see. It's very tempting to push the compliance button hard and focus on risk. But that doesn't help with anxiety. Whenever we push the compliance button, we also have to push the motivation button."
The shift is from top-down communication ("here is the AI policy") to genuine collaboration ("how should we redesign how we work?"). Teams that co-design AI adoption stick with it. Teams that have it pushed at them quietly avoid it.
The three buckets of leadership upskilling
Pascal's framework for the manager AI development path has three components:
- Functional upskilling. What AI can and can't do. The actual risks and opportunities. This is the part that reduces anxiety because most fear comes from not knowing.
- Psychological skills. Understanding the manager's own thoughts and feelings about AI. Most leadership development skips this layer entirely, but it is where authentic communication starts.
- Coaching capability. Working alongside teams on real AI initiatives. Experimenting. Prototyping. Asking the harder question: what is the manager's role when we redefine how work gets done?
"There are three buckets. First, functional upskilling: understanding what AI can and can't do, what the risks and opportunities are. That helps decrease anxiety. Then the psychological aspect: understanding your own thoughts and feelings about AI. Finally, the coaching element: working on initiatives, experimenting, and prototyping. What is your role when we're going to redefine work?"
— Pascal Struijk, Product Lead, Lepaya
Most manager AI training programs cover only the first bucket. That's why most programs don't move adoption metrics.
What HR and L&D leaders should do now
If 95% of AI pilots fail and the difference is leadership, the highest-leverage HR investment of 2026 is manager AI development. Not AI tool deployment. Not policy frameworks. Manager skill building.
Three concrete moves:
1. Make middle management central to AI strategy, not peripheral
Build manager representation into every AI initiative from scoping onward. The team closest to the work decides whether the tool gets used. They need a voice in choosing it, deploying it, and measuring it.
2. Design the development path across all three buckets
Functional knowledge, psychological awareness, coaching capability, all covered in sequence, ideally with hands-on practice. Theory-only modules underperform predictably. Programs that have managers redesigning real workflows with AI tools produce different outcomes.
3. Match metrics to maturity
In the first 6-12 months of an AI rollout, track manager support of team experimentation, employee confidence and anxiety levels, and frequency of AI use in real work. Track business ROI later, when adoption is real enough to measure. Reversing this order makes teams feel like failures when they're actually doing the right things.
The strategic point
The 95% AI pilot failure rate is misleading on its own. It sounds like a technology indictment. It is actually a leadership indictment.
Tools don't adopt themselves. AI ROI doesn't appear because procurement signed the contract. The pattern across successful AI transformations is the same: managers who can hold technical clarity, emotional steadiness, and team coaching at the same time. The 5% of pilots that succeed have those managers. The 95% that fail don't.
Strategic L&D in 2026 is the function that closes that gap.
Want the full discussion? Pascal and the OpenUp team's recent webinar on training managers for AI scale goes deeper into the three-bucket framework and the leadership skills HR should prioritise.

We offer a scalable employee training solution. It lets you continuously upskill your people.
Book a call
Related articles
.png)
Review by:
The future of Anti-Money-Laundering
AI advances like ChatGPT-5 are transforming the workforce, but not without risks. With entry-level roles disappearing and human cognition declining through overreliance on AI, leaders must prioritize skill resilience and talent pipelines to sustain long-term performance.
Ready to drive impact together?
Close skill gaps, accelerate growth, and future-proof your workforce.


Frequently Asked Questions
Why doesn't IT-led AI adoption work?
IT-led adoption misses the layer that actually drives team behaviour. AI strategy commonly involves IT (the systems), HR (the policy and people), but skips middle managers (the people with direct access to teams). Without manager involvement in scoping, deployment, and measurement, organizations don't know what the most valuable AI investments actually are.
How do managers drive AI ROI?
Managers drive AI ROI by championing AI use within their teams, modelling experimentation, addressing anxiety alongside policy, co-designing workflows with employees rather than imposing AI top-down, and balancing risk management with active encouragement to try new tools. Research shows employees are dramatically more likely to use AI productively when their direct manager actively supports it.
Why are 95% of AI pilots failing?
The failure rate is largely about leadership, not technology. When managers actively support AI adoption, employees are 2.1x more likely to use AI frequently and 8.8x more likely to say it helps them do their best work. When managers don't, adoption stalls regardless of how good the underlying tool is. Most failed pilots trace back to managers who weren't equipped to lead the change.
What leadership skills are needed for AI transformation?
Three skills define the new requirements: blending analytical and human skills (technical understanding plus empathy and influence), balancing governance with experimentation (compliance plus permission to try things), and measuring success with metrics matched to AI maturity stage (early-stage leading indicators vs mature-stage business outcomes).
What does manager AI training actually cover?
Effective manager AI training covers three areas: functional understanding (what AI can and can't do, risks and opportunities), psychological awareness (the manager's own attitudes and feelings about AI), and coaching capability (how to lead teams through experimentation and workflow redesign). Most programs cover only the first area, which is why they don't move adoption metrics.
How should HR measure AI adoption progress?
Match metrics to maturity stage. In early AI adoption, measure leading indicators: confidence levels, anxiety levels, frequency of experimentation, manager support behaviours. In mature stages, measure business outcomes: time saved, decisions improved, revenue or cost impact. Using mature-stage metrics on early-stage programs produces false negatives and kills momentum.
What's the relationship between psychological safety and AI adoption?
Without psychological safety, employees hide their AI use, avoid asking for help, and stop experimenting. Surveys show nearly half of workers already hide their AI use to avoid judgment. Managers who balance compliance messaging with motivation messaging, pushing both buttons rather than only the risk button, create the conditions where genuine adoption can happen.

.jpg)

.png)