When Your Training Gets Smarter, But You Don't
What AI Is Doing to How We Learn
That's my Zwift avatar. Training indoors whilst the British winter does its thing outside.
Every session uploads to Strava. Their "Athlete Intelligence" AI analyses my efforts, spots patterns I'd miss, tells me when I'm training too hard or not hard enough. It's like having a coach who never sleeps and never misses a detail.
It's brilliant. But if I think about it too much, it can make me slightly uncomfortable.
The danger is that I start to think less about what I am are doing and the progress I am making. The AI tells me I'm fatigued before I feel it. It recommends recovery when I might have pushed through. It suggests intervals when I was planning endurance.
I'm getting fitter. More efficient. Better results in less time.
But am I developing better judgment as a cyclist? Or am I just following better instructions?
AI & Re-skilling
87% of CEOs expect at least 25% of their workforce will need reskilling due to AI within the next few years. That statistic's everywhere right now. Everyone's talking about reskilling.
But Lynda Gratton at London Business School just published something that asks the harder question: how do we reskill people when AI is fundamentally changing how humans actually learn?
She starts her piece in Harvard Business Review with a senior executive saying, "None of us knows how people will learn in this new era."
last week, and it's been nagging at me ever since.
Because it names something we're all feeling but not many want to say out loud: we're implementing AI tools at pace, celebrating the productivity gains, and quietly hoping we're not breaking something fundamental about human development.
The Pathways We're Losing
Gratton invites executives to describe their own pathways to mastery. The countless hours of practice. The moments of intense insight. The wisdom gained from failure. The mentors who gave feedback at exactly the right moment.
Then she asks: what disappears when AI does this early work?
One senior banker told her: "If my young analysts never struggle and put in those long hours that I did, will they ever learn to think?"
It's not about glorifying suffering. It's about recognising that expertise isn't just accumulated knowledge. It's identity formation. The repetition, the frustration, the honing of craft - these aren't obstacles to mastery. They're the pathway to it.
Here's what landed hardest for me: "AI will undoubtedly accelerate learning - but accelerated learning is not the same as development. Acceleration increases output; development transforms identity."
Think about that in the context of my Zwift training. Strava's AI is accelerating my fitness gains. But is it developing me as a cyclist? Am I learning to read my body, understand fatigue, make judgment calls? Or am I just getting better at following instructions?
What Pro Cycling's Learning
Professional teams are already deep into this territory. UAE Team Emirates - Tadej Pogačar's team - has an AI bot called Ana. She analyses complex datasets and translates them into bespoke strategies: recovery protocols, training plans, even which tyres each rider should use.
Lotto's working with an AI platform that monitors fatigue and stress in real time. A few days before Victor Campenaerts' emotional stage win at last year's Tour, the dashboard flagged high fatigue levels. The team told him to ease off. He arrived fresher. He won.
It's extraordinary technology. The kind of marginal gains cycling's always chased, now turbocharged by machine learning.
But here's what's interesting: the best teams understand that AI augments judgment. It doesn't replace the need to develop it.
Dan Bigham - former hour record holder, now head of engineering at Red Bull-Bora-hansgrohe - talks about race performance modelling, aerodynamic testing, pacing strategies. But when you listen to him, it's clear the tech exists to support decision-making, not to make the decisions.
The riders still have to do the suffering. Still have to learn to read their bodies, their teammates, the race dynamics. AI sharpens that judgment; it doesn't replace it.
The Four Uncomfortable Questions
Gratton suggests four provocations for leadership teams. These aren't rhetorical. They're the questions that separate organisations thoughtfully integrating AI from those sleepwalking into consequences they didn't intend.
1. What happens when AI makes the pathways to mastery disappear?
If AI drafts the strategy memo, analyses the data, generates the first twenty ideas... what happens to the slow, demanding process through which people learn to think for themselves?
Are we preserving the experiences that build expertise? Or are we falling prey to what Gratton calls "the tyranny of productivity"?
2. Are we drowning out calm?
Remember pandemic tech? Meeting volumes exploded by 50%. Workloads intensified. Time for deep, focused work contracted.
With AI, we risk repeating that pattern at scale. If pandemic tech expanded the volume of meetings, AI expands the volume of content. One executive told Gratton: "We're generating more but thinking less."
AI-generated slide decks multiply faster than teams can interpret them. The hope was that automating knowledge work would create space for reflection. The reality is that proliferation creates noise that crowds out the very conditions essential for learning: calm, reflection, and space to think deeply.
3. Are we dulling what makes us human?
The capabilities senior leaders value most - discernment, intuition, moral reasoning, empathy - are also the hardest to develop. They're learned through repeated exposure to emotional nuance, navigating tension, managing conflict.
What worries executives isn't that AI will replace empathy. It's that AI will replace the contexts in which empathy develops.
"If AI handles the difficult conversations," one exec asked Gratton, "how will people learn to have them?"
When AI intermediates human interaction, it reduces exposure to the friction points that test and strengthen emotional capability. The convenience strips away the very conditions through which empathy and judgment are formed.
4. Are we eroding choice and identity?
As AI nudges behaviour, proposes next steps, automates decisions, it's also stripping people of the capacity to reflect, choose, and take ownership.
"If the system always knows the next step," one executive asked Gratton, "when do my people learn to choose for themselves?"
Agency - the engine of human growth and development - requires space for independent judgment. Are we designing for that? Or are we accidentally engineering it away?
What This Means for Your Team
Here's my uncomfortable truth: I love Strava's Athlete Intelligence. I'm faster because of it. But I'm also aware that I'm outsourcing judgment I used to develop myself. The question isn't whether to use AI. That ship's sailed. The question is how we use it without accidentally breaking the developmental pathways that make people capable.
Professional cycling's figured something out: technology serves the rider. It doesn't replace the rider's development. The AI makes elite coaching more accessible, but it doesn't eliminate the need for riders to develop their own feel for the race.
For those of us leading teams, that distinction matters.
When you're implementing AI tools - and you are, whether consciously or not - are you asking:
What pathways to mastery are we preserving?
Are we creating more output or more noise?
Are we protecting the friction points that build capability?
Are we designing for human choice and agency?
Or are you just celebrating the efficiency gains and hoping the rest sorts itself out?
The Development Challenge Nobody Wants
Gratton's piece ends with a question that should keep every senior leader awake: "In an age of intelligent machines, how do we ensure that people continue to develop into their most capable selves?"
I don't have tidy answers. Neither does Gratton. Neither, I suspect, do you.
What I do know is this: the best leaders I work with aren't the ones with all the answers. They're the ones willing to ask uncomfortable questions and sit with the uncertainty.
AI's transforming work. That much is certain. But whether AI transforms learning - and how - that's still up to us.
My Zwift avatar might be getting faster. But I'm the one who has to decide whether I'm developing as a cyclist or just becoming really good at following instructions from an algorithm.
Your team's facing the same choice. The difference is, they're looking to you to navigate it.
Worth thinking about. Especially if you're the one other people look to for clarity when everything feels uncertain.
Lynda Gratton's full piece is worth your time: AI Is Changing How We Learn at Work