L&D QUESTIONS

AI Native L&D Teams: How They Will Think, Work, and Build Differently

LAVINIA MEHEDINTU
June 23, 2025

In April I attended the Learning Technologies Conference in London, and I swear—there wasn’t a single session without AI in the title. Anamaria Dorgo, my partner-in-crime for the event, said something that stuck with me: “AI won’t be a stand-alone thing. It’ll be embedded in everything we do in L&D.”

She’s right. And in many ways, we’re already wired for this. L&D folks are good at following the shiny, new trend. That might actually be to our advantage this time.

But I also believe that the future isn’t just about integrating AI into what we’re already doing. It’s about reinventing how L&D teams operate from the ground up.

Because like Anamaria said, AI is not an add-on. It’s the infrastructure, the assistant, the lens, and the fuel. To truly make an impact, L&D teams will need to become AI-native, while also looking beyond AI, to unlock both performance and potential for organizations and individuals.

This guide is a deep dive into what we imagine AI-native L&D teams might look like in practice: how they make decisions, what roles they evolve into, how they build with AI, and the mindset shifts that help them do it.

It’s not a futuristic fantasy. It’s a shift already underway in the most forward-thinking organizations.

But the truth is, most of us aren’t there yet. And that’s okay. This isn’t about getting it perfect. It’s about getting started.

Why L&D Needs a New Operating Model

I don’t think we can outrun AI. As controversial as this technology is, it’s here to stay, and we might as well get on board in a smart way.

Dr. Philippa Hardman mentioned in her talk at LTC that L&D teams started using AI massively just last year, largely because many companies finally allowed controlled access. The most common use case? More efficiency and productivity in the design process.

At first, that felt like a red flag to me. I couldn’t shake the feeling that we might be scaling something that’s not really working. And to be honest, I still think that’s a risk. But Myles Runham offered a good counterpoint. AI arrived in a tough economic moment, when most organizations were under pressure to cut costs. Efficiency became the low-hanging fruit.

So yes, it’s understandable that we’re starting with “faster and cheaper.” But if we stop there, we’re missing the point. Because AI isn’t just about helping us do the same things faster. It’s already changing the very assumptions behind how learning and learning design happens.

Think about how we’ve approached online learning for years: static content, end-of-module quizzes, completion rates. It’s a model that prioritizes recall over real-world performance. What agentic AI makes possible is something much more powerful: practice in the flow of work, feedback in real time, learning that’s embedded in doing. 

In this context, the core questions for L&D are changing. We're no longer asking, “What training should we build?” Instead, we’re asking:

  • What is the real problem we're solving?
  • What performance data are we missing?
  • What can be automated or augmented to increase value?
  • What knowledge already exists in the organization that can be unlocked and scaled?
  • Why and how are we bringing people together to learn?

AI-native L&D teams aren’t just adding AI to existing processes. They’re redesigning those processes for a world of continuous, adaptive, and embedded learning.

This requires:

  • Moving from courses to systems
  • Moving from delivery to enablement
  • Moving from evaluation after the fact to real-time intelligence
  • Moving from training to experience design

The implication is profound: L&D must become embedded in the flow of work, embedded in decisions, and embedded in performance management. Learning is no longer a break from work, it is work.

And as AI takes over many predictable, repeatable aspects of learning design and delivery, the value of human connection increases. In this context, L&D teams must also reimagine how they help people connect with each other, with ideas, and with purpose. Experience design becomes a strategic lever: it's about crafting moments that invite reflection, spark conversation, foster psychological safety, and build communities of practice. When learning is everywhere, connection becomes the glue that makes it meaningful.

New Thinking: What Sets AI-Native L&D Apart

Okay, so what exactly are the principles that set AI-native L&D teams apart? I’d argue that most of them aren’t new. We’ve been talking about them for years. But now, some are becoming impossible to ignore, while others are showing up for the first time.

In traditional L&D, content is king. The measure of success is often completion rates or course feedback. But in an AI-native world, that mindset no longer holds. These teams approach learning differently, grounded in four core pillars: 

1. Performance-first orientation. AI-native teams start with business problems and performance gaps. They don't begin with learning objectives, they begin with outcomes. They reverse-engineer from real-world success metrics: sales improvement, process adherence, faster onboarding, and customer satisfaction.

2. Systemic thinking. They understand that learning is part of a broader system that includes culture, incentives, feedback loops, and workflows. An AI-native L&D team thinks like system designers, constantly tuning inputs and outputs to improve how people learn and perform.

3. Data as design input. AI-native teams don’t wait for post-training surveys to evaluate effectiveness. They continuously feed data into design decisions: behavioral signals, usage stats, feedback simulations, and more. This enables a design process that is iterative, evidence-based, and grounded in what works, not what’s assumed.

4. Design for human connection. And critically, they don't just design content, they design for human connection. Whether through peer feedback rituals, shared reflection prompts, or live collaboration spaces, these teams use experience design to create the conditions where people learn not just from content, but from each other.

This mindset shift is what distinguishes a team that uses AI from one that builds with AI.

What’s Holding L&D Teams Back

Yet, most of us aren’t there yet.

We’ve spoken with dozens of L&D leaders and practitioners over the past few months. Many are curious about AI. Some are experimenting. But very few feel like they’re truly ahead.

Here’s what we’ve heard is getting in the way:

1. Lack of clarity on purpose. Many L&D teams still operate reactively. They respond to training requests rather than leading performance conversations. Without a clear north star, it’s hard to decide how or why to adopt AI.

2. Limited access to tools (or too many tools). Some teams don’t have access to basic AI tools. Others are overwhelmed by choice and don’t know where to start. The result? Either paralysis, or scattered, disconnected experiments.

3. A perception problem. In some organizations, L&D is still seen as an admin function, not a strategic partner. This makes it harder to get buy-in, resources, or even a seat at the AI table.

4. Fear of “doing it wrong”. Let’s be honest, AI can be intimidating. From hallucinations to bias to ethical concerns, it’s easy to feel like you need everything figured out before taking action. But perfectionism holds us back more than it protects us.

5. Forgetting the human. Ironically, in our rush to explore AI, some teams lose sight of what learning is really about: connection, growth, and meaning. If AI doesn’t help us do those better, then what’s the point?

This isn’t about shaming anyone. We’re all figuring it out. But if we want to move forward, we have to be honest about what’s holding us in place.

Ways of Working: Building With, Not Just Using, AI

But while some teams feel paralyzed, others are leaning in, rebuilding how they work from the ground up. What are they doing differently?

1. Front-loaded simulation and prototyping. Before writing content or launching a pilot, teams simulate learner interviews using GPT-4o, predict engagement levels with different formats, generate learner personas, and model skill transfer likelihood.

2. Lean experimentation mindset. Borrowing from agile, these teams launch MVPs of learning products, run small pilots, gather rapid feedback, and iterate weekly. AI tools are used to analyze open-text feedback, simulate reactions, and compare performance data across cohorts.

3. Content creation as a system, not a one-off. Instead of building content from scratch every time, AI-native teams build reusable prompt libraries, generate adaptable templates, and create modular micro-content that is recombined and customized by AI.

4. Multi-tool orchestration. AI-native teams use a toolbox, not a monolith. Tools are chosen for specific stages:

  • Perplexity for sourcing evidence-based insights
  • Claude for structured writing and standards compliance
  • Synthesia for video creation
  • Sana, HowNow, or Copilot for automated skills management
  • Epiphany for learning design that’s routed in science
  • Pika, Imagen, and ElevenLabs for media generation
  • DeepL for smart translation
  • Whimsical/Napkin for diagrams

A well-orchestrated stack enables speed, consistency, and quality, without locking into a single vendor ecosystem.

5. Gaining time for higher-order thinking. Instead of spending time on manual, and quite boring tasks, AI-native teams, discuss performance issues, review insights to spot patterns across teams, regions, or roles, and also shift their focus from content to connection in trainings and workshops.

This is where AI doesn’t just make us faster, it makes us better. Because when you take away the grunt work, you give people space to be more strategic, reflective, and human.

Subscribe to Offbeat

Every Sunday we send over a pack of articles, e-books, podcasts, videos, and thoughts, to inspire you and help you stay up to date with what's happening within our L&D community

Awesome! Now, check your inbox
Oops! Something went wrong while submitting the form.

Potential New & Redesigned Roles in L&D

Will all of this change L&D roles? I hope so. I’m not saying that our roles aren’t good enough already, but like all other functions, we should be evolving. 

We can’t predict the future, but we can make some assumptions about the categories L&D roles might fall into the future:

Learning experience design

These roles focus on the human side of learning: not just what people need to know, but how they feel while learning. This includes digital content, yes—but also real-life spaces where people connect, reflect, and grow. It’s about crafting intentional experiences that spark transformation, not just information transfer. These folks bring storytelling, facilitation, design thinking, and community-building into the L&D toolkit.

Typical focus areas:

  • Designing blended or cohort-based learning journeys
  • Facilitating peer learning and reflection rituals
  • Creating emotionally engaging learning experiences
  • Applying learning science and UX principles
  • Shaping how people connect across functions and geographies

These are the roles that make learning feel like more than just a checkbox.

Organizational design & learning architecture

These roles zoom out to the full system. They think less about content and more about how learning actually happens inside an organization: through feedback, incentives, culture, performance management, mobility, peer learning, and more. People in this space often work closely with People Ops, HRBPs, or transformation teams. They ask: what does our org need to be capable of—and how do we build for that?

Typical focus areas:

  • Mapping capability models and skills architectures
  • Linking learning initiatives to business priorities and workforce planning
  • Designing internal talent mobility systems
  • Building learning culture through rituals and role modeling
  • Collaborating on systems-level interventions (not just training)

These roles turn L&D from a service function into a strategic enabler of business change.

AI learning enablement

These roles focus on making AI tools usable, safe, and valuable within the context of learning. People in this space are not just prompting ChatGPT, they’re building systems, curating prompt libraries, tuning models, and applying governance practices that make AI tools sustainable for the long term. They’re also the bridge between tech and learning, partnering with IT and data teams while translating business needs into intelligent automation.

Typical focus areas:

  • Managing AI copilots and content engines
  • Building automation flows for personalized learning
  • Monitoring quality and ethics in AI-generated outputs
  • Training the team on AI fluency and prompting best practices
  • Experimenting with AI use cases and scaling what works

This is where AI stops being a one-off tool and starts becoming part of your L&D operating system.

Will this happen? We don’t know. Will this happen for everyone at the same time? No. That’s one thing we know for sure. Some L&D teams might be slower than others in adapting, but we’re here to help everyone, no matter where they are right now.

Whether you’re rethinking roles now or just planting seeds, the next horizon is clear: L&D must evolve from efficiency to impact.

From Efficiency to Strategic Impact

In the end, this is the shift we need to aim for.

Yes, efficiency is a great starting point. But the real opportunity with AI isn’t just doing the same things faster. It’s doing different things altogether. Once teams free up time and mental space, they can move from order-takers to strategic partners. From reactive to proactive. From delivery mode to performance mode.

And when that happens, we start to see real strategic value:

  • Real-time business alignment: When business goals shift, learning interventions adapt immediately. For example, AI updates training based on new compliance laws or product features overnight.
  • Dynamic knowledge capture: Experts can contribute in conversation rather than content creation. AI then structures and stores the insights in a searchable format.
  • Performance integration: AI helps link learning data to KPIs. If a team misses a sales target, the system recommends micro-learnings or mentorships that address the root gap.
  • Elevated collaboration and connection: With AI handling information-heavy tasks, L&D can focus on designing spaces where people connect, share insights, and solve real problems together. It’s not just about learning from content, it’s about learning with each other.
  • Revenue enablement: In some cases, learning products are monetized, especially in consulting or SaaS firms. Internal knowledge becomes an external asset.
  • Brand elevation: High-quality, AI-curated internal academies position the company as a learning organization. This helps with recruitment, retention, and thought leadership.

These changes reposition L&D as a lever for value creation, not just cost control.

What AI-Native Teams Will Build

AI-native L&D teams aren’t just building tools. They’re enabling a new kind of learning: agentic, embedded, and adaptive. This goes far beyond the LMS era (although, let’s be honest, there’s a running joke in the industry that the LMS will outlive us all).

The old model kept learning in a separate lane, log in, complete your course, check the box. But the new wave of tools isn’t just smarter; they’re more human. They blur the lines between learning and doing. They live inside the flow of work, provide real-time support, and adapt to the learner’s context, confidence, and needs.

These systems don’t just track clicks. They coach, prompt, assess, and connect, delivering feedback, surfacing practice opportunities, and offering nudges when it matters most. And they’re not limited to screens or dashboards. In the best cases, they’re embedded into everyday conversations, team rituals, and real-life collaboration.

Here’s what AI-native teams are already building:

  • Skill navigation platforms: Not static competency models, but live maps powered by real-time role data, market benchmarks, and internal mobility patterns.
  • Performance support agents: Virtual copilots that surface help, documentation, tips, or next steps in the flow of work, contextually, without a click.
  • Conversational simulations: AI-generated, branching scenarios where learners practice conversations, make decisions, and receive feedback based on their style.
  • Self-service learning ecosystems: Searchable libraries, curated learning paths, smart recommendations, and instant Q&A powered by RAG (retrieval-augmented generation).
  • Embedded learning agents: Tools that live inside Slack, Teams, or CRMs, offering practice, nudges, and feedback exactly when people need them—no extra logins, no artificial separation from real work.
  • Automated onboarding systems: Personalized journeys based on role, manager, location, and previous experience. AI tracks confidence, not just progress.
  • Knowledge marketplaces: Anyone in the org can contribute, tag, and promote content. AI handles formatting, summaries, translation, and knowledge routing.
  • Real-life learning experiences: Designed using experience design principles, these aren’t just trainings or workshops. They are spaces built for connection, reflection, and transformation, where people learn through shared dialogue, peer collaboration, and purposeful interaction.

New Measures of Success

Metrics must also evolve. AI-native teams move beyond vanity metrics and start measuring:

  • Skill acquisition speed: How fast learners move from unskilled to capable.
  • Behavior change: Not intent, but actual observable change.
  • Task performance delta: Before and after training, how much more effective are people?
  • Time to impact: How quickly does learning show up in business metrics?
  • Agent engagement: Are people using learning copilots, and do they trust them?
  • Learning equity: Is personalization working for everyone, or just the already engaged?
  • Competency in context: Instead of using proxy metrics like quiz scores, AI-native teams assess real-world work. That might mean analyzing how a message is written, how a sales pitch unfolds, or how a decision is made, bringing us closer than ever to true competency-based assessment.

L&D teams use these metrics not just to justify their work, but to design better experiences, faster.

The Real Enablers: Mindset, Mandate, and Maturity

But to get there, we need to remember something Egle Vinauskaite mentioned in her talk at LTC: “An AI tool is not an AI strategy”. 

We can’t start using a tool and think that’s it. AI-native transformation depends on:

  1. L&D leadership mindset. You need leaders who don’t just react to AI hype but understand its transformative potential. They champion bold experiments, even when ROI isn’t instant.
  2. Executive mandate. L&D must be seen as a strategic partner, not an order taker. This means budget autonomy, visibility in strategic planning, and access to cross-functional teams.
  3. Infrastructure maturity. Without clean, accessible data or IT support, AI efforts stall. AI-native L&D teams work with data scientists and system architects to ensure integrations are secure, scalable, and privacy-compliant.
  4. Cultural readiness. The broader org culture must allow for experimentation, intelligent failure, and decentralized innovation. Otherwise, even the best AI tools won’t be used to their full potential.

AI-native teams don’t just adopt a tool in isolation, they help raise the AI maturity of the whole organization.

Your Next Step: From Adoption to Reimagination

Becoming AI-native is not a one-time upgrade. It’s a fundamental reinvention of how learning teams operate.

Start here:

  • Map your current roles. Where is human time being spent on low-impact work?
  • Build a pilot team. Start small with a high-impact use case (e.g., onboarding or sales enablement).
  • Develop AI fluency across the team. Not just tools, but critical thinking, prompting, and governance.
  • Design with feedback in mind. Use learner simulation, small releases, and rapid iteration.
  • Measure what matters. Don’t wait six months, start measuring after two weeks.

Most importantly, stay human. AI will not make learning better on its own. That still requires care, creativity, and courage.

Acknowledgements

This article was inspired by the ongoing work of brilliant thinkers and practitioners in the L&D and AI space. A warm thank you to Dr. Philippa Hardman, Egle Vinauskaite, Ross Stevenson, Rita Azevedo, and Josh Bersin, whose research, posts, and frameworks have helped spark and shape many of the ideas in this piece.

So here we are, back to that buzzing conference hall in London, where AI was on every stage and in every conversation. Anamaria’s words still ring true: AI won’t be a stand-alone thing, it will be embedded in everything we do in L&D.

But embedding AI isn’t the end goal. Reinventing how we think, work, and build is.

The future of L&D isn’t just about faster content. It’s about deeper connection. It’s about designing systems that serve performance and people. It’s about embracing new tools with the same human-centered curiosity that’s always guided our work.

We don’t need to have it all figured out. But we do need to begin, with courage, clarity, and a commitment to doing things differently.

LAVINIA MEHEDINTU

CO-FOUNDER & LEARNING ARCHITECT @OFFBEAT

Lavinia Mehedintu has been designing learning experiences and career development programs for the past 11 years both in the corporate world and in higher education. As a Co-Founder and Learning Architect @Offbeat she’s applying adult learning principles so that learning & people professionals can connect, collaborate, and grow. She’s passionate about social learning, behavior change, and technology and constantly puts in the work to bring these three together to drive innovation in the learning & development space.

Meet Offbeat

We’re the place where L&D professionals accelerate their career. Live programs, mentorship, lots of practice and knowledge sharing.

A diverse learning community

Curated learning resources

Personalized guidance in your learning journey

Weekly live sessions

Cohort-Based Programs run by experts

1:1 mentoring relationships

Become an Offbeat Fellow →

Copyright Offbeat 2023