Last updated: March 2026

The L&D world isn’t slowing down, especially with AI, skills data, and employee expectations all colliding.

So what’s actually going to shake things up in 2026?

Here’s a quick list of the 11 biggest Learning & Development trends for 2026:

Challenges L&D teams will face in 2026

Understanding the trends matters less if L&D teams cannot first address the structural pressures making those trends difficult to act on.

Three challenges in particular are shaping how organisations approach learning investment this year. Thirst’s State of L&D for SMBs 2026 report, a survey of 3,000+ SMB L&D professionals, found that 75.6% of teams expect their learning budgets to remain flat in 2026.

With limited room to increase spend, the pressure to demonstrate value with existing resources is higher than it has been for some time.

Thirst Blog Promo - Get a Demo

Skills gaps are widening faster than training can close them

According to the World Economic Forum’s Future of Jobs Report, 44% of workers’ core skills are expected to change by 2028. McKinsey research puts the problem in even sharper relief: 87% of companies say they either face a skills gap today or expect to within a few years.

For L&D teams, this creates a structural tension between the scale of reskilling required and the capacity available to deliver it. For most L&D teams, the answer is not more courses.

Using skills data to identify which gaps matter most for business performance, and prioritising those, tends to produce better returns than attempting comprehensive coverage. Partnering with external specialists for targeted needs and building internal knowledge-sharing programmes can stretch capacity without proportionally growing headcount.

Demonstrating the business impact of learning investment

Thirst’s State of L&D for SMBs 2026 report found that 64% of L&D professionals say leadership now expects proof of learning impact, and that proving ROI has overtaken learner engagement as the number one L&D challenge for the first time.

Most organisations still measure learning success through completion rates and training hours: metrics that describe activity rather than outcomes. When budgets are flat and scrutiny is higher, programmes that cannot demonstrate a connection to business performance are the most vulnerable.

Fixing this is less a technical problem than a habits one.

Agree on KPIs before a programme launches rather than after it ends, build reporting that connects learning data to business data, and frame results in terms that budget-holders recognise. Modest, well-evidenced outcomes carry more weight in a budget conversation than a thick activity report.

Keeping learners engaged in a time-poor environment

Time is the barrier employees most consistently cite when asked why they do not engage with development.

Competing priorities, back-to-back schedules, and the perception that learning happens separately from work all contribute to low engagement with traditional training formats. Thirst’s 2026 research found that 58% of SMB L&D professionals say they are too busy delivering programmes to think strategically about learning, and 39% update their learning strategy only once a year. When the people responsible for building a learning culture are themselves time-poor, the problem compounds at every level.

Improving content quality rarely solves this.

What tends to move the needle is reducing friction: shorter formats, learning surfaced inside tools people already use, and visible endorsement from direct managers so development does not feel like an optional extra.

State of L&D for SMBs 2026

The challenges and trends in this article draw on data from Thirst’s State of L&D for SMBs 2026 report, a survey of 3,000+ SMB L&D professionals. The full report covers budgets, AI adoption rates, skills frameworks, and how leadership is now measuring learning impact.

Download the full report

1. AI agents that learn and teach

AI agents have moved beyond answering questions. They are now running complete training workflows: onboarding new employees, tracking compliance deadlines, identifying skill gaps, and adjusting how content is delivered based on how learners respond.

Unlike earlier chatbot tools, these agents adapt over time, changing their approach based on where comprehension consistently falters or where learners disengage.

Why it matters: For smaller L&D teams, this has a practical consequence. Thirst’s State of L&D for SMBs 2026 report found that 83% of SMB L&D teams increased their use of AI in 2026, up from 59% in 2025, and 71% of those teams report AI saves at least four hours per week. An AI agent handling repetitive tasks such as onboarding FAQs, reminder nudges, or compliance tracking frees up the team to focus on higher-value work: programme design, stakeholder engagement, and measurement.

Where to start:

  • Test an AI copilot within your existing LMS or inside tools like Slack or Microsoft Teams before considering a new platform.
  • Begin with a narrow, low-risk use case. Onboarding FAQs and compliance reminders tend to produce early results without complex configuration.
  • Review outcomes after 60 days before expanding into more complex use cases such as skills coaching.

2. Learning embedded in workflows

The original framing of learning in the flow of work centred on making resources accessible during the working day.

That framing is now too narrow: training is embedded into the workflows themselves. Guidance surfaces inside CRMs as a sales rep begins a pitch, inside development environments as a developer writes code, inside project tools as a team scopes a brief. The separate course is gone. Learning happens where the work happens.

Why it matters: Every time an employee leaves a tool to find training, there is a cost in time, attention, and motivation. Removing that friction makes learning a natural part of how work gets done rather than something that competes with it.

Where to start:

  • Identify two or three workflows where employees regularly pause to look for guidance or repeat common errors.
  • Check whether your LMS has integrations with tools like Salesforce, HubSpot, or Jira that your teams already use daily.
  • Pilot short, contextual resources inside one workflow: a 60-second video, a checklist, or a prompt that surfaces at the relevant moment.

3. Internal skills marketplaces

Rather than defaulting to external hiring every time a skill gap appears, organisations are building internal marketplaces that match employees to stretch assignments, short-term projects, and cross-functional roles based on their existing and developing skills.

AI handles the matching, surfacing employees whose capabilities and development goals align with available opportunities. The outcome is faster talent mobility, lower recruitment costs, and a direct signal to employees that internal progression is genuinely on the table.

Why it matters: Internal mobility reduces hiring costs, keeps experienced employees engaged with new challenges, and addresses skill gaps that would otherwise require external recruitment. LinkedIn Learning data shows that 94% of employees say they would stay longer at a company that invests in their career development, and internal marketplaces are one of the clearest signals an organisation can send that it does. It also gives L&D a direct connection to the talent decisions happening across the business.

Where to start:

  • Start with a skills audit. Surveys or self-assessments give you a baseline picture without requiring complex tooling.
  • Use your learning platform to surface internal candidates for projects rather than defaulting to open roles.
  • Pilot within one function before scaling. Teams with enough project volume, such as engineering or marketing, tend to generate early results.

4. Knowledge graphs

Knowledge graphs have been discussed in L&D for years without much practical adoption. The tooling is now accessible to teams without specialist data expertise, which is changing that.

A knowledge graph maps the relationships between skills, roles, content, and learning outcomes, giving organisations a structured view of what their workforce knows and what it still needs. Rather than a flat course catalogue, employees navigate a connected structure: complete this, and the next step becomes clear.

Why it matters: Most content libraries have become difficult to navigate. Thirst’s 2026 research found that only 24% of SMBs have a live skills framework, 46% are actively building one, and 28% have none at all. Without a structured view of which skills matter for each role and how they connect, even well-resourced content libraries are hard to navigate purposefully. A knowledge graph provides that structure: it gives employees a clear map rather than a catalogue to search through.

Where to start:

  • Audit your existing content and tag each item by skill or competency as a first step.
  • Look for learning platforms with built-in skill-mapping tools rather than attempting to build from scratch.
  • Create visible learning paths by role. Even a manually built path is a meaningful improvement on an unstructured catalogue.

5. Synthetic peers for practice

Synthetic peers are AI-driven characters that replicate the dynamics of real workplace conversations: a sceptical client, a manager pushing back, a team member resisting change.

Learners practise against them in a simulated environment that responds to what is actually said rather than following a fixed script. The character asks follow-up questions, raises objections, and reacts in ways that feel closer to real interaction than a video module or a classroom exercise can replicate.

Why it matters: Safe practice environments matter most for high-stakes skills: handling a difficult negotiation, giving challenging feedback, and managing conflict. These are precisely the skills that transfer poorly from traditional eLearning, and where the gap between knowing and doing tends to be widest.

Where to start:

  • Begin with one scenario that maps to a genuine performance gap. Sales objection handling and leadership conversations are common starting points with clear business cases.
  • Use off-the-shelf AI role-play tools before investing in custom builds.
  • Collect structured learner feedback after each session and ask specifically whether the simulation felt realistic enough to influence real behaviour.

6. Dynamic credentials

A certificate earned three years ago says very little about what an employee can do today.

Dynamic credentials address this by attaching a decay mechanism to skills recognition: a credential weakens when a skill goes unused and strengthens when it is applied regularly. Organisations end up with a more accurate picture of actual capability, and employees have a built-in reason to keep skills current rather than treating a completed course as a permanent achievement.

Why it matters: For L&D teams, dynamic credentials shift recognition from an administrative record to a live signal of workforce capability. For employees, they create an ongoing relationship with development rather than a one-off transaction at the end of a course.

Where to start:

  • Introduce digital badges for specific, measurable skills rather than broad course completions.
  • Add a refresh rule: a skills badge might require renewal within 12 months if the skill has not been demonstrably applied.
  • Where possible, connect badges to observable performance indicators: project completions, tool usage data, or manager-validated assessments.

7. AI-powered mentorship

Traditional mentorship programmes are constrained by the time senior people can commit and by the matching process that determines who gets access. AI addresses both.

It can match learners to mentors based on goals, skill gaps, and working preferences at a scale that a manual process cannot reach. It can also provide a layer of guidance independently, answering career questions, surfacing relevant content, and prompting reflection, for employees who would otherwise have no formal mentorship in place at all.

Why it matters: Access to mentorship has historically been unevenly distributed. Employees with strong informal networks or visible roles tend to benefit most. AI-assisted mentorship extends access more consistently across an organisation, including to employees who would typically fall outside traditional mentorship programmes.

Where to start:

  • Survey employees on the skills and career areas where they want guidance before building any matching system.
  • Use an AI matching tool to pair mentors and mentees based on goals rather than seniority or department proximity.
  • If mentor availability is genuinely limited, pilot an AI guidance tool for early-career employees as a complement to human mentorship rather than a replacement for it.

8. Learning tied to business outcomes

This trend is less about new technology than about what L&D chooses to measure. Completion rates and training hours describe activity but not impact.

The L&D teams gaining credibility inside their organisations are those connecting programmes directly to the outcomes the business already tracks: reduced onboarding time, lower attrition in critical roles, improved customer satisfaction scores following service training, or sales productivity after enablement programmes.

Why it matters: L&D teams that can demonstrate a connection to revenue or cost reduction are better positioned in budget discussions. Those that can only report engagement data remain difficult to defend when spending is scrutinised.

Thirst’s State of L&D for SMBs 2026 report found that 64% of L&D professionals say leadership now expects proof of learning impact, with productivity as the metric leadership tracks most closely, ahead of cost savings and retention. Research from Deloitte found that companies with strong learning cultures are 46% more likely to be first to market and 92% more likely to innovate. Capturing that connection requires measurement frameworks that go well beyond completion tracking.

Where to start:

  • Identify one or two business KPIs that a specific training programme should affect. Define them before the programme launches, not after.
  • Build reporting around those KPIs from the start rather than retrofitting measurement later.
  • Begin with straightforward metrics: time to productivity for new hires, or retention rates for teams with and without structured development, before moving to more complex attribution models.

Ombar - Customer Story Promo

9. AI-curated learning journeys

A large content catalogue creates its own problem: the cognitive load of deciding what to learn next. Employees who are motivated to develop often disengage, not because they lack interest but because they cannot find what is relevant to them quickly enough.

AI is now handling this by building personalised journeys for each employee based on role, stated goals, skill gaps, and past learning behaviour. Content surfaces when it is relevant, in an order that makes sense for that individual’s development rather than a generic sequence.

Why it matters: Relevance drives completion. A learning platform that surfaces the right content at the right time removes the friction that sits between motivation and action.

Where to start:

  • Reduce the visible catalogue and surface content through recommendations rather than requiring employees to search.
  • Test AI recommendations within your current LMS if the feature already exists before evaluating new platforms.
  • Build manual learning playlists for key roles as an interim step while AI personalisation is being configured.

10. Cohort-based learning

The value of cohort-based learning, including peer accountability, shared context, and social reinforcement, has not changed. What has changed is how cohorts are formed and sustained.

AI can group learners based on shared skill gaps rather than shared job titles, creating cohorts that are more purposefully matched. It can also generate nudges, reflection prompts, and peer feedback mechanisms throughout the programme, keeping cohorts engaged well beyond the opening sessions.

Why it matters: Cohort-based learning produces stronger outcomes than self-paced learning for most interpersonal and complex skills. The challenge has always been the overhead of running it at scale manually. AI reduces that overhead without removing the peer element that makes it effective.

Where to start:

  • Run one pilot cohort around a high-demand skill. Leadership fundamentals and data literacy are common choices with clear business cases.
  • Use existing collaboration tools for peer discussion rather than adding new platforms.
  • Review what worked after the first cohort and add AI-generated prompts and peer feedback mechanisms for the next iteration.

11. L&D as a retention strategy

Pay and benefits still matter, but employees, particularly those early and mid-career, weight development opportunities heavily when deciding whether to stay.

Thirst’s State of L&D for SMBs 2026 report found that leadership development has become the top priority for SMB L&D teams in 2026, overtaking employee engagement and compliance training for the first time.

That shift reflects how directly organisations are now connecting structured development to retention rather than treating it as a separate concern. Organisations that offer clear learning pathways communicate that they are invested in their employees’ futures. Those that do not are competing for the same talent while removing a retention lever that is also among the most cost-effective available.

Why it matters: Attrition is expensive. Replacing a mid-level employee typically costs between 50% and 200% of their annual salary once recruitment, onboarding, and lost productivity are included. The same LinkedIn Learning research that found 94% of employees would stay longer with better development investment also shows that lack of career growth is consistently one of the top three reasons people leave. A credible learning programme built around retention costs a fraction of that, and the people it serves are still there to benefit from it.

Where to start:

  • Connect career pathways explicitly to skills development in internal communications, so employees can see where learning leads.
  • Formalise a learning budget per employee and communicate it clearly, rather than leaving development feeling contingent on manager discretion.
  • Document and share internal growth stories: employees who developed a skill and moved into a new role are more credible to their peers than any policy statement.

FAQ

What are the biggest challenges facing L&D teams in 2026?

The most commonly cited challenges are skills gaps widening faster than training can address them, difficulty demonstrating business impact beyond completion rates, and keeping learners engaged when time is the primary barrier.

According to Thirst’s State of L&D for SMBs 2026, 64% of teams say leadership now expects proof of learning impact, 58% say they are too busy delivering programmes to think strategically, and 75.6% expect budgets to remain flat. The World Economic Forum projects that 44% of workers’ core skills will change by 2028, which adds urgency to all three challenges.

What are the biggest L&D trends for 2026?

The most significant L&D trends for 2026 include AI agents running training workflows, learning embedded directly into work tools, internal skills marketplaces, dynamic credentials that reflect actual skill use, and a stronger focus on connecting learning programmes to measurable business outcomes rather than completion rates.

How is AI changing learning and development in 2026?

AI is affecting L&D across several areas: intelligent agents handle repetitive training tasks such as onboarding, FAQs and compliance tracking; AI curates personalised learning journeys rather than leaving employees to search large catalogues; AI-powered matching tools extend mentorship to employees who would otherwise not have access; and synthetic peer technology creates realistic practice environments for high-stakes skills.

What is an internal skills marketplace?

An internal skills marketplace is a system that matches employees to projects, stretch assignments, and internal roles based on their skills and development goals.

Rather than defaulting to external recruitment, organisations use AI-powered matching to surface internal talent. This supports mobility, reduces hiring costs, and gives employees access to development opportunities aligned with where they want to grow.

How should L&D teams measure business impact in 2026?

The shift is away from activity metrics (completions, hours logged) and towards outcome metrics.

L&D teams are connecting programmes to KPIs the business already tracks: time to productivity for new hires, attrition in teams with and without structured development, customer satisfaction scores after service training, and revenue influenced by sales enablement. The practical starting point is identifying one or two KPIs before a programme launches and building reporting around them from the outset.

What does learning in the flow of work mean?

Learning in the flow of work means delivering guidance at the point of need, inside the tools employees are already using, rather than requiring them to access a separate learning system.

In 2026, this has evolved beyond embedding links to resources: training is being integrated into CRMs, project management tools, and development environments so that guidance surfaces where and when it is needed.

How can small L&D teams act on 2026 trends without large budgets?

Most of the trends shaping L&D in 2026 can be approached through narrow pilots rather than large-scale programmes.

Small teams tend to see the fastest returns from testing one AI agent for onboarding FAQs, building one role-specific learning playlist, connecting one programme to a single business KPI, or running a single cohort around a high-demand skill. Starting with and measuring something real builds the internal evidence needed to expand investment over time.

Final thoughts

Running through these 11 trends is a consistent direction: learning is moving closer to the work being done, closer to the individual doing it, and closer to the business outcomes that justify the investment.

Not all of them will be equally relevant to every team. A smaller L&D function will find more immediate traction in embedding learning into existing workflows or improving how credentials track actual skill use than in building a full internal skills marketplace from the ground up.

The most practical starting point is to pick one area, run a contained pilot, and measure something real. That evidence, however modest, builds the internal case for the next step.

Got 2 Minutes?

Thirst is the #1 learning platform for SMBs, built to help L&D teams in SMBs do more with less.

It boosts learner engagement, speeds up onboarding, keeps compliance on track and brings all your learning into one place.

And it does it without adding to your admin load.

Take a quick guided tour today and see how Thirst could support your organisation.

 

For more e-learning insights, resources and information, discover the Thirst blog.

You may also enjoy:

Top L&D Events to Attend in 2026 | Best 10 LMS for Remote Teams | LMS Business Case: The Ultimate Guide [+ Free Template]

The #1 Learning Platform
for SMBs

One home for learning, onboarding, and compliance - built for growing teams.

Request a demo

Related Articles