Making Sense of Data: Why L&D’s 2026 Challenge Isn’t Measurement—It’s Meaning

When “Data-Driven” Stops Driving Progress

Walk into any learning or HR department today, and you’ll hear the same confident refrain: We’re data-driven.
We say it in boardrooms, we print it in strategy decks, we use it to justify budget requests.
And in some ways, it’s true—we’ve never had more data at our fingertips. Every LMS, survey tool, and AI platform boasts another dashboard promising to “unlock performance insights.”

But here’s the question that should stop us cold: Are we actually learning anything from all of this?

Because for all our analytics and reports, many leaders still can’t answer the simplest, most essential question:
Do we understand what drives performance—and what gets in its way?

That’s not a data problem. That’s a sense-making problem.

The False Comfort of More Data

There’s a certain satisfaction in being able to say, “We’re measuring everything.”
Completion rates. Test scores. Clicks. Time spent. Engagement heatmaps. It all looks impressive—clean numbers marching neatly across a screen.

The trouble is, these metrics often serve as comfort blankets more than catalysts for insight. They make us feel productive. They fill the void when the real story is too complex to capture in a chart.

The truth is, data abundance often disguises the absence of understanding.
We’ve built beautiful dashboards that describe what happened, but rarely explain why it happened.

It’s like standing in front of a mirror and mistaking the reflection for the person.
We’ve confused the activity of measurement with the art of meaning.

For learning leaders, this creates a dangerous illusion. We believe we’re making progress because the numbers are moving, but those numbers often track surface activity, not performance change.
And when “success” is defined by completions instead of capability, we shouldn’t be surprised when behavior doesn’t shift.

The Confidence Crisis: When More Data Creates More Doubt

Ironically, our growing sophistication with data has produced a quiet crisis of confidence.

Talk to any senior leader and you’ll hear it: “We have all this information, but I still don’t know what it’s telling me.”
The more reports we generate, the murkier the picture seems to become.

Even the most data-fluent organizations—those with teams fluent in Power BI or Tableau—struggle to connect the dots between learning activity and real performance outcomes.
We’re left with beautiful visualizations and shallow insights.

And when data stops clarifying and starts confusing, leaders retreat into instinct. They trust their gut instead of their graphs. They look to AI tools or consultants to make sense of the noise.

It’s not that our data is wrong, it’s that it’s incomplete. It tells us what people did, but not what they became capable of doing because of it.

And so, confidence erodes. Not because we don’t believe in data, but because deep down, we know it’s not yet telling us the truth that matters.

The Reflection Reset: Listening Before Measuring

When I first stepped into my role at Kirkpatrick Partners, I was surrounded by information. Reports, certifications, surveys, it was a sea of data points.
But despite all that visibility, I couldn’t see what mattered most. Were our certifications actually changing anything for our participants? Were they helping people perform better?

It was humbling to realize we didn’t have those answers.

So I did what data couldn’t: I started talking to people.
To facilitators. To affiliates. To customers—past and present.
And through those conversations, I began to notice patterns that no spreadsheet could reveal.

Stories of transformation. Barriers that never showed up in survey results. Insights about what truly made learning stick, or why it didn’t.

That process took months. It was slow, sometimes uncomfortable, but completely necessary.
Because what we discovered through reflection reshaped everything: our programs, our technology, even how we defined success.

Reflection, not reporting, gave us the clarity that metrics alone couldn’t.

If data tells us what’s visible, reflection helps us understand what’s valuable.

The Real Purpose of Evaluation: Learning, Not Proving

The Kirkpatrick Model was never meant to be an administrative exercise. Don Kirkpatrick didn’t invent it to fill dashboards, he designed it to answer a deeply practical question: What’s working, what’s not, and why?

Somewhere along the way, we turned evaluation into a compliance task.
We started counting attendance, surveying satisfaction, and calling it impact.

But true evaluation is not about proving. It’s about improving.
It’s a conversation between intention and outcome.

When we approach evaluation as learning, we give ourselves permission to ask better questions—ones that explore the messy, interconnected ecosystem that drives performance.

  • What factors truly influence success in this environment?
  • How do our systems and culture support or undermine what people learn?
  • What would we change next time based on what we’ve discovered?

That’s the real heart of evaluation, and it’s the kind of thinking our organizations desperately need right now.

💡 Quick Tip:
Before you build another dashboard, ask yourself:
What decision will this data help me make?
If you can’t answer that question clearly, you don’t need more data, you need a sharper purpose.

The New Skill for 2026: Sense-Making

Here’s the hard truth: the data wave isn’t slowing down. AI will only accelerate it.
Soon, we’ll have real-time analytics for everything—microlearning engagement, peer collaboration, even emotional sentiment.

But more data won’t save us from confusion. The leaders who thrive in this next era won’t be those who collect the most information. They’ll be those who can make meaning from what they have.

Sense-making is the strategic superpower of 2026.
It requires curiosity. Courage. And a willingness to pause long enough to connect the dots before chasing the next shiny metric.

Instead of measuring for measurement’s sake, we must learn to measure for movement—for the story of growth behind the numbers.

From Metrics to Meaning

The future of L&D doesn’t belong to those with the biggest dashboards—it belongs to those who can see the story behind them.

As we step into 2026, the invitation is clear:
Stop measuring more. Start measuring better.

Ask better questions. Reflect more deeply.
And remember that the goal of evaluation was never to count—it was to learn.

When we embrace that, data becomes what it was always meant to be: a mirror that helps us see ourselves, and our organizations, more clearly.

Ready to Redefine Measurement?

If this message resonates with you, there are two ways to take it further:

🎧 Listen to the full conversation:
Catch this episode of The Kirkpatrick Podcast on YouTube or wherever you get your audio podcasts.

🤝 Join the Kirkpatrick Collective:
Be part of a community of leaders transforming how organizations evaluate, learn, and grow.