The analytics side of L&D is definitely tricky. Avoiding it altogether is not an option. Focusing too much on them? Wouldn’t advise that either. The truth is that as you read, and understand data in learning, one thing becomes clear. We work with humans. One of the things we can rely on when it comes to humans is unpredictability. We got used to living with it in training and e-learning, now it’s time to get used to it when it comes to learning analytics.
Still, unpredictability is not a reason to stop trying.
Let’s state the why. As I see it, there are three main stakeholders of your learning analytics:
There’s a lot of history when it comes to L&D data and measurement processes, and I recommend you read Learning Analytics: Measurement Innovations to Support Employee Development if you want to explore it in depth.
As much as I’ve studied the frameworks mentioned there, I will let the most experienced people do the talking, and focus on what I came across so far.
One of the programs we run as an L&D team is the Onboarding of new employees. When we think about assumptions, we should start with the end in mind. What are we trying to accomplish through this program?
The answer? Retention and productivity, which in the end means lower costs for our organization.
Let’s examine both.
First, you need to have access to churn rates by employee time with the company. If you already have the data, great, if not, start working on those dashboards!
Don’t stop there. Let’s say you discover that your Technology department has a high churn rate for employees under 6 months. You still want to know why. To find out, use Exit Interviews, Onboarding Feedback Forms, and gather data from both new hires and hiring managers. Try to ignore your “gut feeling” at this point. That instinct to say “ I know what works and what doesn’t”. I’m not saying you don’t know, but data can always prove you wrong.
When looking at Exit Interviews, you should make sure that technical training, understanding the organization, their team’s role, or even worse, their role are some of the reasons never mentioned. If they are, you’ve got work to do. If both new hires and managers say there was no cultural fit, that’s a recruitment problem and luckily not your direct problem.
When examining Onboarding Feedback Results, look for what people need, not what people say would be cool to have. It’s really important how you frame your questions, and I totally recommend The Mom Test for some guidance.
If you run in-class training or webinars for your new hires, that’s another opportunity to dig for some insights. Is the experience appropriate, are people using what they have learned, do they feel more connected, do they better understand your culture? Of course, your questions should be framed according to the end purpose of your training.
In order to calculate this, you first need some strong data on employee productivity. To test the waters, you can start with an easier use case. Your Sales or Customer Service department.
First, talk to people to understand what’s the fastest possible timeline for a new customer or sales rep to get up to speed. How does that translate into their KPIs? That should be your benchmark.
Start tracking new hires’ KPIs and see if you’re delivering on your promise or not. If not, don’t worry, it’s just feedback to integrate into your onboarding strategy and make it better.
Starting from small to big, some Onboarding KPIs might be:
The assumption here is that your learning strategy helps employees easily transition into other roles. Why is this important?
I have yet to go through the experience of correlating engagement with productivity, so I will skip the first reason this time.
The first step in this journey is to actually calculate a recruitment cost. There are many articles talking about this or you can think about your own formula if you don’t already have one in place.
As you gain experience looking at the KPIs behind a recruitment cost, you’ll start seeing that:
So if there’s a talent pipeline in your organization, don’t be shy to explore it. And apart from exploring it, you should track number of internal transfers and recruitment costs as KPIs. Proving that people moving from one role to another lowers cost can give you leverage to build all kinds of interesting programs.
Still, be aware. While people move to other steps in their careers inside your company, they leave behind open spots. And that’s a whole other recruitment cost. But they also leave a trail of what-ifs behind. What if they would have gotten bored staying in the same role and leave? What if they would have felt unchallenged and unproductive?
If you want to go all data freak you can even measure those things by looking at average time spent on a role or peak productivity on a role based on historical data.
But if you are just getting started, I will definitely recommend you to start with number of internal transfers and recruitment costs. You can complicate your life later.
The easiest way to test this assumption is to look at people who started on the same role in your company. How long did those who had no career move stay with you vs how long did those who had one or more career moves stay with you?
Of course, in order to squeeze the truth from the data for this use case you need a lot of data.
Ok, so you now know some KPIs to look at when it comes to Career Development. But we are just getting started. There are other questions we can ask:
There are more, but I’ll stop at these for now.
Well, start tracking! Look at historical data, put a process in place that captures those transitions to gather even more data.
Of course, you don’t want to retain unproductive employees. But for the sake of argument, we’ll consider you have a top-notch performance management system in place which doesn’t allow that to happen.
The follow-up question to ask is after how long did employees voluntarily leave the company, citing a lack of career development opportunities as a top reason?
Start with the end in mind. Look at those who mentioned it. For how long were they employed within your organization?
Can you spot a pattern? Good, go back and track those who are close to their profiles (the same role, department, team, the same tenure) and even ask them if they’re thinking about switching jobs to confirm your theory. No matter what they tell you, don’t forget this is just their perception at the moment. It doesn’t mean you can’t start suggesting available roles.
The first step in discovering this is having a track record of the resources each of your employees accessed. In-class training, webinars, conferences, e-learning, mentorship, coaching, 360 feedback, book clubs, and whatnot.
Then match people, learning resources, and career paths. Look again for patterns and write down assumptions.
This is one of the easiest assumptions you can test since you are the one in charge of the learning resources. Of course, the testing phase will not take one day or one week, it will probably take at least a year, a year and a half. But even if you leave the company, that’s a hell of a legacy to leave behind.
In Romania, we call monsters hiding under beds “bau-bau”. This is L&D’s “bau-bau”. The main reason is the complexity in finding a definite correlation between learning programs and business results.
Because you have to:
This is where the unpredictability I was talking about comes most into play. I have to admit I’m still learning to cope with it. The sanest way to do so was experimenting.
Let’s say there’s one business KPI you know should be improved. As I mentioned before, your first step is to understand what drives performance on that specific KPI. After talking to most of your stakeholders, list 5-6 performance drivers, and understand that you can design a learning program (which shouldn’t necessarily be in-class training) that could impact one of them.
While you’re preparing the learning experience, there’s one other thing to add on your list. Target profiling. Look into as much data as you have available (demographics, engagement scores, manager scores, KPI results, past performance reviews, 360 feedback, team composition), and find people that are as similar as two peas in a pod. I suggest putting all the data you gather in one place.
Split the audience into two groups – A and B. Group A will benefit from the experience you’re designing, group B won’t. Your assumptions are that the program is well designed and that by participating in your learning program, the group’s A performance will go up.
After the program ends, add some new data to the one already gathered – program feedback and performance results.
Right now, there are three scenarios of the group A’s performance?
We will assume that scenarios two and three prove that the learning program is either poorly designed or doesn’t serve a real need.
But let’s look closer at scenario 1 – the performance of group A did actually go up. Before testing your assumption again with group B, I would add one more step. Ask the participants which were the factors they think improved their performance. Let them choose between various options, your program being one of them.
If they confirm it, the chances of your learning program being a success are even higher. If there’s another factor coming up as a reason for success, investigate other sources. A good one would be talking to their manager, who should already be on board.
Still, it’s not yet the time to pull the plug and go home. Repeat the same experiment with group B and look at the results to see if your program is actually the one moving the needle. Don’t be afraid to shut it down when data proves it’s not.
A PA team or a Learning Detective can bring three things to the table. The mindset, the skills, and the time. In this particular order. If you don’t care about your program’s impact or don’t know why learning analytics it’s important, there’s no point in having the skills. If you understand the “why” and have the skills but don’t have the time to put in the work you’ll be frustrated.
The truth is that lack of skills and time is often an excuse to not dig deeper into learning analytics. We have so much on our plate already, needs to analyze, people to talk to, programs to design. If I were you, I would still squeeze in some measurement, probably by automating some of my work.
Truth be told, we can’t and should not know everything. So if you want some high-quality learning analytics, you should involve your PA team. If your organization doesn’t employ one, maybe next on your payroll should be a Learning Detective.
Before setting your own KPIs, you should start by understanding your business KPIs. What’s your organization’s strategy? Are you trying to scale fast and gain market share? Is it time to become profitable? Are you adding new revenue streams or cutting some costs?
The answers to these questions actually turn into performance KPIs for each individual in your company. And your KPIs should be aligned with them, so you can actually drive business impact.
As in many other industries, manual measurement is like walking, using tech is like taking a plane. Manual learning analytics can get you a couple of km far, but tech is actually the actor that makes you fly and reach the sky.
If you’re just starting, you will probably spend some time manually cleaning your data, which is probably one of the most important steps. Mostly because if you get garbage in, garbage will come out on the other end. Even some simple dashboards can be done manually. Starting to prove results will probably help you build a business case for the adult’s league, investing in tech.
Because if you want to get real and scale, there’s no other way than using strong tech.
I probably tackled a small percentage of the stories about learning analytics, mostly because I still have a lot to learn as well. But if by the end of this article I have convinced you to start working with some of the various KPIs I mentioned, I’m quite happy. Although this might not be the reason why we joined the industry, being aware of all the data that surrounds us can only help us, our learners, and the business.
There’s a continuous conversation about learning analytics and moving from learning to performance in the global L&D community. It brings me a lot of joy. Hopefully, more of us will be experimenting and together we’ll find a way to measure what right now seems unmeasurable. This will make me even happier.
Lavinia Mehedintu has been designing learning experiences and career development programs for the past 9 years both in the corporate world and in higher education. As a Co-Founder and Learning Architect @Offbeat she’s applying adult learning principles so that learning & people professionals can connect, collaborate, and grow. She’s passionate about social learning, behavior change, and technology and constantly puts in the work to bring these three together to drive innovation in the learning & development space.
Each Sunday we compile the best L&D resources we find and send them right to your inbox.