Pulling up Aces (or 5 Key Areas that Learning Metrics Must Track)

Share this post

Introducing two exhibits

 

Exhibit 1:

 

You are playing poker.
You’ve been dealt a decent hand and you are pretty good at counting cards. But you are up against a player you’re unfamiliar with.

 

The game progresses. You are keeping tabs of the cards played. It looks like you’re winning. It’s time for a showdown. You have a full house, which is good. But your opponent has a straight flush, which is better. You lose.

 

That’s strange. Because you thought you had counted the cards but something didn’t quite add up.

 

 

Exhibit 2:

 

You have a business.
You need a group of people to perform well to keep that business going.

 

You figure out a strategy. As part of the strategy, you devise some training and administer it to your team. They take the training. They pass the training. Nothing improves or something improves slightly but only for a small group of people. But the story doesn’t change much.

 

That’s strange. Because you have the metrics that tell you that everyone completed the training and everyone passed. But something didn’t add up.

 

So, what do these two exhibits have in common? Training isn’t a card game. But it can be as much a matter of chance and risk. Sometimes you can calculate the probabilities accurately and win. And sometimes, someone may just call your bluff. The 100% completion rates and 90+% pass percentages are not aligning with your performance indicators. And you don’t know why.

 

Areas that an LMS should track

 

It’s said of good poker players – that they don’t play the odds, they play the person. Something similar can be said of setting up good metrics for learning. They measure the learner’s progress, not the training program.

 

Typically, in a work environment, it may help to track 5 areas of a learner’s journey from not knowing to knowing to applying well when needed.

 

The first four areas are based on the consciousness model of learning and the fifth area is the glue that holds it all together.

 

 

 

 

1) Unconscious Incompetence

 

When your team members are in this group, they do not understand or know how to do something. Importantly, they may not see the need to. This could be your experienced sales people not knowing they need to brush up their writing skills. Or your technical professionals not learning about negotiation skills. Or personnel not being aware of non-compliance to policies.

 

Here are some metrics that your LMS can help you with:

 

Self-assessment quizzes: These could be administered over time and across a spectrum. With substantial data, you will be able to locate areas where people do not see the relevance of training or may not even understand the full requirement of their role. If a Design Head sees no point in understanding budgeting and capacity planning, this could be a business problem.

 

Course attendance and Average time to completion: Although not directly, these metrics can be investigated to understand why people do not take the time out to complete a course. It could be because of the way the course is set up (maybe it is a mammoth 60-minute simulation training or a 2-day VILT) or people don’t see the relevance.

 

2) Conscious Incompetence

 

When your team members are in this stage, they are aware that they need upskilling or reskilling. They know that they need training. We can reasonably assume that this is the most motivated group to reach out with training. Of course, the training has to be very targeted and relevant.

 

These metrics can help assess whether your team is getting the training they need:

 

Course attendance and completion rates: High numbers here will generally indicate that your team is getting the training they perceive as relevant.

 

Engagement with the course: A way to enquire into whether your team is actually participating in their training is to check if assignments are completed or the team is engaging in peer discussions or reviews. Data on how many people did this and over how much time will help in identifying whether your team is actually participating in their training.

 

Course completion rate and average time for completion: If the team is not completing the training or taking a long time to finish it, it may indicate that something is amiss in the learning material itself or the ecosystem in which the training is administered.

 

Pass/ fail data: If the training has robust assessments, this metric clearly indicates whether the skill gap is closing.

 

3) Conscious Competence

 

Team members in this stage know how to do something. However, applying the knowledge to a task and demonstrating a skill takes time. For example, a Customer Service representative knows how to use the new diagnostic software to troubleshoot a customer issue. But it takes her 15 minutes to close a query. If a company is looking to improve productivity of its team, it will usually find its members in this stage.

 

Here, the team will benefit if the training program and practice sessions are broken down into smaller skill sets. This way, the team can build up to ease and proficiency of the larger skill set by mastering smaller ones.

 

These metrics can help assess whether team proficiency is being enhanced:

 

Course access and completion rates: If practice segments or simulations are part of the training, details on how the team has fared on these sessions are good indicators.

 

Access to job support materials: Details on how often performance support materials have been accessed or updated indicates how relevant these artefacts are.

 

Assessment pass/ fail rates: This is perhaps the most direct indicators of whether evolving levels of expected performance are realized.

 

4) Unconscious Competence

 

This can be considered to be the final stage of skill competence. In this stage, the team members have completely imbibed the skill. It is ‘in their bones’, so to speak. They can perform the task easily, correctly, and with minimal effort. At this stage, the team is ready to teach it to others.

 

Team members in this category really do not need training on that particular skill set. So standard LMS metrics of course completion, access, etc. may not apply here. You may consider other factors such as whether the team member actually contributed to the learning ecosystem. Was any training or job support material developed by the expert team members? Did they create any learning assets?

 

5) Business and Learning Ecosystem

 

No training is created, delivered, or applied in a vacuum. Although an LMS can capture much of the analytics related to what, when, how, and how much, it will stop short of explaining ‘why’. Do employees resist training because they don’t want to be regarded as incompetent? Was a game testing Excel-based skills successful because it helped people do their job 15% faster?

 

These are issues that can be explored through conversation, stakeholder inclusion, and revisiting the design of learning experiences frequently. An exit interview could tell you if the person received enough training to feel supported in the job. Water-cooler conversations could indicate whether you want to supplement digital learning with a workshop. And of course, any number of bottom-line centric conversations to indicate what the next piece of training should be.

 

The progression from analytics to performance is seldom straight and simple. But it exists if we know where to look and what to look for.

%d bloggers like this: