So far in 2023 there has been, quite literally, millions of hours, and pounds, has been spent on producing gigabytes worth of learning 'data' by organisations of all sizes, across. In this post, I'm going to shine a light on how you, or your team, could be wasting so much time and money on reports specifically related to learning. As an added bonus I'll even share some ways you can recoup some of this waste, quickly!
Learning reporting - what's it all about?
The most common reason given for reporting in a Learning function, or for learning suppliers is the holy grail that is 'Return on Investment'. That is, with Learning and Development (or whatever name it has in your organisation) functions often being mistakenly viewed as a 'cost' to most businesses, there's a need to demonstrate that as a result of X work by the L&D team, Y has happened which has had Z impact on revenue.
One of the most common frameworks people use to conduct, or simply justify their learning reporting activity is the Kirkpatrick model of evaluation. If you're unfamiliar, the Kirkpatrick model outlines an effective framework for understanding the Reaction people have to their learning experience, the Learning people have achieved as a result of the experience, the impact on learner Behaviour, and the Results of the training for your organisation.
The sad reality is that the vast majority of learning reporting focuses on Reaction and Learning, and a significant proportion of reporting on Behaviour and Results falls foul of a few common pitfalls. So let's take a look.
The biggest pitfalls of learning reporting...
Number one - Gathering data for the sake of gathering data
I would bet my house on the fact that most reporting done in Learning teams across the world is conducted 'just because'. Maybe someone has set an arbitrary 'target' for learning attendance, or completion that doesn't actually mean anything and now we're stuck in the perpetual loop of reporting and tracking against this target, spending time and money on a pointless task.
I'd take an educated guess that most of the reports completed 'just because' are done so because 'Hey! You're learning. And there's the Kirkpatrick Model' so you should report on things. Right? So, erm. Yeah. Let's report...
My point here is this - if, beyond reporting on 'things', you're not actually doing anything with the data - like changing and improving the 'things', then you are literally just compiling data and producing reports for the sake of it. You're being rather foolish.
Number two - Reporting in isolation
Another very common pitfall of Learning reporting is that people will either just do a single report and leave it there - no analysis, nothing. Or sometimes there will be different people running reports using similar data, to answer similar questions, and they're not connecting the dots - it's completed in silo.
This kind of isolation is a massive waste of time and money. Not only are you duplicating tasks, you're also missing some great opportunities to collaborate, analyse, and make improvements.
If you don't know whether you're reporting in isolation, here's a quick test:
Check whether there are multiple people running reports on your Learning data. If they are, but they don't collaborate, or you don't see anything beyond a dashboard with no analysis, then you're reporting in isolation.
Number three - Poor data
Not all data is good data. Or useful data. And this can be dangerous!
To glean meaningful insights from any reporting you need to have the right data. And you can only get the right data if you know what questions you're trying to answer.
The reality is that if you don't even know what questions you're trying to answer, or know why you're trying to answer them, then you're probably just grabbing anything and everything from the available data and trying to draw conclusions from things that may not even be related, or useful.
Why is it dangerous? Imagine spending £50,000 delivering a 'Leadership' course to your frontline managers, and running attendance/completion reports. Or maybe checking 'assessment' scores, and 'learner surveys'. In this data you might find that 100% of managers attended the training, and they all scored 100% on the 'assessment' and all said the course was 'engaging'.
This might give the impression of £50,000 well spent. So let's do it for all new managers? Right?
But maybe, the attendance wasn't tracked on the day, and it was assumed that everyone turned up? Or what if the 'Assessment' was a three question knowledge check about the final part of the course. And maybe, on the 'course feedback' form, the only options were 'Boring or Engaging'?
Okay, I know this is an extreme example. I've used it to make a point (Not all data is good data. Or useful data. And this is dangerous!). In this case, it could have led to a continuous waste of £50,000!
Number four - Starting too late
All projects start with the best intentions to follow 'best practice' but so often they're derailed rather quickly, with the design and deployment of learning becoming the sole priority. Then, a week or two before or after launch someone asks how you're going to track 'X'.
By this point you're already too late to start any meaningful reporting. Again, I'll refer back to the Kirkpatrick model, for ease:
What is your desired 'Reaction' to the learning? How successful was your learning at delivering this?
What learning do you want to see from the learning? How successful was your learning at delivering this?
What behaviours would you like to see as a result of the learning? Have learners begun to exhibit these behaviours after completing the learning?
What business results you'd like to see as a result of the learning?
Most of these questions should be clear, if you've conducted your Learning/Training Needs Analysis properly - and if they aren't, you didn't!
So, what if you don't have these questions? Well, what on earth are you using to guide the development of your learning content/experience? You can't. Not properly. And if you're looking to understand the impact of your learning, then how do you know your baseline? You can't measure this after the fact. Not accurately.
How can you stop wasting time and money? And even recoup some of it?
I'd be doing you a disservice if I didn't share some techniques I've used to radically reduced the amount of time and money my clients spend on pointless learning reports. So let's get to it:
First - know why you're capturing the data and producing reports.
Okay. You know, and I know there will be times when your boss, or your boss's boss tells you to capture specific data 'because'. And this means there will be times you consciously capture data for no reason other than 'just because'. Knowing the 'Why' isn't necessarily important. But...
If you're doing a 'just because' report, this is an opportunity to ask 'why'. And if you know that the data you're being asked to use in your report will not fairly answer the questions someone is hoping to answer then you're in a position to recommend good data.
If, on the other hand, you're not doing it 'just because' then you must know why you're capturing. Getting to the why is pretty easy. Just ask yourself 'What will the impact be of not delivering this data/report?'. If the answer is 'Nothing' then you're quite literally wasting time & money, so just stop. Now.
Second - start connecting the dots
This will, single-handedly level-up the impact of your data and reporting in a matter of minutes. You could even do it before you finish reading this post, with data you already have.
Meaningful learning reporting is all about analysis. Don't just collate data and share it. Analyse it. Ask it questions. Find the patters. Here's a live example of how I'm doing this, today.
Recently I ran a report for my global client, who had wanted to compare learning consumption on their LXP for 2021, and 2022, covering all content, and all employees. It was to be added to the annual People Report. That was it. No other reason. Just 'because'. I felt it was another one of those 'Just F-in Do It' requests.
When I captured the data I was able to see who had used the platform. Who had then gone on to access and complete online courses. Who had achieved badges and certifications. Who were repeat learners. Etc.
Being someone who practices what he preaches I then started to ask questions, and connect the dots. I split the data based on content topic, modality, business unit, and geography.
I noticed that there had been a significant shift in the topics people were accessing. And that people had shifted from things like videos, and podcasts towards books, and toolkits. I also noticed a larger volume of usage in certain business unit's and locations.
This didn't tell me anything about the success, or value of the learning. But it did tell me about my clients' employee behaviour. The next step was to ask why. And what does this mean?
As we speak I'm in the midst of defining a new data set to cross-reference, and dive down further into the employee behaviour because to answer questions like:
Have learners focused on new topics because it's what they need, what they want, or because they've been told to?
Have learners moved away from audio/video because of the content quality? Or do they want practical tools they can use when they need them?
Have different business units/locations promoted specific content? Or do they have a strong learning culture?
This is because I outlined to my client that understanding this will help build an objective view what learners need, and want from their digital learning experiences, and can help inform the evolution of the Learning Strategy I'm working on for them.
Ultimately, right now, I may be connecting dots that may be unrelated, hence the deeper dive. But hey - if I didn't ask the questions, and connect the dots, my client may never have realised that despite having high uptake of some expensive content, it was predominantly in only one business unit, and the rest of the business were just ignoring it...
Third - Learning/Training Needs Analysis
I need to drop a mention to Matt Ash (Founder & Consultant at Changeably).
Literally, as I'm writing this post I happened across a linked in post of his that touched on this very point. To steal a line from him: "Focus on identifing the gap, defined by behaviours & metrics. 'We want to be here, but this is where we are right now.' "
Matt's post was about ROI - a common 'reason' for reporting, so similar in theme to my post here.
My key point is this: If you're just grabbing data and trying to do something with it you can easily come to conclusions that are incorrect - see my example above - it may amount to nothing (although I already know it won't because my client was lucky enough to have someone looking at the their data who knew what they were doing!). Moreso, if you're reporting on anything to do with Learning the 'Gap' is essential. Get your baseline. Understand today. And determine the problem your learning was trying to solve. THEN you can define, using data, how things have shifted. This means you need to do your Learning/Training Needs Analysis. No excuses, get it done. And then use this as your baseline.
In summary
Don't waste time and money by just generating learning reports 'just because'. Know what you're doing, and why. And when you finally start using learning data correctly, you'll find that you not only save time, and money, but you can also make better decisions, improving future outcomes for you, your employees, and your organisation!
About the Author - And Apples Performance and Learning
This is where you usually get the obligatory profile, and pitch - feel free to skip it, but I'd rather you didn't. If you've read one of my posts before, this next bit will be familiar :).
Hi. I'm Andy. I'd rather not bore you with a profile, when you can find out what you need here. So take a look, or don't - it's quite literally up to you.
Also - I could give you a really slick 'sales pitch' about how Apples Performance and Learning can help you navigate through the minefield that is Performance Improvement. But I won't. To be frank, this post is already too long, and I'm bored of my own 'voice'. If you want to know how we can help you - check this out, or this.
Beyond that, if you've made it this far, thank you for sticking with me. Hopefully you agree with my points, and find some value in them. If not, please feel free to correct my perspectives - I won't learn anything new otherwise! andy@aplconsulting.co.uk
Comentarios