At number three in this year’s Global Sentiment Survey, learning analytics remains a key focus for learning and development in 2021. The importance of data is nothing new or cutting-edge. Data has long sustained a fundamental role in demonstrating to business leaders the return on investment of workplace learning. Without data or evidence, L&D would struggle to prove the impact learning has had, making it difficult to gain the necessary buy-in for future initiatives.
However, while L&D has traditionally been great at creating and delivering resources, a recent LPI survey revealed that learning professionals confess that they still lack the skills needed to measure how these resources are performing. What’s more, a study by Towards Maturity revealed that 51% of L&D professionals say they cannot use data effectively due to L&D lacking in-house data skills.
Does L&D have a data problem? If so, what are the common challenges?
The challenges of data analysis in L&D
Are people learning the right skills to stay relevant in the changing world of work? Are they able to employ these skills effectively on the job in order to perform to the expected standard? Do you know how to link data with performance in alignment with organisational goals?
Knowing what data to collect, how to collect it, and what to do with these data insights can seem like a daunting task without the confidence or skills needed to do it effectively.
“How do I know what learning data I need?”
Gathering the right data is key to informing your decisions on what the most appropriate actions are when it comes to learning, or indeed clarifying whether learning is really the right solution at all. Krystyna Gadd, author of ‘How Not to Waste Your Money On Training’, suggests that you start by asking yourself: ‘Why bother collecting or analysing data?’
Is it to improve skills? Communicate value? Validate decisions? Or is it simply to check in and make sure things are going to plan?
We need to begin with the organisation itself. By moving closer to the business, L&D becomes more integrated with its objectives and can develop solutions that are in alignment from the outset, as opposed to taking a reactionary approach. Remember, different stakeholders will have different priorities. This is where stakeholder analysis is key.
What are the primary goals and objectives of the organisation? Have you ensured stakeholder involvement and visibility? What sort of data will help in demonstrating progress associated with these particular goals? In order to make sure you’re answering the right questions, it’s worth considering the four types of data that will help improve decision-making. These are:
- Descriptive analytics
- What has happened?
- Diagnostic analytics
- Why has it happened?
- Predictive analytics
- What is likely to happen?
- Prescriptive analytics
- What action should be taken next?
These four categories paint the full picture of how learning has – or has not – delivered on achieving its core aims. Too often we fall into the trap of simply describing the data (descriptive analytics) without the context afforded by the various other types.
“How can I gather the right data?”
L&D professionals are increasingly seeing the value beyond ‘happy sheets’. Positive learner feedback is meaningless if they cannot support actual goals. For example, it’s of little value to know that learners found a training session fun if the goal is to evidence performance improvement.
There’s no ‘one-size-fits-all’ approach, meaning the metrics you choose will vary depending on your goals. If you wanted to find out whether training really was the answer to increasing customer satisfaction scores for example, you could compare an overall CSAT score before and after staff training has taken place. If the goal is to identify the key areas of knowledge learners take away from a course, asking them to complete an assessment before and after training can help identify any areas that need further attention.
That’s not to say learner satisfaction data is obsolete. If your goal is to identify areas for improvement with regards to learner experience, the qualitative data gathered through interviews and surveys will provide valuable insight.
The key is to ensure the data you are collecting, whether qualitative or quantitative, feeds back to the initial goal. This will point you towards the best way to collect your data: whether that’s through surveys, assessments scores, employee retention metrics or observation data.
“What do I do with the learning data once I’ve collected it?”
To prove that your strategy has met its intended goal, your stakeholders are going to want to see solid evidence. However, just sliding a few bar charts across the table and citing a few statistics won’t mean anything if you don’t demonstrate how the data supports your message – and this, perhaps, is where L&D lacks confidence.
In order to demonstrate real insight from analytics, you need to become comfortable with telling the story behind the data. If the concept of data storytelling is new or unfamiliar to you, don’t worry – we’ve got a blog post on data storytelling to help.
James Richardson, Senior Director Analyst at Gartner, notes that “data and analytics teams have always created dashboards and visualisations, but many are unfamiliar with wrapping those artefacts into a narrative.” Even then, this narrative needs to be made accessible and relevant to the target audience. If key stakeholders are to make important, data-driven decisions, they’ll need to be able to connect the dots between data and outcomes.
Use the relevant visuals to display your data and remember there is no one type of data visualisation that works for all situations. Bar charts might be best in one context, whereas a pie chart or infographic might be better in another. You could start by looking at how others have displayed similar information, or you could try presenting the same data in a number of different formats to identify which type people engage with the most.
Don’t be afraid to employ a range of different formats depending on the context of the data and the audience you’re presenting it to. Remember to tell the full story, too – point out any anomalous results and inconsistencies, but be sure to explain how and why these results have occurred. Knowing what doesn’t work is just as important for informing your future strategies as knowing what worked.
The goal of an effective learning program is not only to demonstrate the impact in achieving organisational aims, but also to use these insights to inform the design and delivery of future strategies. There’s no getting around it: while not everyone on the L&D team needs to be a full-time analyst, there’s a need for at least a fundamental understanding of how to evaluate the efficiency and effectiveness of training initiatives using analytics.
The rise in dedicated analytics roles within the L&D team is a sign that the significance of these skills will only increase over time. Just as other departments such as sales, marketing and production all use data and analytics to answer questions and inform decisions, so the L&D team must ensure it’s equipped with the right skills to do the same.
Over the next few weeks we’ll be delving further into data and performance in L&D with content to help you get it right.
Data doesn’t have to invoke dread if you go back to basics and break it down. Kept simple and relevant, data analysis is one of the most powerful tools in your L&D toolkit.
Get the full story behind learner progress in your organisation with Thinqi's cutting-edge reporting tools. Contact us to arrange a free demo.