eBook

Kineo Guide Outlines Seven Smart Data Approaches for Learning

London (UK), June 2022 - Are you wondering what data you should be collecting, how it will inform design decisions, and how it can help prove learning impact? In a new eBook, Kineo proposes seven approaches to learning analytics. The irony of data being almost immeasurable when it comes to proving the impact of your L&D is not something that's lost on Kineo. The secret to getting the most out of your data is being able to prove and measure what value has been added.

Kirkpatrick's Model of Evaluation, pioneered by Dr Don Kirkpatrick in 1993, is still one of the most popular training models used in evaluating training programmes today. It consists of four training evaluation levels, with each level building on the previous one. These include reaction, learning, behaviour, and results.

The levels outlined in the Kirkpatrick Model of Evaluation set clearly defined objectives to achieve productive metrics. This is one of the golden rules when it comes to getting the most effective results and understanding your business outcomes. You need to know how the data you're collecting will help you: will it inform design decisions, help you learn something about the training or audience, or help prove impact?

"Half of today’s L&D leaders are being asked to prove that they're adding more value. That's a pretty strong stat, with only 6% saying they were under less pressure to prove more value", according to David Wilson,

In its new guide, Kineo outlines the following seven approaches to data gathering for smarter measurement:

  1. Business Impact - Business impact is about trying to establish a direct correlation between training and a business metric. A common example is making a connection between a new sales training program and the success of your salespeople. It can be one of the trickiest data points to measure, but it can be done by thinking creatively about how learning maps to business outcomes, or by conducting, for example, a control test.
  2. Behaviour Change - This approach starts with building a behavioural model as part of your needs assessment. The behavioural model identifies both positive behaviours, those we want our audience to continue or increase, and negative ones, those we want to decrease or eliminate. The measurement strategy revolves around different methods to quantify the frequency of these behaviours before and after the training.
  3. Application - Assessing application is about creating different experiences in which the skills and knowledge covered in the training need to be used. The most common approach is a scenario based assessment that provides the possibility to evaluate learners in hypothetical or expected work situations. This approach is especially useful for assessing learners of information, skills, or tasks in prospective situations in which mistakes or failure on the job can have great consequences.
  4. Knowledge Retention - Knowledge assessments are ubiquitous in corporate learning events or courses to measure a learner's ability to recall facts and terminology. Most often, these assessments appear at the end of a course or module as knowledge checks or quizzes, and at the end of the course as final assessments.
  5. Confidence - Confidence ratings are metacognitive tasks that require learners to report about their awareness of their own thinking. They reflect learners’ self-assessment of their own confidence about a choice or decision, usually given retrospectively - after the choice has been made. Learners answer a question and then rate their confidence in their answer.
  6. Engagement - Unlike the categories above, engagement data isn't about the content being taught. Instead, it's about measuring activity. The most common data in this category includes registrations or starts to a course, completions, and time spent.
  7. Reaction - Reaction data is typically collected via "smile sheets". This data reflects the learners’ opinions about or reaction to learning. Questions can range from generic satisfaction - "did you like it?" - to gauging helpfulness in the learners’ jobs, including anticipated likeliness to improving performance.