top of page


What's happening in the world of digital learning?

  • Ashleigh Hull

The Kirkpatrick Model For Dummies

Updated: Dec 3, 2020

Kirkpatrick’s Learning Evaluation Model is a framework for evaluating the impact of your training.

Have a look at our microlearning pieces on Instagram, or read on to find out more.

Why evaluate training impact?

In short, so that you can improve the learner experience and achieve the goals of your training. Good instructional design never really comes to an end; your content can always be refined and improved. If you simply finish a project and move on to the next one, you will never learn; if there are problems with the content you’re producing, they will persist through the content you produce in the future. As designers and developers, it is our duty to be learning ourselves, and to be continually improving what we create. We owe it to our clients and to the people learning from our courses.

As we go through these levels, ask yourself where you would place the current evaluation of your training. And what could you do to move it on to the next level?

The levels of Kirkpatrick:


This is on the level of ‘Which face best represents how you feel about this’. It’s about how participants respond to your training; did they enjoy it? Did they think it was valuable? Did they feel good about the instructor or the venue or the design? It tells you nothing about whether your course actually fulfilled its objective and taught something; the later levels do that; but it does give you an idea how your learning was received, and how you could improve the user experience.

How do I move up to this level?

If you aren’t currently evaluating your training at all, this is the place to start. Why not implement a post-course questionnaire, asking participants how they found the course? Gather the responses and see if there are any recurring themes. Do people struggle with the navigation? Find the voiceover irritating? Do you need to rethink accessibility? There’s a potential wealth of information here, and a questionnaire is a simple thing to set up.


The point of learning is - unsurprisingly - that people learn! Did the participants actually learn from your material? How much has their knowledge increased?

How do I move up to this level? A good way of assessing this is with two quizzes - one at the beginning and one at the end of the course. Ask questions about the same topics, and see if learners are answering more questions correctly after their training. If they are, it would suggest that they did learn; if not, then something about your learning material is clearly not doing its job.

This can be a helpful method of evaluation, as it can give you specific information. If all your learners are getting questions about a specific topic wrong, it might be time to look again at how you’re teaching that topic. What about it is unclear? How could you better present it so your learners take the knowledge on board?

Another way of assessing this is, again, with a post-course questionnaire. But in addition to the basic questions of level 1, you could ask the learners to tell you what they learned on this course. In some cases this might even be more insightful than a quiz. Asking people to articulate something in their own words shows how much they truly understand about it.

One final suggestion; you could send out a follow-up quiz, say a week after the training. Quizzing them straight after they’ve take the course, when it’s all fresh in their minds, doesn’t tell you what they’ll remember over the long term. What do they recall a week later? A month later? And if the key points aren’t being remembered; how can you improve your learning to ensure that they are? Should you be providing refresher training or job aids in addition to your training course? Would it help to build a mobile app that sends daily tips and reminders to your learners after the course?


We’ve all been on those compliance courses where we learn about the correct procedure for doing something, and then go back into the office the next day and continue doing things exactly how we used to. The issue here is not lack of knowledge; we know the correct procedure. It’s that the knowledge is not being applied. Reaching Kirkpatrick’s level 3 means asking, are participants using what they learned?

How do I move up to this level? This is usually something you have to assess a little while after the course has been taken; you need to leave time for the new behaviours to settle in.

The best way of gaining this kind of insight is through 360º feedback.

360º feedback comes from the participant themselves, their colleagues, and their superiors. Asking the participant and everyone around them if their behaviour has changed after taking the learning will give you a 360º view of the situation! If your training has had the desired effect, it will be noticeable to everyone involved.

Sometimes, the feedback will say that no changes have occured. In those instances it’s important to ask why people think this is. Behaviour can only change if the conditions for it are favourable. Will the boss actually let your participant apply their new knowledge? Is there a tool or a system that has not been put in place? Does the learner have any desire or incentive to apply the learning? And what can be done to remedy these situations?

Results Kirkpatrick’s final level of evaluation looks at whether training positively impacted the organisation.

This relies on goals being set before development of the training. What changes were managers looking for? How is success defined? Otherwise, you won’t know what results you’re looking to see.

How do I move up to this level? The way you evaluate this will be determined by the results you’re looking to see. Typically, it will involve analysing data. If it’s improvement in ROI you’re aiming for, you need to be assessing financial statements. If it’s a lessening of health and safety incidents in the office, you need the data on how many incidents there have been.


When evaluating the impact of your training, it’s important to know about all 4 of these levels. For example, if behaviour hasn’t changed, you need to look at the previous levels to understand why - did people actually learn what they needed to? And if not, is that because the design was so confusing and unhelpful that they mentally checked out? Plan your training evaluations to cover all 4 levels; that way, you’ll have the best perspective on how effective your training is, and how it can be improved.

bottom of page