Level 1 Evaluation for eLearning: Reaction
In this mini-series about eLearning evaluation, I’m drilling down into each one of the levels of Kirkpatrick’s Four-Level Model, which is a classic approach that remains both solid and relevant in the digital age. In Level 1 evaluation, you’re tapping into and measuring the reaction of the audience that just finished the eLearning module or course. This article tells you how to do it.
Because this is essentially a kind of customer satisfaction survey (See how eLeaP does feedback surveys), I want to draw your attention to an approach that I find very compelling. If you want your students to give ratings on various aspects of the course, use a 10-point scale where 1 is low/bad and 10 is high/good. I think people intuitively understand the 10-point scale because of its ubiquity in recent decades.
Download the free white paper How You Can Assess The Effectiveness of Your Training – Kirkpatrick Model
And then I also want you to realize that any scores of 6 or below mean that person really hated that aspect of the course and you probably need to fix something – unless only one or two people out of a whole bunch gave those ratings – we all have our cranky days.
Rather than go into a lot of detail around this particular approach when it comes to ratings, I’ll just point you to where it comes from – a 2003 Harvard Business Review article by Frederick Reichheld called The One Number You Need to Grow.
Below are example questions that are relevant for measuring people’s reactions to your eLearning module or course, and I’ve identified which ones are rating questions (quantitative), and which ones are more open-ended (qualitative).
Learning Objectives
- Rate the clarity of the course objectives on a scale of 1-10, where 1 is not at all clear and 10 is extremely clear.
Content
- Was the content what you were expecting? Why or why not? (open-ended)
- Were there topics not included that you wanted to see? (open-ended)
- Rate the quality of the course structure (1-10, 1 is low, 10 is high).
- Rate the consistency of course content with the learning objectives (1-10, 1 is low, 10 is high).
- Rate the relevancy of the content to your role in the organization (1-10, 1 is low, 10 is high).
- Rate the logical/clear arrangement of the content (1-10, 1 is low, 10 is high).
- Rate how well the content explained the knowledge, skills and concepts presented (1-10, 1 is low, 10 is high).
- Rate your confidence level for applying the knowledge or skill presented (1-10, 1 is low, 10 is high).
- Rate the amount of material covered by circling one of the following: Too much, too little, just right.
- Rate the usefulness of any links to external websites (1-10, 1 is low, 10 is high).
- Rate how well the activities helped you gain a clearer understanding of the subject (1-10, 1 is low, 10 is high).
- Rate how well the case studies and hypotheticals helped you gain a clearer understanding of the content (1-10, 1 is low, 10 is high).
- Rate how well the quizzes helped you learn/retain the content (1-10, 1 is low, 10 is high).
- Rate how well the exams helped you learn/retain the content (1-10, 1 is low, 10 is high).
- Rate how well the hypothetical scenarios helped you learn/retain the content (1-10, 1 is low, 10 is high).
- Rate how well the games helped you learn/retain the content (1-10, 1 is low, 10 is high).
- Rate the quality of the examples used in the content (1-10, 1 is low, 10 is high).
- What part of the course/module did you find most useful or interesting and why? (open-ended)
As a final note, for any questions where you use the 10-point rating scale, make sure you give space for people to explain why they gave the rating they gave – especially for those giving a score of 6 or lower. The 19 examples above give you a solid set of potential questions to ask in an eLearning evaluation for Kirkpatrick’s Level 1 to measure audience Reaction. However, there are several other categories for which you should develop questions in a Level 1 evaluation, which will be covered in the next article.