Level 2 – Learning

Level 2 of the Kirkpatrick training evaluation model involves evaluating how far training programme participants have improved their knowledge and skills as a result of the training.

What are the key questions?

The key questions that evaluations at this level can seek to answer include:

  • Did the participants learn what was intended to be taught?
  • What is the extent of advancement or change in the participants after the training?
  • Were there any particular barriers to or promoters of learning?

When combined with evaluations at the other Kirkpatrick levels, measuring impact at level 2 can help you to make judgments and recommendations about the relevance and quality of the training programme and the suitability of the assessments, tests and/or qualifications used as part of that programme. It can also provide key diagnostic information where there has been a breakdown in the process of transferring learning to the workplace. For example, in the case where there has been no observable impact of a programme on workplace performance (level 3), data from level 2 evaluations can help you to track whether, and to what extent, this is due to the amount and type of learning that actually occurred.

What data collection methods can be used?

Evaluation at this level is typically carried out using assessments or tests before and after the training. Often a relevant person (eg the trainer, a union learning representative or a learning and development officer) will provide results data from assessments, tests and/or qualifications to the evaluator.

Other data collection methods which you might consider using include:

  • interviews with and observation of training participants .
  • participant self-assessments
  • group and peer assessments.

Note: You can use TrainingCheck to create simple training assessments and tests using the core evaluation creation tools, but please be aware that dedicated assessment tools provide more comprehensive options and in some cases will be more appropriate to use.

To help ensure the relevance and validity of the evaluation results, it is important to make certain that any assessments, tests or qualifications which are to be used as part of the evaluation process are aligned as closely as possible to the original training objectives. Among other things, this will avoid the issue of the success of the training being judged in terms that were not defined at the outset. In addition, reliable, clear scoring and measurements should be established in order to reduce the risk of assessment inconsistency.

Calculating the 'Learning Gain'

As part of the data collection and analysis process at this level, you may wish to calculate the ‘learning gain’ from training. This shows the improvement between the pre- and post-learning assessment scores. It can be calculated using the following formula:

(Post-learning Score minus Pre-learning Score / Maximum Score minus Pre-learning Score)  X 100

For example, if the pre-learning score was 50, the post-learning score is 80, and the maximum score is 100, then you get the following:

((80-50) / (100-50)) x 100 = (30 / 50) x 100 = 60%

This shows that there was a 60% learning gain.

Evaluating Learning using TrainingCheck

Creating your evaluation

When you are ready to create your evaluation within TrainingCheck you will be able to:

  • copy an existing evaluation (eg you can choose to copy the example evaluations provided)
  • create a new evaluation from scratch
  • choose from the questions within the ‘Participant Reaction’ sections of the Question Library
  • copy individual questions from existing evaluations
  • create your own questions. (See also the guidance on creating effective questions)

The evaluations you create should generally be short (ie between 5 and 15 questions) as longer evaluations tend to have much lower response/completion rates. Therefore question choice is very important.

Once you have created an evaluation it is advisable to pilot test it before deploying it with the target group.

Deploying your evaluation

Your evaluation can be completed by participants themselves (eg assessments/tests or self-evaluation) and anyone else who has access to information about learning assessments, tests and/or qualifications outcomes, e.g. the trainer, a union learning representative, or a learning and development officer.

At this level evaluations can be deployed at any time during the evaluation timeframe. However, it is important to bear in mind that in order to reliably attribute the outcomes of assessments, tests and qualifications directly to the training programme they must be undertaken and recorded soon after the end of the training programme .

The deployment options (via the 'Collect Responses' page) are to:

  • send your evaluation to contacts in your Address Book
  • place a link to your evaluation in an email using your usual email program (eg Outlook)
  • place a link to your evaluation on a web page .
  • launch the evaluation immediately so that you can manually add data directly into it (‘Add Data Manually’ button) - useful, for example, if you have collected evaluation data through paper based evaluations, interviews, or focus groups

Please note: Evaluations can also be printed so that they can be completed manually. Responses can then be uploaded to TrainingCheck via the 'Add Data Manually' function.

You can deploy the evaluation as many times and using as many of the different methods as you wish.

It will be important to consider the timing to ensure a good response rate. For example, does your evaluation coincide with other surveys, or is it a particularly busy time for respondents?

Data analysis and reporting

Once you have collected the data from respondents or manually entered data, you can view the responses by clicking on the 'Analyse' icon. You will be able to filter the responses according to criteria you choose, and download responses as CSV (Excel) or XML files. You will also be able to create custom reports (via the 'My Reports' page) and share these with key stakeholders.

You may want to discuss the results of the evaluation with the learners, the trainer(s) and learners’ managers. This can be an effective way of engaging others in, and identifying potential barriers to, ensuring the transfer of learning to the workplace.

As with all other levels of evaluation, it is vital that the outcomes of the evaluation at this level are acted on. Not doing so will, at a minimum, undermine the credibility of the evaluation process.


Article Quick Jump :