Level 1 - Participant Reaction

Level 1 of the Kirkpatrick training evaluation model attempts to establish whether the conditions were right for learning to take place. This involves capturing participants’ reactions to the training programme, including reactions to its relevance, training methods, trainers, qualification and assessment methods, facilities and administration etc.

What are the key questions?

The kind of questions that evaluations at this level can seek to answer include:

  • Did participants like and enjoy the training?
  • Did they consider the training relevant to their own needs and/or the needs of the organisation?
  • How did participants perceive the practicability and potential for application of the learning?
  • Did they consider it an effective use of their time?
  • Were the style, pace, content, delivery methods and materials appropriate?
  • Has the training acted as a motivator towards further learning?
  • Would participants recommend the training to colleagues?

Which particular evaluation questions you choose will depend on the overall objectives of the evaluation (as mentioned elsewhere, it is vital that the evaluation objectives are closely aligned to both the original training objectives and key business stakeholders’ expectations of the training).

Is evaluating at Level 1 worthwhile?

It should be pointed out that while evaluations at this level are carried out widely, some evaluation experts have questioned the worth of evaluating participant reaction. This is because they believe that getting feedback on, for example, whether the learners enjoyed the programme, will not result in any really useful data about whether the programme was effective.

While it is true that participant reaction evaluations cannot provide an objective measure of the effectiveness of the various elements of the programme, this does not mean that they are not worthwhile. Capturing participants’ views on the training can provide valuable information which can be used to among other things:

  • identify popular courses (ie those that are likely to be attended well) and trainers
  • identify any unmet learning needs
  • provide clues as to how a training programme may be improved further
  • diagnose barriers to learning.

These last two points are especially true when feedback from participant reaction evaluations is viewed alongside evaluation data from the other Kirkpatrick levels. For example, if a level 2 (learning) evaluation shows that learning is not taking place and your level 1 evaluation reveals that participants score all elements of the programme highly except for the training materials, it would be a reasonable assumption that the training materials need to be improved.

What data collection methods can be used?

Usually evaluation of participant reaction is carried out through a questionnaire/survey which participants complete either at the end of the training programme or at specific points during a programme. However, you might also consider using the following to gather evaluation data:

  • interviews or focus groups
  • capturing other formal/informal verbal reactions to the training (eg through meetings, performance appraisals etc)
  • written reports from the participants.

Evaluating Participant Reaction using TrainingCheck

Creating your evaluation

Once you have decided on your key evaluation objectives/questions, you can begin to create your evaluation within TrainingCheck (just click on the ‘Create Evaluation’ button on the ‘Manage Evaluations’ page). You will be able to:

  • copy an existing evaluation (eg you can choose to copy the example evaluations provided)
  • create a new evaluation from scratch
  • choose from the questions within the ‘Participant Reaction’ sections of the Question Library
  • copy individual questions from existing evaluations
  • create your own questions. (See also the guidance on creating effective questions)

Despite the large number of questions to choose from, the evaluations you create should generally be short (ie between 5 and 15 questions) as longer evaluations tend to have much lower response/completion rates. Therefore question choice is very important.

Once you have created your evaluation it is advisable to pilot test it before deploying it with the target group.

Deploying your evaluation

Evaluation respondents should, where possible, include all of the training programme participants. Where there are a very large number of learners you may consider using sampling techniques.

At this level evaluations should usually be deployed within two weeks of the completion of the training programme to ensure that the respondents’ recollection of the training is relatively fresh.

The deployment options (via the ‘Collect Responses’ page) are to:

  • send your evaluation to contacts in your Address Book
  • place a link to your evaluation in an email using your usual email program (eg Outlook)
  • place a link to your evaluation on a web page .
  • launch the evaluation immediately so that you can manually add data directly into it (‘Add Data Manually’ button) - useful, for example, if you have collected evaluation data through paper based evaluations, interviews, or focus groups

Note: Evaluations can also be printed so that they can be completed manually. Responses can then be uploaded to TrainingCheck via the 'Add Data Manually' function.

You can deploy the evaluation as many times and using as many of the different methods as you wish.

When deploying the evaluation it will be important to consider the timing to ensure a good response rate. For example, does your evaluation coincide with other surveys, or is it a particularly busy time for respondents? Offering respondents the possibility of winning a prize of some kind (eg a gift voucher) or some other incentive for completing the evaluation can often significantly increase response rates.

Confidentiality

Learners are unlikely to give full and honest feedback if they feel that there may be a risk that the information they provide could be used against them in any way.

For this reason it is recommended that participant reaction evaluations are not deployed using the Address book method as this links the learner’s personal information (email address, name etc) with their response.

Instead evaluations should be deployed to learners using the ‘Email Link’ or ‘Web Page Link’ options. Also learners should not be asked to enter information on evaluations which could identify them.

Where it is essential to link learners with their responses (eg if you ask learners to list future training requests) you may consider using a coded system, eg allocating a number to each learner, which only you are aware of.

It is important to ensure that learners are aware of any measures you have taken to ensure confidentiality. These can be described in your evaluation introduction email.

Data analysis and reporting

Once you have collected the data from respondents or manually entered data, you can view the responses by clicking on the ‘Analyse’ icon. You will be able to filter the responses according to criteria you choose, and download responses as CSV (Excel) or XML files. You will also be able to create custom reports (via the ‘My Reports’ page) and share these with key stakeholders.

You may want to discuss the results of the evaluation with the participants, the trainer(s) and participants’ managers. This can be an effective way of engaging others in, and identifying potential barriers to, ensuring the transfer of learning to the workplace.

As with all other levels of evaluation, it is vital that the outcomes of the evaluation at this level are acted on. Not doing so will, at a minimum, undermine the credibility of the evaluation process.

Back

Article Quick Jump :