How can confidentiality be maintained?

Training participants may be unwilling to give full and honest feedback if they feel that there may be a risk that the information they provide could be used against them in any way.

For this reason it is recommended that reports do not contain identifying information, and that evaluations aimed at learners (eg Participant Reaction and Job Impact evaluations) are not deployed using the Address Book method as this links the learner’s personal information (email address, name etc) with their response.

Instead evaluations should be deployed to learners using the ‘Email Link’ or ‘Web Page Link’ options. Also, learners should not be asked to enter information on evaluations which could identify them.

Where it is essential to link learners with their responses (eg if you ask learners to list future training requests) you may consider using a coded system, eg allocating a number to each learner, which only you are aware of.

It is important to ensure that learners are aware of any measures you have taken to ensure confidentiality; these can be described in the evaluation introduction email.

Data Protection

Please note that it is the responsibility of evaluators to ensure that learners and respondents are aware that they provide during the evaluation process will be stored online and may be used in reports.

In order to maximise the influence of your evaluation it is important to report relevant outcomes to the right people in a way that they can understand and act on. For this reason it is advisable to plan out the reporting process.

Among the things you will need to consider in your plan are:

·         Who will the report be sent to? If creating reports that include financial data who else might want this information?

·         What information do they need? For example do they need a full report including all evaluation data, or do they just need data relating to a few key questions?

·         How credible is the data you have collected? There is no point in including data that is inaccurate, unrealistic or from a source that is unlikely to be taken seriously.

·         What additional information will you want to include. For example, do you want to include background information on the training programme, participants, training provider etc?

·         How realistic are your recommendations? Will they motivate those who have the responsibility to take them forward?

·         How will you structure the information in the report? (see the Example Evaluation Report Template (Word) [LINK] provided within the TrainingCheck Help Centre.)

What kinds of report are available?

Evaluators can view and monitor ‘live’, real time responses including the latest evaluation responses via the ‘Analyse’ button on the ‘Manage Evaluations’ page.

Clicking the ‘Download’ button on the Analyse page will enable you to download the response data collected as CSV (Excel) and XML files and, if required, in SPSS format. This can be useful if, for example, you want to incorporate the data into management information systems or you wish to use statistical programmes to carry out trend analysis or cross tabulations. 

In addition to this, there are three basic types of custom report available within TrainingCheck:

·         Evaluation reports - based on single or multiple evaluations

·         Category reports - based on project details such as training area, method, provider, trainer etc

·         Return on Training Investment (ROTI) reports.

The following information applies only to Evaluation and Category reports. For guidance on creating ROTI reports please see the ‘Calculating Return on Training Investment’ guidance. [LINK]

Via the ‘My Reports’ page, evaluators can automatically generate and share reports on response data from any combination of evaluations and questions across one or multiple projects. For example, you may choose to generate reports on:

·         selected questions within one evaluation

·         all questions within one evaluation

·         selected questions within multiple evaluations

·         all questions within multiple evaluations.

You can also choose to generate ‘Category’ reports based on a particular training area, training method, provider, trainer, location, date range and number of participants.

When creating your report you will have a number of options, including being able  to choose to apply filters (single evaluation reports only), combine response data from the same question, add notes to the report header, and select the question/response display format, including bar chart, pie chart, area chart and text only options.

In addition, the Example Evaluation Report Template (Word) [LINK] provided within the TrainingCheck Help Centre supports evaluators to create more detailed reports for specific purposes and different audiences. The template prompts you to enter background information on the training programme and the evaluation. You will need to copy and paste data from the web-based reports that you have created into the template.

For more details on analysing data and creating reports see the following TrainingCheck tutorials in the TrainingCheck Help Centre:

·         Analysing Responses [LINK to Tutorial]

·         Custom Reporting [LINK to Tutorial]

 

What makes an effective evaluation report?

Keeping to the following simple guidelines will help you to create evaluation reports that make the right impact on your target audience.

Your reports should:

·         contain only information relevant to the needs of the target audience

·         be clear and easily understood (eg explaining any technical terms)

·         be balanced in perspective – all relevant impact data should be reported on, not just the data that favours a particular point of view

·         present sufficient data to be able to draw reasonable conclusions

·         contain a summary where appropriate

·         provide recommendations around planning, implementation, management and resourcing of training

·         point out the strengths and limitations of the evaluation design and implementation.

How can confidentiality be maintained?

Training participants may be unwilling to give full and honest feedback if they feel that there may be a risk that the information they provide could be used against them in any way.

For this reason it is recommended that reports do not contain identifying information, and that evaluations aimed at learners (eg Participant Reaction and Job Impact evaluations) are not deployed using the Address Book method as this links the learner’s personal information (email address, name etc) with their response.

 

Instead evaluations should be deployed to learners using the ‘Email Link’ or ‘Web Page Link’ options. Also, learners should not be asked to enter information on evaluations which could identify them. 

 

Where it is essential to link learners with their responses (eg if you ask learners to list future training requests) you may consider using a coded system, eg allocating a number to each learner, which only you are aware of.

 

It is important to ensure that learners are aware of any measures you have taken to ensure confidentiality; these can be described in the evaluation introduction email.

Data Protection

Please note that it is the responsibility of evaluators to ensure that learners and respondents are aware that they provide during the evaluation process will be stored online and may be used in reports.

What is benchmarking and how can it be used?

In simple terms, benchmarking is where one process is compared to a better process with the aim of improving the first process. In the case of training programmes, this may mean, for example, comparing outcomes in terms of learner satisfaction, learning, impact on job performance, impact on organisational performance, Return on Training Investment (ROTI) or any aspect or combination of these.

Benchmarking may be a one-off event or a continuous process. It is usually carried out by individual companies (internal benchmarking). It may also be carried out by groups of organisations (collaborative benchmarking).

You can increasingly find collaborative benchmarking data on performance indicators (PIs) through local industry, sector and government bodies. However, there are some weaknesses in using publicly available data. In particular it may not be clear whether the data sample used is valid and reflects your organisation’s profile, for example:

·         There can be gaps or overlaps in the official datasets.

·         There can be differences in the way organisations interpret and record PIs.

·         Expenditure information can depend on individual accounting practices, eg sometimes ‘overheads’ are allocated, sometimes not.

·         There can be difficulties in selecting the most representative unit costs.

There is no single benchmarking methodology. The following is an example of a shortened version of a 12-stage methodology developed by Robert Camp. It applies to internal benchmarking of training programmes and therefore relies on a number of evaluations having already been carried out within an organisation.

1)    Identify training programme problem areas. This can be done through analysis of the data you collect.

2)    Identify similar training programmes. These could be programmes with, for example, similar learner profiles, content, training methods etc.

3)    Identify ‘best practice’ training programmes. Using comparable evaluation methods, look for the best results achieved from the training programmes you have identified.

4)    Establish the differences. Identify which of the best practice processes could be adopted by future programmes.

5)    Develop plans and targets for future performance. Enable inclusion of best practices into future training development and delivery, and set targets for performance based on best practice outcomes. 

6)    Communicate. Ensure everyone involved in the relevant training development and delivery processes is aware of the new implementation plans and targets.

7)    Implement. Put the implementation plans into practice when developing and delivering new training programmes.

8)     Review and recalibrate. Review how successful the new processes have been and whether targets have been met. If the results fall short of expectations, review processes, identify problem areas and potential causes, and set actions and targets for addressing these.

(Adapted an abridged from Camp, R. The search for industry best practices that lead to superior performance. Productivity Press, 1989).

Back

Article Quick Jump :