Perhaps one of these situations is familiar to you.

Situation 1

Client: “We just completed this training program and I’m glad you’ve collected all this data using different multiple choice and open-ended questions but it’s overwhelming. I just want overall numbers that I can show my bosses.”

Situation 2

Client: “We need to do a training evaluation but I want something as simple as possible. Just use a smile sheet. We just need to know if the trainees liked the training enough to continue this program.”

Situation 3

Client: “I’m really concerned about the training program and I’d love to look at the qualitative and quantitative data we collected but I don’t know how to make sense of it. I’m not sure what to do with it or if I even its useable.”

These are some situations that I have encountered when dealing with clients (i.e. HR directors, business owners, line managers). I hope I have captured some element of the post-intervention (in these examples a training program) challenges that many consultants face when dealing with evaluation.

One of the most frequent challenges of post-training data analysis is that you might collect an enormous amount of data and then not be sure what to do with it. The data you have start to look like vacuum cleaner attachments. Sometimes you look at your data and all you see are extra attachments for a device that you are not sure how to use.

Can we use all of these attachments? Of course!

Do we always use them? No!

We’re not certain what the attachments are for. All we know is that we have them. 

In many situations, I find that HR managers, generalists, training managers, directors etc. often have the same reaction when faced with the survey data they collected from trainees. They feel overwhelmed. This is especially true if the clients did not participate in developing the evaluation process. Clients are a necessary component to any evaluation because they know what they are looking for out their training program.

One of the most common sources of overlooked data are qualitative comments from trainees. In many cases, this data is not overlooked because the clients feel that the data is not valuable. In general, my clients have found the qualitative comments invaluable. However, there is typically hesitation about how to analyze the comments made by trainees and how to use the comments to develop next steps.

A recent study by Harman, Ellington, Surface, and Thompson (2015) illustrates the importance of comments to an evaluation of training program effectiveness. The researchers conducted three field studies in a series of simultaneous training classes. In each study they assessed the commenting behavior of these classes. I strongly recommend that any learning, HR, OD, or IO practitioner to read the entire study. However, I wanted to highlight some of my major takeaways.

1) Classroom experiences affected the likelihood of commenting. As individuals who have experienced training in a variety of contexts, we intuitively know this. However, it’s great to know that the comments you receive in your training evaluations will reflect real differences in classroom experience. Pay attention to the comments because they will tell you what happened in these training programs.

2) As class-level learning decreased commenting increased. In other words, there was a negative correlation between learning and commenting. This is a very powerful finding as it relates to the first big takeaway. If trainees do not feel like they are learning in the classroom they will say something about it. That kind of data is important to pay attention to.

3) Trainee reactions are multidimensional. Many times we want to reduce the data to a single number or a single value i.e. “What percent of the trainees liked the program? 60%.” However there’s a lot more going on in any training program than just what the trainees felt about it overall. Trainee reactions can lead us to understand many changes that are important to be made in subsequent training classes. If we look deeper at the data we can learn about components of the training program, the trainer, the class environment, and the relevance of the material.

4) If there is no expectation of change, you will receive no comments. In other words, if your employees feel that your organization won’t change anything based on what they have to say, they won’t say anything at all. If you are conducting a training evaluation (or any evaluation) your organization needs to be committed to making the necessary changes and that should be communicated to your employees. You depend on data from your employees and by communicating your commitment to making changes you will elicit comments from those who have experienced your training program.

The message is clear from these research findings. Your employees want great training programs and want your programs to improve. It’s up to us to leverage the comments made in addition to the quantitative ratings to get the most benefit from the metrics we employ.

If you do feel like you’re missing out on some data or you have training evaluation data that you feel like you don’t know what to do with, what are your options? Here are some suggestions:

1) Reach out to colleagues and discuss your options. Some of my best ideas have come from discussions at ATD meetings or at Metro Applied Psychology meetings. As IO practitioners, we live for this discussion.

2) Talk to your vendor. If you are using a vendor, ask them about options they offer for data analysis. Much like the vacuum cleaner, they probably have options you have not fully investigated.

3) Reach out to a consultant. If you are truly lost in the evaluation process, reach out to a consultant that specializes in training evaluation who can guide you through using this data.

Can you think of other situations where you may be missing out on this hidden data? Is there data that you collect that you feel you do not get the most out of? Feel free to list these in the comments below. I would love to hear your thoughts. Make sure to read the full article. The reference is below!

References

Harman, R. P., Ellington, J. K., Surface, E. A., & Thompson, L. F. (2015). Exploring qualitative training reactions: Individual and contextual influences on trainee commenting. Journal of Applied Psychology, 100(3), 894.

Sy Islam has over 10 years of experience in a variety of corporate, academic, and applied settings. He has served in management, consultant and research roles in a variety of organizations. He is currently an Assistant Professor of Industrial Organizational Psychology at Farmingdale State College. In addition to his role as a professor, he is a co-founder and a Principal Consultant with Talent Metrics. In his role at Talent Metrics, he collaborates with organizations through consulting engagements in his areas of expertise (training and development, selection, survey design, performance management, and team building).