Qualitatively Different Measurement for Training Reactions
BY REANNA P. HARMAN, Ph.D | Originally Published in Training Industry Magazine
Training evaluation should provide insights not only about the effectiveness of training but also about how it can be improved for learners and organizations. In this context, the term “insights” implies a deep understanding of learning, the training process and its outcomes as well as evaluation procedures – designing, measuring, collecting, integrating and analyzing data from both a formative and summative perspective.
A recent ATD study on evaluation practices found that 88 percent of organizations measure reactions. Although the study does not provide a specific percentage, it is safe to assume that most reactions data are quantitative and gathered from evaluation surveys and that a smaller percentage are gathered from qualitative methods. Quantitative measurement is a critical part of effective training evaluation, but there is untapped value in qualitative measurement – especially when it comes to developing a deep understanding of the phenomenon to generate actionable insights.
There is untapped value in qualitative measurement.
What Qualitative Methods Can Be Used for Training Evaluation?
Open-ended questions, focus groups and interviews are three qualitative methods that can be used to dig deeper and generate actionable insights. As training professionals, most of us are familiar with the inclusion of open-ended questions on training reaction surveys as a method for eliciting qualitative feedback. Comments from trainees add richness and context to the more structured and standardized closed-ended items, but they also are skipped more often and tend to elicit feedback from only the most dissatisfied trainees. And while the comments provide the opportunity to dig a little deeper into trainee reactions than the quantitative responses do, the feedback flows only one way (from trainee to evaluator), and there is no opportunity for follow-up or further probing to gain deeper understanding.
Focus groups and interviews provide an even better opportunity to engage in a dialogue with trainees about their experiences. These sessions can be conducted while training is still ongoing; at the end of training; or even retrospectively, after training is complete. These sessions are particularly useful if you are piloting a new program, if changes are being or have been made to training, if other training evaluation data suggest a deeper investigation may be beneficial, or if you are investigating learning transfer and the alignment of training with work requirements.
How to Maximize Your Investment in Qualitative Measurement
Qualitative measurement has a reputation of being costlier and more time-consuming – in terms of both data collection and analysis – when compared to quantitative measurement. But, when qualitative measurement is done well, the return is worth the investment. Here are some tips for adding or integrating qualitative tools into your training reactions measurement plan.
For collecting open-ended items on reactions surveys:
- Write open-ended items that are specific and directly target information you are seeking rather than vague, catch-all items.
- Consider changing open-ended items more frequently than closed-ended items to target hot-button issues or focus on specific trainee groups or situations (e.g., trainees participating in piloting a new course).
For conducting focus groups and interviews:
- Develop and implement a sampling plan from the population of trainees to ensure participation from all relevant sub-groups (representation across different types of classes and to include trainees with different backgrounds).
- Develop protocols and scripts with specific questions and follow-up probes.
- Train moderators and scribes to execute the protocols and scripts as designed.
- Record (via either audio or detailed notes) focus groups and interviews. Consider transcribing audio recordings for further analysis.
- Create an atmosphere that supports open and honest feedback. Trainees must feel comfortable providing feedback, so it is often helpful for a third-party organization to conduct and analyze the feedback.
For analyzing and reporting qualitative data from open-ended items, focus groups or interviews:
- Consider a combination of text analytics tools and human coding to develop an analysis strategy that aligns with your goals.
- Report comments or feedback from focus groups and interviews along with quantitative survey items to show how the comments allow for further exploration of the quantitative trends. Highlight areas of consistency (i.e., agreement with the quantitative data) as well as edge cases where the qualitative feedback provides an alternative perspective.
- Protect the privacy of respondents and follow best practices when reporting verbatim comments.
Adding qualitative methods to the training evaluation toolbox provides an opportunity for evaluators to gain a deeper understanding of the training experience, provides a rich source of data, and communicates to trainees that their input is valued and their voices are being heard.
Dr. Reanna Poncheri Harman is the vice president for practice at ALPS Insights, where she focuses on training evaluation, learning transfer, capability development and needs assessment, including using quantitative and qualitative measurement to help her clients gain insights and achieve their objectives.
RECENT INSIGHTS
Impact Evaluation: From Employee Training to Leadership Development
SIOP Annual Event: Sat, April 25, 12:30PM-1:20PM [Cancelled due to COVID-19 policies]
Drawing on the combined experience of a diverse panel of learning and development experts, this session will examine and discuss current practices and future opportunities in impact evaluation for a wide range of interventions, from employee training to leadership development programs. Panelists will share insights to help build value using evaluation data.
Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice
TK2020 Event: Wed, February 5, 11:30AM – 12:00PM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.
Want More Value from Evaluation? AIM to Answer Two Questions
TK2020 Event: Thurs, February 6, 9:00AM – 10:00AM
While almost all learning is evaluated, less than half of organizations report that evaluation helps meet their learning and business goals. Data create no value. People acting on meaningful data within the L&D process create value. The Alignment and Impact Model (AIM) focuses evaluation on helping all stakeholders create value. AIM incorporates purpose, process, stakeholder roles, and two questions to guide evaluation design and focuses on maximizing learning, transfer, and impact. Examples demonstrate the fundamentals of AIM and how it can be implemented and used.
Create More Value With Your Learning Evaluation, Analytics, and Feedback (LEAF) Practice
TK2020 Event: Thurs, February 6, 10:15AM-10:45AM
Optimizing your LEAF practice is your best opportunity to improve learning and its impact. Less than half of organizations indicate evaluation helps them meet their learning and business goals. Data alone doesn’t create value. People acting on data create value. Our ALPS Ibex™ platform drives effective, purpose-driven evaluation empowering L&D stakeholders with insights and creating a culture of continuous improvement in the workplace. Examples demonstrate how using ALPS Ibex helps L&D stakeholders Act on Insights™ to drive improvement and impact.
Let’s Connect.
Ready to learn how ALPS Insights can help your organization improve?