Training Effectiveness – a Quality by Design Approach
- Posted by: LSTI-Editor
- Category: Training,
In the role of a consultant, you have the benefit of seeing what works well across clients and what the common pain points are. Because of this perspective, I’m often brought onto a project to provide guidance on best practices and practical advice on how to avoid the common pain points. Case in point, recently a client asked me, “What’s the best practice for Training Effectiveness?” My response, in typical consultant fashion is, “Well, how are you defining training effectiveness?” Their response was, “We’re not sure, but we were cited for it in a number of client audits.”
What Is Training Effectiveness?
So let’s begin by breaking the term down and defining it. If we think about “effective training,” we know that training is effective when the employee demonstrates the desired behavior change (new skill, new knowledge, etc.) they learned during the training.
Training effectiveness is essentially a measure that examines the degree to which training improved the employee’s knowledge, skill, and behavioral pattern within the organization as a result of the training. Simply put: Did the training do what it was supposed to do? Did employees learn what they were supposed to learn? Were the employees who attended training able to do what they should be able to do once they left the “classroom?”
Training effectiveness seems to be a topic that is percolating, especially in GMP environments. Yet, it’s shrouded in mystery. No one is clear on what it is, how to implement it, and what the regulators are looking for when it comes to effective training. Summed up, most of the regulations essentially state: an organization should have a program in place to periodically evaluate the effectiveness of training. This definition leaves a lot up for interpretation. In addition, there is limited information out there on the practical aspect of implementing this concept.
Training Effectiveness: The Trends From A Regulatory Standpoint
Regulatory authorities are paying more attention to training effectiveness. However, the FDA does not commonly cite the effectiveness of training except in the case where an organization frequently has demonstrated that retraining (typically as a result of a finding and associated with a CAPA) did not work. You probably know the cycle all too well:
- Audit findings cited the root cause of an issue to be human error.
- The employees are retrained.
- The deviation reoccurs.
- The root cause is determined to be human error.
- Retraining reoccurs.
- The deviation reoccurs…and so the cycle continues.
We could speculate that a proper root cause analysis has not been performed in these cases, but that’s a topic for another article.
I recently asked a former FDA employee this question: “What would you look for and be satisfied with that provides evidence that a pharmaceutical company has effective training?” His response was, “Reduction in human error as a root cause to a very low and stable rate.” What this tells me is that their focus is outcomes based, not in determining if you have a super set of processes around training.
The Trends From A Quality Standpoint
Earlier versions of ISO9000 focused on identifying training needs, delivering training, and keeping records (as part of employee qualifications). However, the most recent version has additional wording regarding competence and effectiveness of training. In section 6.2.2 Competence, Training, and Awareness, it states (among other things), “The organization shall:
- determine the necessary competence for personnel performing work affecting conformity to product requirements,
- where applicable, provide training or take other actions to achieve the necessary competence,
- evaluate the effectiveness of the actions taken…”
This language hints that, similar to regulatory authorities, ISO9000 has shifted its focus from process to outcomes.
It is no longer acceptable to show that you have the processes and documentation in place that ensure the right training for the right employees took place. You are expected to “periodically assess the effectiveness” of your training. But what does that really mean in practical terms?
What type of guidance is out there to help you interpret these regulatory requirements? There isn’t any. Of course, we can use the standard Kirkpatrick Model to measure if our training was effective. That’s a good model, but how do we translate that into an environment where we need to ensure we are consistent — not only in process but in outcomes — and that we’ve documented our consistency and can readily provide evidence of that consistency? How do we look across training at a site and across large organizations to be able to say, yes – we consistently produce high quality training which results in having skilled, qualified employees, which results in a higher quality outcome (i.e., clinical data, safer patients, medicinal product) for our business?
If you’re someone who is not sure where to begin, I’d like to present you with an approach that represents quality and process excellence best practices that can easily apply to training effectiveness. In fact, you might be doing some of these already.
A Two-Pronged Approach To Training Effectiveness:
1. Ensure Training Effectiveness – An organization can ensure training effectiveness through key activities in its best practices training design, development, and delivery methodology. This is truly a Quality by Design approach, and it happens before any employee participates in the training.
- How is it done? An organization embeds best practices in adult learning and training design, development, and delivery into its standard training methodology (yes – that’s right, you need a methodology) enabling the organization to achieve the goal of “Right Training delivered by the Right Trainers to the Right Persons at the Right Time to achieve the Right Outcomes.”
- What does it look like? This Quality by Design approach would look something like this (although not a comprehensive look at all training processes):
2. Assess Training Effectiveness – An organization can assess training effectiveness through periodic review of current trainings after the trainings have been completed.
- How is it done? Periodic assessment is likely to happen at an individual site level, but can be a coordinated effort to look across a larger global organization that has multiple sites. Assessing the effectiveness of training occurs after employees participate in the training and should be part of an identified periodic review.
- What does it look like? Conducting a periodic review might look like this:
- Audit a select number of training sessions
- Document the review results
- Continuously improve and feed into Ensure Training Effectiveness
Key Input: Defining Minimum Measures – Prior to conducting a periodic review, your organization, especially if it has more than one location, should identify minimum measures. For example, at a minimum, sites/locations/organizations that want to determine the effectiveness of training may decide to use the following measures:
|Critical Success Factor||Measurement||Target|
|Qualified Trainer Delivers Training||Percentage of training where a Qualified Trainer was utilized||100 percent|
|Employee Successfully Completes Assessment||Percentage of assessments successfully completed by the employees for the following subject matter:
||Varies, typically determined by subject matter. Recommend a minimum of 90 percent|
|Training CAPA Results In Improved Performance||For sites that have been cited for some aspect of training effectiveness during an audit, measure the behavior or performance the CAPA should have improved||Varies, typically determined based on CAPA outcome.|
From my perspective, these minimum measures would give you a very good feel for whether the training was effective or not. However, there are many additional measurements (follow-up evaluations or employee demonstration of skills, annual performance reviews, etc.) you could use to provide more data regarding whether your training is effective or not. But, caution: don’t over-measure!
Audit A Select Number Of Trainings
Once you’ve identified your measures and are ready to conduct a periodic review, which trainings will you assess (i.e. one training, all trainings) I recommend taking an audit approach. An audit approach will ensure you have a fairly accurate feel for the effectiveness of your training without expending high amounts of effort during your periodic review. For example, during your periodic review you might:
- Select a percentage or minimum number of trainings that makes sense for your organization to evaluate for effectiveness. The training should be a mix (At least one instructor-led training for a high risk or high impact topic, at least one self-study*, etc.)
Footnote: * Read and Understand (AKA Read & Sign) “Training” is not considered a self-study.
- Measure the effectiveness of the trainings by assessing your minimum measurements. Let’s take the second minimum measurement of “Employee Successfully Completes Assessment.” Here is an example approach to take:
- Gather the assessment scores for 25 percent of the total number of participants across all sessions of training for each identified training or a minimum of 10 participants — whichever number is greater.
- Review the assessment scores.
- Ask yourself the following questions:
- What percentage of the reviewed employee assessment scores did not meet the minimum completion expectation requirements? This assumes your organization keeps failed scores.
- Is this percentage 25 percent or higher?
- If the percentage of failed assessments for an identified training is 25 percent or higher, you need to determine why.
- Conduct a root cause analysis and document the root cause, indicating how the site will correct this, who will drive correcting it, and by when.
You don’t have to follow this exactly. It’s just an example, but this example can give you a better feel for what a part of an assessment of training effectiveness might look like. Next steps include documenting the audit and any results and/or actions and using these as input into your continuous improvement process.
Training Effectiveness Requires Training Maturity
The reality is training effectiveness takes training maturity. Remember back in the Ensure Training Effectiveness I mentioned having a standard training methodology. The methodology doesn’t have to be perfect; it can be flexible, but it needs to be consistent and carry standards (e.g., retention standard – limiting the number of read and understand procedures you can expect an employee to take, limiting the number of training hours). An organization needs maturity in its training processes. What I mean by this is, if you do not have a mature training function that follows best practices, why bother putting loads of effort into creating a periodic review of training? The result will always be the same: the training is not effective.
Invest the time upfront in ensuring the design, development, and delivery of your training is setting your employees up for achieving the outcomes. One of the best demonstrations of training effectiveness is the reduction in deviations which, in the long run, pays huge dividends in reducing the cost of quality.
Without an outcomes based approach where you are ensuring training effectiveness, your employees’ performance will continue to suffer and so will your quality.
ABOUT THE AUTHOR
Holly Deiaco-Smith brings over twenty years of management consulting experience to her clients. Holly’s tenure in Big 4 consulting, including Accenture and IBM Global Services, grounded her with a foundation of best methodologies, leading practices, and client experience. Holly has an MS in Educational Technology, is a Six Sigma black belt, and is certified to administer and interpret the Myers-Briggs Type Indicators Steps I & II.
Holly’s experience includes strategic planning, process improvement, benchmarking for leading practices, organizational improvement, adult learning design and development, and change management. She may be reached at 610-871-2316 or email@example.com. Visit her website at www.hdsconsulting.com.