Clinical Trial Root Cause Analysis: Can’t We Do Better Than Five Whys?
- Posted by: LSTI-Editor
- Category: Clinical,
Many people in our industry have had root cause analysis (RCA) training. It is aimed at helping people understand an issue and the underlying reasons it happened. Once you have those reasons (the “root causes”), you can act on them. This is the most effective way of trying to stop the same issue from recurring. And RCA is now a requirement for serious issues per ICH E6 (R2).
If you’ve ever had training on root cause analysis, you are likely familiar with the most common RCA training approach — Five Whys. Keep asking “why?” five times until you get to the root cause. It’s a technique originally developed by Sakichi Toyoda and used in the Toyota Motor Corporation. It aims to help avoid assumptions, and it couldn’t be simpler. After all, toddlers ask “why?” all the time — so just do the same and keep asking “why?”
But something that should give you pause is how your physician responds when you tell her you have a pain in your arm. She doesn’t use the Five Whys technique. She asks questions such as:
- Where on your arm is the pain located?
- Can you move your arm this way? Is it painful then?
- When did the pain start?
- Is it intermittent? Worse at certain times?
- Is it only that arm?
Your physician is gathering as much information as possible in order to diagnose — i.e., get to the root cause — and then treat you properly.
Let’s continue with the Five Whys technique, however, and try it out in an example from a clinical trial. A site audit reveals a satellite site, with no evidence that the satellite site has been assessed by the CRA who is performing the pre-study site visit (PSSV). You’re the project manager, and you’ve been told you need to get to root cause as part of the corrective and preventive action (CAPA) plan. Let’s use Five Whys:
- Why #1: Why is there no evidence that the satellite site has been assessed by the CRA?
Because the satellite site was not assessed at the pre-study site visit
- Why #2: Why was the satellite site not assessed at the pre-study site visit?
Because the CRA was not aware there was a satellite site
- Why #3: Why was the CRA not aware there was a satellite site?
Because no one told the CRA there was a satellite site
- Why #4: Why did no one tell the CRA there was a satellite site?
Because the CRA didn’t ask if there was a satellite site
- Why #5: Why didn’t the CRA ask if there was a satellite site?
Because there was no reminder to the CRA to check if there was a satellite site.
So that’s our root cause. Now we need to put an action in place. Maybe we retrain the CRA to check whether there is a satellite site when they conduct a pre-study site visit. CAPA closed!
Do you think our root cause analysis got us to the right root cause? By implementing the action, will we reduce the likelihood of recurrence of the issue? Maybe. But maybe not. Five Whys is simple but suffers from at least two significant flaws:
- It is not repeatable. Different people will answer the Why questions differently, and their responses will take them to a different conclusion. For example, to Why #3 — “Why was the CRA not aware there was a satellite site?” — the response might be, “Because the CRA did not review previous PSSV reports on the site.” This answer will lead to a completely different “root cause.” The approach is very dependent on the individuals involved and is not repeatable.
- It does not use all available information. The focus is on Why? But what about When? Where? What? How? Why is definitely an important question in root cause analysis, but it’s not the only question. To quote Edward Hodnett, “If you don’t ask the right questions, you don’t get the right answers. A question asked in the right way often points to its own answer. Asking questions is the ABC of diagnosis. Only the inquiring mind solves problems.”
Maybe before getting into Five Whys, we should have asked other questions, like the physician did, such as:
- Are there other sites with satellite sites in the study? Is there evidence that they were assessed during the pre-study site visit?
- Is this something that has only occurred in one country or one type of site, or is there evidence it has occurred elsewhere?
- If there are other occurrences of the issue, when were they? Is this something that has only occurred recently, or has it been happening for a long time?
- What is the company policy/guidance on assessing satellite sites?
Asking questions like these will likely point us to areas we should investigate further. These are questions about timing, scope, and scale. If the issue only appears to happen in one country, why might that be? Or if it only appears to be happening recently, what might have changed?
Root cause analysis should be a chance to take a step back and try to understand why the issue occurred, and to think critically, understand the process, and determine how the issue came about. Our Five Whys approach has led us to a simple “root cause” and action, but have we really understood the issue?
In the complex world of clinical trials, we need to bring together tried and tested techniques of root cause analysis to help develop skills and tools that can be used for better RCA. Such powerful techniques and training will enable us to truly understand issues that arise and implement fixes that can help stop those issues from recurring. As we become more attuned to the skills of critical thinking, we can even focus on minimizing risks — reducing the likelihood of issues ever occurring.
About the Authors:
Sandra “SAM” Sather is an industry-leading consultant whose mission is to promote clinical quality systems for Sponsors/CROs and Investigators/Research Institutions. She has over 25 years of clinical experience, with a Bachelor of Science in Nursing and a Master of Science in Education with a Specialization in Training and Performance Improvement. SAM is the vice-president of Clinical Pathways, a consulting firm located in the Research Triangle Park area in North Carolina, USA. SAM is dual certified by the Association for Clinical Research Professionals (ACRP) for over 10 years (CCRA and CCRC) and a current member of the ACRP Academy Board of Trustees and Regulatory Affairs Committee (RAC). She is a frequent speaker for industry conferences and has authored dozens of courses for clinical research in various functional areas (e.g., monitoring, safety, HIPAA, and vendor management).
Keith Dorricott is the Director of Dorricott Metrics & Process Improvement Ltd., formed in 2016. Over a 12 year period, he worked in R&D, process development, and manufacturing whilst at Kodak Ltd. In 2005 he moved industry into the area of clinical trials and worked for two major CROs leading corporate process improvement. He qualified as a Black Belt in Six Sigma in 2002 and a Master Black Belt in Lean Sigma in 2007. He is an Ambassador for the Metrics Champion Consortium (MCC) and assists by facilitating work groups on centralized monitoring, central lab metrics, vendor oversight, Trial Master File, and by developing tools to assist with risk assessment & control and metrics selection & use. He received the MCC Champion Award in 2016 for making significant contributions to advancing the MCC mission to improve the efficiency, quality and effectiveness of clinical trials. Keith regularly presents at conferences such as DIA and SCOPE on the practicalities of quality risk management and risk-based monitoring and the definition and use of metrics. His passion is to help bring clinical research to the proficiency of manufacturing with root cause analysis skills – a fundamental requirement for effective process improvement.