The Psychology of Failing Fixes

Home » Performance Improvement Articles » The Psychology of Failing Fixes

The Psychology of Failing Fixes

After tens of thousands of corrective and preventive actions have crossed my eyes and passed through my ears over the past thirty plus years, I began to see a definite pattern. It wasn’t the obvious ‘weak fix’ pattern most people see in their organizations. You know what the favorite fixes are – retraining, procedure expanding, and punishment focused fixes. Instead, it was a pattern of a less visible nature. I began to see a pattern of psychology in those failing fixes I was seeing over and over again.

What is the Psychology Behind Our Failing Fixes?

Here is what I saw in a nutshell. When a person attempts to write a corrective action to address a human error, they tend to gravitate towards recommending a relatively weak fix. For example, if we want to write a corrective action to address the problem of people not wearing the right work gloves, what do we often recommend? Most responses focus on making sure gloves are available and reminding the employee of the requirement, and need, for wearing gloves. How effective are those fixes? What is the probability of that problem happening again?

By using a work systems gap or weakness as our initial corrective action reference point, the likelihood of recommending a much more systematic fix exists. If gloves are handed out as part of the job briefing each day, does this increase the chances that people will wear them? If supervisors are routinely measured relative to the degree that they consistently support personal protective equipment use, does this increase the likelihood that gloves will be worn?

Such changes may sound simple. Before you toss my observation aside, take a look at the collective nature of the last fifty or so corrective actions you have written or reviewed. How relatively strong, or weak, are they? Where would they fall on the hierarchy of controls? What percent of the time have you focused on trying to change people, instead of systems?

How Often Do You Rely on Failing Fixes?

Too many organizations rely on failing fixes – reminders, discipline, and retraining for example. What is the case in your company? This often happens because of the root cause analysis approach we use to find root causes. Traditional approaches such as the 5 Why technique or fishbone analysis allow human error to be viewed as a root cause. My experiences have taught me that this is a bad thing to do. Seeing human error as a root cause is a process error in its own right. The better option is to use a root cause analysis approach that looks for the systemic reasons a person makes mistakes.

All people make mistakes. If we want them to produce error free work, we have to design our work systems so they discourage, versus encourage, human error. Our space programs and nuclear power generating companies get this, for example. They rely heavily on well-designed checklists and effective job preparation to guide people in doing their jobs well.  They don’t rely primarily, if not solely, on memory. What percentage of the time do you count on memory to help minimize errors? Is it possible that the use of well-designed checklists could significantly improve performance?

Human Error Should Not be Accepted as a Root Cause

For years, I made the mistake of seeing human error as a root cause until I started teaching the TapRooT® root cause analysis approach as a contract trainer. This approach is designed to force the user to look for the systemic causes of human error. In other words, human error is rarely, if ever, a root cause with this process. What percentage of your root causes are human errors? Is it possible that a different root causes analysis approach could lead you to better, work system focused fixes?

If we continue to try to write corrective and preventive actions to address human error directly, we will continue to write relatively weak fixes. That is the psychology of failing fixes, as shown in the graphic. The fix in this case is to reject human error as a root cause. Instead, always search for the systemic reasons humans do things they themselves really don’t intend to do. How often do your fixes fail? Is it possible that a root cause analysis process shift, along with a psychological shift, could lead you towards a more error free workplace?

Keep improving!

Kevin McManus, Chief Excellence Officer, Great Systems

NOTE: if you found value in this article, you might also benefit from reading my new book “Error Proof- How to Stop Daily Goofs for Good”, which is now for sale on Amazon.com.

By | 2017-04-10T08:23:39+00:00 March 31st, 2017|Process Improvement, root cause analysis|Comments Off on The Psychology of Failing Fixes

About the Author:

Kevin McManus serves as Chief Excellence Officer for Great Systems! and as an international trainer for the TapRooT® root cause analysis process. During his thirty five plus years in the business world, he has served as an Industrial Engineer, Training Manager, Production Manager, Plant Manager, and Director of Quality. He holds an undergraduate degree in Industrial Engineering and a MBA. He has served as an Examiner and Senior Examiner for the Malcolm Baldrige National Performance Excellence Award for eighteen years. Kevin also writes the monthly performance improvement column for Industrial and Systems Engineering magazine, and he has published a new book entitled “Vital Signs, Scorecards, and Goals – the Power of Meaningful Measurement."
Show Buttons
Hide Buttons
Translate »
X