Select Page

As maintenance and reliability professionals, we love rules of thumb and other easy-to-implement, tried-and-true solutions to problems. Experts who specialize in the human factors of failure refer to rules of thumb as “heuristics.” The reason we like heuristics is they don’t require us to exert a much mental energy. That’s because the framework for a solution’s implementation was established when we solved a similar problem in the past.

If, however, the heuristic applied to solve a problem in the plant isn’t on-point, it can short-circuit the root cause analysis (RCA) and continuous improvements. It also can create or perpetuate what’s sometimes called the “cycle of despair,” where the organization and its people come to accept failure and suboptimal solutions as “business as usual.” When this occurs, heuristics are counterproductive and sacrifice effectiveness in the pursuit of efficiency.

Psychologists and social psychologists have defined numerous heuristics, including, among others: the “affect heuristic”, where emotion is employed to influence a decision event when the facts don’t; the “anchoring and adjustment heuristic,” where people tend to anchor on the initial information they receive and associated solution frameworks, while discounting new discoveries of facts and details; and the “familiarity heuristic,” where past solution frameworks, which were effective in solving problems that appeared superficially identical, are ineffectively employed to solve the current problem. While there are many types of heuristics, they all boil down to mental shortcuts that can be fraught with risk.

In the interest of cognitive expediency in solving problems with heuristics, our decision processes are often biased. No humans are purely objective when making decisions. We’re all susceptible to biases that underscore our tendencies to engage heuristics. The availability bias is one such influencing factor that compromises plant reliability decisions and outcomes.

As its name suggests, the “availability bias” causes us to employ solution frameworks that are commonly applied and/or a solution framework that was very recently applied. The “frequency-based bias” is applying the “go-to” solution that we employ for common and usually simple problems. The time frame-related variation on this is a very risky “psychological bias.” Consider the following troubleshooting example.

A troubleshooter may have historically solved two problems that superficially appeared to be similar, but one of the problems was addressed a month before and the other occurred 10 years prior. The 10-year-old solution may have been a better fit, but last month’s solution was more “cognitively available.”

There are plenty of other biases, such as the “anchoring bias,” i.e., if you’re a hammer man, the world looks like a nail (which can be a problem when the situation demands a screwdriver). “Hindsight bias,” “sunk-cost bias,” “belief biases,” “selective-perception biases,” the list goes on and on. However, in all cases, biases represent a risk of taking cognitive shortcuts that can lead to suboptimal decisions and undermine our reliability and safety goals.

So how do we address biases that cause us to inappropriately employ heuristics to the detriment of our reliability goals? First, use a process to address issues, and ensure that it prevents knee-jerk reactions and abrupt decisions regarding corrective actions. Second, start with a clean slate when evaluating problems. Third, challenge the status quo and seek multiple perspectives on problems. Fourth, search for objective data and avoid the temptation to jump to conclusions. Fifth, don’t allow persuasive people to “sell” conclusions and recommendations that are based on beliefs, not facts. Finally, always play devil’s advocate and challenge knee-jerk reactions when analyzing problems. Through this approach, you’ll prevent individual and organizational biases and heuristics from undermining reliability-growth objectives in the plant.TRR



REFERENCE
USC Marshal Critical Thinking Initiative (Date not defined). “How to Reduce Bias in Decision Making.” Univ. of Southern California. https://www.marshall.usc.edu/sites/default/files/2020-01/Reducing-Bias.pdf. Accessed 01 January 2021.



ABOUT THE AUTHOR
Drew Troyer has 30 years of experience in the RAM arena. Currently a Principal with T.A. Cook Consultants, he was a Co-founder and former CEO of Noria Corporation. A trusted advisor to a global blue chip client base, this industry veteran has authored or co-authored more than 250 books, chapters, course books, articles, and technical papers and is popular keynote and technical speaker at conferences around the world. Drew is a Certified Reliability Engineer (CRE), Certified Maintenance & Reliability Professional (CMRP), holds B.S. and M.B.A. degrees, and is Master’s degree candidate in Environmental Sustainability at Harvard University. Contact him directly at 512-800-6031 or [email protected].



T
ags: reliability, availability, maintenance, RAM, heuristics, asset management, workforce issues, plant operations, safety