Let’s Forget the Perfect World

 

As I write this someone, somewhere is designing a system based on the erroneous assumption that things will run perfectly.  So many things—from products to complex processes ignore the simple fact that no system is perfect, and because these systems ignore this fact the systems fail.  Why do we develop systems based on a perfect world when we all know that not only do people make mistakes, so do computers, products, and even robots.  Ideally, we would allow for this imperfection, and in fact, many systems do.  Unfortunately, leaving the perfect world takes time and forsight and these days both are in scare supply.

The Curse of Variability

 

Too often we create “perfect” systems that are corrupted by unforseen factors.  These serpents sneak into our processes and wreak havoc as we sit helplessly nearby wondering how we could have ever prevented such a disaster.  I call this the “Eden Effect”. Whether we call these process disruptions gremlins, ghosts in the machine, SNAFUs or viruses things nobody counted on enter our system and make us shake our heads.  Take for instance the American black bear who wandered into the parking lot of a customer of mine.  They jokingly asked me how to record this hazard in our database.  Clearly this was a safety issue—you can’t have a bear wandering around the parking lot—and yet there was nothing in the safety process (or security process for that matter) that dealt with how to remove a bear from the premises.

Not all process failures are quite as far-fetched.  In fact, many of the most destructive things in our processes aren’t statistical outlyers at all.  They are simply common place things that we didn’t forsee, and our completely understandable lack of foresight leads to disaster and even death.  We describe these things as “freak accidents” or  “acts of God” and excuse ourselves because there was no way we could have seen it coming. The reality is that we often can predict things and take no measures to prevent them; there is nothing wrong with that.  In many cases the likelihood of a failure is so incredibly remote that it doesn’t warrent any preventive measure or counter measure to reduce its severity.  Take our bear example; there had been reports of bears wandering into populated areas and certainly the safety professionals could have had some inkling that there was a possibility that a bear would come calling, and yet they did nothing.  An encounter with a bear is highly likely to cause a sever injury or even a fatality.  Should we judge the safety professional’s behavior as reckless? Was he negligent? No.  Most would agree that the very remote chances of a bear coming into the parking lot did not merit a counter measure even in though the consequences could be fatal.  Any measures to protect workers from bear attacks (likely a once in a couple of lifetime occurence) would be judged as financially irresponsible and ridiculously over protective.

How can a safety professional know the balance between improving the safety system and being over protective?

  • Stop trying to do the impossible.  People make mistakes; that as much of a universal truth as you will ever get in this life.  We have to make our peace with the fact that smart, highly skilled, cautious people will make mistakes and there is nothing in this world we can do to prevent them. We CAN, however, reduce the likelihood of mistakes and the severity of the consequences to the point where mistakes don’t kill people, by managing the things that increase the likelihood of mistake making:
    • Stress. People under stress think differently than those with less stress.  Some brain research has even shown that excessive, prolonged stress can change our brain chemistry.  When we are stressed it signals our subconscious that we need to adapt and the brain starts to experiment with the safety of our environment by causing us to make mistakes.  Mistakes are our subconscious mind looking for the safest route for a quick exit, but unfortunately it tends to find out that something isn’t safe by falling victim to an accident.
    • Incompetance. People who are physically or intellectually unable to do there jobs correctly are going to make more mistakes than those who are better suited to the job requirements. 
      We do no one a service by putting them in a position where they face the real possibility of serious injury by doing the job.  Training can eliminate some incompetance but it can only take us so far.  We also need to beef up post offer screening and our over all recruiting and hiring process if we are going to drive incompentence out of the workplace.
    • Fatigue. As we get fatigued we make poor choices and mistakes.  Safety professionals should take a hard look at fatigue levels of workers in areas of the most frequent near misses and injuries and modify work schedules to reduce fatigue. 
  • Recognize that Systems Also Produce Unexpected Results.  For decades business has worshipped automation, and anyone who works in automation will tell you that you can’t always predict, or count on. what an automated system will produce.  An aggressive Total Productive Maintenance (TPM) System will go a long way in improving equipment reliability, but even TPM can’t tighten your process to the point where everything produced by it is perfect.
  • Build Systems that Can Tolerate Drift.  Not only will people (and machines) make mistakes, they will also slowly (even inperceptably) move from the design standard away from the norm until they ultimately have moved outside the processes tolerance for drift.  Saw blades dull, drill bits get brittle, people take short cuts, until the saw won’t make a clean cut, drill bits snap like pretzels, and people get hurt.  The key to building a system with a high tolerance for variability is to study the factors that must be true for the process to perform and compare them to the likely amount of drift.  This sounds hard, and it is more difficult than it sounds, but until we build better systems that can tolerate variability in materials, environment, machinery, and most importantly, human behavior we will still be counting stitches and bemoaning the fact that we don’t live in a perfect world.
Advertisements

#influencing-human-error, #mistake-proofing, #process-control, #variability, #worker-safety