Learning from mistakes

A human behaviour expert says the Pike River mining tragedy shows we still haven't learned why smart people repeatedly make dumb mistakes - with disastrous results. Andrew Laxon, of the NZ Herald, reports.

It was an accident waiting to happen, Dr Kathleen Callaghan told the Pike River Coal mining tragedy inquiry.

The Auckland University human factors expert did not mince her words as she told the Royal Commission last November what she thought of the company, the inspectors and the whole legal, commercial and political environment which made some kind of disaster highly likely.

The Pike River Coal mine explosion, which killed 29 miners in November 2010, was an organisational accident, she said, in the same category as the 1986 Chernobyl nuclear reactor meltdown, the BP oil spill in the Gulf of Mexico in 2010 and - New Zealand's own benchmark - the Mt Erebus tragedy which killed all 257 crew and passengers on an Air New Zealand DC10 in 1979.

It stemmed from failures at Pike River Coal, but even more importantly, from failures in the Department of Labour, which was supposed to ensure safety in the mine, and by the Government, whose decisions had weakened the inspectors' effectiveness.

Callaghan's evidence was overshadowed at the time by a staggering series of revelations to the commission, which has finished its hearings and is due to report in September.

But her comments briefly drew attention to the fact the tragedy was not just a mining accident. As she argued at the inquiry, it was caused by repeated human error on a large scale, which has disturbing implications for all New Zealanders.

Callaghan is the director of Auckland University's human factors group, which starts from the premise that human beings - and their tendency to make mistakes - are at the centre of everything we do, especially in the workplace.

Sitting in a former ward room near her office in the old Auckland Hospital building, she explains it is the study of why apparently smart people do stupid things, often time and time again and even after they have been told not to.

The answers tend to involve uncomfortable truths about how we really behave, often for hidden reasons we may not want to admit.

The 46-year-old was sidetracked into human factors and accident investigation through a love of flying as a young doctor.

Posted to Dubbo in the New South Wales outback in her first year as a would-be neurologist, she became hooked on gliding and promptly signed up for an aviation medicine career with the Royal New Zealand Air Force.

Her master's thesis examined how fear of crashing affected the decisions of air force fighter jet pilots to hit the ejector-seat button.

Using a simulator, she discovered that pilots generally made the right decision above the 10,000 feet ejection safety threshold but made increasingly over-cautious choices to abandon their aircraft the closer they got to the ground.

As a result of her work, the air force decided not to lower the threshold after all.

She later became principal medical officer of the Civil Aviation Authority, and in a joint PhD in medicine and psychology examined the unofficial reasons behind doctors' decisions about their patients.

"They're the ones we all know like: 'I'm short of time', `I'm worried that they might take me to the Health and Disability Commissioner', 'Mrs X won't get her operation unless I say she fell over'.

If we don't acknowledge them and try to deal with them, how do we expect diagnostic decisions to be any better?"

The most common kneejerk reaction to an accident, she says, is finding someone to blame.

If a patient dies after a nurse accidentally gives the wrong drug, the easy answer is to blame the nurse.

But she may have been distracted, tired or misread the doctor's poor handwriting and each underlying reason could lead to a different solution.

Callaghan says this does not mean letting people off.

A good company has a "just culture" which strikes a balance between encouraging workers to report safety failures without fear of reprisals and reserving the right to take disciplinary action against those who consciously disregard the rules.

Suppose we both drink and drive tonight, she says. "I could make it home scot free, you kill somebody.

"And there's an element of chance to that ... But the conscious disregard is your decision to drink and drive."

Apparently small problems can also have huge consequences.

Everyone is familiar with getting into a different car and, in busy traffic, accidentally switching the windscreen wipers on when you meant to indicate.

Callaghan says a fatal 1995 plane crash near Hamilton occurred partly because the pilots were flying an aircraft identical to the one they normally used, except for the fuel management system.

"Some people died there but the underlying action is the same as you [mistakenly] flicking the windscreen wipers."

Another common kneejerk response is improved training, which she says has become a catch-all corporate response to failure, even though most of us already know when we're doing something wrong.

An apparently crazy decision by a factory worker who removes a safety guard and loses his arm is the same kind of choice we make each time we jaywalk across a busy street.

"The choice I make is not 'Wait at the pedestrian lights or die'. It's normally something like 'I'm late for a meeting and it's my boss' - so I dodge through traffic."

Callaghan says it's also unrealistic to say staff should speak up if everyone knows they will be punished, openly or otherwise, for doing so.

This culture of saying one thing but doing another frequently leads to dangerous shortcuts.

"We call them 'routine violations', where the rule says 'X' but everyone does it another way. Everybody's aware that they're doing it another way, including supervisors, and you just get on.

"When the s... hits the fan, that's when somebody invokes the rule again and decides to take out the individual rule-breaker." Erebus had a strong element of that, she says.

On the face of it, Captain Jim Collins went below the minimum descent altitude and crashed into the mountain.

But several pilots - supported by a company brochure - said Air New Zealand routinely ignored the rule to give passengers a better view.

Even the Costa Concordia sinking off the coast of Italy in January may stem from more than just the actions of its notoriously cowardly commander, Captain Francesco Schettino.

Subsequent reports have revealed the company had approved an even closer "sail by" in August, many lifeboats could not be launched because the ship was tilting too sharply and new passengers had not been given a safety drill.

Callaghan's final tip is to avoid making new rules for the sake of it.

"We often see solutions implemented before the problem's been defined and then people run around going: 'But it's not working'."

Callaghan and her colleague Bridget Mintoft say that, increasingly, some businesses understand their ideas and can see benefits beyond safety.

For instance, Mintoft is researching how long personal investors are willing to stick with losing stocks under stress, which many firms could find directly useful.

It is not ivory tower science, says Callaghan, passionately. It's about understanding how to get the best out of human beings, "which is actually really sexy".

She laughs at her own enthusiasm.

"If you do it right, it can have an immediate positive effect on people.

"Whereas exhortations to pay more attention or just do it better or let's get rid of the bad bastards ... [she lowers her voice to a stage whisper] ... it doesn't actually get you anywhere."

Quick fixes- and why they often don't work:
Fire someone: A new person might make the same mistake.
More training: Useless if staff were already trained but knowingly did the wrong thing.
New rules: Unlikely to work if the old rules were ignored, often with the tacit approval of managers.

 

Add a Comment