Rig blast a recipe for disaster

July 28th, 2010
Email This Post  Print This Post  

My “Open Range” column from the July 2010 issue of Alberta Venture magazine:

Over-reliance on processes, systems and modelling trumps human judgment

One reads with a sickening sense of impending doom the recent second-by-second accounts of the Deepwater Horizon’s final minutes and the lives that were torn apart and snuffed out. There’s more than enough tragedy and folly – panicked rig workers reportedly crawling over injured comrades in the rush to get aboard life rafts. Also disturbing are suggestions that the rig’s ultra-modern computerized systems and BP’s inches-thick safety manuals stifled quick thinking and decisive action. If workers on the drilling floor noticed a natural gas build-up – a potentially fatal occurrence – they needed approval from two supervisors before declaring an emergency or taking action. Think about that. Every second counts, and the rules demand you waste minutes hunting down not one, but two bosses. One of them, the Wall Street Journal recently reported, was in the shower.

What strikes me most about this is the company’s reliance on systems, processes, manuals, models, computers – to the exclusion of sound human judgment. You may snigger at the image of a supervisor showering at the moment disaster strikes. But don’t we all need to sleep, eat and care for ourselves? What kind of planning for an essential decision-point – one where delay equals death – ignores normal human functions? Whoever came up with this stuff crafted a procedure for every contingency but overlooked the obvious. Human error triggered the chain of events – but rigid process may have prevented individual ingenuity from averting disaster.

Similar examples abound, disturbingly so. One was the shutdown of air travel throughout Europe due to Iceland’s volcanic dust cloud. Most of us assumed those ominous TV cloud maps stemmed from direct observation using weather balloons or satellites or whatnot. Forget it. It was all computer models. Here at home, last fall’s slow distribution of H1N1 vaccine, combined with the politically correct insistence on indiscriminate vaccination, relied on a similar logic. In both cases, the systems and processes planned for everything except what actually happened, while leaving little scope to adjust through clear-eyed observation and clear-headed judgment.

This mentality may also explain why security planning for major events like the Olympics and G8 is so over the top. If everyone’s been told exactly what to do under dozens of foreseen scenarios, they’re unlikely to have any idea what to do under the unforeseen one. A similar mindset also underlay the multi-trillion-dollar financial meltdown of 2008. The bubble had rested on the conceit that every financial risk was identified and quantified. From this flowed the mad scheme to ensure the entire economy against systemic risk. But the entire structure was based on exactly five years of real-world market data. It excluded 98 per cent of stock-market history – including every previous boom and bust. Its risk probabilities were ruinously off-base, so every comforting belief that followed was mere delusion.

When things go badly wrong, many point to government cutbacks. In its stampede to cut costs, government hands the regulatory keys to industry. Companies file reports and updates without independent scrutiny or verification. Not only international finance but provincial energy royalties, forestry quotas, environmental remediation, all tick along with little or no due diligence, the bureaucrats rubber-stamping industry’s submission.

There’s something to the captive regulator thesis, but not so much from cost-cutting, since government spending has been exploding nearly everywhere. Bureaucrats are among the most reliant on systems, protocols, manuals and procedures, and few are more obsessed with process and paralyzed by risk management (call it the “precautionary principle” or plain old butt-covering). The no-fly order in Europe came from amply funded regulators who were hardly captive to their industry. It was left to private aviators like former race-car driver Nikki Lauda to go up and hunt for the phantom cloud, shaming the regulators into relenting. When governments do cut, it’s usually the front-line inspections and human inspectors. The very people in a position to observe and report are replaced by buffoonish computer models and Kafkaesque procedure manuals.

The urge to eliminate risk, utopian in its conceit and usually inept in execution, creates gigantic unknown risks threatening virtually every complex system, institution or facility. I think part of the notorious Syncrude duck kill was due to an over-reliance on process and procedure. There was a “system” and a “plan” to keep ducks away from the toxic tailings. They just didn’t cover a late-season snowfall, and when one came, company staff didn’t or couldn’t act. But it could be anything. The disaster that comes is precisely the unimagined, unknown, unplanned. That’s when human flexibility and ingenuity are all that remain.

Yet today’s methodology increasingly circumscribes individual freedom of action. We’ve created enormous moral hazard: paralysis in the face of unforeseen events. Initiative, judgment, quick thinking, decisiveness and courage – the very ingredients of heroism – have been pushed outside the realm of acceptable behaviour. They not only can’t be described in procedure manuals, they appal the process-obsessed functionary because they entail risk and shame those whose careers consist of inaction. We can’t know what will happen next time. Something will. But our entire civilization operates in the delusion that we’ve managed these unknown unknowns out of existence.

Blogmarks BlogLines del.icio.us Digg Facebook Google Google Reader Magnolia Yahoo! MyWeb Newsgator reddit SlashDot StumbleUpon Technorati
By George Koch
Category