Putting diversity in the picture

Today I was putting together a presentation that I’m delivering tomorrow (at way too early in the morning – who schedules 7:30am meetings? wind turbine technicians, that’s who).

Anyway, I was poaching photos off the internet for my slides, and here was my shopping list:

  • Someone driving a car
  • Someone working on machinery while ticking things off on a clipboard
  • Someone pulling electronics to pieces or maybe doing electrical testing

Here’s the sort of screen which comes up in your search results if you search for “someone driving a car”:

someone-driving-a-car

My other search results were similar. Where the women at? I was scared to see what kind of misogynistic nonsense might turn up with a search like “woman driving a car”, but it turned out ok.

woman-driving-a-car

In the end I got the reasonable photo diversity I was looking for by just going a bit further with my search terms to get beyond the default “white guy” filter. The search terms that I used in the end went something like this:

  • “woman driving with a cup of coffee”
  • “working on machinery with a clipboard”
  • “woman electrical fault finding”

Putting a bit of diversity and inclusion into your content isn’t difficult, you just have to be aware that it’s something worth doing, and try to be aware of your own biases. Here’s that slide:

Work-mode-vs-error-rate

Learn your lessons (or crash and burn)

(this article was first posted at eSocSci.org.nz)

On 31 October 2014, the world’s first commercial passenger space ship crashed, killing the co-pilot and seriously injuring the pilot. SpaceShipTwo (also known as ‘VSS Enterprise’) was undergoing tests after being reconfigured into what was intended to be the final state to take on passengers for suborbital joyrides to the edge of space. The US National Transportation Safety Board (NTSB) released its report into the crash on 28 July 2015, in which they determined a ‘root cause’ of the crash to be human error by the co-pilot, when he prematurely unlocked the tail-swivel mechanism while travelling at 0.8 Mach, a speed which was too slow for stability in that configuration.

It is a natural reaction to look where to appropriately assign blame after a major accident or incident, and we often hear that plane or train crashes happened because of pilot error. However if we dig a bit deeper we find a different story. SpaceShipTwo was designed to have simple controls – sticks and pedals rather than computers. The tail-swivel ‘feathering’ re-entry system was supposed to be a foolproof mechanical solution – the ship could reenter the atmosphere at any angle and right itself like a shuttlecock. But it relied on skilled operators making no mistakes, like unlocking the mechanism at 0.8 Mach instead of within the safe zone of 1.2 – 1.8 Mach. Manned spaceflight was being reinvented from scratch, with funding from none other than Richard Branson. However in doing so, all the old mistakes were simply being made all over again. The NTSB report notes that the design, safety assessment, and operation of the craft did not meet any of the US Federal Aviation Authority (FAA) guidelines for human factors (i.e. consideration of human error and the like). So perhaps it would make much more sense for us to blame the designers and company management rather than the pilot. For that matter, the FAA for some reason had granted special exemptions from those design requirements, and had given the blessing for the dangerous practises to continue.

Not long beforehand, Richard Branson had been taking photos with the crew of VSS (Virgin Space Ship) Enterprise and referring to them as part of the Virgin Galactic family. After the incident, Virgin press releases described the incident as involving Scaled Composites employees (Scaled Composites being the name of the subcontractor company that developed the ship and making it ready for commercial spaceflight operations).

But on another level, does it make sense for us to blame anyone? Industrial incidents are often the result of a large number of factors converging together in a complex system, leading to results that nobody wanted or expected. In that case who’s interests are being served by assigning blame to individuals, destroying reputations, possibly locking up one or two people while others are let off the hook to continue unsafe practices.

When workers or members of the public die in a tragic incident, there is sometimes a cry for vengeance. Journalists, politicians, or public prosecutors sometimes lead the call to find out whose fault it is, so they can be held accountable. However, the victims and their families almost always eventually come to the conclusion that what they really want is for lessons to be learned so that such an event never happens again (see Berlinger’s book ‘After Harm: Medical Error and the Ethics of Forgiveness, 2005). Generally speaking, people sue to ‘get the truth out’ more than to ‘punish those responsible’. Sometimes rogue operators do need to be shut down. Sometimes folks simply aren’t going to get the message or change their ways without being forced to do so. Sometimes the public needs to be protected from true psychopaths (or psychopathic corporations) who seem hell bent on repeating tragedy. Other times the players involved really did have the best intentions but simply did not know another way to operate, or did not appreciate the true risks. The ultimate goal should be for all players in industry to learn from the mistakes of the past so that they are not needlessly repeated.

Scaled Composites won the Ansari X-Prize with SpaceShipOne (SpaceShipTwo’s precursor), by putting together the first ever successful commercial suborbital spaceflight operation. They ushered in a new Second Age of Space, driven by the agile business rather than ponderous government. Space could be reached more easily, more cheaply, by throwing off the yoke of bureaucracy. Was there any need to operate under the crushing constraints of NASA space flight? Well, it turns out that (in some cases at least) the answer is: yes. Scaled Composites and Virgin Galactic (re)discovered the hard way that there were good reasons for all of those FAA safety regulations and human factors requirements; they were put in place by incident investigators of the past, in the hope that ‘this never happens again’.

Today’s leading thinkers in safety and human error tell us, from evidence-based research, that what you really want is not a Blame Culture, nor a No-Blame Culture, but a ‘Just Culture’ which sits somewhere in between (see Sidney Dekker’s book ‘Just Culture’, 2012, and Erik Hollnagel’s book ‘Safety-I and Safety-II’, 2014). That’s a culture in which everyone takes accountability for their part in an incident by making amends, which means owning up to their responsibility and doing what they can to make things right. Blaming people and doling out punishment or retribution actually reduces accountability by forcing people into defensive or adversarial positions. Yes, you need to have systems in place to deal with the true psychopaths, but if you can get past that then true progress is made when everyone involved is willing to talk with each other and make the necessary changes to ensure that ‘this never happens again’.

With New Zealand’s new Health and Safety laws coming out soon, we are in a unique position to get things right. New Zealand’s experiment in the past of firing all the Department of Labour safety inspectors and applying an ideology that ‘business performs best when left well alone’ was a failure. It turned out (to nobody’s real surprise) that there needs to be a referee on the pitch reminding everyone to play by the rules. Along with a system that encourages people to discuss the hazards and best practices for dealing with that. There has to be some sort of system for people to refer to the body of knowledge collected from past incidents.

One aspect in the new legislation really bothers me. Penalties in the new Health and Safety At Work Act won’t discriminate between actual events (which injured or killed people) and near misses (or dangerous situations in which someone could have been killed or injured). This may sound reasonable on the face of it – after all, by paying attention to the near misses we could potentially avert far more incidents before they every occur. It’s true that a system isn’t safe just because there hasn’t yet been an incident. Win-win, surely?

Well, maybe. It all comes down to implementation. If WorkSafeNZ were to begin prosecuting people who report their own mistakes in good faith, then what do you think will happen to reporting? When people have good reason to believe that they or their company will be targeted for speaking up or pointing out unsafe practices, then the safety conversation shuts down. Quickly and totally. Again, this has been proven by trial and error in the past. A key element of the ‘Just Culture’ is: who gets to make the final call about who is responsible and where accountability lies? Does that arbiter have the trust of the community to make those decisions? If there is trust in the community then there will be positive feedback as more and more people join into the safety conversation. If, on the other hand, the WorkSafeNZ regulators come out with all guns blazing, issuing judgements and assigning blame before seeking any sort of community buy-in, then there will be no ongoing safety conversation and New Zealanders will suffer another generation of downward-spiralling safety statistics. In that case, the lessons of the past will continue to go unheard.

Some more references:

NTSB findings on the crash of SpaceShipTwo
New Zealand’s new Health and Safety At Work Act (see especially Part 2, Subpart 3, Sections 42-44 ‘Offences’)
• More info about Human Factors and Just Culture by Sidney Dekker