Vital Smarts, the people who brought out the fantastic “Crucial Conversations” a few years ago, have a great ad for their consulting business.
I have noticed a massive upsurge in interest in Visual Thinking and Information Design.
Whilst Edward Tufte is no doubt the prime mover in this field, I think Dan Roam’s 2008 bestseller “The Back of the Napkin” (website) has given a huge boost to the field by popularising Visual Thinking and giving people simple tools to apply it to the problems in their lives.
Here are some good links on Visual Design and Information Design
http://www.thebackofthenapkin.com/ ( book on Amazon)
[Update1: This was my submission to Executive Rockstar’s Best Career Advice competition ]
[Update2: This book look promising “Know What You Don’t Know: How Great Companies Fix Problems Before They Happen” By: Michael A. Roberto]
[Update3: Why Systems Fail and Problems Sprout Anew]
The best career advice that I ever received was from Steve O’Donnell, currently SVP IT Infrastructure & Operations at First Data International, celebrity blogger and former Global Head of Data Centre & Customer Experience Management at BT (where we worked together):
One day he said:
“Jonathan, I will never fire you for an honest mistake but if you lie to me, ever, you will be out the door in a minute. There is no mistake that you can make that I cannot figure out how to fix IF you tell me about it immediately. Be honest with me and you are safe, lie to me and you are gone.”
This is a golden rule in effective technical operations. It creates a culture of honesty and safety – not being afraid of reporting errors or lapses – that leads to true Kaizan: genuine self-correction and organisational self-improvement because you are able to deal with errors systematically (i.e. by tweaking systems) and without the damage of the blame game and deferred responsibility.
His advice is particularly important in environments where errors are rare but extremely serious when they do occur – like executive boardrooms or aircraft maintenance hangers or hospitals. The behaviour or practice of telling the truth about minor errors is central to the precursor-based error detection system (i.e. spotting the warning signs early) which is in turn at the center of truly effective operations management (and every other system).
Kevin Kelly explains the issue in a brilliant post about “Looking for Ugly“:
How do you prevent major errors in a system built to successfully keep major errors to a minimum? You look for the ugly.
The safety of aircraft is so essential it is regulated in hopes that regulation can decrease errors. Error prevention enforced by legal penalties presents a problem, though: severe penalties discourages disclosure of problems early enough to be remedied. To counter that human tendency, the US FAA has generally allowed airlines to admit errors they find without punishing them. These smaller infractions are the “ugly.” By themselves they aren’t significant, but they can compound with other small “uglies.” Often times they are so minimal — perhaps a worn valve, or discolored pipe — that one can hardly call them errors. They are just precursors to something breaking down the road. Other times they are things that break without causing harm.
The general agreement in the industry is that a policy of unpunished infractions encourages quicker repairs and reduces the chances of major failures. Of course not punishing companies for safety violations rubs some people the wrong way. A recent Times article reports on the Congressional investigation into whether this policy of unpunished disclosure should continue, which issued the quote above. The Times says:
“We live in an era right now where we’re blessed with extremely safe systems,” said one panel member, William McCabe, a veteran of business aviation companies. “You can’t use forensics,” he said, because there are not enough accidents to analyze.
“You’re looking for ugly,” Mr. McCabe said. “You ask your people to look for ugly.” A successful safety system, he said, “acknowledges, recognizes and rewards people for coming forward and saying, ‘That might be one of your precursors.’ “
Looking for ugly is a great way to describe a precursor-based error detection system. You are not really searching for failure as much as signs failure will begin. These are less like errors and more like deviations. Offcenter in an unhealthy way. For some very large systems — like airplanes, human health, ecosystems — detection of deviations is more art than science, more a matter of beauty or the lack of it.
Come to think of it, looking for ugly is how we assess our own health. I suspect looking for ugly is how we will be assessing complex systems like robots, AIs and virtual realities.
So, in short: Create a professional environment that enables and encourages your team to detect, report and deal with the “ugly”.
[Update: I mailed Steve my submission and I was delighted to see he blogged about it on his Hot Aisle blog]