Vital Smarts, the people who brought out the fantastic “Crucial Conversations” a few years ago, have a great ad for their consulting business.
In an recent article for Future Changes, Bill Arconati – Confluence Product Marketing Manager at Atlassian – argues that Enterprise Wikis are much better than contemporary e-mail culture , creating what he calls an opt-in culture:
“In an opt-in culture, employees contribute to conversations where they gain the most satisfaction and have the largest impact. They look beyond their tiny fiefdoms and seek out situations where they can add value and offer their expertise.” – Wikis, “Opt-in Culture” Contribute to a Healthy Organization
Her contrast opt-in culture with its opposite – the opt-out e-mail culture – that completely dominates the business world:
Perhaps the best way to understand and appreciate an opt-in culture is by contrasting it to an opt-OUT culture like email. Have you ever left work at the end of the day and thought to yourself, “All I did today was respond to emails?” In email-based companies you frequently spend your days knocking down emails like a bad game of Whac-A-Mole.
The main problem with email is that you have little control over what lands in your inbox. Most emails are either (i) people asking you to do something or (ii) conversations between two or three people (frequently executives) with a dozen innocent bystanders in the cc line. The only way to shut out the noise in an email culture is to opt-out and say “Take me off this thread!”
Even if you successfully filter out mail you don’t want, there’s little you can do about the email you’re NOT receiving. Important management decisions are made every day on your corporate email server without the input of your company’s most interested and qualified employees. For example, I’m in marketing but I’ve worked in product development and corporate finance in past roles. I’d like to think I have something to offer to conversations about product development and financial analysis even though they’re technically outside of my designated role. But in an email culture, I wouldn’t be cc’d on those emails and hence not part of the conversation simply because I’m a marketing guy. Much of the knowledge and experience that I bring to the organization would be completely wasted in an email-based culture.
He is right, there is terrible waste in the fire-and-forget e-mail culture, with massive numbers of hours lost to simply cheking that mails can be safely discarded.
Bill ends by explaining how to use wiki’s to develop an opt-in culture:
- Communities of interest – deploy a wiki that lets you create a separate space for every area of interest.
- Comments and Discussions – deploy a wiki where conversations can naturally evolve out of content.
- Subscriptions – deploy a wiki where users can opt-in to conversations happening in the wiki either by subscribing via email or via RSS. With email and RSS notifications, users can actually monitor and participate in conversations happening all across the company.
- Openness – Consider a wiki where openness is the default.
[Update1: This was my submission to Executive Rockstar’s Best Career Advice competition ]
[Update2: This book look promising “Know What You Don’t Know: How Great Companies Fix Problems Before They Happen” By: Michael A. Roberto]
[Update3: Why Systems Fail and Problems Sprout Anew]
The best career advice that I ever received was from Steve O’Donnell, currently SVP IT Infrastructure & Operations at First Data International, celebrity blogger and former Global Head of Data Centre & Customer Experience Management at BT (where we worked together):
One day he said:
“Jonathan, I will never fire you for an honest mistake but if you lie to me, ever, you will be out the door in a minute. There is no mistake that you can make that I cannot figure out how to fix IF you tell me about it immediately. Be honest with me and you are safe, lie to me and you are gone.”
This is a golden rule in effective technical operations. It creates a culture of honesty and safety – not being afraid of reporting errors or lapses – that leads to true Kaizan: genuine self-correction and organisational self-improvement because you are able to deal with errors systematically (i.e. by tweaking systems) and without the damage of the blame game and deferred responsibility.
His advice is particularly important in environments where errors are rare but extremely serious when they do occur – like executive boardrooms or aircraft maintenance hangers or hospitals. The behaviour or practice of telling the truth about minor errors is central to the precursor-based error detection system (i.e. spotting the warning signs early) which is in turn at the center of truly effective operations management (and every other system).
Kevin Kelly explains the issue in a brilliant post about “Looking for Ugly“:
How do you prevent major errors in a system built to successfully keep major errors to a minimum? You look for the ugly.
The safety of aircraft is so essential it is regulated in hopes that regulation can decrease errors. Error prevention enforced by legal penalties presents a problem, though: severe penalties discourages disclosure of problems early enough to be remedied. To counter that human tendency, the US FAA has generally allowed airlines to admit errors they find without punishing them. These smaller infractions are the “ugly.” By themselves they aren’t significant, but they can compound with other small “uglies.” Often times they are so minimal — perhaps a worn valve, or discolored pipe — that one can hardly call them errors. They are just precursors to something breaking down the road. Other times they are things that break without causing harm.
The general agreement in the industry is that a policy of unpunished infractions encourages quicker repairs and reduces the chances of major failures. Of course not punishing companies for safety violations rubs some people the wrong way. A recent Times article reports on the Congressional investigation into whether this policy of unpunished disclosure should continue, which issued the quote above. The Times says:
“We live in an era right now where we’re blessed with extremely safe systems,” said one panel member, William McCabe, a veteran of business aviation companies. “You can’t use forensics,” he said, because there are not enough accidents to analyze.
“You’re looking for ugly,” Mr. McCabe said. “You ask your people to look for ugly.” A successful safety system, he said, “acknowledges, recognizes and rewards people for coming forward and saying, ‘That might be one of your precursors.’ “
Looking for ugly is a great way to describe a precursor-based error detection system. You are not really searching for failure as much as signs failure will begin. These are less like errors and more like deviations. Offcenter in an unhealthy way. For some very large systems — like airplanes, human health, ecosystems — detection of deviations is more art than science, more a matter of beauty or the lack of it.
Come to think of it, looking for ugly is how we assess our own health. I suspect looking for ugly is how we will be assessing complex systems like robots, AIs and virtual realities.
So, in short: Create a professional environment that enables and encourages your team to detect, report and deal with the “ugly”.
[Update: I mailed Steve my submission and I was delighted to see he blogged about it on his Hot Aisle blog]