Four degrees of separation that help simplify work

From: http://changingminds.org/blog/0902blog/090225blog.htm

Much work these days is packaged up as projects, with plans, resources and time-bound deliverables. Managing projects is a skill as the various risks and issues can easily trip you up. In particular the sheer complexity can cause much extra work and conceal important issues.

Here, then, are four ways of making things simpler by separating out things that need your attention in different ways.

1. Separate rapidly changing things from slowly changing things. This makes changes (and communication about them) easier. For example a strategic plan, which changes little is separated from a rapidly-changing tactical plan.

2. Separate things that require attention now from points of information. This allows a sharper focus on action. For example items that require decisions may be covered first in a meeting, then information discussions continued in the remaining time.

3. Separate planned action from unexpected action. This allows both to be clearly managed and for plans to be revised as needed. For example issues are managed separately from standard project plans, thus allowing both onto the stage.

4. Separate internal project communications from external communications. Internal communications can be detailed, technical, textual and full of jargon. External communications should be focused, brief, visual and use Plain English.

You can also use the principle of separation to create clarity in documents and presentations by:

* Using colour, bold fonts, and other visual contrasts.
* Using lines and physical separation.
* Visual/physical separation into sections, pages, documents.

Declaration of Interdependence (Kanban)

I enjoyed this 2005 declaration from David Anderson:

Declaration of Interdependence

We are a community of project leaders that are highly successful at delivering results. To achieve these results:

  • We increase return on investment by making continuous flow of value our focus.
  • We deliver reliable results by engaging customers in frequent interactions and shared ownership.
  • We expect uncertainty and manage for it through iterations, anticipation, and adaptation.
  • We unleash creativity and innovation by recognizing that individuals are the ultimate source of value, and creating an environment where they can make a difference.
  • We boost performance through group accountability for results and shared responsibility for team effectiveness.
  • We improve effectiveness and reliability through situationally specific strategies, processes and practices.

From: David J. Anderson and Associates

Jerry Weinberg’s ten laws of trust

Jerry Weinberg is a legend in Project Management and Consulting circles. Here are his 10 Laws of Trust:

1. Nobody but you cares about the reason you let another person down.
2. Trust takes years to win, moments to lose.
3. People don’t tell you when they stop trusting you.
4. The trick of earning trust is to avoid all tricks.
5. People are never liars—in their own eyes.
6. Always trust your client—and cut the cards.
7. Never be dishonest, even if the client requests it.
8. Never promise anything.
9. Always keep your promise.
10. Get it in writing, but depend on trust.

Conferences That Work | Jerry Weinberg’s ten laws of trust

Elliot Jaques and Requisite Organisation

From the Economist’s Guru section article on Elliott Jaques:

Jaques (1917-2003) decided that jobs could be defined in terms of their time horizon. For example, a director of marketing might be worried about marketing campaigns for next year, while a salesman on the road is worried about reaching his targets for the week. Jaques also believed that people had a “boss” and a “real boss”. The boss was the person to whom they were nominally responsible, while the real boss was the person to whom they turned to get decisions crucial to the continuation of their work.

The sales manager in charge of a salesforce would not have a longer time horizon than the people in his salesforce. So when a salesman wanted a decision on something affecting his ability to deliver to his clients, he would go over the head of the sales manager for that decision. Jaques called this “level skipping”, and identified it as a dangerous pathology in any hierarchy.

He then looked at the time horizons of people, their bosses and their real bosses, and he found that people with a time horizon of less than three months treated those with a horizon of 3–12 months as their real bosses, and so on up the scale. He identified seven different time horizons, from three months to 20 years, and argued that organisations, no matter how complex, should have seven levels of hierarchy, each corresponding to a different managerial time horizon. Jaques’s theory has come to be known as RO (requisite organisation).

This reminds me of the Tolstoy quotation from C.S. Lewis’s “The Inner Ring”:

“When Boris entered the room, Prince Andrey was listening to an old general, wearing his decorations, who was reporting something to Prince Andrey, with an expression of soldierly servility on his purple face. “Alright. Please wait!” he said to the general, speaking in Russian with the French accent, which he used when he spoke with contempt. The moment he noticed Boris he stopped listening to the general who trotted imploringly after him and begged to be heard, while Prince Andrey turned to Boris with a cheerful smile and a nod of the head. Boris now clearly understood-what he had already guessed-that side by side with the system of discipline and subordination which were laid down in the Army Regulations, there existed a different and a more real system-the system which compelled a tightly laced general with a purple face to wait respectfully for his turn while a mere captain like Prince Andrey chatted with a mere second lieutenant like Boris, Boris decided at once that he would be guided not by the official system but by this other unwritten system.”

How wiki’s can foster an “Opt-in Culture”

In an recent article for Future Changes, Bill Arconati – Confluence Product Marketing Manager at Atlassian – argues that Enterprise Wikis are much better than contemporary e-mail culture , creating what he calls an opt-in culture:

“In an opt-in culture, employees contribute to conversations where they gain the most satisfaction and have the largest impact. They look beyond their tiny fiefdoms and seek out situations where they can add value and offer their expertise.” – Wikis, “Opt-in Culture” Contribute to a Healthy Organization

Her contrast opt-in culture with its opposite – the opt-out e-mail culture – that completely dominates the business world:

Perhaps the best way to understand and appreciate an opt-in culture is by contrasting it to an opt-OUT culture like email. Have you ever left work at the end of the day and thought to yourself, “All I did today was respond to emails?” In email-based companies you frequently spend your days knocking down emails like a bad game of Whac-A-Mole.

The main problem with email is that you have little control over what lands in your inbox. Most emails are either (i) people asking you to do something or (ii) conversations between two or three people (frequently executives) with a dozen innocent bystanders in the cc line. The only way to shut out the noise in an email culture is to opt-out and say “Take me off this thread!”

Even if you successfully filter out mail you don’t want, there’s little you can do about the email you’re NOT receiving. Important management decisions are made every day on your corporate email server without the input of your company’s most interested and qualified employees. For example, I’m in marketing but I’ve worked in product development and corporate finance in past roles. I’d like to think I have something to offer to conversations about product development and financial analysis even though they’re technically outside of my designated role. But in an email culture, I wouldn’t be cc’d on those emails and hence not part of the conversation simply because I’m a marketing guy. Much of the knowledge and experience that I bring to the organization would be completely wasted in an email-based culture.

He is right, there is terrible waste in the fire-and-forget e-mail culture, with massive numbers of hours lost to simply cheking that mails can be safely discarded.

Bill ends by explaining how to use wiki’s to develop an opt-in culture:

  1. Communities of interest – deploy a wiki that lets you create a separate space for every area of interest.
  2. Comments and Discussions – deploy a wiki where conversations can naturally evolve out of content.
  3. Subscriptions – deploy a wiki where users can opt-in to conversations happening in the wiki either by subscribing via email or via RSS. With email and RSS notifications, users can actually monitor and participate in conversations happening all across the company.
  4. Openness – Consider a wiki where openness is the default.

Read on: http://www.ikiw.org/2009/03/04/wikis-opt-in-culture-contribute-to-a-healthy-organization/

“Looking for Ugly” in the honest workplace

[Update1: This was my submission to Executive Rockstar’s Best Career Advice competition ]

[Update2: This book look promising “Know What You Don’t Know: How Great Companies Fix Problems Before They Happen” By: Michael A. Roberto]

[Update3: Why Systems Fail and Problems Sprout Anew]

The best career advice that I ever received was from Steve O’Donnell, currently SVP IT Infrastructure & Operations at First Data International, celebrity blogger and former Global Head of Data Centre & Customer Experience Management at BT (where we worked together):

One day he said:

“Jonathan, I will never fire you for an honest mistake but if you lie to me, ever, you will be out the door in a minute. There is no mistake that you can make that I cannot figure out how to fix IF you tell me about it immediately. Be honest with me and you are safe, lie to me and you are gone.”

This is a golden rule in effective technical operations. It creates a culture of honesty and safety – not being afraid of reporting errors or lapses – that leads to true Kaizan:  genuine self-correction and organisational self-improvement because you are able to deal with errors systematically (i.e. by tweaking systems) and without the damage of the blame game and deferred responsibility.

His advice is particularly important in environments where errors are rare but extremely serious when they do occur – like executive boardrooms or aircraft maintenance hangers or hospitals.  The behaviour or practice of telling the truth about minor errors is central to the precursor-based error detection system (i.e. spotting the warning signs early)  which is in turn at the center of truly effective operations management (and every other system).

Kevin Kelly explains the issue in a  brilliant post about “Looking for Ugly“:

How do you prevent major errors in a system built to successfully keep major errors to a minimum?  You look for the ugly.

The safety of aircraft is so essential it is regulated in hopes that regulation can decrease errors. Error prevention enforced by legal penalties presents a problem, though: severe penalties discourages disclosure of problems early enough to be remedied.  To counter that human tendency, the US FAA has generally allowed airlines to admit errors they find without punishing them. These smaller infractions are the “ugly.” By themselves they aren’t significant, but they can compound with other small “uglies.” Often times they are so minimal — perhaps a worn valve, or discolored pipe — that one can hardly call them errors. They are just precursors to something breaking down the road.  Other times they are things that break without causing harm.

The general agreement in the industry is that a policy of unpunished infractions encourages quicker repairs and reduces the chances of major failures. Of course not punishing companies for safety violations rubs some people the wrong way. A recent Times article reports on the Congressional investigation into whether this policy of unpunished disclosure should continue, which issued the quote above. The Times says:

“We live in an era right now where we’re blessed with extremely safe systems,” said one panel member, William McCabe, a veteran of business aviation companies. “You can’t use forensics,” he said, because there are not enough accidents to analyze.

“You’re looking for ugly,” Mr. McCabe said. “You ask your people to look for ugly.” A successful safety system, he said, “acknowledges, recognizes and rewards people for coming forward and saying, ‘That might be one of your precursors.’ “

Looking for ugly is a great way to describe a precursor-based error detection system. You are not really searching for failure as much as signs failure will begin. These are less like errors and more like deviations. Offcenter in an unhealthy way.  For some very large systems — like airplanes, human health, ecosystems — detection of deviations is more art than science, more a matter of beauty or the lack of it.

Come to think of it, looking for ugly is how we assess our own health. I suspect looking for ugly is how we will be assessing complex systems like robots, AIs and virtual realities.

So, in short:  Create a professional environment that enables and encourages your team to detect, report and deal with the “ugly”.

[Update: I mailed Steve my submission and I was delighted to see he blogged about it on his Hot Aisle blog]