Rise of the Expert Generalist

Enjoyed this profile of Charlie Munger on Medium, especially the description of the Expert Generalist, a rival to the 10,000 hour specialist:

The Rise Of The Expert-Generalist

The rival argument to the 10,000 hour rule is the expert-generalist approach. Orit Gadiesh, chairman of Bain & Co, who coined the term, describes the expert-generalist as:

“Someone who has the ability and curiosity to master and collect expertise in many different disciplines, industries, skills, capabilities, countries, and topics., etc. He or she can then, without necessarily even realizing it, but often by design:

  1. Draw on that palette of diverse knowledge to recognize patterns and connect the dots across multiple areas.
  2. Drill deep to focus and perfect the thinking.”

The concept is commonly represented by this model of the “T-shaped individual”:

NewImage

Rules of Thumb

Steve Roesler of All Things Workplace, one of the most dependable book recommenders I know of, gives “Rules of Thumb” a rave review in a recent post on his blog.

The subtitle lives up to its words: “52 Truths For Winning At Business Without Losing Your Self”.

You
don’t see many book reviews here even though we receive many
promotional copies. I do look hard at each one but, given my own
business and personal priorities, I only write a review when it’s a
raving recommendation, like: Rules of Thumb: 52 Truths for Winning at Business Without Losing Your Self

Get it over at: Amazon.com

“Looking for Ugly” in the honest workplace

[Update1: This was my submission to Executive Rockstar’s Best Career Advice competition ]

[Update2: This book look promising “Know What You Don’t Know: How Great Companies Fix Problems Before They Happen” By: Michael A. Roberto]

[Update3: Why Systems Fail and Problems Sprout Anew]

The best career advice that I ever received was from Steve O’Donnell, currently SVP IT Infrastructure & Operations at First Data International, celebrity blogger and former Global Head of Data Centre & Customer Experience Management at BT (where we worked together):

One day he said:

“Jonathan, I will never fire you for an honest mistake but if you lie to me, ever, you will be out the door in a minute. There is no mistake that you can make that I cannot figure out how to fix IF you tell me about it immediately. Be honest with me and you are safe, lie to me and you are gone.”

This is a golden rule in effective technical operations. It creates a culture of honesty and safety – not being afraid of reporting errors or lapses – that leads to true Kaizan:  genuine self-correction and organisational self-improvement because you are able to deal with errors systematically (i.e. by tweaking systems) and without the damage of the blame game and deferred responsibility.

His advice is particularly important in environments where errors are rare but extremely serious when they do occur – like executive boardrooms or aircraft maintenance hangers or hospitals.  The behaviour or practice of telling the truth about minor errors is central to the precursor-based error detection system (i.e. spotting the warning signs early)  which is in turn at the center of truly effective operations management (and every other system).

Kevin Kelly explains the issue in a  brilliant post about “Looking for Ugly“:

How do you prevent major errors in a system built to successfully keep major errors to a minimum?  You look for the ugly.

The safety of aircraft is so essential it is regulated in hopes that regulation can decrease errors. Error prevention enforced by legal penalties presents a problem, though: severe penalties discourages disclosure of problems early enough to be remedied.  To counter that human tendency, the US FAA has generally allowed airlines to admit errors they find without punishing them. These smaller infractions are the “ugly.” By themselves they aren’t significant, but they can compound with other small “uglies.” Often times they are so minimal — perhaps a worn valve, or discolored pipe — that one can hardly call them errors. They are just precursors to something breaking down the road.  Other times they are things that break without causing harm.

The general agreement in the industry is that a policy of unpunished infractions encourages quicker repairs and reduces the chances of major failures. Of course not punishing companies for safety violations rubs some people the wrong way. A recent Times article reports on the Congressional investigation into whether this policy of unpunished disclosure should continue, which issued the quote above. The Times says:

“We live in an era right now where we’re blessed with extremely safe systems,” said one panel member, William McCabe, a veteran of business aviation companies. “You can’t use forensics,” he said, because there are not enough accidents to analyze.

“You’re looking for ugly,” Mr. McCabe said. “You ask your people to look for ugly.” A successful safety system, he said, “acknowledges, recognizes and rewards people for coming forward and saying, ‘That might be one of your precursors.’ “

Looking for ugly is a great way to describe a precursor-based error detection system. You are not really searching for failure as much as signs failure will begin. These are less like errors and more like deviations. Offcenter in an unhealthy way.  For some very large systems — like airplanes, human health, ecosystems — detection of deviations is more art than science, more a matter of beauty or the lack of it.

Come to think of it, looking for ugly is how we assess our own health. I suspect looking for ugly is how we will be assessing complex systems like robots, AIs and virtual realities.

So, in short:  Create a professional environment that enables and encourages your team to detect, report and deal with the “ugly”.

[Update: I mailed Steve my submission and I was delighted to see he blogged about it on his Hot Aisle blog]