Clear Choice: Semantic Structure or Systemic Crises
June 10, 2010 Leave a comment
I was recently approached by a consultant who came out of “big oil”. This person was writing a book on credibility and making the ancient pitch that perception is reality—all that mattered was the perception of the oil industry.
We exchanged about a dozen emails on the Gulf disaster, innovation, and crisis prevention, until I honestly admitted that it was a distraction to my work, that these were archaic arguments, that denial and arrogance were yet again proven resilient, and that the oil industry culture and by extension government regulators were very poorly informed (actually to the point of being self destructive).
Among other serious problems, the institutional cultures in government and industry are suffering a delusion that experts in specific disciplines hold the keys to the innovation door. From a business perspective, I see this crisis as a major opportunity for alternative energy and smaller oil companies that are more innovative. From a macro global economics perspective, I see the argument for more engineers and entrepreneurs at decision levels like in China instead of lawyers like in the U.S. (one of few lessons from China the U.S. would be well served to follow).
Of course the individual didn’t take it well: “for someone who claims they know a lot about innovation—you sure have a closed mind”. I do have a closed mind on some issues, or actually no time, for conflicted debates attempting to persuade me that the sun didn’t rise this morning given that I witnessed same with my own eyes. The live cam of the oil spill would seem to negate attempts at denial, but no—we’ve just seen the blame game move from oil partners to industry/government partners.
My view is that all of the actors involved lost credibility based on the evidence, not perception, and should no longer enjoy the privilege to be in the decision chain. Credible leaders would step aside and seek wisdom outside of their failed system. I suspect that the attempt at persuading me was part of a large damage control effort from other giants in the oil industry who will no doubt be harmed from the reaction to this crisis; part of the macro problem today in not knowing who is working for whom, particularly in consulting and increasingly academia.
Enter the polymath
Whether dealing with systemic risk on Wall Street, the FRB, MMS, or an oil rig, one of my favorite quotes is often ignored, but should be kept front and center:
“We can’t solve problems by using the same kind of thinking we used when we created them.” – Attributed to Albert Einstein
Not normally thought of as a polymath, rather a genius physicist, Einstein’s actual writings demonstrating the ability to apply lessons in one discipline to another suggests to me otherwise. While we hear a great deal of talk and hype about interdisciplinary approaches today, we do not see much evidence of the polymath philosophy in our institutions, including at the highest levels of government and industry. That is a very serious problem (for more on polymaths, I recommend reading Abbie Lundberg’s review of the book ‘The New Polymath’).
When working to solve the vast majority of difficult problems today, the complexity requires a broad perspective that is sufficiently deep in multiple disciplines to be able to combine the fragments in a coherent, functional system of solutions. Most of the decision makers in large organizations like BP and the U.S. Government, even those in engineering, have long been bureaucrats by necessity—otherwise they wouldn’t have survived.
“Look at the oil spill problem. Everyone thinks it’s a technological problem. It’s not. It’s a management problem.” – UC Berkeley engineering professor Robert Bea.
I first came across Robert Bea’s work about a decade ago when I dove deep into the study of systemic crisis and prevention. (The final report on the Deepwater Horizon disaster can be viewed here.)
While Professor Bea is an engineer with significant experience in the oil industry, his breadth of knowledge becomes obvious when reading his reports, to include Katrina, which was an event I studied closely.
“It’s an attitude of independence, an attitude of being willing to be very, very deeply immersed in data, an attitude of healthy skepticism, an attitude of being able to question other people’s findings,” –Kathleen Tierney, Natural Hazards Center at the University of Colorado at Boulder.
“He’s a giant in our industry,” said J. David Rogers, Missouri University of Science and Technology. “Bob has the big-picture view that you just don’t see much anymore. We’ve become a culture of specialists. And those aren’t the kind of people who can figure out failures.” (Quotes from a profile on Bea at SFGate)
He may not represent a classic polymath, but Bea does provide the breadth of experience and knowledge rarely found. Unfortunately, like every other well conceived report on crises, his reports have been largely ignored within the same type of phenomenon that enabled the crises to occur, representing an ever larger negative spiral of systemic crises, which taken together appear more like a series, or even perhaps a single event in hindsight.
The collapse of the Deepwater Horizon and continuing oil gusher from the Gulf seafloor is the latest preventable disaster calling for structured multi-organizational systems. Until we hardwire structured data in a logical, adaptive organizational architecture, accountability is not possible in large organizations; and without accountability all manner of human manipulation and error can and will occur.