An article in the New York Times reminds us once again that without a carefully crafted and highly disciplined governance architecture in place, perceived misalignment of personal interests between individuals and organizations across cultural ecosystems can lead to catastrophic decisions and outcomes. The article was written by Martin Fackler and is titled: Nuclear Disaster in Japan Was Avoidable, Critics Contend.
While not unexpected by those who study crises, rather yet another case where brave individuals raised red flags only to be shouted down by the crowd, the article does provide instructive granularity that should guide senior executives, directors, and policy makers in planning organizational models and enterprise systems. In a rare statement by a leading publication, Martin Fackler reports that insiders within “Japan’s tightly knit nuclear industry” attributed the Fukushima plant meltdown to a “culture of collusion in which powerful regulators and compliant academic experts”. This is a very similar dynamic found in other preventable crises, from the broad systemic financial crisis to narrow product defect cases.
One of the individuals who warned regulators of just such an event was professor Kunihiko Shimizaki, a seismologist on the committee created specifically to manage risk associated with Japan’s off shore earthquakes. Shimizaki’s conservative warnings were not only ignored, but his comments were removed from the final report “pending further research”. Shimizaki is reported to believe that “fault lay not in outright corruption, but rather complicity among like-minded insiders who prospered for decades by scratching one another’s backs.” This is almost verbatim to events in the U.S. where multi-organizational cultures evolved slowly over time to become among the highest systemic risks to life, property, and economy.
In another commonly found result, the plant operator Tepco failed to act on multiple internal warnings from their own engineers who calculated that a tsunami could reach up to 50 feet in height. This critical information was not revealed to regulators for three years, finally reported just four days before the 9.0 quake occurred causing a 45 foot tsunami, resulting in the meltdown of three reactors at Fukushima.
Three questions for consideration
1) Given that the root cause of the Fukushima meltdown was not the accurately predicted earthquake or tsunami, but rather dysfunctional organizational governance, are leaders not then compelled by moral imperative to seek out and implement organizational systems specifically designed to prevent crises in the future?
2) Given that peer pressure and social dynamics within the academic culture and relationship with regulators and industry are cited as the cause by the most credible witness—from their own community who predicted the event, would not prudence demand that responsible decision makers consider solutions external of the inflicted cultures?
3) With the not-invented-here-syndrome near the core of every major crises in recent history, which have seriously degraded economic capacity, can anyone afford not to?
Steps that must be taken to prevent the next Fukushima
1) Do not return to the same poisoned well for solutions that caused or enabled the crisis
The not-invented-here-syndrome combined with bias for institutional solutions perpetuates the myth that humans are incapable of anything but repeating the same errors over again.
This phenomenon is evident in the ongoing financial crisis which suffers from similar cultural dynamics between academics, regulators and industry.
Researchers have only recently begun to understand the problems associated with deep expertise in isolated disciplines and cultural dynamics. ‘Expertisis’ is a serious problem within disciplines that tend to blind researchers from transdisciplinary patterns and discovery, severely limiting consideration of possible solutions.
Systemic crises overlaps too many disciplines for the academic model to execute functional solutions, evidenced by the committee in this case that sidelined their own seismologist’s warnings for further study, which represents a classic enabler of systemic crises.
2) Understand that in the current digital era through the foreseeable future, organizational governance challenges are also data governance challenges, which requires the execution of data governance solutions
Traditional organizational governance is rapidly breaking down with the rise of the neural network economy, yet governance solutions are comparably slow to be adopted.
Many organizational leaders, policy makers, risk managers, and public safety engineers are not functionally literate with state-of-the-art technology, such as semantic, predictive, and human alignment methodologies.
Functional enterprise architecture that has the capacity to prevent the next Fukushima-like event, regardless of location, industry, or sector, will require a holistic design encapsulating a philosophy that proactively considers all variables that have enabled previous events.
Any functional architecture for this task cannot be constrained by the not-invented-here-syndrome, defense of guilds, proprietary standards, protection of business models, national pride, institutional pride, branding, culture, or any other factor.
3) Adopt a Finely Woven Decision Tapestry with Carefully Crafted Strands of Human, Sensory, and Business Intelligence
Data provenance is foundational to any functioning critical system in the modern organization, providing:
Carefully managed transparency
Far more functional automation
The possibility of accurate real-time auditing
4) Extend advanced analytics to the entire human workforce
incentives for pre-emptive problem solving and innovation
Automate information delivery:
Plug-in multiple predictive models:
5) Include sensory, financial, and supply chain data in real-time enterprise architecture and reporting
Until this year, extending advanced analytics to the entire human workforce was considered futuristic (see 1/10/2012 Forrester Research report Future of BI), in part due to scaling limitations in high performance computing. While always evolving, the design has existed for a decade
Automated data generated by sensors should be carefully crafted and combined in modeling with human and financial data for predictive applications for use in risk management, planning, regulatory oversight and operations.
While obviously not informed by a first-person audit and review, if reports and quotes from witnesses surrounding the Fukushima crisis are accurate, which are generally consistent from dozens of other human caused crises, we can conclude the following:
The dysfunctional socio-economic relationships in this case resulted in an extremely toxic cultural dynamic across academia, regulators and industry that shared tacit intent to protect the nuclear industry. Their collective actions, however, resulted in an outcome that idled the entire industry in Japan with potentially very serious long-term implications for their national economy.
Whether psychological, social, technical, economic, or some combination thereof, it would seem that no justification for not deploying the most advanced crisis prevention systems can be left standing. Indeed, we all have a moral imperative that demands of us to rise above our bias, personal and institutional conflicts, and defensive nature, to explore and embrace the most appropriate solutions, regardless of origin, institutional labeling, media branding, or any other factor. Some crises are indeed too severe not to prevent.
Founder & CEO