Legacy of the Tōhoku Earthquake: Moral Imperative to Prevent a Future Fukushima Crisis


An article in the New York Times reminds us once again that without a carefully crafted and highly disciplined governance architecture in place, perceived misalignment of personal interests between individuals and organizations across cultural ecosystems can lead to catastrophic decisions and outcomes. The article was written by Martin Fackler and is titled: Nuclear Disaster in Japan Was Avoidable, Critics Contend.

While not unexpected by those who study crises, rather yet another case where brave individuals raised red flags only to be shouted down by the crowd, the article does provide instructive granularity that should guide senior executives, directors, and policy makers in planning organizational models and enterprise systems. In a rare statement by a leading publication, Martin Fackler reports that insiders within “Japan’s tightly knit nuclear industry” attributed the Fukushima plant meltdown to a “culture of collusion in which powerful regulators and compliant academic experts”.  This is a very similar dynamic found in other preventable crises, from the broad systemic financial crisis to narrow product defect cases.

One of the individuals who warned regulators of just such an event was professor Kunihiko Shimizaki, a seismologist on the committee created specifically to manage risk associated with Japan’s off shore earthquakes. Shimizaki’s conservative warnings were not only ignored, but his comments were removed from the final report “pending further research”. Shimizaki is reported to believe that “fault lay not in outright corruption, but rather complicity among like-minded insiders who prospered for decades by scratching one another’s backs.”  This is almost verbatim to events in the U.S. where multi-organizational cultures evolved slowly over time to become among the highest systemic risks to life, property, and economy.

In another commonly found result, the plant operator Tepco failed to act on multiple internal warnings from their own engineers who calculated that a tsunami could reach up to 50 feet in height. This critical information was not revealed to regulators for three years, finally reported just four days before the 9.0 quake occurred causing a 45 foot tsunami, resulting in the meltdown of three reactors at Fukushima.

Three questions for consideration

1) Given that the root cause of the Fukushima meltdown was not the accurately predicted earthquake or tsunami, but rather dysfunctional organizational governance, are leaders not then compelled by moral imperative to seek out and implement organizational systems specifically designed to prevent crises in the future?

2) Given that peer pressure and social dynamics within the academic culture and relationship with regulators and industry are cited as the cause by the most credible witness—from their own community who predicted the event, would not prudence demand that responsible decision makers consider solutions external of the inflicted cultures?

3) With the not-invented-here-syndrome near the core of every major crises in recent history, which have seriously degraded economic capacity, can anyone afford not to?

Steps that must be taken to prevent the next Fukushima

1) Do not return to the same poisoned well for solutions that caused or enabled the crisis

  • The not-invented-here-syndrome combined with bias for institutional solutions perpetuates the myth that humans are incapable of anything but repeating the same errors over again.

  • This phenomenon is evident in the ongoing financial crisis which suffers from similar cultural dynamics between academics, regulators and industry.

  • Researchers have only recently begun to understand the problems associated with deep expertise in isolated disciplines and cultural dynamics. ‘Expertisis’ is a serious problem within disciplines that tend to blind researchers from transdisciplinary patterns and discovery, severely limiting consideration of possible solutions.

  • Systemic crises overlaps too many disciplines for the academic model to execute functional solutions, evidenced by the committee in this case that sidelined their own seismologist’s warnings for further study, which represents a classic enabler of systemic crises.

2) Understand that in the current digital era through the foreseeable future, organizational governance challenges are also data governance challenges, which requires the execution of data governance solutions

    • Traditional organizational governance is rapidly breaking down with the rise of the neural network economy, yet governance solutions are comparably slow to be adopted.

    • Many organizational leaders, policy makers, risk managers, and public safety engineers are not functionally literate with state-of-the-art technology, such as semantic, predictive, and human alignment methodologies.

    • Functional enterprise architecture that has the capacity to prevent the next Fukushima-like event, regardless of location, industry, or sector, will require a holistic design encapsulating a philosophy that proactively considers all variables that have enabled previous events.

      • Any functional architecture for this task cannot be constrained by the not-invented-here-syndrome, defense of guilds, proprietary standards, protection of business models, national pride, institutional pride, branding, culture, or any other factor.

3) Adopt a Finely Woven Decision Tapestry with Carefully Crafted Strands of Human, Sensory, and Business Intelligence

Data provenance is foundational to any functioning critical system in the modern organization, providing:

      • Increased accountability

      • Increased security

      • Carefully managed transparency

      • Far more functional automation

      • The possibility of accurate real-time auditing

4) Extend advanced analytics to the entire human workforce

      • incentives for pre-emptive problem solving and innovation

      • Automate information delivery:

        • Record notification

        • Track and verify resolution

        • Extend network to unbiased regulators of regulators

      • Plug-in multiple predictive models:

        • -establish resolution of conflicts with unbiased review.

        • Automatically include results in reporting to prevent obstacles to essential targeted transparency as occurred in the Fukushima incident

5) Include sensory, financial, and supply chain data in real-time enterprise architecture and reporting

      • Until this year, extending advanced analytics to the entire human workforce was considered futuristic (see 1/10/2012 Forrester Research report Future of BI), in part due to scaling limitations in high performance computing. While always evolving, the design has existed for a decade

      • Automated data generated by sensors should be carefully crafted and combined in modeling with human and financial data for predictive applications for use in risk management, planning, regulatory oversight and operations.

        • Near real-time reporting is now possible, so governance structures and enterprise architectural design should reflect that functionality.

 

Conclusion

While obviously not informed by a first-person audit and review, if reports and quotes from witnesses surrounding the Fukushima crisis are accurate, which are generally consistent from dozens of other human caused crises, we can conclude the following:

The dysfunctional socio-economic relationships in this case resulted in an extremely toxic cultural dynamic across academia, regulators and industry that shared tacit intent to protect the nuclear industry. Their collective actions, however, resulted in an outcome that idled the entire industry in Japan with potentially very serious long-term implications for their national economy.

Whether psychological, social, technical, economic, or some combination thereof, it would seem that no justification for not deploying the most advanced crisis prevention systems can be left standing. Indeed, we all have a moral imperative that demands of us to rise above our bias, personal and institutional conflicts, and defensive nature, to explore and embrace the most appropriate solutions, regardless of origin, institutional labeling, media branding, or any other factor. Some crises are indeed too severe not to prevent.

Mark Montgomery
Founder & CEO
Kyield
http://www.kyield.com

Advertisements

New Video: Extending Analytics to the Information Workplace


Consolidated thoughts and comments 2-2012


I thought it might be of interest to consolidate a few thoughts and comments over the past couple of weeks (typo edits included).

~~~~~~~~~~~~~~

Tweeted thoughts @Kyield

As distasteful as self-promotion can be, value entering the knowledge stream would be quite rare or postmortem otherwise, as king bias rules

~~

Raw data is just a mess of yarn until interlaced by master craftspeople with a quality loom, providing context, utility, and probability

~~

As with any predatory culture sculpted by inherent naivete in nature, IT carnivores feast on Leporidae and become frustrated by Testudines

~~

Super organizational intelligence through networked computing has already surpassed the capacity of the physical organization

~~

Superhuman intelligence with machines won’t surpass the human brain until organic computing matures

~~~~~~~~~~~~~~

At Data Science Central in response to an article and posting by Dan Woods at Forbes: IBM’s Anjul Bhambhri on What Is a Data Scientist?

Until very recently—perhaps still, very few outside of a minority in research seem to appreciate the changes for data management in networks, and most of those scientists were engaged with physics or trans-disciplinary research. The ability to structure data employing AI techniques is beginning to manifest in ways intended by the few in CS working the problem for almost 20 years now. The result is a system like ours that was designed from inception over time in conjunction with a new generation of related technologies specifically for the neural network economy, with what we’ll call dumb data silos (extremely destructive from a macro perspective) in the bulls-eye of replacement strategies. Of course the need for privacy, security, and commerce is even more necessary in the transparent world outside of database silos, which provides a challenge especially in consumer and public networks, or at the hand-off of data between enterprise and public networks. The new generation of technologies requires a new way of thinking, of course, especially for those who were brought up working with the relational databases. Like any progress in mature economies, next generation technologies will not come without some level of disruption and displacement so there are  fair amounts of defensive tactics out there. Evolution continues anyway.

~~~~~~~~~~~~~~

At Semanticweb.com in response to an article by Paul Miller: Tread Softly

Yes I suppose this does need a bit more attention. We shouldn’t have a consumer web and enterprise web in technology, languages and standards—agreed, although I do think we’ll continue to see different functionality and to some extent technology that provides value relative to their much different needs. Cray for example announced a new division last week based on semantics with several partners close to this blog, but it’s very unlikely we’ll see a small business web site developer on the sales target list of Cray’s DOS anytime soon. There is a reason for that.

While I agree that it can be (and often is) a misleading distinction—reason for commenting—it is precisely misleading due to the confusion over the very different economic drivers, models, and needs of the two markets. In this regard I am in total agreement with Hendler in the interview just before this article—the problem is that the semantic web community doesn’t understand the economics of the technology they are researching, developing, and in many cases selling, and this is in part due to lots of misleading information in their learning curve—not so much here but before they leave the university. This is in part due to the level of maturity of the technology but also in part due to the nature of R&D and tech transfer when we have folks deciding what to invest in who have little experience or knowledge about product development and applied economics, and they also have conflicting internal needs that quite often influence or dictate where and how the dollars are invested. (Note: I would add here on my own blog a certain level of ideological bias that doesn’t belong in science but is nonetheless prevalent).

The free web supported in part by advertising, in part by e-commerce, but still mainly by user generated content, like all economic models, should heavily influence product and technology development for those products that are targeted for that specific use case. We are beginning to see a bit of divergence with tools reflective of the vastly different needs (rdfa was a good standards example), but for the sake of example it’s not terribly challenging for a fortune 50 company to form a team of experts to create a custom ontology and operate a complex set of tools that are the norm in semantics—indeed they usually prefer the complexity as would-be competitors often cannot afford the same luxury, so the high bar is a competitive advantage. However, the average small business web designer is already tapped out by keeping up with just the changing landscape of the existing web (not to mention mobile) and cannot even consider taking on that same level of complexity or investment.

The technical issues are understood but the economics are not, especially relating to the high cost of talent for the semantic web (technologies), or in some cases as cited above even to include universities and government, they understand well that the complexity in the enterprise provides strong career security.

So if I am providing a free web site service that is supported by ad revenue, it’s unlikely that I will adopt the same technology and tools developed for intelligence agencies, military, and the fortune 500. The former lives in the most hyper competitive environment in human history and the latter quite often has very little competition with essentially unlimited resources. Vastly different worlds, needs, markets, tactics and tools. But we still must use the same languages to communicate between the two—therein lies the challenge in the neural network economy, or for (developing, adopting, and maintaining) network standards.

The economic needs and abilities are radically different which should dominate the models, even though it’s true that both are part of the same neural network economy and must interact with the least amount of integration friction as possible. I do believe that like we’ve seen in other technologies eventually scale will converge the two—so-called consumerization, although in this case it’s somewhat in reverse. So you can see that I agree with your wise advice to walk softly on hype with the consumer semantic web, but I am quite certain that it’s more misleading to not point out the dramatically different economic drivers and needs of the two markets. Indeed it’s quite common for policy makers and academics to create large numbers of martyrs (often with good intentions) by promoting and even funding modeling they have no experience with beyond the subsidized walls of research. What is great for society is not always great for the individual entrepreneur or investor, and quite often isn’t.

I’ve recently even decided that in an attempt to prevent unnecessary martyrdom by very inexperienced and often ideological entrepreneurs of the type common in semantics that we should just take the heat of incoming arrows and point out the sharp misalignment of interests between the two markets and the stakeholders. The super majority of revenue in the semantic web reflects the jobs posted here recently—defense contractors, super computing, universities, governments, and the Fortune 500.

It’s not by accident that in venture capital we see specialists in individuals—very few tend to be good at both enterprise and consumer markets—especially in tech transfer during and graduating from the valley of death. While exceptions exist and the companies they invest in can serve both particularly as they mature, the norm is that the understanding, modeling, tactics and tools of the two markets in order to be successful are actually usually conflicting and contradictory in the early stages.

So it’s not only possible to experience an inflection point for the same technology in one market and not the other, it’s actually more common than not (although would agree that not preferable in this case). A decade ago it may have been surprising to most—today it’s not surprising to me. The functionality of a web that provides deep meaning and functionality is beyond the economic ability of the advertising model that dominates the majority of the web, and even the majority of end consumers ability to pay for it. The gap is closing and I expect it to continue to close over time, but to date I still see primarily subsidies in one form or another on the free semantic web for the functionality—whether capital, talent, infrastructure, or some combination thereof. Been a long day already so groggy but hope this helps a bit to clarify the point I was attempting to make. Thanks, MM

Press release on our enterprise pilot program


We decided to expand our reach on our enterprise pilot program through the electronic PR system so I issued the following BusinessWire release today:

Kyield Announces Pilot Program for Advanced Analytics and Big Data with New Revolutionary BI Platform 

“We are inviting well-matched organizations to collaborate with us in piloting our break-through system to bring a higher level of performance to the information workplace,” stated Mark Montgomery, Founder and CEO of Kyield. “In addition to the significant competitive advantage exclusive to our pilot program, we are offering attractive long-term incentives free from lock-in, maintenance fees, and high service costs traditionally associated with the enterprise software industry.”

Regards, MM

Future of BI in the Organization


We enjoyed a pleasant surprise this week in the form of a new Forrester report that named Kyield as one of the interviews. The topic and content was certainly appropriate for introducing Kyield to Forrester clients:

Future of BI: Top Ten Business Intelligence Predictions for 2012by Boris Evelson and Anjali Yakkundi with Stephen Powers and Shannon Coyne.

I have reviewed the brief paper, finding that I am in substantial agreement with the direction and predictions so I recommend the product, and I wanted to share a few additional thoughts.

One of the main themes in the report concerns expanding the use of BI throughout the organization, which essentially follows a proven management philosophy and strategy I have advocated my entire career due to a combination of logic, experience, and testing in many real-life cases.  In a broad sense this issue speaks to organizational awareness, or as too often has been the case; the lack thereof.

Whether missed opportunity or crisis prevention, the problem is not that organizations do not contain essential knowledge for good decision making—or even that accurate predictions are not transferred to digital form—indeed experience shows that digital red flags are commonly found that were appropriate, accurate, and in many cases followed rational policy. Rather, in the super majority of cases a combination of obstacles prevented the critical issues from reaching and/or influencing decision makers. It required more than a dozen years for related technologies to evolve and mature to the point of allowing our system to resolve this and other critical issues in near-real time, but that day has finally come.

I found it interesting that the paper acknowledges that BI has serious limitations, and that the next generation of technologies may turn traditional BI relationships “upside down”, to include business alignment. The Forrester team also warns clients about excessive control and the need for balance. Indeed our patented system provides fully adaptable modules to tailor to the specific mission and needs of the individual, group, and organization. Each organization requires the ability to tailor differentiation in easy to use modules based on standards that are cost efficient and adaptable for continual improvement.

Other highlights in the paper point to the benefits of a semantic layer based on standards and a well-deserved warning to avoid consultants and integrators who “love the old-fashioned BI” as it can result in “unending billable hours”. Of course this problem is not limited to just BI—a serious problem across the enterprise ecosystem that has resulted in a unified defense against any innovation that reduces costs and improves efficiency, leading to ‘lock-in’ and low levels of innovation in the enterprise.

Unfortunately for those who rely on inefficient systems and misalignment of interests, technology advances have come together with economic necessity at roughly the same time to form an inflection point at the confluence of organizational management and neural networks. That is to say that our internal R&D as well as considerable external evidence is in strong agreement with the advice the Forrester team provides clients in this report, including to “start embracing some of the latest BI trends — or risk falling behind”. That risk is currently significant, expanding quickly in 2012 and will no doubt impact outcomes for the foreseeable future. -MM

Strategic IT Alignment in 2012: Leverage Semantics and Avoid the Caretaker


A very interesting development occurred on the way to the neural network economy: The interests of the software vendor and the customer diverged, circled back and then collided, leaving many executives stunned and confused.

The business model in the early years of software was relatively simple. Whether an individual or enterprise, if the customer didn’t adopt the proprietary standard that provided interoperability, the customer was left behind and couldn’t compete—a no brainer—we all adopted. By winning the proprietary standard in any given software segment, market leaders were able to deliver amazing improvements in productivity at relatively low cost while maintaining some of the highest profit margins in the history of business. This model worked remarkably well for a generation, but as is often the case technology evolved more rapidly than business models and incumbent cultures could adapt, so incumbents relied on lock-in tactics to protect the corporation, profit, jobs, and perhaps in some cases national trade.

Imagine the challenge of a CEO today in a mature, publicly traded software company with a suite of products that is generating many billions of dollars in profits annually. In order to continue to grow and keep the job, the CEO would need to either rediscover the level of innovation of the early years—as very few have been able to do, play favorites by providing some customers with competitive advantage and others with commodities—occurring in the enterprise market but risky, or focus on milking the commoditized market power in developed nations while pushing for growth in developing countries. The latter has been the strategy of choice for most mature companies, of course.

Doing all of the above simultaneously is nearly impossible. Killer apps by definition must cannibalize cash cows and public company CEOs have a fiduciary responsibility to optimize profits while mitigating risk, so most CEOs in this position choose to remain ‘dairy farmers’ either until retirement or are forced to change from emergent competition. In discussing one such incumbent recently with one of the leading veterans in IT, he described such a CEO as “the caretaker”. For enterprise customers this type of caretaker can be similar to the one we hired a few years ago to protect our interests when we moved to the Bay area, returning to a property that was uninhabitable after messaging ‘all is fine’ (beware of the caretaker wolf in sheep’s clothing).

Now consider that software exports generate large, efficient import engines for currency in headquarter countries, thus placing those national governments in strategic alignment with the incumbents in the short-term (often dictated by short-term politics); and another entire dimension appears that is rarely discussed, yet very strongly influences organizations worldwide. This situation can influence governments in protecting and reinforcing perceived short-term benefits of commoditized market leaders over critical long-term needs of organizations, markets, and economies. It is not inaccurate to suggest that national security is occasionally misunderstood and/or misused in the decision process on related policy.

Monopoly cultures think and act alike, whether in the public or private sector, which is often (eventually) their undoing, unless of course they adopt intentional continuous improvement. This is why creative destruction is so essential, has been embraced internally by most progressive organizations in some form, and why customers should proactively support innovators and farm markets towards sustainable diversity. Despite what may appear to be the case, the interests of incumbents in enterprise software are often directly conflicting with the interests of the customer.

While the theory of creative destruction has roots in Marxism, the practice is a necessity for capitalism (or any other ism) today due to the natural migration of cultures and economies to seek security and protection, which in turn takes us away from the discipline required for continual rejuvenation. We embrace creative destruction in what has become modern global socialism simply because very little innovation would emerge otherwise. Competitive advantage for organizations cannot exist in rigid commoditization of organizational systems as we see in software. Simply put—whether at the individual, organizational, or societal level, we should embrace creative destruction for long-term survival, especially in light of our current unsustainable trajectory.

Which brings us to the present day emergent neural network economy. In our modern network economy we simply must have interoperable software and communications systems. The global economy cannot function properly otherwise, so this is in everyone’s interest, as I have been saying for 15 years now. The overpowering force of the network effect would place any proprietary standard in an extortion position to the entire global economy in short order. The current danger is that functional global standards still do not exist while national interests can align perfectly in the short-term with proprietary standards. That is not to say, however, that proprietary languages and applications should not be encouraged and adopted—quite the contrary—open source suffers similar challenges as standards in terms of competitive differentiation. Rather, it only means that proprietary technologies cannot become the de facto standard in a network economy.

In peering into the future from my perch in our small private lab and incubator in wilds of N AZ more than 15 years ago, the need for standardized structured data becomes evident, as does the need for easily adaptable software systems that manage relationships between entities. Combined with the data explosion that seems infinite, it was also obvious that filters would be required to manage the quality and quantity of data based on the profiles of entities. The platform would need to be secure, not trackable for many applications, reflect the formal relationships between entities, and set the foundation for accountability, the rule of law, and sustainable economics. In addition, in order to allow and incentivize differentiation beyond the software programmer community, thus permitting market economics to function, the neural network economy would require adaptability that is similar to that which takes place in the natural, physical world.

I suggest then while nascent and imperfect, semantics is the preferred method to achieve alignment of interests in the emergent neural network economy, for it represents the building blocks in structured data for meaning in the digital age, resting at the confluence of human and universal languages, and serving as the functional portal to the neural network economy.

Finally, as the humble founder and inventor, permit me then to suggest that Kyield is the optimal system to manage semantics as it intentionally achieves the necessary elements for organizations to align and optimize their digital assets with the mission of the organization, containing adaptable tools to manage the relationships between entities, including with and between each individual and workgroup.

May 2012 deliver more meaning to you, your organization, and by extension our collective future.

Best of Kyield Blog Index


I created an index page containing links to the best articles in our blog with personal ratings:

Best of Kyield Blog Index.