Hidden costs of complexity in the enterprise
April 7, 2010 11 Comments
Often is the case when consuming fresh evidence that the truism is reconfirmed: what is not visible is far greater than what is; in terms of opportunity, risk, and cost. Few concepts are so profoundly transformative when this string of characters begins to take shape in the mind’s eye.
Unlike biological evolution where logical adaptation occurs over many generations, the dominant complex systems on earth today are driven by technology, evolving very rapidly, and of course subject to human manipulation.
From competitive advantage to utility tax
Similar to healthcare, education, and finance where complexity creep has been manipulated for long periods until surpassing sustainability, enterprise software has become primarily a game of ‘heads I win; tails you lose’ for those paying the bills. While the first generation of enterprise software provided a strong competitive advantage to early adopters, the bulk of systems have long since resembled a tax on admission.
Today most leading enterprise systems offer a competitive advantage to very few organizations where innovation is focused on optimizing commoditization, scale and capitalization; not products, service, or improvement. The result is a breakdown in product design, tragically to include a negative influence on innovation for customers worldwide.
The tipping point for market dysfunction in the enterprise occurred more than a decade ago when most of the innovation began to be outsourced to consumer ventures in Silicon Valley that serve a very different purpose for a different customer. Even if the majority of SV VC firms and young social networking entrepreneurs understood the needs of organizations, globally adopted consumer technology cannot provide a competitive advantage to organizations. It should be obvious that if everyone is using the same rigid programs with little if any adaptation or customization, then the advantage falls only to those with scale.
Another sign of market dysfunction in IT more generally is visible by comparing cash generation of industry leaders with their competitors. A recent article in the WSJ revealed that nine tech companies generated $68.5 billion in new cash during the Great Recession while the other 65 competitors in the S&P 500 generated $13.5 billion combined.
The war chests at IT leaders tells a different story for each company, including recent product success at Apple, the power of the network effect with Google on the Web, effectively managed monopolies, and an economy dominated for many years by investment banks, weak regulation and judicial enforcement, and last but not least; apathy in enterprise customers. I argue that this level of consolidation of market power isn’t good for anyone, including the stockholders of those with the largest war chests as the dividends are either very small or non-existent, with the majority performing poorly in stock appreciation.
Need for ‘market farming’
It’s essential to understand the difference between proprietary applications and proprietary standards, particularly relating to network effect, market power and systemic risk. IT architecture in the network era is of course global and largely unregulated outside of minimal anti-trust enforcement, with voluntary standards bodies, so it should not be surprising then that proprietary standards still dominate enterprise computing, or that proprietary systems are manipulated for self gain. That is after all the fiduciary duty of public companies with bonus incentives, stock options, and careers dependent upon same.
Of course it’s also true that given government’s terrible track record in regulating technology, combined with nationalist conflicts that too-often suffocate innovation with protectionism, very few of the vastly divergent stakeholders in the ecosystem favor increased regulation, which leaves us with the heavy lifting of systems design and market education to attract intelligent decisions by early adopters in an environment that is predatory to new entrants. Markets do eventually depend upon well-informed customers acting in their best interest, even in expert professional roles; particularly in markets dominated by lock in.
From the perspective of a consultant and venture capitalist who coached hundreds of businesses of all sizes on defending against predation, it is an understatement that enterprise customers need to take a far more proactive role as experts in market farming. The financial incentive to do so is certainly sufficient, as is the risk in not doing so. Until such time that more effective global governance of standards is enforced, however, national governments, non-profits, consumers, and especially CIOs have a special responsibility to be proactive as ‘market farmers’ within the IT ecosystem.
Ownership of data
In the modern world, data represents knowledge, security, identity, health, and wealth. Those who control proprietary standards within the network effect in an unregulated environment not only can extort a high price of entry to the utility, but can also greatly influence if not control ownership of data. We see that often today on the public web (see LinkedIn’s license agreement for an example).
Our position with Kyield is that while we have every right and indeed a responsibility to protect our proprietary intellectual capital within our platform, we do not have the right to extend the power enabled by the platform over others. I would even go so far as to say that we have a responsibility to defend customers from others doing so to the best of our ability. Our power should only extend to the borders of our internal systems, beyond which we should embrace universal standards, so we do, even though they are far from perfect and extremely slow to evolve relative to need.
Networks are not new, they are just much larger, contain data representing far more than ever before, and are of course global. Nations created standards for electric grids presumably to prevent the electric companies from owning every appliance maker, if not their economy. The ‘extend and conquer’ strategy is predatory, obviously very dangerous to the global economy, and should therefore be regulated and aggressively enforced. Nationalist conflicts and protectionism have prevented proper regulation.
It’s my personal position that no one should have power over the languages used for public transactions, nor should any company or group of companies (or any entity including governments) have influence, power, or ownership of data other than the original owner of that data or those who have been granted the rights of use. Ownership and control of data is I believe a fundamental right for individuals and legal entities. It would surprise me if the U.S. Supreme Court failed to better clarify data standards and ownership at some point in the near future.
Of course standards bodies have a higher calling not only to manage the standards process properly within meaningful time constraints, but also consider the influence of conflicting interests on their financial model, as well as sustainable economics relating to adoption. I recently suggested to the W3C for example that their Wiki was insufficient for communicating the benefits of the languages the body created and approved. The response was that they didn’t have the bandwidth and that communications was up to vendors. The response fell well short of what leaders in science have been teaching for many years, suggesting to me that significant change was needed in the model and culture.
With very few exceptions, during the 15 years I have been observing the evolution of computing standards, it has been confirmed often that the IT industry is no more able to self-regulate than the finance or healthcare industries. I wish they could, but wishing doesn’t make it so.
Beauty of simplicity
Those who doubt the need for reducing complexity in our society probably haven’t spent much time working pro bono for a worthy cause that cannot be saved due to entrenched interests owing their existence to manipulated complexity. I have seen the dilemma many times during my career manifest in legislation where the individual interests of the few manipulate the outcome for the many, thereby threatening the whole, to include themselves eventually of course, with little or no apparent understanding of the impact of their self centered activism. The current political situation in the U.S. is obviously not what the Founding Fathers (and Mothers) intended, and indeed feared most.
A very similar dynamic evolved in computing over the past few decades, which I began to recognize in the mid 1990s when we converted our consulting firm to one of the first commercial Internet labs and incubators. It was only when I trained to become a network engineer and object oriented programmer that the degree of manipulation and systemic risk became apparent.
“Complicated systems and their combinations should be considered only if there exist physical-empirical reasons to do so.” –Albert Einstein
There have been reasons for the increasing complexity in enterprise software, but have been primarily limited to protecting cash cows and extending market share in the global network ecosystem. While economies of scale and Moore’s law pushed hardware costs lower, software costs escalated while arguably delivering less value, despite a massive reaction by open source. The parallel evolution in healthcare, education, and government in the U.S. was very similar; we were all paying more for less.
The problem today of course is that our collective resources can no longer afford even the status quo, much less the trajectory, from the costs due to manipulated complexity. In enterprise software, manipulated complexity also creates a very high maintenance cost now exceeding 20% of product cost per year in some cases, and of course a much higher level of security risk. This is a dangerous game and largely unnecessary.
One of the elements I liked best when reviewing the papers of the Google founders and later their early beta was the elegance of simplicity. Google was among the first substantial efforts in the Internet era to reduce a highly complex environment to a very simple interface designed for humans in their own natural language, and it worked very well indeed.
Ironically I found Google while searching for raw IP in an effort to achieve something similar for the enterprise, but enterprise data had not been converted to simple HTML like the public web, rather was subject to incompatible languages and data silos, and of course HTML lacked the ability to provide the embedded intelligence we were seeking, so we had a long wait for standards to catch up to the vision often associated with the semantic web. I like to think we used the time wisely.
Simplicity for the enterprise has not yet arrived at the station, but the train is full and fast approaching.