Future of BI in the Organization


We enjoyed a pleasant surprise this week in the form of a new Forrester report that named Kyield as one of the interviews. The topic and content was certainly appropriate for introducing Kyield to Forrester clients:

Future of BI: Top Ten Business Intelligence Predictions for 2012by Boris Evelson and Anjali Yakkundi with Stephen Powers and Shannon Coyne.

I have reviewed the brief paper, finding that I am in substantial agreement with the direction and predictions so I recommend the product, and I wanted to share a few additional thoughts.

One of the main themes in the report concerns expanding the use of BI throughout the organization, which essentially follows a proven management philosophy and strategy I have advocated my entire career due to a combination of logic, experience, and testing in many real-life cases.  In a broad sense this issue speaks to organizational awareness, or as too often has been the case; the lack thereof.

Whether missed opportunity or crisis prevention, the problem is not that organizations do not contain essential knowledge for good decision making—or even that accurate predictions are not transferred to digital form—indeed experience shows that digital red flags are commonly found that were appropriate, accurate, and in many cases followed rational policy. Rather, in the super majority of cases a combination of obstacles prevented the critical issues from reaching and/or influencing decision makers. It required more than a dozen years for related technologies to evolve and mature to the point of allowing our system to resolve this and other critical issues in near-real time, but that day has finally come.

I found it interesting that the paper acknowledges that BI has serious limitations, and that the next generation of technologies may turn traditional BI relationships “upside down”, to include business alignment. The Forrester team also warns clients about excessive control and the need for balance. Indeed our patented system provides fully adaptable modules to tailor to the specific mission and needs of the individual, group, and organization. Each organization requires the ability to tailor differentiation in easy to use modules based on standards that are cost efficient and adaptable for continual improvement.

Other highlights in the paper point to the benefits of a semantic layer based on standards and a well-deserved warning to avoid consultants and integrators who “love the old-fashioned BI” as it can result in “unending billable hours”. Of course this problem is not limited to just BI—a serious problem across the enterprise ecosystem that has resulted in a unified defense against any innovation that reduces costs and improves efficiency, leading to ‘lock-in’ and low levels of innovation in the enterprise.

Unfortunately for those who rely on inefficient systems and misalignment of interests, technology advances have come together with economic necessity at roughly the same time to form an inflection point at the confluence of organizational management and neural networks. That is to say that our internal R&D as well as considerable external evidence is in strong agreement with the advice the Forrester team provides clients in this report, including to “start embracing some of the latest BI trends — or risk falling behind”. That risk is currently significant, expanding quickly in 2012 and will no doubt impact outcomes for the foreseeable future. -MM

Advertisements

Strategic IT Alignment in 2012: Leverage Semantics and Avoid the Caretaker


A very interesting development occurred on the way to the neural network economy: The interests of the software vendor and the customer diverged, circled back and then collided, leaving many executives stunned and confused.

The business model in the early years of software was relatively simple. Whether an individual or enterprise, if the customer didn’t adopt the proprietary standard that provided interoperability, the customer was left behind and couldn’t compete—a no brainer—we all adopted. By winning the proprietary standard in any given software segment, market leaders were able to deliver amazing improvements in productivity at relatively low cost while maintaining some of the highest profit margins in the history of business. This model worked remarkably well for a generation, but as is often the case technology evolved more rapidly than business models and incumbent cultures could adapt, so incumbents relied on lock-in tactics to protect the corporation, profit, jobs, and perhaps in some cases national trade.

Imagine the challenge of a CEO today in a mature, publicly traded software company with a suite of products that is generating many billions of dollars in profits annually. In order to continue to grow and keep the job, the CEO would need to either rediscover the level of innovation of the early years—as very few have been able to do, play favorites by providing some customers with competitive advantage and others with commodities—occurring in the enterprise market but risky, or focus on milking the commoditized market power in developed nations while pushing for growth in developing countries. The latter has been the strategy of choice for most mature companies, of course.

Doing all of the above simultaneously is nearly impossible. Killer apps by definition must cannibalize cash cows and public company CEOs have a fiduciary responsibility to optimize profits while mitigating risk, so most CEOs in this position choose to remain ‘dairy farmers’ either until retirement or are forced to change from emergent competition. In discussing one such incumbent recently with one of the leading veterans in IT, he described such a CEO as “the caretaker”. For enterprise customers this type of caretaker can be similar to the one we hired a few years ago to protect our interests when we moved to the Bay area, returning to a property that was uninhabitable after messaging ‘all is fine’ (beware of the caretaker wolf in sheep’s clothing).

Now consider that software exports generate large, efficient import engines for currency in headquarter countries, thus placing those national governments in strategic alignment with the incumbents in the short-term (often dictated by short-term politics); and another entire dimension appears that is rarely discussed, yet very strongly influences organizations worldwide. This situation can influence governments in protecting and reinforcing perceived short-term benefits of commoditized market leaders over critical long-term needs of organizations, markets, and economies. It is not inaccurate to suggest that national security is occasionally misunderstood and/or misused in the decision process on related policy.

Monopoly cultures think and act alike, whether in the public or private sector, which is often (eventually) their undoing, unless of course they adopt intentional continuous improvement. This is why creative destruction is so essential, has been embraced internally by most progressive organizations in some form, and why customers should proactively support innovators and farm markets towards sustainable diversity. Despite what may appear to be the case, the interests of incumbents in enterprise software are often directly conflicting with the interests of the customer.

While the theory of creative destruction has roots in Marxism, the practice is a necessity for capitalism (or any other ism) today due to the natural migration of cultures and economies to seek security and protection, which in turn takes us away from the discipline required for continual rejuvenation. We embrace creative destruction in what has become modern global socialism simply because very little innovation would emerge otherwise. Competitive advantage for organizations cannot exist in rigid commoditization of organizational systems as we see in software. Simply put—whether at the individual, organizational, or societal level, we should embrace creative destruction for long-term survival, especially in light of our current unsustainable trajectory.

Which brings us to the present day emergent neural network economy. In our modern network economy we simply must have interoperable software and communications systems. The global economy cannot function properly otherwise, so this is in everyone’s interest, as I have been saying for 15 years now. The overpowering force of the network effect would place any proprietary standard in an extortion position to the entire global economy in short order. The current danger is that functional global standards still do not exist while national interests can align perfectly in the short-term with proprietary standards. That is not to say, however, that proprietary languages and applications should not be encouraged and adopted—quite the contrary—open source suffers similar challenges as standards in terms of competitive differentiation. Rather, it only means that proprietary technologies cannot become the de facto standard in a network economy.

In peering into the future from my perch in our small private lab and incubator in wilds of N AZ more than 15 years ago, the need for standardized structured data becomes evident, as does the need for easily adaptable software systems that manage relationships between entities. Combined with the data explosion that seems infinite, it was also obvious that filters would be required to manage the quality and quantity of data based on the profiles of entities. The platform would need to be secure, not trackable for many applications, reflect the formal relationships between entities, and set the foundation for accountability, the rule of law, and sustainable economics. In addition, in order to allow and incentivize differentiation beyond the software programmer community, thus permitting market economics to function, the neural network economy would require adaptability that is similar to that which takes place in the natural, physical world.

I suggest then while nascent and imperfect, semantics is the preferred method to achieve alignment of interests in the emergent neural network economy, for it represents the building blocks in structured data for meaning in the digital age, resting at the confluence of human and universal languages, and serving as the functional portal to the neural network economy.

Finally, as the humble founder and inventor, permit me then to suggest that Kyield is the optimal system to manage semantics as it intentionally achieves the necessary elements for organizations to align and optimize their digital assets with the mission of the organization, containing adaptable tools to manage the relationships between entities, including with and between each individual and workgroup.

May 2012 deliver more meaning to you, your organization, and by extension our collective future.