Press release on our enterprise pilot program

We decided to expand our reach on our enterprise pilot program through the electronic PR system so I issued the following BusinessWire release today:

Kyield Announces Pilot Program for Advanced Analytics and Big Data with New Revolutionary BI Platform 

“We are inviting well-matched organizations to collaborate with us in piloting our break-through system to bring a higher level of performance to the information workplace,” stated Mark Montgomery, Founder and CEO of Kyield. “In addition to the significant competitive advantage exclusive to our pilot program, we are offering attractive long-term incentives free from lock-in, maintenance fees, and high service costs traditionally associated with the enterprise software industry.”

Regards, MM


Future of BI in the Organization

We enjoyed a pleasant surprise this week in the form of a new Forrester report that named Kyield as one of the interviews. The topic and content was certainly appropriate for introducing Kyield to Forrester clients:

Future of BI: Top Ten Business Intelligence Predictions for 2012by Boris Evelson and Anjali Yakkundi with Stephen Powers and Shannon Coyne.

I have reviewed the brief paper, finding that I am in substantial agreement with the direction and predictions so I recommend the product, and I wanted to share a few additional thoughts.

One of the main themes in the report concerns expanding the use of BI throughout the organization, which essentially follows a proven management philosophy and strategy I have advocated my entire career due to a combination of logic, experience, and testing in many real-life cases.  In a broad sense this issue speaks to organizational awareness, or as too often has been the case; the lack thereof.

Whether missed opportunity or crisis prevention, the problem is not that organizations do not contain essential knowledge for good decision making—or even that accurate predictions are not transferred to digital form—indeed experience shows that digital red flags are commonly found that were appropriate, accurate, and in many cases followed rational policy. Rather, in the super majority of cases a combination of obstacles prevented the critical issues from reaching and/or influencing decision makers. It required more than a dozen years for related technologies to evolve and mature to the point of allowing our system to resolve this and other critical issues in near-real time, but that day has finally come.

I found it interesting that the paper acknowledges that BI has serious limitations, and that the next generation of technologies may turn traditional BI relationships “upside down”, to include business alignment. The Forrester team also warns clients about excessive control and the need for balance. Indeed our patented system provides fully adaptable modules to tailor to the specific mission and needs of the individual, group, and organization. Each organization requires the ability to tailor differentiation in easy to use modules based on standards that are cost efficient and adaptable for continual improvement.

Other highlights in the paper point to the benefits of a semantic layer based on standards and a well-deserved warning to avoid consultants and integrators who “love the old-fashioned BI” as it can result in “unending billable hours”. Of course this problem is not limited to just BI—a serious problem across the enterprise ecosystem that has resulted in a unified defense against any innovation that reduces costs and improves efficiency, leading to ‘lock-in’ and low levels of innovation in the enterprise.

Unfortunately for those who rely on inefficient systems and misalignment of interests, technology advances have come together with economic necessity at roughly the same time to form an inflection point at the confluence of organizational management and neural networks. That is to say that our internal R&D as well as considerable external evidence is in strong agreement with the advice the Forrester team provides clients in this report, including to “start embracing some of the latest BI trends — or risk falling behind”. That risk is currently significant, expanding quickly in 2012 and will no doubt impact outcomes for the foreseeable future. -MM

Strategic IT Alignment in 2012: Leverage Semantics and Avoid the Caretaker

A very interesting development occurred on the way to the neural network economy: The interests of the software vendor and the customer diverged, circled back and then collided, leaving many executives stunned and confused.

The business model in the early years of software was relatively simple. Whether an individual or enterprise, if the customer didn’t adopt the proprietary standard that provided interoperability, the customer was left behind and couldn’t compete—a no brainer—we all adopted. By winning the proprietary standard in any given software segment, market leaders were able to deliver amazing improvements in productivity at relatively low cost while maintaining some of the highest profit margins in the history of business. This model worked remarkably well for a generation, but as is often the case technology evolved more rapidly than business models and incumbent cultures could adapt, so incumbents relied on lock-in tactics to protect the corporation, profit, jobs, and perhaps in some cases national trade.

Imagine the challenge of a CEO today in a mature, publicly traded software company with a suite of products that is generating many billions of dollars in profits annually. In order to continue to grow and keep the job, the CEO would need to either rediscover the level of innovation of the early years—as very few have been able to do, play favorites by providing some customers with competitive advantage and others with commodities—occurring in the enterprise market but risky, or focus on milking the commoditized market power in developed nations while pushing for growth in developing countries. The latter has been the strategy of choice for most mature companies, of course.

Doing all of the above simultaneously is nearly impossible. Killer apps by definition must cannibalize cash cows and public company CEOs have a fiduciary responsibility to optimize profits while mitigating risk, so most CEOs in this position choose to remain ‘dairy farmers’ either until retirement or are forced to change from emergent competition. In discussing one such incumbent recently with one of the leading veterans in IT, he described such a CEO as “the caretaker”. For enterprise customers this type of caretaker can be similar to the one we hired a few years ago to protect our interests when we moved to the Bay area, returning to a property that was uninhabitable after messaging ‘all is fine’ (beware of the caretaker wolf in sheep’s clothing).

Now consider that software exports generate large, efficient import engines for currency in headquarter countries, thus placing those national governments in strategic alignment with the incumbents in the short-term (often dictated by short-term politics); and another entire dimension appears that is rarely discussed, yet very strongly influences organizations worldwide. This situation can influence governments in protecting and reinforcing perceived short-term benefits of commoditized market leaders over critical long-term needs of organizations, markets, and economies. It is not inaccurate to suggest that national security is occasionally misunderstood and/or misused in the decision process on related policy.

Monopoly cultures think and act alike, whether in the public or private sector, which is often (eventually) their undoing, unless of course they adopt intentional continuous improvement. This is why creative destruction is so essential, has been embraced internally by most progressive organizations in some form, and why customers should proactively support innovators and farm markets towards sustainable diversity. Despite what may appear to be the case, the interests of incumbents in enterprise software are often directly conflicting with the interests of the customer.

While the theory of creative destruction has roots in Marxism, the practice is a necessity for capitalism (or any other ism) today due to the natural migration of cultures and economies to seek security and protection, which in turn takes us away from the discipline required for continual rejuvenation. We embrace creative destruction in what has become modern global socialism simply because very little innovation would emerge otherwise. Competitive advantage for organizations cannot exist in rigid commoditization of organizational systems as we see in software. Simply put—whether at the individual, organizational, or societal level, we should embrace creative destruction for long-term survival, especially in light of our current unsustainable trajectory.

Which brings us to the present day emergent neural network economy. In our modern network economy we simply must have interoperable software and communications systems. The global economy cannot function properly otherwise, so this is in everyone’s interest, as I have been saying for 15 years now. The overpowering force of the network effect would place any proprietary standard in an extortion position to the entire global economy in short order. The current danger is that functional global standards still do not exist while national interests can align perfectly in the short-term with proprietary standards. That is not to say, however, that proprietary languages and applications should not be encouraged and adopted—quite the contrary—open source suffers similar challenges as standards in terms of competitive differentiation. Rather, it only means that proprietary technologies cannot become the de facto standard in a network economy.

In peering into the future from my perch in our small private lab and incubator in wilds of N AZ more than 15 years ago, the need for standardized structured data becomes evident, as does the need for easily adaptable software systems that manage relationships between entities. Combined with the data explosion that seems infinite, it was also obvious that filters would be required to manage the quality and quantity of data based on the profiles of entities. The platform would need to be secure, not trackable for many applications, reflect the formal relationships between entities, and set the foundation for accountability, the rule of law, and sustainable economics. In addition, in order to allow and incentivize differentiation beyond the software programmer community, thus permitting market economics to function, the neural network economy would require adaptability that is similar to that which takes place in the natural, physical world.

I suggest then while nascent and imperfect, semantics is the preferred method to achieve alignment of interests in the emergent neural network economy, for it represents the building blocks in structured data for meaning in the digital age, resting at the confluence of human and universal languages, and serving as the functional portal to the neural network economy.

Finally, as the humble founder and inventor, permit me then to suggest that Kyield is the optimal system to manage semantics as it intentionally achieves the necessary elements for organizations to align and optimize their digital assets with the mission of the organization, containing adaptable tools to manage the relationships between entities, including with and between each individual and workgroup.

May 2012 deliver more meaning to you, your organization, and by extension our collective future.

Best of Kyield Blog Index

I created an index page containing links to the best articles in our blog with personal ratings:

Best of Kyield Blog Index.

Kyield is seeking select organizations to partner on enterprise prototype


We are now seeking clients to work in a collaborative manner to develop and test a fully functional prototype of our patented enterprise platform during 2012.

For a small group of well-matched organizations, we are prepared to offer exceptional benefits:

  • Very attractive license terms of extended duration
  • Extraordinary consulting in tailoring and optimizing the system
  • Priority terms for future innovation, providing on-going competitive advantage

General target criteria for prototype / client partnership

Must be interested in improving:

  • Crisis prevention
  • Innovation
  • Productivity
  • Decision making

Preffered entity of 500 to 10,000 knowledge workers

  • Flexible depending on work type & intensity
  • Can go much higher but not much lower
  • No direct competitors
  • Strategic partner organizations possible

Above average technical environment

  • Create original digital work products
  • Distributed, remote workers
  • Place high value on invention & innovation
  • High value on prevention of crises & litigation
  • High priority on competitive advantage & differentiation
  • Thought leader more important than market leader

English language work environment

    • To interact efficiently with Kyield, not for internal files

The project will require CEO/COO buy-in for unit or organization

    • The nature of the organization platform will likely require CEO leadership prior to engagement

If your organization or a client matches these general criteria for this opportunity, or you are aware of one that might, please contact Kyield’s CEO Mark Montgomery ( ) to explore in more detail. 

Kyield will not disclose identities of either the individuals or organizations until both parties agree.

Thank you!

New paper: Optimizing Knowledge Yield in the Digital Workplace

I am pleased to share a new paper that may be of interest:

Optimizing Knowledge Yield in the Digital Workplace
A new system design for thriving in the data-intensive universe

From the abstract:

The purpose of this paper is threefold. First, it briefly describes how the digital
workplace evolved in an incremental manner. Second, it discusses related structural
technical and economic challenges for individuals and organizations in the digital
workplace. Lastly, it summarizes how Kyield’s novel approach can serve to provide
exponential performance improvement.

Key patent issued

My key patent for Kyield was issued today by the USPTO as scheduled earlier this month.

Title: Modular system for optimizing knowledge yield in the digital workplace

Abstract: A networked computer system, architecture, and method are provided for optimizing human and intellectual capital in the digital workplace environment.

To view our press release go here

To view the actual patent  go here

I will post an article when time allows on the importance of this technology and IP, and perhaps one on the experience with the patent system. Thanks, MM

Too Big to Fail or Too Primitive to Succeed?

Our economy very nearly experienced a financial version of Armageddon due to the gap between a primitive governance structure and highly sophisticated tools employed by a few with interests that were deeply misaligned with the needs of sustaining our national and global economy. We have all since unwillingly experienced the negative impacts of untamed technology while experiencing few of the benefits of the tamed; whether for resolution of the current crisis or prevention of the next.

Given the systemic nature and scale of the financial crisis, and in consideration of the poor ongoing economic conditions, it’s clear that the financial industry, political process, and regulators have all fallen short of achieving the individual mission of each, particularly in consideration of current technological capabilities.

For the past several months financial institutions have been attempting to convince regulators that they should not be labeled a Systemically Important Financial Institution (SIFI). The process of implementing the 2010 Dodd-Frank law in the U.S. has resulted in spin offs in an attempt to avoid increased U.S. regulation, while the new global rules for multi-national banks on top of Basel III, including surcharges and increased capital ratios, is resulting in a comprehensive rethink of the fundamental assumptions surrounding the global banking model.

Observing this dynamic invites a mental imagery of bureaucrats, politicians, and academics in team competition, each applying favored remedies such as duck tape over economic journals in a futile attempt to plug giant leaks in the hull of a Nimitz-class aircraft carrier.

When basic human greed clashed with globalization, networking, and technology, the combination introduced a complexity far beyond the organizational structures and tools available to regulators or corporations. Indeed, the reaction we’ve observed suggests that remedies employed to manage this crisis were designed for a war fought over seven decades ago during the Great Depression; an era when state-of-the-art technology was represented by the IBM Type 285 Numeric Printing Tabulator– capable of tabulating 150 cards per minute. The hourly sales of IBM today are approaching the annual sales of 1933, and billions of records are now run in seconds, yet our archaic regulatory system is employing printing presses in response to the largest financial crisis in 75 years.

A great deal has been learned in recent years beyond traditional economic theory about the systemic nature of networks, social behavior, contagion, and the global economy, with considerable investment in basic and applied research focused on technologies specifically designed to prevent systemic crises.

In the era of high performance computing on an increasingly interconnected planet with ever expanding pipes, economic tipping points can be reached very quickly that can bankrupt even the previously most wealthy nation on earth, particularly in a weakened economic condition suffering from structural problems. Focusing on SIFIs is of course essential, even if tardy by decades, but the emphasis should be on managing real systemic risk, which requires a very specific data structure that ensures data integrity, enhanced security, system-wide automation, modernized organizational structures, and continual, real-time improvement.

Without deep intelligence on the constantly changing relationships in a carefully constructed semantic layer, and automatically managed by pre-configured data valves, systemic risk is impossible to manage well, or even I argue at a level that is minimally acceptable.

Sophisticated new multi-disciplinary systems have been designed specifically to address the modern challenges in systemic risk management, but have yet to be built out and deployed. Policy makers should insist on the new generation of technologies to better protect citizens and the economy; regulators should embrace and promote the technology for it’s impossible to meet their mission otherwise; and financial institutions should adopt the technology due to rare ROI and sharply reduced levels of risk.

Is the customer’s customer a tipping point for enterprise IT?

In early 1996 we spun out a radical concept from my consulting firm on the newly commercialized web that attempted to level the playing field between small business and large. The vision was grander than the technical capabilities at the time, but despite our many weaknesses it became a niche market leader.

Even though we had recent experience representing clients who were competing with market leaders, I was still surprised by the response in some sectors. In our attempt to partner with multinationals, we found primarily fear and defense, including in finance. It was quite clear that the majority of leaders in the corporate world were not terribly thrilled with our efforts. From my perspective, however, given the advantages of incumbents, the long-term risk was far greater to most of their companies if such efforts did not succeed. Having been on all sides of this issue, I was closer to the challenges than they were (and had deeper intel).

Fast forward to 2008. During the initial wave of the global financial crisis I had a private email exchange with one of the leading economic editors, who is a respected centrist thought leader I had known for over a decade. While we have very different backgrounds and experiences, we were in agreement that the initial reaction to the crisis, even though understandable, were misguided. Due to a myriad of factors, including consolidation, centralization, internal financial conflicts, expediency, scale, political activism favoring large institutions, and technology, the small business engine was already in trouble in the west, buoyed primarily by easy credit and the housing bubble in previous years. Based on the evidence in previous recessions, we had very little confidence that the existing financial infrastructure could serve the needs of small business, particularly in current form. One need only travel in the rural U.S. or observe a few SME P&Ls to conclude in hindsight that we were correct.

Fast forward to the present day. On Meet The Press last Sunday, David Brooks sent a warning that I fear will go unheard in the very ivory towers that need to heed the message: “I was up on Wall Street the other day. I know political risk better than they do; they are vastly underestimating the source of political risk out there. We could have a massive problem in the next couple of years.” The source Brooks is referring to, of course, is the American citizen and consumer.

A headline on Wednesday (6/1) at CNBC echoes the disconnect: Wall Street Baffled by Slowing Economy, Low Yields. These are not isolated cases, but rather symptoms of a greater problem at work in the decision process found in every crisis over the past 15 years. I don’t know what data these analysts are consuming, or what tools they are using, but their systems and methods continue to fail them if these and other reports are true.

A glance at the quarterly reports of even the largest consumer companies would reveal a combination of inflation and weak spending that is beginning to negatively impact earnings. Given the massive scale and tight margins facing most of these companies, it should serve as a long over-due wake-up call that it’s time for the IT industry cluster to execute competitive, cost effective solutions to help the customer’s customers compete. This is not an immediate crisis begging for knee jerk reactions, but rather a trend long in the making, dealing with underlying structural problems in the economy that are essential to overcome.

One does not need to search far and wide to discover a variety of profitable methods and models to extend high-end functionality to the SME market, provided of course one is looking, not consumed with protectionism, and obstacles are removed. Regardless of what sector of the economy each serves, ultimately there is no escaping the impact of macro economic conditions, to include the impact of technology on customers of customers.

New Mexico Analytics Cluster project

I wanted to share a project I initiated earlier this year that we have temporarily named the New Mexico Analytics Cluster.

Essentially the project relates to organizing the broad analytics cluster here in NM into a functional global leader worthy of the science and technology. Initial feedback from quite a few of the related companies, labs, and individuals has been useful and generally positive, so we thought it was time to initiate a modest public exposure with a blog.  A brief slide presentation is linked that provides more detail.

Thanks, MM