Kyield Partner Program: An Ecosystem for the Present Generation


We’ve been in various discussions for the past couple of years with potential partners, finding many interests in the industry to be in direct conflict with the mission of the customer organizations. Most of our discussions reflect a generational change in technology underway that is being met with sharp internal divisions at incumbent market leaders that result in attempting to protect the past from the present and future. The situation is not new– it is very difficult for incumbent market leaders to voluntarily cannibalize highly profitable business models, particularly when doing so would threaten jobs, so historically disruptive technology has required new ecosystems (MS and Intel in the early 1980s, Google in the late 1990s, RIM, Apple app stores in the 2000s, etc.).

So due to the platform approach to Kyield and the disruptive nature of our technology to incumbent business models, and resistance to change in industry leaders–despite pressure from even their largest customers, we determined quite some time ago that it may require building a new ecosystem based on the Kyield platform, data standards and interoperability. The driving need is not just about the enormous sum of money being charged for unproductive functionality like lock-in, maintenance fees, and unnecessary software integration–although this is an ever-increasing problem for customers, or even commoditization and lack of competitive advantage–also a very serious problem, rather as is often the case it comes down to a combination of complexity, math and physics.

Not only is it not economically viable to optimize network computing in the neural network economy based on legacy business models, but we cannot physically prevent systemic crises, dramatically improve productivity, and/or expedite discovery in a manner that doesn’t bankrupt a good portion of the economy without data standards and seamless interoperability. In addition, we do need deep intelligence on each entity as envisioned in the semantic web in order to overcome many of our greatest challenges, to execute advanced analytics, and to manage ‘big data’ in a rational manner, but we also need to protect privacy, data assets, knowledge capital and property rights while improving security. Standards may be voluntary, but overcoming these challenges isn’t.

So we’ve been working on a new partner program for Kyield in conjunction with our pilot phase in attempt to reach out to prospective partners who may not be on our radar that would make great partners and work with us to build a new ecosystem based not on protecting the past or current management at incumbent firms, but rather the future by optimizing the present capabilities and opportunities surrounding our system.  We hope to collectively create a great many more new jobs than we could possibly do on our own in the process–not just in our ecosystem, but importantly for customer ecosystems.

We’ve decided for now on five different types of partner relationships that are intended to best meet the needs of customers, partners, and Kyield:

  • Consulting
  • Technology
  • Platform
  • Strategic
  • Master

For more information on our partner program, please visit:

http://www.kyield.com/partners.html

Thank you,

Mark Montgomery
Founder, Kyield
Advertisements

Five Essential Steps For Strategic (adaptive) Enterprise Computing


Given the spin surrounding big data, duopoly deflection campaigns by incumbents, and a culture of entitlement across the enterprise software ecosystem, the following 5 briefs are offered to provide clarity for improving strategic computing outcomes.

1)  Close the Data Competency Gap

Much has been written in recent months about the expanding need for data scientists, which is true at this early stage of automation, yet very little is whispered in public on the prerequisite learning curve for senior executives, boards, and policy makers.

Data increasingly represents all of the assets of the organization, including intellectual capital, intellectual property, physical property, financials, supply chain, inventory, distribution network, customers, communications, legal, creative, and all relationships between entities. It is therefore imperative to understand how data is structured, created, consumed, analyzed, interpreted, stored, and secured. Data management will substantially impact the organization’s ability to achieve and manage the strategic mission.

Fortunately, many options exist for rapid advancement in understanding data management ranging from off-the-shelf published reports to tailored consulting and strategic advisory from individuals, regional firms, and global institutions. A word of caution, however—technology in this area is changing rapidly, and very few analysts have proven able to predict what to expect within 24-48 months.

Understanding Data Competency

    • Data scientists are just as human as computer or any other type of scientist
    • A need exists to avoid exchanging software-enabled silos for ontology-enabled silos
    • Data structure requires linguistics, analytics requires mathematics, human performance requires psychology, predictive requires modeling—success requires a mega-disciplinary perspective

2)  Adopt Adaptive Enterprise Computing

A networked computing workplace environment that continually adapts to changing conditions based on the specific needs of each entity – MM 6.7.12

While computing has achieved a great deal for the world during the previous half-century, the short-term gain became a long-term challenge as ubiquitous computing was largely a one-time, must-have competitive advantage that everyone needed to adopt or be left behind.  It turns out that creating and maintaining a competitive advantage through ubiquitous computing within a global network economy is a much greater challenge than initial adoption.

A deep misalignment of interests now exists between customer entities that need differentiation in the marketplace to survive and much of the IT industry, which needs to maintain scale by replicating the precise same hardware and software at massive scale worldwide.

When competitors all over the world are using the same computing tools for communications, operations, transactions, and learning, yet have a dramatically different cost basis for everything else, the region or organization with a higher cost basis will indeed be flattened with economic consequences that can be catastrophic.

This places an especially high burden on companies located in developed countries like the U.S. that are engaged in hyper-competitive industries globally while paying the highest prices for talent, education and healthcare—highlighting the critical need to achieve a sustainable competitive advantage.

Understanding adaptive enterprise computing:

    • Adaptive computing for strategic advantage must encompass the entire enterprise architecture, which requires a holistic perspective
    • Adaptive computing is strategic; commoditized computing isn’t—rather should be viewed as entry-level infrastructure
    • The goal should be to optimize intellectual and creative capital while tailoring product differentiation for a durable and sustainable competitive advantage
    • Agile computing is largely a software development methodology while adaptive computing is largely a business strategy that employs technology for managing the entire digital work environment
    • The transition to adaptive enterprise computing must be step-by-step to avoid operational disruption, yet bold to escape incumbent lock-in

3)  Extend Analytics to Entire Workforce

Humans represent the largest expense and risk to most organizations, so technologists have had a mandate for decades to automate processes and systems that either reduce or replace humans. This is a greatly misunderstood economics theory, however. The idea is to free up resources for re-investment in more important endeavors, which has historically employed the majority of people, but in practice the theory is dependent upon long-term, disciplined, monetary and fiscal policy that favors investment in new technologies, products, companies and industries. When global automation is combined with an environment that doesn’t favor re-investment in new areas, as we’ve seen in recent decades, capital will sit on the sidelines or be employed in speculation that creates destructive bubbles, the combination of which results in uncertainty with high levels of chronic unemployment.

However, while strategic computing must consider all areas of cost competitiveness, it’s also true that most organizations have become more skilled at cost containment than human systems and innovation. As we’ve observed consistently in recent years, the result has been that many organizations have failed to prevent serious or fatal crises, failed to seize missed opportunities, and failed to remain innovative at competitive levels.

While hopefully the macro economic conditions will broadly improve with time, the important message for decision makers is that untapped potential in human performance analytics that can be captured with state-of-the-art systems today is several orders of magnitude higher than through traditional supply chain analytics or marketing analytics alone.

Understanding Human Performance Systems:

    • Improved human performance systems improves everything else
    • The highest potential ROI to organizations today hasn’t changed in a millennium: engaging humans in a more competitive manner than the competition
    • The most valuable humans tend to be fiercely protective of their most valuable intellectual capital, which is precisely what organizations need, requiring deep knowledge and experience for system design
    • Loyalty and morale are low in many organizations due to poor compensation incentives, frequent job change, and misaligned motivation with employer products, cultures and business models
    • Motivation can be fickle and fluid, varying a great deal between individuals, groups, places, and times
    • For those who may have been otherwise engaged—the world went mobile

4)  Employ Predictive Analytics

An organization need not grow much beyond the founders in the current environment for our increasingly data rich world to require effective data management designed to achieve a strategic advantage with enterprise computing. Indeed, often has been the case where success or failure depended upon converting an early agile advantage into a more mature adaptive environment and culture. Within those organizations that survive beyond the average life expectancy, many cultures finally change only after a near-death experience triggered by becoming complacent, rigid, or simply entitled to that which the customer was in disagreement—reasons enough for adoption of analytics for almost any company.

While the need for more accurate predictive abilities is obvious for marketers, it is no less important for risk management, investment, science, medicine, government, and most other areas of society.

Key elements that impact predictive outcomes:

    • Quality of data, including integrity, scale, timeliness, access, and interoperability
    • Quality of algorithms, including design, efficiency, and execution
    • Ease of use and interpretation, including visuals, delivery, and devices
    • How predictions are managed, including verification, feed-back loops, accountability, and the decision chain

5)  Embrace Independent Standards

Among the most important decisions impacting the future ability of organizations to adapt their enterprise computing to fast changing external environmental forces, which increasingly influences the ability of the organization to succeed or fail, is whether to embrace independent standards for software development, communications, and data structure.

Key issues to understand about independent standards:

    • Organizational sovereignty—it has proven extremely difficult and often impossible to maintain control of one’s destiny in an economically sustainable manner over the long-term with proprietary computing standards dominating enterprise architecture
    • Trade secrets, IP, IC, and differentiation are very difficult to secure when relying on consultants who represent competitors in large proprietary ecosystems
    • Lock-in and high maintenance fees are enabled primarily by proprietary standards and lack of interoperability
    • Open source is not at all the same as independent standards, nor necessarily improve adaptive computing or TCO
    • Independent standards bodies are voluntary in most of the world, slow to mature, and influenced by ideology and interests within governments, academia, industry, and IT incumbents
    • The commoditization challenge and need for adaptive computing is similar with ubiquitous computing regardless of standards type

Press release on our enterprise pilot program


We decided to expand our reach on our enterprise pilot program through the electronic PR system so I issued the following BusinessWire release today:

Kyield Announces Pilot Program for Advanced Analytics and Big Data with New Revolutionary BI Platform 

“We are inviting well-matched organizations to collaborate with us in piloting our break-through system to bring a higher level of performance to the information workplace,” stated Mark Montgomery, Founder and CEO of Kyield. “In addition to the significant competitive advantage exclusive to our pilot program, we are offering attractive long-term incentives free from lock-in, maintenance fees, and high service costs traditionally associated with the enterprise software industry.”

Regards, MM

Strategic IT Alignment in 2012: Leverage Semantics and Avoid the Caretaker


A very interesting development occurred on the way to the neural network economy: The interests of the software vendor and the customer diverged, circled back and then collided, leaving many executives stunned and confused.

The business model in the early years of software was relatively simple. Whether an individual or enterprise, if the customer didn’t adopt the proprietary standard that provided interoperability, the customer was left behind and couldn’t compete—a no brainer—we all adopted. By winning the proprietary standard in any given software segment, market leaders were able to deliver amazing improvements in productivity at relatively low cost while maintaining some of the highest profit margins in the history of business. This model worked remarkably well for a generation, but as is often the case technology evolved more rapidly than business models and incumbent cultures could adapt, so incumbents relied on lock-in tactics to protect the corporation, profit, jobs, and perhaps in some cases national trade.

Imagine the challenge of a CEO today in a mature, publicly traded software company with a suite of products that is generating many billions of dollars in profits annually. In order to continue to grow and keep the job, the CEO would need to either rediscover the level of innovation of the early years—as very few have been able to do, play favorites by providing some customers with competitive advantage and others with commodities—occurring in the enterprise market but risky, or focus on milking the commoditized market power in developed nations while pushing for growth in developing countries. The latter has been the strategy of choice for most mature companies, of course.

Doing all of the above simultaneously is nearly impossible. Killer apps by definition must cannibalize cash cows and public company CEOs have a fiduciary responsibility to optimize profits while mitigating risk, so most CEOs in this position choose to remain ‘dairy farmers’ either until retirement or are forced to change from emergent competition. In discussing one such incumbent recently with one of the leading veterans in IT, he described such a CEO as “the caretaker”. For enterprise customers this type of caretaker can be similar to the one we hired a few years ago to protect our interests when we moved to the Bay area, returning to a property that was uninhabitable after messaging ‘all is fine’ (beware of the caretaker wolf in sheep’s clothing).

Now consider that software exports generate large, efficient import engines for currency in headquarter countries, thus placing those national governments in strategic alignment with the incumbents in the short-term (often dictated by short-term politics); and another entire dimension appears that is rarely discussed, yet very strongly influences organizations worldwide. This situation can influence governments in protecting and reinforcing perceived short-term benefits of commoditized market leaders over critical long-term needs of organizations, markets, and economies. It is not inaccurate to suggest that national security is occasionally misunderstood and/or misused in the decision process on related policy.

Monopoly cultures think and act alike, whether in the public or private sector, which is often (eventually) their undoing, unless of course they adopt intentional continuous improvement. This is why creative destruction is so essential, has been embraced internally by most progressive organizations in some form, and why customers should proactively support innovators and farm markets towards sustainable diversity. Despite what may appear to be the case, the interests of incumbents in enterprise software are often directly conflicting with the interests of the customer.

While the theory of creative destruction has roots in Marxism, the practice is a necessity for capitalism (or any other ism) today due to the natural migration of cultures and economies to seek security and protection, which in turn takes us away from the discipline required for continual rejuvenation. We embrace creative destruction in what has become modern global socialism simply because very little innovation would emerge otherwise. Competitive advantage for organizations cannot exist in rigid commoditization of organizational systems as we see in software. Simply put—whether at the individual, organizational, or societal level, we should embrace creative destruction for long-term survival, especially in light of our current unsustainable trajectory.

Which brings us to the present day emergent neural network economy. In our modern network economy we simply must have interoperable software and communications systems. The global economy cannot function properly otherwise, so this is in everyone’s interest, as I have been saying for 15 years now. The overpowering force of the network effect would place any proprietary standard in an extortion position to the entire global economy in short order. The current danger is that functional global standards still do not exist while national interests can align perfectly in the short-term with proprietary standards. That is not to say, however, that proprietary languages and applications should not be encouraged and adopted—quite the contrary—open source suffers similar challenges as standards in terms of competitive differentiation. Rather, it only means that proprietary technologies cannot become the de facto standard in a network economy.

In peering into the future from my perch in our small private lab and incubator in wilds of N AZ more than 15 years ago, the need for standardized structured data becomes evident, as does the need for easily adaptable software systems that manage relationships between entities. Combined with the data explosion that seems infinite, it was also obvious that filters would be required to manage the quality and quantity of data based on the profiles of entities. The platform would need to be secure, not trackable for many applications, reflect the formal relationships between entities, and set the foundation for accountability, the rule of law, and sustainable economics. In addition, in order to allow and incentivize differentiation beyond the software programmer community, thus permitting market economics to function, the neural network economy would require adaptability that is similar to that which takes place in the natural, physical world.

I suggest then while nascent and imperfect, semantics is the preferred method to achieve alignment of interests in the emergent neural network economy, for it represents the building blocks in structured data for meaning in the digital age, resting at the confluence of human and universal languages, and serving as the functional portal to the neural network economy.

Finally, as the humble founder and inventor, permit me then to suggest that Kyield is the optimal system to manage semantics as it intentionally achieves the necessary elements for organizations to align and optimize their digital assets with the mission of the organization, containing adaptable tools to manage the relationships between entities, including with and between each individual and workgroup.

May 2012 deliver more meaning to you, your organization, and by extension our collective future.

Unacceptably High Costs of Data Silos


Above is a screen capture of an internal Kyield document that displays an illustration of the high costs of data silos to individual organizations, regions, and society based on actual cases we have studied; in some case based on public information and in others private, confidential information. This is intended for a slide-show type of presentation so does not go into great detail. Suffice to say that human suffering, lives lost–human and otherwise, and wars that could have been prevented that were not are inseparably intertwined with economics and ecology, which is why I have viewed this issue for 15 years as one ultimately of sustainability, particularly when considering the obstacles of silos to scientific discovery, innovation, and learning as well as crisis prevention.

 
Mark Montgomery
Founder & CEO
Kyield
http://www.kyield.com

Is social networking for the enterprise a revolution or ruse?


A slightly edited comment in response to a good article from Knowledge@Wharton: Is Business-centric Social Networking a Revolution — or a Ruse? ( via http://eponymouspickle.blogspot.com/ ) .

****~~~~****

Most professors I know are using twitter, but not for the consumer noise most are familiar with, rather to share information easily and quickly. Similarly, on Facebook the most intelligent people I know have the most friends. It’s a more functional replacement for email that bogged us all down– that isn’t discussed here, and should have been.

The author also misses several other points in this otherwise good piece, not least of which is the need to share information with customers and partners, monitor feedback, and importantly; introduce advanced analytics that wouldn’t be available otherwise.

I am not kind to the hype in software, and consumer social is absolutely in bubble territory, but we need to remind ourselves that collaboration–indeed computing, was originally designed for the workplace. That the enterprise market is relying primarily on consumer innovation (not just in collaboration or social–look at Apple) speaks to the dysfunction of organizational cultures more than how humans work best together, and with which tools.

Truth be known, and few if any have studied this issue as long or detailed as I have:

  • boards are often the last to know that their companies are failing
  • true meritocracy doesn’t exist in most organizations due to lack of structured information (meaning false assumptions and/or missing information dominates)
  • the interests of the knowledge workers and organizations are often misaligned, which can be improved substantially with properly structured data
  • virtually all human caused crises can be prevented with properly structured and designed networks, including many multi-billion dollar cases we’ve identified in private companies in the past five years
  • large bureaucratic enterprises are among the least innovative orgs on the planet, due primarily to cultures that protect and defend instead of working together to improve, and smothering creativity of individuals in the process

Yes, consumer social software is primitive for enterprise use. Yes, it is largely a waste of time to use tools designed for different purposes, but it’s equally true that most don’t understand how to use the tools. No, the functionality needed in the enterprise is not simply an incremental evolution of existing platforms.

It’s best for leaders of large organizations to seek revolutionary improvement given the extremely poor performance of many in crisis prevention and innovation.

Here is a generic short video on structured data for the enterprise that provides a summary on why revolutionary change is needed. Those silos exist in large part due to incompatible languages and high costs of integration in legacy systems. If you want to learn about hype in software, and the damage done to customers and society, take a look at lock-in and (its) relationship to voluntary data standards, comparing say to interoperability in the electric grid or plumbing infrastructure for comparison.
http://semanticweb.com/semantic-web-elevator-pitch-for-enterprise-decision-makers_b17651

This article clearly supports the status quo. How’s that working; and for whom?

Either organizations are about people or they aren’t. If they are, as I am absolutely certain, then we’ve just scratched the surface of what is already possible, and those who refuse to adopt technologies that provide a tailored competitive advantage will continue to suffer the consequences. That has been true since the invention of the wheel, if not before. And all along the way some Neanderthals have successfully convinced their tribes that starting fires is a waste of time, or learning how to use a microscope was too laborious given the obvious fact that their current tools (eyes) worked just fine.

Now excuse me while I click the twitter button above and share this article with chief influencers and on Linkedin that includes all manner of customers to Wharton.

Mark Montgomery
Founder & CEO
Kyield

Semantic enterprise elevator pitch (2 min video)


How can information technology improve health care?


I recall first asking this question in leadership forums in our online network in 1997, hoping that a Nobel laureate or Turing Award winner might have a quick answer.  A few weeks earlier I had escorted my brother Brett and his wife from Phoenix Sky Harbor airport to the Mayo Clinic in Scottsdale, seeking a better diagnosis than the three-year death sentence he had just received from a physician in Washington. Unfortunately, Mayo Clinic could only confirm the initial diagnosis for Amyotrophic lateral sclerosis (ALS).

In my brother’s case, the health care system functioned much better than did the family; it was the dastardly disease that required a cure, along with perhaps my own remnant hubris, but since his employer covered health care costs we were protected from most of the economic impact. I then immersed myself in life science while continuing the experiential learning curve in our tech incubator. It soon became apparent that solving related challenges in research would take considerably longer than the three years available to my brother, his wife, and their new son. Close observation of health care has since revealed that research was only part of the challenge.

Symptoms of an impending crisis

During my years in early stage venture capital, symptoms of future economic crisis in health care appeared in several forms, including:

  • R&D failed to consider macro economics
  • Technology that increased costs were most likely to be funded and succeed
  • Technology that decreased costs were often unfunded and/or not adopted
  • Cultural silos in scientific disciplines were entrenched as effective guilds
  • Professional compensation packages were growing rapidly
  • Regulatory bureaucracy was devolving
  • The valley of death was expanding rapidly
  • Trajectory of HC costs and customer means were in opposing X formation

Over the course of the following decade, while observing my father’s experience with diabetes—including billing, it became obvious that few stakeholders in the life science and health care ecosystem were provided with a financial incentive for preserving the overall system; meaning the challenge was classically systemic. Clayton Christensen sums up the situation in health care succinctly: “clearly, systemic problems require systemic solutions.”

12 years to design an answer

When looking at the challenges within information technology and health care first as individual systems, and then combined as a dynamic integrated system, we came to several conclusions that eventually led to the design of our semantic healthcare platform.

Ten essentials:

  1. Patients must manage their own health, including data
  2. Universal computing standards; likely regulated in health care
  3. A trustworthy organization and architecture
  4. Simple to use for entry level skills
  5. Unbiased, evidenced-based information and learning tools
  6. Highly structured data from inception
  7. High volume data meticulously synthesized from inception
  8. Integrated professional and patient social networking
  9. Mobile to include automation, analytics, and predictive
  10. Anonymous data should be made available to researchers

Probable benefits

While we must build, scale, and evaluate our system to confirm and measure our predictions, we anticipate measurable benefits to multiple stakeholders, including:

  • Higher levels of participation in preventative medicine
  • More timely and accurate use of diagnostics
  • Reduced levels of unnecessary hospital visitation
  • Higher levels of nutrition
  • Lower levels of over-prescription
  • Reduced levels of human error
  • Improved physician productivity
  • Reduced overall cost of health care
  • Much greater use of analytics and predictive algorithms
  • Expedited paths to discovery for LS researchers

In May of 2010, we unveiled our semantic health care platform with a companion diabetes use case scenario, which is written in a story telling format, and is freely available in PDF.

Realizing the Theory: Yield Management of Knowledge


Since I have now responded to a related USPTO elections/restriction letter, I feel a bit more freedom in sharing additional thoughts on the underlying theory that has served as the foundation of Kyield:

Yield Management of Knowledge: The process by which individuals and organizations manage the quality and quantity of information consumption, storage, and retrieval in order to optimize knowledge yield.–(me) 

(Please see this related post prior to attempting to categorize)

Background: The cyber lab experiment

The theory emerged gradually over several years of hyper intensive information management in my small lab on our property in northern Arizona (we are currently based in Santa Fe after a year in the Bay area). The experimental network called GWIN (Global Web Interactive Network) was designed after several years of previous high intensity work in our incubator, which followed a decade in consulting that was also drawn from. The GWIN product was entirely unique and intentionally designed to test the bleeding edge of what was then possible in computer and social sciences. We were particularly interested in filtering various forms of digitized intelligence worldwide as quality sources came online, conversion to useful knowledge, weaving through academic disciplines, and then mix with professional networking.

The network was open to anyone, but soon became sort of an online version of the World Economic Forum (photo), with quite a few of the same institutions and people, although our humble network even in nascent form was broader, deeper, larger, with less elitism and therefore more effective in some ways.

Mark Montgomery's first computer lab and incubator building (Kyield)

Our first computer lab and office

I was quite proud that membership was based primarily on interest, effort, and intellectual contributions; not social status, guilds, political views, market power, or wealth, even if the norm in our membership.

My late partner and friend Russell Borland and I learned a great deal from GWIN, as did many of our members and those who followed our work closely. The first thing one should understand is that while we worked with various teams of remote programmers to build products, and served thousands of people worldwide daily who provided about half of the content, I operated the lab solo onsite. Given the volume, work hours, lab efficiencies, and short commute, I was likely consuming as much data personally as any other human, which is fundamental to the construct of the theory; how the brain functions in dealing with overload, human-computer interaction, and what tools, languages, and architectures were needed in order to optimize knowledge yield.

Need to emphasize data quality, not quantity

The vast majority of solutions for improved decision making in the networked era have been computing versions of HazMat crews attempting to clean up the toxic waste resulting from information overload. Reliance on the advertising model for the consumer Web created a system design essentially requiring lower quality in a populist manner, aided and abetted by search and then social networking.  While the advertising model is certainly appropriate for many forms of low cost entertainment, for serious learning and important decision making envisioned in yield management of knowledge, an ounce of prevention in the form of logically structured data is worth potentially far more than a ton of cure.

It became obvious very early in our lab (1996) that the world needed a much more intelligently structured Web and Internet, for consumers as well as the enterprise. In studying search engines closely at the earliest stages, they were by necessity applying brute computing force, clever algorithms, and exploiting content providers in an attempt to deal with the unprecedented explosion of data, and noise, while providing what investors needed for such risk. What we really needed of course was logically structured data that was controlled by data owners and providers, which would then (and only then) provide the opportunity for knowledge yield. Further, the languages used for the structure must be non-proprietary due to the overwhelming global market power that would result for the winner due to the network effect.

Need for independent standards

In the enterprise market, proprietary languages can and do thrive internally, but the integration required in sharing data with essential external partners is similar to the brute force applied in search—crisis clean-up rather than prevention, complete with disempowerment of customers who create and own the data. Most organizations increasingly rely on shared data, whether regulatory or partnerships, even if private and encrypted, so proprietary data languages are not well aligned to the enterprise in the increasingly networked, global economy.

Finally, there are fundamental and unavoidable conflicts between large public companies that dominate markets with proprietary languages, their fiduciary duty, and the minimal sustainable requirements of our globally networked economy. A few examples of these conflicts can be clearly observed today in failure to deal effectively with network security, personal privacy, protection of intellectual property, and information overload. Evidence of the challenge can also be observed (and felt by millions of people) in economics where policies of multinationals favor the largest emerging markets due to market access. Of course the lack of functioning governance of what has become an essential global medium empowers these phenomenons.

It is my personal position that the intellectual competition should be intentionally focused on optimal use of the data to achieve the mission of the customer (whether individual consumer or enterprise), not protectionism, and that vendors should be the caretaker of data on the behalf of data owners, which requires a different economic model than the free ad supported model on the consumer Web.

So in order to realize the goal of the theory, we really needed a much more intelligent and highly structured Internet and Web that is based on independent languages in a similar method as the underlying protocols (TCP/IP and HTTP), and not supported by advertising alone.

I am speaking here of data communication in global networks, not individual applications. If we had the former we need not worry about the latter, at least in the context of network dynamics.

A word of caution on standards

A ‘Semantic Web’, which should make this possible, has yet to emerge, but when it does, the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines.– Tim Berners-Lee, 1999

One of the most significant risks with independent universal standards is unintended consequences.  While the stated vision of most involved is for more efficiency, transparency, empowering individuals and organizations, less bureaucracy, and lower costs, the nature of universal languages favors organizations that control data. One of the primary challenges in Web 1.0 and 2.0 has been data silos and private sector exploitation of data owned by others, which is largely driven by the revenue model. The primary challenge of Web 3.0 and 4.0 could be increased government control at the expense of individual liberty and private sector jobs, or perhaps worse; a public/private duopoly or oligopoly. From the perspective of an entrepreneur attempting to create jobs, I see such risk increasing daily.

Introducing Mawthos

Louis V. Gerstner, Jr. was perhaps most responsible for moving software towards a service model in his turn around of IBM in the mid 1990s, which was a brilliant business strategy for IBM at that time (we exchanged letters on the topic in that era), but it has not been terribly successful as a model for creative destruction, rather it has primarily seemed to exchange one extortion model (propriety code) for another (combination of proprietary code, consulting, and open source). Unlike a giant turn around, we were focused on more revolutionary models that provided value where none existed previously, so our first effort was an ASP (Application Service Provider), which emerged in the mid 1990s. In 2001, this paper by the SIIA is credited with defining and popularizing SaaS (Software as a Service), which has evolved more recently to an ‘on demand’ subscription model that is often bare bones software apps like those developed for smart phones.

While I have been a big proponent of a cultural shift in software towards service, I have rarely been a proponent of the results sold under the banner of service in the software industry, recognizing a shift in promotions and revenue modeling, not culture. In reviewing this article I recalled many public and private discussions through the years debating the misalignment of interests between vendors and the missions of customers, so thought I would introduce yet another acronym: Mawthos (Mission accomplished with the help of software), which is a slight jab at our acronym-fatigued environment, while attempting to describe a more appropriate posture and role for enterprise software, and the philosophy necessary to realize the theory Yield Management of Knowledge.

Diabetes and the American Healthcare System


I am pleased to share our just completed healthcare use case scenario in story telling format.

We selected diabetes mellitus (type 2) as a scenario to demonstrate the value of the Kyield platform to healthcare. Given the very high cost of healthcare in the U.S. currently with an unsustainable economic trajectory, it’s essential that costs be driven lower while improving care. Diabetes type 2 has direct costs exceeding $200 billion annually in the U.S. alone, the majority of which is entirely preventable.

The most obvious method to overcome this significant challenge is with far more intelligent HIT systems. It is not surprising that the legislation appears to be perfectly matched for the Kyield PaaS– nor is it entirely accidental as our mission aligns well with the needs in healthcare; an R&D process that began more than a decade ago.

This was a challenging scenario to develop and write due to the complexity of the disease, large body of regulations, incomplete standards, and varying interests between the partners in the ecosystem.  A bit of extra personal motivation for me was that my father died a few years ago from complications from diabetes, which was diagnosed shortly after my brother died of ALS. Ever since the shocking phone call from my brother informing me of his “death sentence” in the summer of 1997, I have followed ALS research; among the most complex and brutal diseases.

Diabetes type 2 is also complex, but unlike ALS and many other diseases, diabetes type 2 is largely preventable with a relatively modest change in behavior and lifestyle– modest indeed particularly compared to the later stage affects of the disease in absence of prevention, which we highlight in this use case. It’s difficult to understand after watching my father’s disease progress for a decade why anyone would not want to prevent diabetes– it literally destroys the human body.

I hope you find the case interesting and valuable. I am confident that if followed in a similar path as outlined in this scenario, the platform will contribute to significantly more effective prevention and healthcare delivery at a lower cost.

Diabetes Use Case Scenario (PDF)

Mark Montgomery