Kyield Partner Program: An Ecosystem for the Present Generation


We’ve been in various discussions for the past couple of years with potential partners, finding many interests in the industry to be in direct conflict with the mission of the customer organizations. Most of our discussions reflect a generational change in technology underway that is being met with sharp internal divisions at incumbent market leaders that result in attempting to protect the past from the present and future. The situation is not new– it is very difficult for incumbent market leaders to voluntarily cannibalize highly profitable business models, particularly when doing so would threaten jobs, so historically disruptive technology has required new ecosystems (MS and Intel in the early 1980s, Google in the late 1990s, RIM, Apple app stores in the 2000s, etc.).

So due to the platform approach to Kyield and the disruptive nature of our technology to incumbent business models, and resistance to change in industry leaders–despite pressure from even their largest customers, we determined quite some time ago that it may require building a new ecosystem based on the Kyield platform, data standards and interoperability. The driving need is not just about the enormous sum of money being charged for unproductive functionality like lock-in, maintenance fees, and unnecessary software integration–although this is an ever-increasing problem for customers, or even commoditization and lack of competitive advantage–also a very serious problem, rather as is often the case it comes down to a combination of complexity, math and physics.

Not only is it not economically viable to optimize network computing in the neural network economy based on legacy business models, but we cannot physically prevent systemic crises, dramatically improve productivity, and/or expedite discovery in a manner that doesn’t bankrupt a good portion of the economy without data standards and seamless interoperability. In addition, we do need deep intelligence on each entity as envisioned in the semantic web in order to overcome many of our greatest challenges, to execute advanced analytics, and to manage ‘big data’ in a rational manner, but we also need to protect privacy, data assets, knowledge capital and property rights while improving security. Standards may be voluntary, but overcoming these challenges isn’t.

So we’ve been working on a new partner program for Kyield in conjunction with our pilot phase in attempt to reach out to prospective partners who may not be on our radar that would make great partners and work with us to build a new ecosystem based not on protecting the past or current management at incumbent firms, but rather the future by optimizing the present capabilities and opportunities surrounding our system.  We hope to collectively create a great many more new jobs than we could possibly do on our own in the process–not just in our ecosystem, but importantly for customer ecosystems.

We’ve decided for now on five different types of partner relationships that are intended to best meet the needs of customers, partners, and Kyield:

  • Consulting
  • Technology
  • Platform
  • Strategic
  • Master

For more information on our partner program, please visit:

http://www.kyield.com/partners.html

Thank you,

Mark Montgomery
Founder, Kyield
Advertisements

Five Essential Steps For Strategic (adaptive) Enterprise Computing


Given the spin surrounding big data, duopoly deflection campaigns by incumbents, and a culture of entitlement across the enterprise software ecosystem, the following 5 briefs are offered to provide clarity for improving strategic computing outcomes.

1)  Close the Data Competency Gap

Much has been written in recent months about the expanding need for data scientists, which is true at this early stage of automation, yet very little is whispered in public on the prerequisite learning curve for senior executives, boards, and policy makers.

Data increasingly represents all of the assets of the organization, including intellectual capital, intellectual property, physical property, financials, supply chain, inventory, distribution network, customers, communications, legal, creative, and all relationships between entities. It is therefore imperative to understand how data is structured, created, consumed, analyzed, interpreted, stored, and secured. Data management will substantially impact the organization’s ability to achieve and manage the strategic mission.

Fortunately, many options exist for rapid advancement in understanding data management ranging from off-the-shelf published reports to tailored consulting and strategic advisory from individuals, regional firms, and global institutions. A word of caution, however—technology in this area is changing rapidly, and very few analysts have proven able to predict what to expect within 24-48 months.

Understanding Data Competency

    • Data scientists are just as human as computer or any other type of scientist
    • A need exists to avoid exchanging software-enabled silos for ontology-enabled silos
    • Data structure requires linguistics, analytics requires mathematics, human performance requires psychology, predictive requires modeling—success requires a mega-disciplinary perspective

2)  Adopt Adaptive Enterprise Computing

A networked computing workplace environment that continually adapts to changing conditions based on the specific needs of each entity – MM 6.7.12

While computing has achieved a great deal for the world during the previous half-century, the short-term gain became a long-term challenge as ubiquitous computing was largely a one-time, must-have competitive advantage that everyone needed to adopt or be left behind.  It turns out that creating and maintaining a competitive advantage through ubiquitous computing within a global network economy is a much greater challenge than initial adoption.

A deep misalignment of interests now exists between customer entities that need differentiation in the marketplace to survive and much of the IT industry, which needs to maintain scale by replicating the precise same hardware and software at massive scale worldwide.

When competitors all over the world are using the same computing tools for communications, operations, transactions, and learning, yet have a dramatically different cost basis for everything else, the region or organization with a higher cost basis will indeed be flattened with economic consequences that can be catastrophic.

This places an especially high burden on companies located in developed countries like the U.S. that are engaged in hyper-competitive industries globally while paying the highest prices for talent, education and healthcare—highlighting the critical need to achieve a sustainable competitive advantage.

Understanding adaptive enterprise computing:

    • Adaptive computing for strategic advantage must encompass the entire enterprise architecture, which requires a holistic perspective
    • Adaptive computing is strategic; commoditized computing isn’t—rather should be viewed as entry-level infrastructure
    • The goal should be to optimize intellectual and creative capital while tailoring product differentiation for a durable and sustainable competitive advantage
    • Agile computing is largely a software development methodology while adaptive computing is largely a business strategy that employs technology for managing the entire digital work environment
    • The transition to adaptive enterprise computing must be step-by-step to avoid operational disruption, yet bold to escape incumbent lock-in

3)  Extend Analytics to Entire Workforce

Humans represent the largest expense and risk to most organizations, so technologists have had a mandate for decades to automate processes and systems that either reduce or replace humans. This is a greatly misunderstood economics theory, however. The idea is to free up resources for re-investment in more important endeavors, which has historically employed the majority of people, but in practice the theory is dependent upon long-term, disciplined, monetary and fiscal policy that favors investment in new technologies, products, companies and industries. When global automation is combined with an environment that doesn’t favor re-investment in new areas, as we’ve seen in recent decades, capital will sit on the sidelines or be employed in speculation that creates destructive bubbles, the combination of which results in uncertainty with high levels of chronic unemployment.

However, while strategic computing must consider all areas of cost competitiveness, it’s also true that most organizations have become more skilled at cost containment than human systems and innovation. As we’ve observed consistently in recent years, the result has been that many organizations have failed to prevent serious or fatal crises, failed to seize missed opportunities, and failed to remain innovative at competitive levels.

While hopefully the macro economic conditions will broadly improve with time, the important message for decision makers is that untapped potential in human performance analytics that can be captured with state-of-the-art systems today is several orders of magnitude higher than through traditional supply chain analytics or marketing analytics alone.

Understanding Human Performance Systems:

    • Improved human performance systems improves everything else
    • The highest potential ROI to organizations today hasn’t changed in a millennium: engaging humans in a more competitive manner than the competition
    • The most valuable humans tend to be fiercely protective of their most valuable intellectual capital, which is precisely what organizations need, requiring deep knowledge and experience for system design
    • Loyalty and morale are low in many organizations due to poor compensation incentives, frequent job change, and misaligned motivation with employer products, cultures and business models
    • Motivation can be fickle and fluid, varying a great deal between individuals, groups, places, and times
    • For those who may have been otherwise engaged—the world went mobile

4)  Employ Predictive Analytics

An organization need not grow much beyond the founders in the current environment for our increasingly data rich world to require effective data management designed to achieve a strategic advantage with enterprise computing. Indeed, often has been the case where success or failure depended upon converting an early agile advantage into a more mature adaptive environment and culture. Within those organizations that survive beyond the average life expectancy, many cultures finally change only after a near-death experience triggered by becoming complacent, rigid, or simply entitled to that which the customer was in disagreement—reasons enough for adoption of analytics for almost any company.

While the need for more accurate predictive abilities is obvious for marketers, it is no less important for risk management, investment, science, medicine, government, and most other areas of society.

Key elements that impact predictive outcomes:

    • Quality of data, including integrity, scale, timeliness, access, and interoperability
    • Quality of algorithms, including design, efficiency, and execution
    • Ease of use and interpretation, including visuals, delivery, and devices
    • How predictions are managed, including verification, feed-back loops, accountability, and the decision chain

5)  Embrace Independent Standards

Among the most important decisions impacting the future ability of organizations to adapt their enterprise computing to fast changing external environmental forces, which increasingly influences the ability of the organization to succeed or fail, is whether to embrace independent standards for software development, communications, and data structure.

Key issues to understand about independent standards:

    • Organizational sovereignty—it has proven extremely difficult and often impossible to maintain control of one’s destiny in an economically sustainable manner over the long-term with proprietary computing standards dominating enterprise architecture
    • Trade secrets, IP, IC, and differentiation are very difficult to secure when relying on consultants who represent competitors in large proprietary ecosystems
    • Lock-in and high maintenance fees are enabled primarily by proprietary standards and lack of interoperability
    • Open source is not at all the same as independent standards, nor necessarily improve adaptive computing or TCO
    • Independent standards bodies are voluntary in most of the world, slow to mature, and influenced by ideology and interests within governments, academia, industry, and IT incumbents
    • The commoditization challenge and need for adaptive computing is similar with ubiquitous computing regardless of standards type

Press release on our enterprise pilot program


We decided to expand our reach on our enterprise pilot program through the electronic PR system so I issued the following BusinessWire release today:

Kyield Announces Pilot Program for Advanced Analytics and Big Data with New Revolutionary BI Platform 

“We are inviting well-matched organizations to collaborate with us in piloting our break-through system to bring a higher level of performance to the information workplace,” stated Mark Montgomery, Founder and CEO of Kyield. “In addition to the significant competitive advantage exclusive to our pilot program, we are offering attractive long-term incentives free from lock-in, maintenance fees, and high service costs traditionally associated with the enterprise software industry.”

Regards, MM

Strategic IT Alignment in 2012: Leverage Semantics and Avoid the Caretaker


A very interesting development occurred on the way to the neural network economy: The interests of the software vendor and the customer diverged, circled back and then collided, leaving many executives stunned and confused.

The business model in the early years of software was relatively simple. Whether an individual or enterprise, if the customer didn’t adopt the proprietary standard that provided interoperability, the customer was left behind and couldn’t compete—a no brainer—we all adopted. By winning the proprietary standard in any given software segment, market leaders were able to deliver amazing improvements in productivity at relatively low cost while maintaining some of the highest profit margins in the history of business. This model worked remarkably well for a generation, but as is often the case technology evolved more rapidly than business models and incumbent cultures could adapt, so incumbents relied on lock-in tactics to protect the corporation, profit, jobs, and perhaps in some cases national trade.

Imagine the challenge of a CEO today in a mature, publicly traded software company with a suite of products that is generating many billions of dollars in profits annually. In order to continue to grow and keep the job, the CEO would need to either rediscover the level of innovation of the early years—as very few have been able to do, play favorites by providing some customers with competitive advantage and others with commodities—occurring in the enterprise market but risky, or focus on milking the commoditized market power in developed nations while pushing for growth in developing countries. The latter has been the strategy of choice for most mature companies, of course.

Doing all of the above simultaneously is nearly impossible. Killer apps by definition must cannibalize cash cows and public company CEOs have a fiduciary responsibility to optimize profits while mitigating risk, so most CEOs in this position choose to remain ‘dairy farmers’ either until retirement or are forced to change from emergent competition. In discussing one such incumbent recently with one of the leading veterans in IT, he described such a CEO as “the caretaker”. For enterprise customers this type of caretaker can be similar to the one we hired a few years ago to protect our interests when we moved to the Bay area, returning to a property that was uninhabitable after messaging ‘all is fine’ (beware of the caretaker wolf in sheep’s clothing).

Now consider that software exports generate large, efficient import engines for currency in headquarter countries, thus placing those national governments in strategic alignment with the incumbents in the short-term (often dictated by short-term politics); and another entire dimension appears that is rarely discussed, yet very strongly influences organizations worldwide. This situation can influence governments in protecting and reinforcing perceived short-term benefits of commoditized market leaders over critical long-term needs of organizations, markets, and economies. It is not inaccurate to suggest that national security is occasionally misunderstood and/or misused in the decision process on related policy.

Monopoly cultures think and act alike, whether in the public or private sector, which is often (eventually) their undoing, unless of course they adopt intentional continuous improvement. This is why creative destruction is so essential, has been embraced internally by most progressive organizations in some form, and why customers should proactively support innovators and farm markets towards sustainable diversity. Despite what may appear to be the case, the interests of incumbents in enterprise software are often directly conflicting with the interests of the customer.

While the theory of creative destruction has roots in Marxism, the practice is a necessity for capitalism (or any other ism) today due to the natural migration of cultures and economies to seek security and protection, which in turn takes us away from the discipline required for continual rejuvenation. We embrace creative destruction in what has become modern global socialism simply because very little innovation would emerge otherwise. Competitive advantage for organizations cannot exist in rigid commoditization of organizational systems as we see in software. Simply put—whether at the individual, organizational, or societal level, we should embrace creative destruction for long-term survival, especially in light of our current unsustainable trajectory.

Which brings us to the present day emergent neural network economy. In our modern network economy we simply must have interoperable software and communications systems. The global economy cannot function properly otherwise, so this is in everyone’s interest, as I have been saying for 15 years now. The overpowering force of the network effect would place any proprietary standard in an extortion position to the entire global economy in short order. The current danger is that functional global standards still do not exist while national interests can align perfectly in the short-term with proprietary standards. That is not to say, however, that proprietary languages and applications should not be encouraged and adopted—quite the contrary—open source suffers similar challenges as standards in terms of competitive differentiation. Rather, it only means that proprietary technologies cannot become the de facto standard in a network economy.

In peering into the future from my perch in our small private lab and incubator in wilds of N AZ more than 15 years ago, the need for standardized structured data becomes evident, as does the need for easily adaptable software systems that manage relationships between entities. Combined with the data explosion that seems infinite, it was also obvious that filters would be required to manage the quality and quantity of data based on the profiles of entities. The platform would need to be secure, not trackable for many applications, reflect the formal relationships between entities, and set the foundation for accountability, the rule of law, and sustainable economics. In addition, in order to allow and incentivize differentiation beyond the software programmer community, thus permitting market economics to function, the neural network economy would require adaptability that is similar to that which takes place in the natural, physical world.

I suggest then while nascent and imperfect, semantics is the preferred method to achieve alignment of interests in the emergent neural network economy, for it represents the building blocks in structured data for meaning in the digital age, resting at the confluence of human and universal languages, and serving as the functional portal to the neural network economy.

Finally, as the humble founder and inventor, permit me then to suggest that Kyield is the optimal system to manage semantics as it intentionally achieves the necessary elements for organizations to align and optimize their digital assets with the mission of the organization, containing adaptable tools to manage the relationships between entities, including with and between each individual and workgroup.

May 2012 deliver more meaning to you, your organization, and by extension our collective future.

Unacceptably High Costs of Data Silos


Above is a screen capture of an internal Kyield document that displays an illustration of the high costs of data silos to individual organizations, regions, and society based on actual cases we have studied; in some case based on public information and in others private, confidential information. This is intended for a slide-show type of presentation so does not go into great detail. Suffice to say that human suffering, lives lost–human and otherwise, and wars that could have been prevented that were not are inseparably intertwined with economics and ecology, which is why I have viewed this issue for 15 years as one ultimately of sustainability, particularly when considering the obstacles of silos to scientific discovery, innovation, and learning as well as crisis prevention.

 
Mark Montgomery
Founder & CEO
Kyield
http://www.kyield.com

Is social networking for the enterprise a revolution or ruse?


A slightly edited comment in response to a good article from Knowledge@Wharton: Is Business-centric Social Networking a Revolution — or a Ruse? ( via http://eponymouspickle.blogspot.com/ ) .

****~~~~****

Most professors I know are using twitter, but not for the consumer noise most are familiar with, rather to share information easily and quickly. Similarly, on Facebook the most intelligent people I know have the most friends. It’s a more functional replacement for email that bogged us all down– that isn’t discussed here, and should have been.

The author also misses several other points in this otherwise good piece, not least of which is the need to share information with customers and partners, monitor feedback, and importantly; introduce advanced analytics that wouldn’t be available otherwise.

I am not kind to the hype in software, and consumer social is absolutely in bubble territory, but we need to remind ourselves that collaboration–indeed computing, was originally designed for the workplace. That the enterprise market is relying primarily on consumer innovation (not just in collaboration or social–look at Apple) speaks to the dysfunction of organizational cultures more than how humans work best together, and with which tools.

Truth be known, and few if any have studied this issue as long or detailed as I have:

  • boards are often the last to know that their companies are failing
  • true meritocracy doesn’t exist in most organizations due to lack of structured information (meaning false assumptions and/or missing information dominates)
  • the interests of the knowledge workers and organizations are often misaligned, which can be improved substantially with properly structured data
  • virtually all human caused crises can be prevented with properly structured and designed networks, including many multi-billion dollar cases we’ve identified in private companies in the past five years
  • large bureaucratic enterprises are among the least innovative orgs on the planet, due primarily to cultures that protect and defend instead of working together to improve, and smothering creativity of individuals in the process

Yes, consumer social software is primitive for enterprise use. Yes, it is largely a waste of time to use tools designed for different purposes, but it’s equally true that most don’t understand how to use the tools. No, the functionality needed in the enterprise is not simply an incremental evolution of existing platforms.

It’s best for leaders of large organizations to seek revolutionary improvement given the extremely poor performance of many in crisis prevention and innovation.

Here is a generic short video on structured data for the enterprise that provides a summary on why revolutionary change is needed. Those silos exist in large part due to incompatible languages and high costs of integration in legacy systems. If you want to learn about hype in software, and the damage done to customers and society, take a look at lock-in and (its) relationship to voluntary data standards, comparing say to interoperability in the electric grid or plumbing infrastructure for comparison.
http://semanticweb.com/semantic-web-elevator-pitch-for-enterprise-decision-makers_b17651

This article clearly supports the status quo. How’s that working; and for whom?

Either organizations are about people or they aren’t. If they are, as I am absolutely certain, then we’ve just scratched the surface of what is already possible, and those who refuse to adopt technologies that provide a tailored competitive advantage will continue to suffer the consequences. That has been true since the invention of the wheel, if not before. And all along the way some Neanderthals have successfully convinced their tribes that starting fires is a waste of time, or learning how to use a microscope was too laborious given the obvious fact that their current tools (eyes) worked just fine.

Now excuse me while I click the twitter button above and share this article with chief influencers and on Linkedin that includes all manner of customers to Wharton.

Mark Montgomery
Founder & CEO
Kyield

Semantic enterprise elevator pitch (2 min video)