The ‘Sweet Spot of Big Data’ May Contain a Sour Surprise for Some


I’ve been observing a rather distasteful trend in big data for the enterprise market over the past 18 months that has reached the point of wanting to share some thoughts despite a growing mountain of other priorities.

As the big data hype grew over the past few years, much of which was enabled by Hadoop and other FOSS stacks, internal and external teams serving large companies have perfected a (sweet spot) model that is tailored to the environment and tech stack. Many vendors have also tailored their offerings for the model backed up with arguably too much funding by VCs, dozens too many analyst reports, and a half-dozen too many CIO publications attempting to extend reach and increase the ad spend.

The ‘sweet spot’ goes something like this:

  • Small teams consisting of data scientists and business analysts.
  • Employing exclusively low cost IT commodities and free and open source software (FOSS).
  • Targeting $1 to $5 million projects within units of mid to large sized enterprises.
  • Expensed in current budget cycle (opex), with low risk and high probability of quick, demonstratable ROI.

So what could possibility be wrong with this dessert—looks perfect, right? Well, not necessarily—at least for those of us who have viewed a similar movie many times previously, with a similar foreshadowing, plot, cast of characters, and story line.

While this service model and related technology has been a good thing generally, resulting in new efficiency, improved manufacturing processes, reduced inventories, and perhaps even saved a few lives, not to mention generated a lot of demand for data centers, cloud service providers and consultants, we need to look at the bigger picture in the context of historical trends in tech adoption and how such trends evolve to see where this trail will lead. Highly paid data scientists, for example, may then find that they have been frantically jumping from one project to the next inside a large bubble with a thin lining, rising high over an ecosystem with no safety net, and then suddenly find themselves a target of flying arrows from the very CFO who has been their client, and for good fundamental reasons.

As we’ve seen many times before at the confluence of technology and services, the beginning of an over-hyped trend creates demand for high-end talent that is unsustainable even often in the mid-term. Everyone from the largest vendors to emerging companies like Kyield to leading consulting firms and many independents alike are in general agreement that while service talent in big data analytics (and closely related) are capturing up to 90% of the budget in this ‘sweet spot’ model today, the trend is expected to reverse quickly as automated systems and tools mature. The reaction to such trends is often an attempt to create silos of various sorts, but even for those in global incumbents or models protected by unions and laws like K-12 in the U.S., it’s probably wise to seek a more sustainable model and ecosystem tailored for the future. Otherwise, I fear a great many talented people working with data will find in hindsight that they have been exploited for very short-term gain in a model that no longer has demand and may well find themselves competing with a global flood of bubble chasers willing to work cheaper than is even possible given the cost of living in their location.

What everyone should realize about the big data trend

While there will likely be strong demand for the very best mathematicians, data modelers, and statisticians far beyond the horizon, the super majority of organizations today are at some point in the journey of developing mid to long-term strategic plans for optimizing advanced analytics, including investments not just for small projects, but the entire organization. This is not occurring in a vacuum, but rather in conjunction with consultants, vendors, labs and emerging companies like ours that intentionally provide a platform that automates many of the redundant processes, enable plug and play, and make advanced analytics available to the workforce in a simple to use, substantially automated manner. While it took many years of R&D for all of the pieces to come together, the day has come when physics allows such systems to be deployed and so this trend is inevitable and indeed underway.

The current and future environment is not like the past when achieving a PhD in one decade will necessarily provide job demand in the next, unless like everyone else in society one can continue to grow, evolve and find ways to add value in a hyper competitive world. The challenges we face (collectively) in the future are great and so we cannot afford apathy or wide-spread cultures that are protecting the (unsustainable) past, but rather only those attempting to optimize the future.

In our system design, we embrace the independent app, data scientist, and algorithm, and recommend to customers that they do so as well—there is no substitute for individual creativity—and we simply must have systems that reflect this reality, but it needs to occur in a rationally planned manner that attacks the biggest problems facing organizations, and more broadly across society and the global economy.

The majority seem frankly misguided on the direction we are headed: the combination of data physics, hardware, organizational dynamics and economics requires us to automate much of this process in order to prevent the most dangerous systemic crises and to optimize discovery. It’s the right thing to do. I highly recommend to everyone in related fields to plan accordingly as these types of changes are occurring in half the time as a generation ago and the pace of change is still accelerating. At the end of the day, the job of analytics is to unleash the potential of customers and clients so that they can then create a sustainable economy and ecology. 

Advertisements

Five Essential Steps For Strategic (adaptive) Enterprise Computing


Given the spin surrounding big data, duopoly deflection campaigns by incumbents, and a culture of entitlement across the enterprise software ecosystem, the following 5 briefs are offered to provide clarity for improving strategic computing outcomes.

1)  Close the Data Competency Gap

Much has been written in recent months about the expanding need for data scientists, which is true at this early stage of automation, yet very little is whispered in public on the prerequisite learning curve for senior executives, boards, and policy makers.

Data increasingly represents all of the assets of the organization, including intellectual capital, intellectual property, physical property, financials, supply chain, inventory, distribution network, customers, communications, legal, creative, and all relationships between entities. It is therefore imperative to understand how data is structured, created, consumed, analyzed, interpreted, stored, and secured. Data management will substantially impact the organization’s ability to achieve and manage the strategic mission.

Fortunately, many options exist for rapid advancement in understanding data management ranging from off-the-shelf published reports to tailored consulting and strategic advisory from individuals, regional firms, and global institutions. A word of caution, however—technology in this area is changing rapidly, and very few analysts have proven able to predict what to expect within 24-48 months.

Understanding Data Competency

    • Data scientists are just as human as computer or any other type of scientist
    • A need exists to avoid exchanging software-enabled silos for ontology-enabled silos
    • Data structure requires linguistics, analytics requires mathematics, human performance requires psychology, predictive requires modeling—success requires a mega-disciplinary perspective

2)  Adopt Adaptive Enterprise Computing

A networked computing workplace environment that continually adapts to changing conditions based on the specific needs of each entity – MM 6.7.12

While computing has achieved a great deal for the world during the previous half-century, the short-term gain became a long-term challenge as ubiquitous computing was largely a one-time, must-have competitive advantage that everyone needed to adopt or be left behind.  It turns out that creating and maintaining a competitive advantage through ubiquitous computing within a global network economy is a much greater challenge than initial adoption.

A deep misalignment of interests now exists between customer entities that need differentiation in the marketplace to survive and much of the IT industry, which needs to maintain scale by replicating the precise same hardware and software at massive scale worldwide.

When competitors all over the world are using the same computing tools for communications, operations, transactions, and learning, yet have a dramatically different cost basis for everything else, the region or organization with a higher cost basis will indeed be flattened with economic consequences that can be catastrophic.

This places an especially high burden on companies located in developed countries like the U.S. that are engaged in hyper-competitive industries globally while paying the highest prices for talent, education and healthcare—highlighting the critical need to achieve a sustainable competitive advantage.

Understanding adaptive enterprise computing:

    • Adaptive computing for strategic advantage must encompass the entire enterprise architecture, which requires a holistic perspective
    • Adaptive computing is strategic; commoditized computing isn’t—rather should be viewed as entry-level infrastructure
    • The goal should be to optimize intellectual and creative capital while tailoring product differentiation for a durable and sustainable competitive advantage
    • Agile computing is largely a software development methodology while adaptive computing is largely a business strategy that employs technology for managing the entire digital work environment
    • The transition to adaptive enterprise computing must be step-by-step to avoid operational disruption, yet bold to escape incumbent lock-in

3)  Extend Analytics to Entire Workforce

Humans represent the largest expense and risk to most organizations, so technologists have had a mandate for decades to automate processes and systems that either reduce or replace humans. This is a greatly misunderstood economics theory, however. The idea is to free up resources for re-investment in more important endeavors, which has historically employed the majority of people, but in practice the theory is dependent upon long-term, disciplined, monetary and fiscal policy that favors investment in new technologies, products, companies and industries. When global automation is combined with an environment that doesn’t favor re-investment in new areas, as we’ve seen in recent decades, capital will sit on the sidelines or be employed in speculation that creates destructive bubbles, the combination of which results in uncertainty with high levels of chronic unemployment.

However, while strategic computing must consider all areas of cost competitiveness, it’s also true that most organizations have become more skilled at cost containment than human systems and innovation. As we’ve observed consistently in recent years, the result has been that many organizations have failed to prevent serious or fatal crises, failed to seize missed opportunities, and failed to remain innovative at competitive levels.

While hopefully the macro economic conditions will broadly improve with time, the important message for decision makers is that untapped potential in human performance analytics that can be captured with state-of-the-art systems today is several orders of magnitude higher than through traditional supply chain analytics or marketing analytics alone.

Understanding Human Performance Systems:

    • Improved human performance systems improves everything else
    • The highest potential ROI to organizations today hasn’t changed in a millennium: engaging humans in a more competitive manner than the competition
    • The most valuable humans tend to be fiercely protective of their most valuable intellectual capital, which is precisely what organizations need, requiring deep knowledge and experience for system design
    • Loyalty and morale are low in many organizations due to poor compensation incentives, frequent job change, and misaligned motivation with employer products, cultures and business models
    • Motivation can be fickle and fluid, varying a great deal between individuals, groups, places, and times
    • For those who may have been otherwise engaged—the world went mobile

4)  Employ Predictive Analytics

An organization need not grow much beyond the founders in the current environment for our increasingly data rich world to require effective data management designed to achieve a strategic advantage with enterprise computing. Indeed, often has been the case where success or failure depended upon converting an early agile advantage into a more mature adaptive environment and culture. Within those organizations that survive beyond the average life expectancy, many cultures finally change only after a near-death experience triggered by becoming complacent, rigid, or simply entitled to that which the customer was in disagreement—reasons enough for adoption of analytics for almost any company.

While the need for more accurate predictive abilities is obvious for marketers, it is no less important for risk management, investment, science, medicine, government, and most other areas of society.

Key elements that impact predictive outcomes:

    • Quality of data, including integrity, scale, timeliness, access, and interoperability
    • Quality of algorithms, including design, efficiency, and execution
    • Ease of use and interpretation, including visuals, delivery, and devices
    • How predictions are managed, including verification, feed-back loops, accountability, and the decision chain

5)  Embrace Independent Standards

Among the most important decisions impacting the future ability of organizations to adapt their enterprise computing to fast changing external environmental forces, which increasingly influences the ability of the organization to succeed or fail, is whether to embrace independent standards for software development, communications, and data structure.

Key issues to understand about independent standards:

    • Organizational sovereignty—it has proven extremely difficult and often impossible to maintain control of one’s destiny in an economically sustainable manner over the long-term with proprietary computing standards dominating enterprise architecture
    • Trade secrets, IP, IC, and differentiation are very difficult to secure when relying on consultants who represent competitors in large proprietary ecosystems
    • Lock-in and high maintenance fees are enabled primarily by proprietary standards and lack of interoperability
    • Open source is not at all the same as independent standards, nor necessarily improve adaptive computing or TCO
    • Independent standards bodies are voluntary in most of the world, slow to mature, and influenced by ideology and interests within governments, academia, industry, and IT incumbents
    • The commoditization challenge and need for adaptive computing is similar with ubiquitous computing regardless of standards type