Results

Digital Business Transformation means Digital Project Delivery Transformation The terms and concepts are all too familiar but what really needs to be done, and how?

Digital Business Transformation means Digital Project Delivery Transformation The terms and concepts are all too familiar but what really needs to be done, and how?

We live in the age of surveys, whilst taken individually the results of a survey may be questionable, taken collectively trends are easily identified. One of the most prominent is the accelerating shift taking place in what, and where, Enterprises are choosing to invest in technology. Directly related to this, but much less remarked upon, is the aligned shift in how these new investments are delivered.

Moving from Monolithic Enterprise Applications to Apps dramatically changes the size of the project is a point that is generally well appreciated, but shifting the requirement from relative standardized Business activities into unique competitive differentiators is a wholly different proposition. Much less about writing software and much more about creating truly innovative code and capabilities.

Small projects delivered by new providers are hard to notoriously hard to track by the Industry watch keepers. An alternative way to measure is the ‘missing’ sum in the gap between the reported total expenditure on Enterprise Technology and amount recorded as being spent on ‘traditional IT’. The table below shows the priorities and makes it clear that there is a big shift towards ‘new’ enterprise business requirements that relate more closely to the ‘Digital Business’ agenda. It is instructive to look beyond the obvious values of relative importance of each heading, and consider which of the three groups is driving the activity.

The three individual colored bar graphs shown for each heading can be used to get an impression as to who is driving/funding the action. So while it might be interesting to argue about the relative importance and order it’s the make-up of the support for each initiative that tells the story about the type of project, and of course the delivery skills required.

The Green bar for ‘Strategic’ importance almost certainly relates to Business Management as the sponsor, whilst the ‘Functional’ description indicated by the Blue bar suggest it is more likely to be part of the traditional role of IT. The uncertainty that many managers, both Business and IT, feel over what to do about Digital Business seems to be indicated by the Red of ‘Transformation’ falling half way between the other two choices. However, there are two exceptions, as might be expected when aligned to the actual issue of ‘Transformation of existing Business Processes’ the answer is positive.

More seriously noteworthy, is the ‘Ability to find and retain Talent’ is also seen as a ‘Transformation’ factor, rather than either of the other two factors. This is an important aspect that illustrates people with the right skill are recognized as a critical success factor in transformation whether at a project, or an enterprise level.

Listing the Business Strategic and IT Functionality investments in the manner below both highlights the differences as well as providing an alignment to the business and technology deployment architecture; Systems of Engagement, Systems of Intelligence and Systems of Record.

The IT delivered actions can be seen to be part of the continued battle for improvement in existing activities largely within Systems of Record, the traditional role of IT. In contrast, the Business deployments initiatives relate to Systems of Engagement and Systems of Intelligence and the delivery of direct Competitive capabilities for Digital Business.

Business Deployed Strategic Initiatives

IT Delivered Actions

 

 

- Introduce New Digital revenue streams

- Improve organizational agility

- Improve the customer experience

- Meet compliance requirements

- Optimize worker productivity

- Increase Cyber security protection

- Increase operational effectiveness

 

 

Skill Requirements

Skill Requirements

 

 

  • Systems of Engagement
  • Systems of Record
  • Systems of Intelligence

 

 

 

New, sometimes Hybrid, skill sets relating to new technologies and business models

Traditional skills and role of IT

These Business led actions to grow Digital revenue streams and improve the customer experience are readily recognizable to be based on increasing interaction or ‘engagement’, but what exactly does that mean in launching a project to actually define and deliver? Who will take the lead, and where do the boundaries lie with the existing installed IT systems?

The Roles for ‘Systems’ in the Enterprise

For the IT department, it’s not easy, or practical, to lead as their core role is, and must remain, to protect the operational integrity of the Enterprise processes and systems; acting as a ‘disruptor’ in deploying solutions that directly challenge their status quo is counter intuitive. Additionally, these new initiatives require a challenging new mix of business, technology and sector skills working in partnership with often younger tech savvy business managers whose own tech skills may be disruptive to established IT staffers.

Creating the necessary truly innovative solutions that are called for by the competitive Transformation of Digital Business in the Enterprise’s markets equally calls for a Transformation in the design and delivery processes.

Solutions for Systems of Engagement and Systems of Intelligence have little in common with the solutions developed for Systems of Record. The new technologies, and more importantly the new business practices are both built differently as well as measuring business value in their deployment outcomes in ways other than the cost centricity of IT Systems of Record.

Small teams working in a collaborative manner to quickly define new possibilities in terms of their business value, delivery risk, and cost require equal innovation in staffing and methods. The intellectual contribution of each individual member, their ability to solve challenges successfully, and actually deliver, are likely to count for more than strict adherence to methodology. Not that this is suggesting an abandonment of tested principles merely a recognition that methodologies may ensure quality delivery, but in so doing can constraint innovation.

All of which may be recognized as a statement of principles, but given the very real shortages of skilled staff seems an unrealizable ideal! But where there is a market requirement there will be new entrants offering their own go to market innovation!

It would be pointless to draw attention to these issues if there was no solution. In conclusion, here are outline profiles of two Enterprises that have recently briefed Constellation Research on their abilities to answer these challenges.

SoftServe https://www.softserveinc.com/

SoftServe has 25 years of experience in the design and build of unique bespoke business solutions, always maintaining a position at the leading edge of Software creation. SoftServe both supports and drives their clients through the entire process from an initial business workshop, develop of the solution design, into creation and deployment. A major additional differentiation comes from their mathematical skills in creating the necessary unique algorithms that can truly competitively differentiate their clients’ solutions.

At a time when digital business calls for genuine innovative competitive differentiation to be rapidly defined, delivered and deployed by small teams with hybrid skills, SoftServe has built a strongly referenced position in a crowded market.

SoftServe states itself to be; ‘A digital authority operating at the cutting edge of technology to deliver the innovation, quality, and speed that its clients’ users expect. Fully aligned to four specific journey states of business maturity, SoftServe reveals, transforms, accelerates, and optimizes the way Fortune 500 and independent software vendors do business across healthcare, retail, media, and financial services industries’.

‘Focused on open innovation – from assessing compelling new ideas, to developing and implementing transformational products and services. Provided as a cohesive and comprehensive approach built on a foundation of empathetic, human-focused experience design talent natured and developed by their own SoftServe University.’

SoftServe is an excellent example of being large enough to be able to bring together the range of skills required through deep technology and business model savvy experience, and nimble enough to deploy through small teams using design experience. A well-focused client delivery model to suit the very different conditions that digital business solutions are demanding.

TopCoder https://www.topcoder.com/

Imagine being able to draw on the talents of over one million technologists to find and put together a perfect team to develop your requirement. Founded in 2001, Topcoder has created a unique marketplace that establishes relationships with and develops the software skills of developers through a competition-based model.

 

This remarkable pool of talent is applied to requirements by Topcoder’s secure, seamless project management platform, on which the work and deliver are both managed. By quite literally having 24/7 access to a truly global network of designers, developers, and data scientists, Topcoder can deliver innovative, creative solutions — ranging from unique apps, complex websites, to secure, ultra-reliable enterprise-grade software, and beyond.

 

References range from small to large across every vertical sector ranging from; developing an app to keep astronauts aboard the International Space Station fit; to optimizing an algorithm for DNA sequencing; to building 17 solar energy application MVPs in parallel in just 60 days.

 

Through Topcoder Challenges, developers have the opportunity to compete in software challenges for real businesses — all on a platform that allows up-and-coming coders to step into the spotlight. This enables them to break into the industry in front of peers, experts in the field, and real-world Fortune 500 clients. Members can compete in all types of development challenges to win prize money and establish their credibility in the market. They can hone their skills and learn new technology while working on real-world projects for some of the biggest businesses in the world.

 

The unique crowdsourcing model enables Topcoder to both build contact with and individually rank the skills of each member to establish their unique differentiator. Businesses and the larger marketplace can benefit greatly from the first-of-its-kind combination of open source and crowdsourcing across a massive talent pool. Topcoder describes their approach and value proposition as follows:

 

“At Topcoder, we don’t sell services. We sell outcomes; elegant, intuitive, functional digital solutions — no matter the complexity of your challenge. You pay only for a finished product, not the hours it takes to create it.”

 

Footnote;

Constellation Research is drawing attention to these companies and their innovative additions to the technology market place as examples and is not making specific recommendations. Buyers should carry out their own due diligence on prospective technology services partners, and technology product vendors.

New C-Suite

Digital Transformation Digest: Robots on the Rise, Twitter's New Enterprise API, Micro Focus Expands COBOL Business

Digital Transformation Digest: Robots on the Rise, Twitter's New Enterprise API, Micro Focus Expands COBOL Business

Constellation Insights

Rise of the robots: An influential robotics trade association has released its latest industry growth numbers for North America, and the results are impressive. Here are the key details from the Association for Advancing Automation:

For the first nine months of 2017, 27,294 orders of robots valued at approximately $1.473 billion were sold in North America, which is the highest level ever recorded in any other year during the same time period. These figures represent growth of 14% in units and 10% in dollars over the first nine months of 2016. Automotive-related orders are up 11% in units and 10% in dollars, while non-automotive orders are up 20% and 11%, respectively.

For shipments, 25,936 robots valued at $1.496 billion were shipped in North America during the first nine months. These record high quantities represent growth of 18% in units and 13% in dollars over what sold in 2016. Automotive-related shipments also grew 12% in units and 9% in dollars during that time, with non-automotive shipments increasing by 32% and 22% for units and dollars, respectively.

POV: A3 also reports that the North American machine vision market grew 14 percent to $1.94 billion. Overall, the numbers paint an incomplete picture of the global robotics market due to their geographic limitation, but do provide leading indicators of where and in which ways industries are investing in robotics. It's notable to see CPG companies come in third-highest with investments after metals and automotive parts, as is the 20 percent uptick in non-automotive orders. Constellation tracks robotics from both a hardware and software perspective. Go here to read our ShortList of robotic process automation vendors, which concerns the latter.

Twitter releases new enterprise API: Companies looking to engage with customers through Twitter chatbots have a new tool in the form of an enterprise API the social media platform released this week. The new API builds on Twitter's existing Account Activity API, which delivers in real time activities such as tweets, mentions, blocks, likes, mentions, direct messages sent or received, and follows. The standard version supports up to 35 accounts, whereas the enterprise edition allows for large numbers of accounts—Twitter didn't specify how many in its announcement—and managed support.

Twitter is also removing the beta label from a series of direct messages features, such as quick replies, welcome messages and customer feedback cards. Other key additions to direct messages include read receipts and indicators when a chatbot is typing out a response. Finally, Twitter released details of when some older DM features will be deprecated next year.

POV: The announcement shows Twitter can deliver on its developer roadmap and is continuing to figure out what kind of features enterprise developers need. Earlier this year, Twitter released a set of Premium APIs, which addressed a particular pain point: Many developers had been running up against the limits of its free APIs but weren't willing or able to pay for the enterprise versions. Premium APIs offer more features but are priced lower than the enterprise API.

 


Micro Focus adds open-source COBOL tools: There remains a vast amount of legacy COBOL code out there, running in production systems. Micro Focus has built a substantial business around COBOL support and application rehosting, and just became a bit bigger of a player with the acquisition of COBOL-IT, a Paris-based company that develops open-source COBOL technology. Here's how Micro Focus describes the value proposition:

It is the first COBOL vendor to develop and deliver an open source-based COBOL compiler and run time environment, enabling enterprises to run compiled objects in all Open Systems Unix, Linux, and Windows platforms.

"COBOL-IT adds a unique and exciting dimension to our COBOL product portfolio, and enables us to offer technology support that fully spans the evolving needs of customers as they extend, integrate and modernize their core business applications and data," said Chris Livesey, SVP and GM, Application Modernization and Connectivity at Micro Focus.

POV: It's the first acquisition for Micro Focus since it completed the purchase of software assets from Hewlett-Packard Enterprise in September. While the deal appears to be on the small side, it's fairly strategic for Micro Focus. COBOL-IT had previously positioned itself as the "best alternative" to Micro Focus and mainframe COBOL, claiming savings of up to 80 percent. It's not quite clear how COBOL-IT's tools will fit into Micro Focus's broader application modernization strategy, but the deal does give it a lower-cost option for its price list.

Future of Work Marketing Transformation Matrix Commerce Tech Optimization Chief Customer Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

Digital Transformation Digest: Where Net Neutrality Goes From Here, Ellison Bangs the Database Drum, IoT Botnet Creators Plead Guilty

Digital Transformation Digest: Where Net Neutrality Goes From Here, Ellison Bangs the Database Drum, IoT Botnet Creators Plead Guilty

Constellation Insights

Net neutrality—where we go from here: The Federal Communications Commission voted as expected to overturn net neutrality rules, prompting a blizzard of media coverage and chatter on social media. Supporters of net neutrality, which compelled ISPs to treat all legal Internet traffic the same, say the rules' absence will lead to higher costs for customers, data throttling and content discrimination. Opponents of the rules contend that net neutrality inhibited free market competition and getting rid of the rules will be better for customers. There are many moving pieces to the net neutrality debate with valid arguments to be made from both sides. Here's a look at some key points to absorb as it enters its next chapter.

  • The rules change will reclassify Internet traffic as an information service, putting it under the purview of the Federal Trade Commission. This is a good thing, as the FTC has a good track record of defending consumer rights and is a powerful agency in its own right. Consumers will not be left in the wind and rightfully so. The Internet is one area that needs regulations that define the playing field for competition and innovation.
  • It's possible the rules change won't stand. There are ample lawsuits underway from both consumer advocate groups and state and local lawmakers seeking to get them overturned. A key point of contention lies in federal law stating that agencies such as the FCC cannot make major decision like this one in an "arbitrary" or "capricious" manner. Given that net neutrality rules only went into effect in 2015, a court may be sympathetic to the notion of the FCC's action being in violation of that law. Meanwhile. even if the lawsuits fail, Congress could pass a bill restoring the rules or putting a modified set in place; with the midterm elections coming in 2018, a shift toward a Democratic majority would make this more likely.
  • In many markets, Internet customers pay according to connection speed. Data caps are becoming increasingly common as well. But not every customer is a streaming video fanatic. Why should the customer who wants a fast connection to read websites, but comes nowhere near their data cap, pay the same amount of money per month as a chronic Netflix binge-watcher? It's a valid question and market competition could lead to more diverse pricing models. It could also lead to ISPs passing more costs onto subscription content providers, who would then likely pass them onto consumers in the form of higher pricing. (Which leads to the last point.)
  • ISPs have clout and influence over the Internet, that is for sure. But they're increasingly not the only game in town. Google, Microsoft and other large tech vendors are building out their own networks. OneWeb, a startup building a global network powered by more than 600 satellites, looks like it could shake up the game. That being said, there needs to be much greater diversity in the choices consumers have in an ISP. The free market competition envisioned by the FCC commissioners who voted down the rules can't really exist if the current landscape, where many U.S. citizens have only one choice of ISP in their home market, remains the status quo.

Ellison, Hurd bang the database drum: Oracle reported second-quarter earnings this week, with revenue up 6 percent to $9.6 billion and net income up 10 percent to $2.2 billion. The company showed continued progress in cloud sales, with IaaS and PaaS revenue growth lagging that of SaaS. The full numbers are here for those inclined to go through them; as usual, we will focus on key commentary from Oracle executives during the earnings call. The most heated topic regarded Oracle's database and the competitive landscape. Here are the highlights.

  • Oracle's autonomous database, announced earlier this year, will arrive in January, CTO and executive chairman Larry Ellison confirmed. Taking aim once again at Amazon Web Services, Ellison said that customers who move a workload from AWS's Redshift service to Oracle's database and IaaS, their costs will drop by 80 percent. Oracle will also offer SLAs guaranteeing customers who move from AWS to Oracle that their database bills will be cut at least in half. What wasn't clear from Ellison's remarks is whether the cost savings refer to just to underlying IaaS resources or the cost of database subscriptions as well.
  • Oracle's database competitors are sticking with its platform with no sign of moving off, Ellison said: "A company you’ve heard of just gave us another $50 million this quarter to buy Oracle database and other Oracle technology. That company is Amazon. They’re not moving off of Oracle. Salesforce isn’t moving off of Oracle. ... Let me tell you someone else who’s not moving off of Oracle: SAP. They had that database called HANA they’d like to move to. SuccessFactors, they’ve been trying to move off of Oracle for five or six years. SAP is running on Oracle. Ariba runs on Oracle. All SAP large customers run on Oracle."
  • Speaking of customers, many are waiting to upgrade until the autonomous database arrives, Ellison said. Those who do so will be required to purchase certain database options, including Real Application Clusters and multitenant, he added. Oracle is in a good position to retain its dominant market share in database, Ellison contended: "There’s been no big migration anyplace of Oracle databases into anyone’s cloud, including ours. There’s been some, but it’s a relatively very, very small business. This all begins to happen starting in January, where the capabilities of cloud are so much better. The economics in the cloud are so much better than what’s available on premise, that we think our customers are going to move very, very rapidly to the cloud."

"Mirai" IoT botnet co-creators plead guilty: Two men who created the "Mirai" botnet, which used malware to infect thousands of Internet-connected devices to launch massive distributed denial-of-service attacks, have pleaded guilty in U.S. federal court. Paras Jha and Josiah White face up to five years in prison and a $250,000 fine under the plea deals.

Jha, White and a third man, Dalton Norman, also pleaded guilty to involvement in a "clickfraud" scheme wherein the botnet was used to generate website advertising revenue. They face similar sentences in that case.

Mirai's creators released the botnet's source code in 2016, leading to a series of large-scale DDoS attacks, including one on DNS provider Dyn that brought down the likes of Netflix and Twitter.

POV: It's fitting in a way that the DOJ unsealed the plea deals now, just over a week before the Christmas holiday, when untold millions of new consumer IoT devices are opened up and logged into the Internet. Come December 25th, the attack surface for IoT botnets like Mirai will get a whole lot bigger, and there's absolutely no indication that product manufacturers have improved IoT device security to an appreciable degree. The new year will undoubtedly bring more major DDoS attacks via IoT, but the question is whether they will finally be damaging enough to force crucial improvements in IoT device security.

Digital Safety, Privacy & Cybersecurity Future of Work Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief People Officer Chief Information Officer Chief Digital Officer

AI - Enterprise Scale Intelligence with C3 IoT

AI - Enterprise Scale Intelligence with C3 IoT

Business expectations of the deployment of AI, with associated Digital Business technologies, may be higher than the reality of their Technology staff’s current experience to deploy. Pilots, even business successful deployments, are by necessity focused on targeted and contained deliveries. The technologies, methods and experience will all have been chosen to reflect the requirements of the singular project.

The likely outcome, with some additional challenges, will be a dysfunctional Enterprise unable to integrate at an Enterprise level all resources, assets and intelligence in a cohesive manner able to optimize responses to markets and operating conditions. Sadly, as has been too often the case Business Management will be frustrated by the failure of Technology investments to deliver expectations.

Why is this happening again? From the start of Enterprise IT in the mid-eighties, through the succession of Internet, Web, Mobility, Social Tools, and Apps, localized project successes have had a nasty tendency to turn into Enterprise level barriers. Yet eager Business managers and technology leaders are once again embarking on project level deployments, often supported by C Level executives. Why is the apparently accepted Enterprise Business Strategy of Digital Business transformation failing to set an appropriate Enterprise Digital Technology strategy?

Once again there is a Business and IT alignment issue, but this time not occurring in the usual manner. Digital Business operating models focus on ‘unbundling’, or de-centralization, of activities, to support flexibility in what and how they do business. Not unsurprisingly Technology managers believe they should align to this business model by moving to individual project deployment optimization.

It is perfectly understandable, even a necessity at one level, given the new diversity of requirements for ‘Systems of Engagement’, (as distinct from ‘Systems of Record’ provided by current IT systems). However, to make this work there has to be a parallel, and conscious, Digital Business Technology strategy for the use of data from these new systems to develop Enterprise ‘Systems of Intelligence’, (Business Insights from Machine Learning and AI).

The words ‘use of data’ are the key to understanding what is required. This is not a call for an Enterprise Data strategy, or even Big Data analytics, as commonly understood in support of the current IT systems, and associated Business Intelligence analytics. Nor for that matter will the current methods of Enterprise Architecture deliver integrity and integration in this new Digital Business and Technologies environment. A Digital Business and Technology strategy requires three individual elements to be successful, combined and integrated at the Enterprise level;

Enterprise IT Systems of Record (existing Enterprise Applications), create centralized recording of transactions as ‘state-full’ historic data in defined formats arising from an architecture of closely coupled Enterprise Applications; Enterprise scale challenges usually refer to the size of the resulting data bases, together with their management and analytics.

Digital Business Systems of Engagement (Apps from IoT, Mobility, Social tools, etc.), refers to loosely coupled integration of sets of activities on the Edge of the Business producing huge flows of ‘stateless’ data on real events actually happening. This data is the foundation of the new era of Digital Business and has little in common with Systems of Record. The challenge of Scale refers to the data volumes flowing from connected devices, and systems, at levels that overwhelming traditional methods of storage and analytics.

Enterprise Systems of Intelligence use ‘stateless’ data to provide the fundamental properties that enable Digital Business insights, read and react capabilities, and support the introduction of Machine Learning and AI. Digital Business Stateless data has the power to corrupt Enterprise IT State-full data and the role of Systems of Intelligence is to provide a successful integration and processing capability that results in the genuine innovation that Machine Learning and AI brings to Enterprise operations.

The Constellation Research Digital Transformation Survey October 2017 identified that whilst Enterprise CxO management had a clear commitment to enacting a Digital Transformation there is a distinct lack of clarity as to what this actually meant. The reality is that many Enterprises are still struggling to come to terms with successful deployment of IoT driven projects, modernization of their existing IT systems to run on the Cloud, or integration of Social CRM.

These ‘Digital Strategies’, driven by the very real sense of the speed with which market and competitive change is occurring, are creating intense activity and investment. Unfortunately, also a breeding ground for potential internal Digital Disruption of operations that will see Enterprise management struggling to reconnect, and rebuild, their internal operational coherence to face the all too real external Digital Disruption of their markets.

A genuine joint Enterprise Business and Technology strategy is required to ensure Systems of Engagement develop to empower the Enterprise Systems of Intelligence that are the key to Digital Transformation of Enterprise capabilities.

In this, the Digital Technology era, a Cloud Platform would appear to be the immediate answer, but as in so many other aspects of the Digital Transformation some care needs to be taken over the casual use of terminology. The traditional definition of a ‘Platform’ refers to the technology capability to provide a set of common technology functions to simplify the tasks of integration and processing, as such it is the common route chosen by Technology management. Though inherently a technology choice many offer the addition of some Business value, but technology simplification is usually offset by Business requirement constraints.

N.B. Internal Integration Platforms should not be confused with external Business Ecosystems Platforms, though similar care must be exercised in choices. See Constellation blog Open Business Ecosystems or Closed Technology Platforms.

Systems of Intelligence are constantly discovering and creating new innovative business insights by continually refining dynamic, unpredictable, event inputs with historic data. Much of this new insightful value comes from finding in the data flows previously unforeseen, or planned, relationships, a very different processing activity from traditional Systems of Record deployments using data Integration and analytical tools.

This new requirement for a combination of simplified, abstraction, of core technology elements with complex integration, processing, and automation of responses is more closely aligned to the definition of an Software Engine, than a Integration Platform. With the simplification of Software development, often by the use of a Platform in one form or another, over recent years the thoughtful in depth functional specification evoking the requirement for an ‘Engine’ to power a series of complex software elements has sharply decreased.

The following description of a Software Engine fits the requirement for Systems of Intelligence more closely, whereas the requirements for Systems of Engagement integration and connectivity are well suited to Platform technology.

An ‘Engine’ is defined as an application program that coordinates the overall operation of other programs. Systems of Intelligence are the conglomeration of the dynamic environment of Business defined values, and insights, integrated, regardless of technology definitions and formats, into cohesive optimized Enterprise and/or Edge outputs

Perhaps it would be better to think of an Enterprise requiring a Digital Business Engine to successfully operate as an intelligent Digital Enterprise by provide the necessary continuous business ‘agility’ from Systems of Intelligence.

The first wave of Digital Pioneers, or early adopters, have already reached the stage when these issues have become clear, and the need to find a solution has become necessary. Interestingly, many who are the standard business cases references for Digital Transformation have started deployments with C3 IoT.

The following is based on a briefing by C3 IoT to Constellation Research

The C3 IoT Platform™ is a platform as
a service (PaaS) for the design, development, deployment, and operation of next-generation AI and IoT applications and business processes. The applications apply advanced machine learning to recommend actions based on real-time analysis of petabyte-scale data sets, dozens of enterprise and extraprise data sources, and telemetry data from tens of millions of endpoints.

C3 IoT provides a suite of pre-built, cross-industry applications, developed on its platform, that facilitate IoT business transformation for organizations in energy, manufacturing, aerospace, automotive, chemical, pharmaceutical, telecommunications, retail, insurance, healthcare, financial services, and the public sector. C3 IoT cross-industry applications are highly customizable and extensible. Pre-built applications are available for predictive maintenance, sensor health, enterprise energy management, capital asset planning, fraud detection, CRM, and supply network optimization. Customers can also use the C3 IoT Platform to build and deploy their own custom applications.

Operationalizing IoT is much harder than it looks. Many IoT platform development efforts to date – internal development projects as well as industry-giant development projects – are attempts to develop a solution via acquisition of multiple piece parts or from the many independent software components that are collectively known as the open-source Apache Hadoop stack. Despite the marketing claims surrounding these projects, a close examination suggests that there are few examples, if any, of enterprise production-scale, elastic cloud, big data, artificial intelligence, and machine learning IoT applications that have been successfully deployed in any vertical market except for applications addressed with the C3 IoT Platform.

Companies often expect that installing the Apache Hadoop open-source stack will enable them to establish a “data lake” and build from there. However, the investment and skill level required to deliver business value from this approach quickly escalates when developers face hundreds of disparate software components in various stages of maturity, designed and developed by more than 350 different contributors using different programming languages while providing incompatible software interfaces. A loose collection of independent, open-source projects is not a true platform, but rather a set of independent technologies that need to be somehow integrated and maintained by developers. Instead, companies need a comprehensive AI and IoT application development platform. To avoid this increasingly common pitfall, C3 IoT leverages a model-driven architecture approach.

Model-Driven Architecture

The architecture requirements for the Systems of Intelligence that are the key to the digital transformation of enterprises are uniquely addressed through a Model-driven architecture. Model-driven architectures define software systems using platform independent models, which are translated to one or more platform specific implementations. The C3 IoT Platform is a proven model-driven architecture.

This architecture abstracts application and machine learning code from the underlying platform services and provides a domain-specific language (annotations and expressions) to support highly declarative, low-code application development.

The model-driven approach provides an abstraction from the underlying technical services (for example, queuing services, streaming services, ETL services, data encryption, data persistence, authorization, authentication) and simplifies the programming interface required to develop AI and IoT applications to a Type System interface.

The model is used to represent all layers of an application including the data interchange with source systems, application objects and their methods, data aggregates on those objects, complex features representing business and application logic, AI-machine learning algorithms that use these features, and the application user interface. Each of these layers are also accessible as microservices.

The C3 IoT Platform is an example of a proven model-driven AI platform. The C3 IoT Platform allows small teams of five to ten application developers and data scientists to collaboratively develop, test and deploy large-scale production AI applications in one to three months. The platform is proven across 30 large-scale deployments across industries including energy, manufacturing, aerospace and defense, healthcare, financial services, and more. A representative large-scale C3 IoT Platform deployment processes AI inferences at a rate of a million messages per second against a petabyte-sized unified federated cloud data image aggregated from 15 disparate corporate systems and a 40-million sensor network. Global 1000 organizations have successfully used the platform to deploy full-scale production deployments in six months and enterprise-wide digital transformations with more than 20 AI applications in 24- to 36-month timeframes.

Time to market is critical as next-generation computing platforms emerge. C3 IoT is a proven, scalable, production and development environment with dozens of large-scale IoT applications deployed in the market, managing tens of millions of smart, sensor-enabled devices. The time-to-market advantage of a proven, scalable architecture can be leveraged to gain early network effects and competitive differentiation in the next big wave of computing and industrial automation.

C3 Type System – a detailed review provided by C3 IoT

The C3 Type System is a data object-centric abstraction layer that binds the various C3 IoT Platform components, including infrastructure and services. It is both sufficient and necessary for developing and operating complex predictive analytics and IoT applications in the cloud.

The C3 Type System is the medium through which application developers and data scientists access the C3 IoT Platform, C3 Data Lake, C3 Applications, and applications and microservices. Examples of C3 Types include data objects (e.g., customer, product, supplier, contract, or sales opportunity) and their methods, application logic, and machine learning classifiers.

The C3 Type System allows programs, algorithms, and data structures – written in different programming languages, with different computational models, making different assumptions about the underlying infrastructure – to interoperate without knowledge of the underlying physical data models, data federation and storage models, interrelationships, dependencies, or the bindings between the various structural platform or cloud infrastructure services and components (e.g., RDBMS, No SQL, ETL, SPARK, Kafka, SQS, Kinesis, object models, classifiers, data science tools, etc.). The C3 Type System provides RESTful interfaces and programming language bindings to ALL underlying data and functionality.

Leveraging the C3 Type System, application developers and data scientists can focus on delivering immediate value, without the need to learn, integrate, or understand the complexities of the underlying systems. The C3 Type System enables programmers and data scientists to develop and deploy production big data, predictive analytics, and IoT applications in one-tenth the time at one-tenth the cost of alternative technologies.

To improve manageability, Types support multiple object inheritance, allowing objects to inherit characteristics from one or more other objects. For example, a mixed-use building might have characteristics of both a residential and commercial use building.

The Type System, through inherent dataflow capabilities, automatically triggers the appropriate processing of data changes by tracing implicit dependencies between objects, aggregates, analytic features, and machine learning algorithms in a directed acyclic graph.

The Type System is accessible through multiple programming language bindings (i.e. Java, JavaScript, Python, Scala, and R) and Types are automatically accessible through RESTful interfaces allowing interoperability with external systems.

Model-Driven Architecture Abstracts Underlying Platform Services Through a Simple Type Systems Interface

 

 

Summary

The C3 IoT Platform (and the associated C3 Type System) is a unique high-productivity, low-code application PaaS for rapidly developing and deploying AI and IoT applications at scale across an enterprise. The C3 IoT Platform has been developed and hardened through numerous large-scale deployments over 9 years at an investment of $300 million. The C3 IoT Platform is proven to support Enterprise Digital Transformations.

Capitalizing on the potential of AI and IoT requires a new kind of technology stack that can handle the volume, velocity, and variety of big data and operationalize machine learning at scale. Existing attempts to build an IoT technology stack from open-source components have failed—due to the complexity of integrating hundreds of software components developed with disparate programming languages and incompatible software interfaces. C3 IoT has successfully developed a comprehensive technology stack from scratch for the design, development, deployment, and operation of next-generation applications and business processes.

Developing AI applications requires less code to be written, and less code to be debugged and maintained, significantly reducing delivery risk and total cost of ownership. Using the C3 IoT Platform, a company’s investment in application code is abstracted from underlying infrastructure and platform services and future proofed against rapidly evolving software technologies avoiding lock-in to those technologies. Further, the C3 Type System provides enterprise leverage – any published code is instantly available to the rest of the organization.

The C3 believe its IoT Platform is the industry’s only IoT platform proven in full-scale production. With hundreds of millions of sensors under management and more than 20 enterprise customers reporting measurable ROI, including improved fraud detection, increased uptime as a result of predictive maintenance, improved energy efficiency, and stronger customer engagement.

 

New C-Suite Chief Digital Officer

Digital Transformation Digest: Target Buys Shipt for Same-Day Delivery, NVIDIA Trains AI At Construction Site Safety, Microsoft Pushes Azure Cost Envelope Again

Digital Transformation Digest: Target Buys Shipt for Same-Day Delivery, NVIDIA Trains AI At Construction Site Safety, Microsoft Pushes Azure Cost Envelope Again

Constellation Insights
 

Target buys same-day shipping startup: Hoping to both fend off Amazon's challenge and keep pace with rival Walmart, Target is buying same-day delivery platform provider Shipt for $550 million. Target says Shipt will give it same-day delivery at half of its stores by early in the new year, and in every major market by late next year.

Shipt will become a subsidiary of Target but will run independently and still seek deals with other retailers. At first, Target customers can get same-day deliveries of grocery items, home products, electronics and some other categories, with an expansion coming over the next couple of years.

The startup, based in Alabama and San Francisco, has a network of 20,000 personal shoppers who fulfill customer orders in 72 markets. As of right now, Shipt's coverage area is centered in the South, Mid-Atlantic, part of the Midwest and Texas. Memberships are $99 for an annual plan, or $14 for a month-to-month plan. Deliveries are free on orders over $35 and can be completed in less than one hour, according to Shipt's site. Delivery drivers can earn around $25 per hour.

POV: Target's move comes just a few months after it acquired Grand Junction, another delivery-related startup. Grand Junction's platform has a different focus, however. It can be used to coordinate and optimize deliveries performed by a retailer's own employees, as well as connect them to some 700 local contract carriers. Shipt's model has more of a gig economy makeup, but it will be interesting to see how Target blends the two companies' capabilities going forward.

Meanwhile, Target's ambitious-sounding plans for increased Shipt availability sound like a must, given its absence in major markets such as the Northeast and California. While there will be an awareness gap at first, Target's retail footprint, digital channels and loyalty programs could close it rather quickly. Overall, same-day delivery is becoming table stakes for many retailers, and Target is showing willingness to make serious investments in it.

But Target's competitors are hardly standing still. Walmart, which is experimenting with rapid delivery on multiple fronts, recently acquired Parcel, a New York delivery startup with a different twist than either Shipt or Grand Junction. Parcel uses its own employees and leased trucks to make deliveries from a warehouse in Brooklyn to locations around the city. For Walmart, buying Parcel was a way to test out same-day delivery in one of the most difficult markets to maneuver in from a logistics standpoint.

Meanwhile, Amazon unsurprisingly isn't letting off the gas either. On the same day as Target's announcement, Amazon said that free same-day and one-day shipping for its Prime members has been expanded to 8,000 U.S. cities and towns.

NVIDIA, Komatsu eye AI for safer jobsites: Chipmaker NVIDIA, which has moved deeper into software and artificial intelligence in recent years, is working with construction equipment manufacturer Komatsu on technology aimed at making work sites safer places. Here are the key details from their announcement:

The partnership – described at GTC Japan by NVIDIA founder and CEO Jensen Huang – will focus on Komatsu using NVIDIA GPUs to visualize and analyze entire construction sites. The NVIDIA® Jetson AI platform will serve as the brain of heavy machinery deployed on these sites, enabling improved safety and productivity.

NVIDIA GPUs will communicate with drones and cameras in the construction sites, acting as an AI platform for analysis and visualization. SkyCatch will provide drones to gather and map 3D images for visualizing the terrain at the edge. OPTiM, an IoT management-software company, will provide an application to identify individuals and machinery collected from surveillance cameras. Both of these Komatsu partners are also members of NVIDIA’s Inception program for AI startups.

At the center of the collaboration is NVIDIA Jetson, a credit-card sized platform that delivers AI computing at the edge. Working in tandem with NVIDIA cloud technology, Jetson will power cameras mounted on Komatsu’s construction equipment and enable 360-degree views to readily identify people and machines nearby to prevent collisions and other accidents.

NVIDIA and Komatsu will hit at other pain points in the construction industry beyond safety. In Japan, construction workers are in high demand because of its aging population; to that end, Komatsu and NVIDIA's object is also to make job sites more productive.

POV: The project, which builds upon jobsite safety measures Komatsu has been working on since 2015, has many moving pieces. This reflects both its complexity and the breadth of NVIDIA and Komatsu's plans. Construction is just the latest industry move for NVIDIA, which has pushed strongly into autonomous vehicles, healthcare and robotics.

NVIDIA's Jetson developer kit for embedded applications uses the same Kepler GPU core found in supercomputers. GPUs contain thousands of smaller cores that are well-suited for massive parallel processing of deep learning workloads.

NVIDIA has been working on AI for nearly 10 years and has developed a vast set of related libraries and frameworks. This year, it introduced NVIDIA GPU Cloud, a unified software stack that runs on a PC, a more powerful DGX system, or in the cloud.

A related focus has been on education; NVIDA has said it will train 100,000 developers on deep learning this year. The deal with Komatsu represents the type of broad industry use case for AI that can start putting those developers to work sooner than later.

Microsoft pushes Azure cost envelope again: If there is a perennial trend in the IaaS and PaaS industry, it's a steady stream of cost cuts as rivals seek to remain competitive. This week, Microsoft introduced four new capabilities and pricing changes for Azure, with the most prominent being Azure Policy.

The service, now in public preview, allows customers to apply governance over their Azure resources:

Azure Policy allows you to turn on built-in policies or build your own custom policies to enable company-wide governance. For example, you can set your security policy for your production subscription once and apply that policies to multiple subscriptions.

Microsoft is also expanding support for its Cost Management service to Azure Virtual Machine Reserved Instances; cutting prices on Dv3 VMs in some regions, and making its Azure Archive low-cost storage generally available.

POV: None of the announcements on their own are earth-shattering, but they all hold some importance to Azure customers. Moreover, while Microsoft wants to grow Azure workloads and corresponding spend as much as possible, and price cuts are one way to entice more consumption. But keeping a steady focus on factors such as cost optimization and good governance as well drives more value from customers and demands similar efforts by competitors.

Data to Decisions Future of Work Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Procurement Officer Chief Digital Officer

Digital Transformation Digest: Box Eyes Bigger Enterprise Consulting Role, How Chaos Can Prevent Cloud Outages, IBM's Enterprise Chatbot Exchange

Digital Transformation Digest: Box Eyes Bigger Enterprise Consulting Role, How Chaos Can Prevent Cloud Outages, IBM's Enterprise Chatbot Exchange

Constellation Insights

Box believes content management is a crucial component of digital transformation projects, and to that end it has launched a new consulting service called Transform. The service provides customers who buy in with a dedicated and long-term consultant who will work with them on issues much broader than implementing Box software. Here's how Box describes Transform:

Integrating digital initiatives to accelerate organization-wide transformation: Box Transform provides enterprises with a strategic IT advisor to help them go beyond traditional file sharing practices, including implementing paperless strategies, digitizing time consuming processes like HR onboarding, building out custom applications with Box Platform, and retiring costly legacy infrastructure like network file shares.

Deploying agile methodologies: Projects through Box Transform are comprised of iterative sprints that include phases of planning, execution and retrospection, with the goal to increase efficiency and speed to results, resulting in processes being reimagined in months, not years.

Developing long term content strategies: With Box’s team of product and domain experts help you develop strategic and actionable content strategy roadmaps to centralize content layers and support ROI goals.

POV: Pricing wasn't disclosed for Transform services, which build upon the consulting practice Box formed in 2013. That was a telling development in Box's push toward enterprise business, and Transform represents the next natural step on that journey. The question is whether Box can make a compelling enough case to customers that Transform can truly deliver value outside the company's core domain expertise. Issues around identity management and compliance with respect to content are two areas where Box consultants could do so, notes Constellation VP and principal analyst Alan Lepofsky.

Box, founded in 2005, does have the benefit of having helped thousands of customers move technology and processes to the cloud and has been steadily notching up more large enterprise deals, including with AstraZeneca and General Electric. Still, it's a crowded market for digital transformation consulting services and Box has its work cut out for it.

Gremlin targets cloud outages with chaos engineering: Outages are a fact of life in the cloud, and they're not only inconvenient but incredibly costly to both providers and customers who rely on their services. Now a startup just out of stealth called Gremlin wants to make outages a thing of the past—or at least dramatically infrequent—through a technique called chaos engineering.

It's led by Kolton Andrus, a former engineer at Amazon and Netflix, the latter of which is known for its use of chaos engineering. Netflix built a tool called Chaos Monkey that randomly takes parts of its production system offline to see how the rest of it responds, giving engineers insight into how to build more resiliency. Chaos Monkey is now part of a larger family of Netflix tools known as Simian Army.

Gremlin's tool acts in a similar manner; the company refers to it as an "engineering flu shot," wherein companies can "safely inject failure into systems in order to proactively identify and fix unknown faults."

The distributed systems found in cloud services are inherently problematic, Gremlin contends in its announcement:

Previously, software ran in a controlled, bare metal environment that introduced few variables, making it possible for engineering teams to identify potential risk and failures before they occurred. Within the last decade, systems have shifted to the cloud and become distributed with microservices and serverless methodologies, which introduced new dependencies on services outside of one’s control - creating complexity for any team of engineers to fully understand. This makes failure and outages inevitable.

Gremlin’s tool allows engineers to see how the system will behave in the face of failure, validates that defenses will work to prevent outages, minimizes the blast radius to allow for safe experimentation in production, and saves time and resources for engineering teams.

POV: Gremlin has raised a total of $8.75 million in seed and Series A funding, so it's certainly early days for the company in that respect. But it has already managed to sign up some high-profile early customers, including Expedia and Twilio. Gremlin has certainly taken inspiration from companies who understand massively scalable systems and its tool may quickly find a place in many enterprises' operations.

IBM launches Bot Asset Exchange: Big Blue is hoping to draw more chatbot developers to its Watson Conversation service through a new portal called the Bot Asset Exchange. Here are the key details from IBM's announcement:

The Bot Asset Exchange leverages open source development to help conversational interface developers, including bot, voicebot, IoT, and virtual reality developers easily discover, quickly configure, and simply deploy bots. Users can find domain-specific conversation logic ready for them to use, leverage the creativity of a community of bot builders to discover innovative ways others have built bots, or create and contribute their own bot conversation logic.

Using the platform tools and participating in the community is incentivized by a point system that offers rewards, recognition and prizes such as IBM-branded merchandise, tickets to IBM events like Index – San Francisco, one-on-one meetings with the Digital Business Group’s product team, and social media mentions.

POV: The portal is already stocked with a healthy number of industry-specific bots, with use cases including IT support, travel booking, online banking, equity trading, personal finance and property management. The bots are dependent on Watson Conversation for the back-end conversation processing but there are no restrictions on how they can be designed or where they can be published. The site is in its early stages but is well-designed; the challenge will be raising awareness, getting a critical mass of developers engaged, and ensuring good governance over the bot content.

Future of Work Next-Generation Customer Experience Tech Optimization Chief People Officer Chief Information Officer Chief Digital Officer

Digital Transformation Digest: Microsoft Quantum Appeal to Developers, Google Cloud's Ecosystem Traction, Internet Pioneers' Net Neutrality Hail Mary

Digital Transformation Digest: Microsoft Quantum Appeal to Developers, Google Cloud's Ecosystem Traction, Internet Pioneers' Net Neutrality Hail Mary

Constellation Insights

Microsoft looks to seed quantum developer base: In September, Microsoft laid out its quantum computing vision, which included the unveiling of Q#, a new programming language. Now Microsoft is hoping to get its large base of Visual Studio developers working on quantum projects with the release of a counterpart toolkit.

Classical computers are binary, storing bits as either a one or a zero. But quantum systems take advantage of the behavior of subatomic particles, which can hold multiple states in a phenomena known as that stands to give quantum systems vast amounts of processing power.

The new toolkit is "deeply integrated" with Microsoft's Visual Studio IDE, comes with a set of libraries and tutorials, and also includes a quantum simulator that runs on a laptop. For bigger quantum projects, Microsoft has a simulator that runs on Azure. Any quantum applications written with the kit will be future-proofed in a sense, as they'll work on general-purpose quantum hardware now under development at Microsoft.

POV: Microsoft has been working on quantum computing for more than 10 years. Still, it is playing catchup somewhat to rivals such as IBM, who has already pledged to have commercial quantum systems to market in just a few years. Google says it is getting close to reaching "quantum supremacy," referring to a quantum system that can complete a task faster than the world's most powerful classical supercomputers.

But giving millions of developers access to quantum tooling and educational resources now, consumable from within the familiar confines of Visual Studio, is a long and smart play from Microsoft. There are fundamental conceptual differences in programming a quantum system that the vast majority of developers will need to wrap their heads around. The toolkit and language, which will surely be continuously refined, provide an abstraction layer that gives developers a head start.

Google Cloud Platform plays catchup on MSP ecosystem: In a move that both suggests increased interest from enterprises and acknowledges their needs, Google Cloud Platform is steadily increasing the number of managed service providers in its orbit.

In March, Google announced that Rackspace would be GCP's first MSP; today, that number has grown to 12. While not a jaw-dropping total, it's nonethless progress for GCP's ecosystem and a sign that partners are becoming more willing to make their own substantial investments in building out GCP practices. Here's how Google describes the MSP program:

From hands-on support to the ongoing operation of customer workloads, these partners offer proactive services to both large and small cloud adopters. With their staff of dedicated technical experts, MSPs can tackle high-touch projects, covering engagement to migration and execution, to post-planning and ongoing optimization. Specifically, Google Cloud MSPs offer at minimum:

Consulting, assessment, implementation, monitoring and optimization services 

24x7x365 support with enterprise-grade SLAs 

L1, L2, L3 tiered support models 

Certified support engineers

One big addition to the ranks is Accenture. The rest is a mix of smaller companies: Cascadeo, Claranet, Cloudreach, DoIT International, Go Reply, Pythian, RightScale, SADA Systems, Sutherland and Taos.

POV: A robust MSP ecosystem is a proof point that a platform has matured and has market traction. As for GCP, 12 MSPs on board is certainly better than one, but to compete for more enterprise business Google will need to grow the ecosystem significantly, both from an expertise and geographic availability perspective. Google says more MSP partner announcements are coming soon.

Internet pioneers throw a Net Neutrality hail Mary: Later this week, the U.S. Federal Communications Commission's board is expected to overturn net neutrality regulations along party lines. The vote seems inevitable (although net neutrality proponents have a number of options to pursue next), but a group of 21 well-known technologists are asking members of Congress to step in at the eleventh hour.

The group, which includes Internet pioneer Vint Cerf and Apple co-founder Steve Wozniak, has written an letter to members of the House and Senate committees for technology-related matters, with the rather tart title, "Internet Pioneers and Leaders Tell the FCC: You Don’t Understand How the Internet Works." Here is an excerpt from the letter:

This proposed Order would repeal key network neutrality protections that prevent Internet access providers from blocking content, websites and applications, slowing or speeding up services or classes of service, and charging online services for access or fast lanes to Internet access providers’ customers.

It is important to understand that the FCC’s proposed Order is based on a flawed and factually inaccurate understanding of Internet technology. These flaws and inaccuracies were documented in detail in a 43-page-long joint comment signed by over 200 of the most prominent Internet pioneers and engineers and submitted to the FCC on July 17, 2017.

Despite this comment, the FCC did not correct its misunderstandings, but instead premised the proposed Order on the very technical flaws the comment explained. The technically-incorrect proposed Order dismantles 15 years of targeted oversight from both Republican and Democratic FCC chairs, who understood the threats that Internet access providers could pose to open markets on the Internet.

POV: Net neutrality bars ISPs from slowing legal Internet traffic based on payments or other considerations. The rules passed in 2015 determined that Internet service should be governed under Title II of the Commmunications Act, a law that dates to 1934. Opponents argue that the rules overreach and are anticompetitive.

As for the group's letter, it's highly unlikely to have any effect on the vote but does serve to bring attention to the issue. It's a sure bet that once the FCC votes, net neutrality proponents will file a lawsuit and it's conceivable that the rule change will be stayed by the court pending an outcome. This debate is far from over.

Data to Decisions Matrix Commerce Tech Optimization Chief Information Officer Chief Procurement Officer Chief Digital Officer

3 Ways Employees Will Benefit From Digital Transformation in 2018

3 Ways Employees Will Benefit From Digital Transformation in 2018

From Baby Boomers to Gen Z, today’s workplace contains a mixture of generations. Although each has grown up with very different technological and cultural experiences, all face similar challenges at work, like information overload and having to stay up-to-date with technology that’s constantly changing. But all is not lost! The future of work is an exciting one which will leverage new tools, technologies and techniques to help people get work done.

At Constellation Research, three of the top areas we’re tracking around employees in the digital workplace are:
1. using technology to augment how teams accomplish work,
2. using data to guide actions and prioritize projects and
3. using technology to encourage more creativity among teams. 

Here are some of the things we’re observing.

Augmenting our ability to get more done

No longer a thing of the future, AI is already all around us in a big way—powering the voice input on our phones or the content in our news streams.

While conversations about AI often turn to science fiction, the reality for knowledge workers is that AI is already enhancing how they work, and will continue to do so. We’re already seeing email clients that recommend replies, calendars that automate meeting scheduling, and video services that transcribe content.

The way we create, consume and interact with content is also changing. Legacy whiteboards in meeting rooms are being replaced by large, intelligent and interactive screens that allow people to collaborate whether they're in the same room or across the world. Augmented and virtual reality are moving beyond science fiction (and gaming) to mainstream use cases such as education, product design and retail. While today’s headsets may be cumbersome, soon augmented reality will be everywhere, turning any clear surface into a potential display.

In addition, new input methods including voice dictation and gesture recognition (hands and face) are allowing us to interact with our devices in new ways. I actually wrote a lot of this post by speaking out loud to my phone. 

Using data to derive insights and guide actions

How many miles have you flown this year? How many steps have you taken today? Our personal lives are filled with measurements of our accomplishments and actions. Everything is quantified. But can you say the same for work?

Imagine if you could understand which social media posts are most effective or which meetings lead to more customer wins. We don’t always have the information we need at work to help us be more effective employees. In order to provide employees with meaningful information, data needs to be collected and patterns need to be discovered. But the fragmentation of work across social networks, file sharing, web conferencing and business applications creates quite a challenge.

The solution requires charting the interactions between people, content and devices. These collections are called “graphs” in computer science, and they reveal things like who people work with and what content they interact with. This information can be used to discover patterns, leading to insights about the way people work. In turn, this data can help employees better determine what work should be prioritized and what can be postponed.

Everyone becomes a storyteller

Think about the types of content people use at work: email, chat, documents, spreadsheets, presentations. Compare that to your personal life which is probably dominated by photos and videos. Wouldn’t it be nice if we had a similar level of fun and creativity at work? 

In the past, creating compelling graphics or videos was limited to professionals. Today, almost anyone with a camera phone can start creating highly visual content. Most camera applications provide lenses, filters, stickers and other digital tricks to enhance pictures. Some take gorgeous panoramic images and some even create 3600 content. Conversations in group messaging applications now include emojis and animated gifs. Photo-sharing sites can automatically create collages from our best images.

These advances in storytelling are starting to show up in the workplace as well, enabling marketers to create more effective presentations, financial workers to create visually informative spreadsheets and sales people to pitch products with more engaging content. The days of boring content at work are coming to an end.

Delivering in the digital workplace

We’ve witnessed incredible advancements in the tools we use at work over the past 20 years. However, these pale in comparison to what the next decade will be like. The future of work is going to empower employees regardless of skillset or seniority.

If you're ready to embrace the changes and become a digital employee, have your holographic assistant connect with mine so we can discuss this further! ...Or at least take advantage of some of the auto-scheduling features cropping up in your Calendar app.

Future of Work

Digital Transformation Digest: Kubernetes Can't Be Contained, Cisco Eyes Cloud Cost Management, Oracle and MongoDB Set for Earnings Showdown

Digital Transformation Digest: Kubernetes Can't Be Contained, Cisco Eyes Cloud Cost Management, Oracle and MongoDB Set for Earnings Showdown

Constellation Insights

Kubernetes can't be contained: Even if you're not a DevOps type, it's likely you've heard of Kubernetes, the open-source container orchestration platform that has become the industry standard in just a couple of years. Kubernetes originated at Google, which had used it and previous incarnations of the idea to run its own operations. That's undoubtedly one reason for Kubernetes' rapid start out of the gate when Google open-sourced it in 2014.

Containers are lightweight packages that include everything an application needs to execute—binaries, config files and so forth—so they can run the same across different environments and systems. Kubernetes handles the job of deploying and managing armies of containers, which offer benefits for developers, IT operations staff as well as end-users in the form of stability and performance.

It's overseen by the Cloud Native Computing Foundation, which the Linux Foundation formed in 2015. The CNCF now has 13 other open-source projects under its purview, many of which are focused on container-related functions. The group now has 160 members representing every top enterprise technology vendor; this week, Salesforce joined the CNCF in another prominent addition.

Another momentum data point: This week's KubeCon event in Austin, Texas drew greater than 4,000 attendees. That's well over three times the number who showed up one year ago.

It's not difficult to read the tea leaves, says Constellation VP and principal analyst Holger Mueller.

"Enterprises want portability and containers give them that," he says. "The more support for a container there is, the more they want it. So the flywheel is working for Kubernetes." Last week at its re:Invent conference, Amazon Web Services announced its own distribution of Kubernetes, even though it offers a homegrown container orchestration system. There's only one way to interpret that, Mueller says: "The war is over. Kubernetes has won."

Cisco buys Cmpute.io for managing cloud spend: This week, Cisco quietly struck a deal to buy Cmpute.io, a Bangalore company focused on helping enterprises manage their cloud spending. Cisco's Rob Salvagno gave the rationale for the acquisition in a blog post:

Cmpute.io’s software solution analyzes cloud-deployed workloads and consumption patterns, and identifies cost-optimization strategies. The solution helps customers right-size their cloud workload instances, minimize overprovisioning, and avoid paying for resources that don’t deliver business value.

With a multicloud strategy, customers need to budget, buy, and consume differently. Cmpute.io’s technology added to existing Cisco solutions will help our customers optimize their cloud consumption to ensure optimal business value.

Cmpute.io's team and technology will be rolled into Cisco's CloudCenter group.

POV: Terms of the deal weren't disclosed, so the price tag was likely on the smaller side. The more important thing to note is how Cisco's move ties into a broader cloud industry trend, where the market has become somewhat binary. Amazon Web Services and Microsoft Azure hold the lead in IaaS, with Google, IBM and Oracle trying to bring up their market share. Cisco, HPE and other players who attempted to launch a public IaaS but were compelled to fold their hand under too-stiff competition are trying to make money by helping customers manage multi-cloud spend, which is arguably a runaround way to compete with the IaaS leaders.

Oracle and MongoDB's dueling earnings: This is the time of year when enterprise tech industry news slows down for a bit, but there are still some notable items to watch. One is Oracle's Q2 earnings report, which is due out Dec. 14.

The quarter is typically one of Oracle's slower ones yet the numbers, when they're released, could be telling. For Q2 is the first quarter in which customers could take advantage of new programs geared toward convincing them to adopt Oracle's IaaS and PaaS.

The programs include a BYOL (bring your own license) option for Oracle's database and middleware, wherein customers can transfer their existing on-premises licenses to Oracle's IaaS. Those who move database licenses there can run them "at a fraction of the old PaaS price," Oracle said at the time.

Oracle also rolled out universal credits for PaaS and IaaS, which it described as follows:

Customers have one simple contract that provides unlimited access to all current and future Oracle PaaS and IaaS services, spanning Oracle Cloud and Oracle Cloud at Customer. Customers gain on-demand access to all services plus the benefit of the lower cost of pre-paid services. Additionally, they have the flexibility to upgrade, expand or move services across datacenters based on their requirements.

Go here for Constellation VP and principal analyst Holger Mueller's deep-dive on Oracle's new programs. Other metrics to watch in Oracle's Q2 include trend shifts in the sale of new on-premises licenses, how well Oracle SaaS is selling across different categories, and mentions of "all-in" customer wins, particularly for Oracle cloud services.

Oracle has a long and rich history, with a market capitalization of more than $200 billion. In a bit of counterpoint, NoSQL database vendor MongoDB will issue its first earnings report since going public in October. The tech unicorn has been posting heavy losses as many hot startups do, but eyes will be closely watching MongoDB's numbers, particularly any guidance it provides on future quarters. MongoDB's leadership has long positioned the company as an alternative to Oracle's database; to that end, some of the closest observers of its numbers may be watching from a certain set of towers in Redwood Shores.

Data to Decisions Tech Optimization Chief Information Officer Chief Procurement Officer Chief Digital Officer

Digital Transformation Digest: EU Says Luxury Brands Can Block Amazon Sales, Inside Dell's Q3

Digital Transformation Digest: EU Says Luxury Brands Can Block Amazon Sales, Inside Dell's Q3

Constellation Insights

EU Court rules that luxury bands can block Amazon sales: A landmark ruling has come down in the European Union Court of Justice, which said that luxury product makers can block their wares from being sold by third parties on marketplace sites such as Amazon and eBay.

"The quality of luxury goods is not simply the result of their material characteristics, but also of the allure and prestigious image which be stows on them an aura of luxury," the court said. "That aura is an essential aspect of those goods in that it thus enables consumers to distinguish them from other similar goods. Therefore, any impairment to that aura of luxury is likely to affect the actual quality of those goods."

German cosmetics and fashion manufacturer Coty, which owns brands such as Calvin Klein and Covergirl, brought the action to the ECJ after an authorized distributor began selling its wares on Amazon. Coty and other luxury brands, such as LMVH, are loath to be associated with the likes of Amazon, with its emphasis on low prices and mass availability. To that end, LVMH has created 24 Sevres, a glossy e-commerce site where shoppers can select products from more than 150 high-end designers and get them via 2-day shipping.

POV: "This is all about brand protection and a part of preventing counterfeit goods," says Constellation VP and principal analyst Cindy Zhou. Luxury brands have taken similar issues to court in the United States as well, such as when Tiffany sued warehouse club Costco over rings labeled "Tiffany" in stores. Costco argued that it was not selling counterfeit rings but rather using the word as a general description of the type of setting used in the rings. A court ruled against Costco, ordering it to pay Tiffany more than $19 million.

"Luxury brands are all about differentiation and curating their upscale image," Zhou says. "As Amazon and other sites have a third-party seller network, it is difficult for the brands to control their distribution."

The ECJ's ruling offers protection to the LVMH's of the world from brand dilution in the EU's 28 countries, but won't have any effect on the massive consumer goods market in the U.S. Nor is it any guarantee across the EU, as the notion of a luxury good can be a bit fluid. For example, Coty brands such as Cover Girl and Clairol products are widely available for a reasonable cost in drug and department stores everywhere, hardly confined to tony retail boutiques. But overall, the ruling provides interesting fodder in a time of rapid change for cross-border trade, e-commerce, marketing and customer engagement.

Dell Technologies Q3 results—the highlights: This week Dell Technologies reported its third-quarter results, posting revenue of $19.6 billion but a net loss of $941 million. The full numbers are available here, but in this post we'll mostly focus on comments executives made during a conference call, and how they reflect on the broader market.

One major aspect of the historic Dell-EMC merger that created Dell Technologies was the potential for supply chain synergies, which would not only lower the vendor's cost but also improve product availability and service. Here's how Jeff Clarke, VP of products and operations, described the state of the state:

We've seen tremendous efficiency in the supply chain particularly through cycle time improvement, lead time improvement to our customer base and managing our working capital initiatives through our facilities most notably in the form of inventory. So I think we are well along the path of managing our other cost outside the commodity and the supply chain on the product side.

Memory prices, which affect so many tech products, companies and end-users, have been rising and with little end in sight, but Dell's supply chain footprint is helping it compete:

You've seen what we've gone through which is the longest inflationary period that I can recall in memory in a decade plus. And that's a byproduct of two things; one, there hasn’t been any new DRAM capacity been brought online and then the consumption of DRAM is at the highest rates we've seen.

We have DRAM, so as much as I have said DRAM is going up in cost we have it. And we're getting a value for having it. And whether that's in our PC business, on our server business I think that is something customers are coming to us for knowing we have supply and they're obviously paying for it.

Dell introduced flexible consumption models across its product line earlier this year. They allow companies to scale usage up or down as needed while avoiding major up-front costs. The number of flexible consumption deals in Q3 dropped compared to Q2, suggesting initial customer enthusiasm for the model is waning. That's not necessarily the case, Dell CFO Tom Sweet said:

I think that things are going to vary. They are complex, they're multi-year, they take time to negotiate with and typically the larger customers, the global customers that are negotiating these types of arrangements and so I think we are going to see some variability. I think all things being equal it's entirely conceivable that we'll see an uptick in a flexible consumption models in Q4 just given natural end of year sort of activity both from a customer and from a Dell Technologies perspective.

Dell has a massive array of products, some of which will likely be consolidated over time. Clarke acknowledged a need to lower complexity for customers:

[Y]ou know, complexity doesn't mean less products, complexity can mean the number of offers per product, how many countries we offer our product in. Interestingly we treat the 180th country the same way we treat the largest country in the world. It's not clear to me the 180th country in the world needs all of the entire storage breadth of our portfolio and we can make that less complex.

The next big news out of Dell will come in about a month at the Consumer Electronics Show in Las Vegas.

Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer