Results

IBM, ABB Tie Up for Industrial AI

IBM, ABB Tie Up for Industrial AI

Constellation Insights

IBM is teaming up with power systems and robotics giant ABB in a bid to apply AI (artificial intelligence) to the shop floor, smart grids and other industrial scenarios. The partnership will incorporate ABB's Ability platform with IBM's Watson IoT (Internet of things) technology:

The solutions enable current connected systems that simply gather data to become cognitive industrial machines that use data to understand, sense, reason and take actions to support industrial workers.

ABB CEO, Ulrich Spiesshofer enumerates: “This powerful combination marks truly the next level of industrial technology, moving beyond current connected systems that simply gather data, to industrial operations and machines that use data to sense, analyze, optimize and take actions that drive greater uptime, speed and yield for industrial customers.”

“This important collaboration with ABB will take Watson even deeper into industrial applications — from manufacturing, to utilities, to transportation and more,” said Ginni Rometty, IBM Chairman, president and CEO. 

Analysis: ABB-IBM Partnership speaks to broader trend

The pairing of ABB and IBM's platforms represent a continued shift away IT and toward operational technology, says Constellation Research VP and principal analyst Andy Mulholland. "IoT instrumentation creating massive new streams of data for AI to process is fast becoming the biggest driving force for the creation of digital business," he says. "The puzzlement of the IT community about the role and use of IBM Watson in IT should be rapidly being replaced by the desire to gain more understanding of this changing focus for the deployment of technology in enterprises."

First up, IBM and ABB will focus on the factory floor and smart grids. 

On the first, real-time production images collected by ABB systems will be analyzed through Watson IoT for Manufacturing, according to a statement. Watson will scan the images for defects, eliminating the previous manual inspection process. The result will be higher throughput for production lines along with better quality, the companies say.

Meanwhile, Watson will also be used to predict demand for electric power based on historical and weather data, allowing utilities to fine-tune the upkeep and operation of smart grids. 

ABB and IBM made the announcement in conjunction with Hannover Messe, the large industrial trade fair held annually in Germany.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

Infor Acquires Birst for Cloud Business Intelligence: What It Means

Infor Acquires Birst for Cloud Business Intelligence: What It Means

Constellation Insights

Infor has spent billions in recent years in the course of creating a next-generation cloud ERP suite, with most of the results coming from organic development. To shore up its hand in analytics, however, it is going outside with plans to acquire Birst, a fairly small but mature player in cloud-based BI (business intelligence). Terms of the deal weren't disclosed.

In a statement, Infor CEO Charles Phillips provided the rationale for buying Birst:

"This is much of the same team that built Siebel Systems BI, which is now Oracle's BI stack. They put the band back together, pivoted to the cloud and built a modern BI platform with an understanding of future needs, experience with a wide variety of use cases, and commitment the cloud."

Birst was founded in 2004 and has raised about $139 million in venture capital to date. Customers include American Express, Kellogg's, Schneider Electric and Citrix. Its capabilities include an ETL (extract, transform and load) engine, reports and dashboards, visualization, smart discovery and data blending.
 
While small, to attract customers Birst has used tactics such as offering concurrent user pricing, a model that can help companies which want to give many users occasional access to BI save a lot of money compared to named user pricing.


Meanwhile, Infor now has a "critical mass of cloud subscribers and petabytes of mission-critical data in the cloud," making Birst an ideal fit for deriving value from it, according to a statement.

It should be noted that Infor already has a BI offering, which includes an in-memory calculation engine, integration with Office and an application studio geared to the abilities of business users. Birst's feature set is more robust and the company has what Infor needs to meet customer demands. Infor and Birst also have complementary domain expertise:

Customers running multiple ERP systems have asked Infor to build the enterprise analytic layer across the reality of a federated environment. ERP application companies rarely have the expertise or interest to build this aggregation layer. BI companies provide the analytics platform but don't understand industry processes and potential insights.

"This is a natural move as enterprise applications customers move into the cloud," says Constellation Research VP and principal analyst Doug Henschen. SAP and Oracle have had cloud BI offerings for a number of years, Birst gives Infor a mature, customer-ready cloud business intelligence platform, he adds. 

In turn, Infor's courtship represents a safe exit for Birst, which entered the cloud BI arena way before its time, Henschen notes. Infor's acquisition comes as the competition is getting formidable, with major vendors including AWS, Microsoft, Google, Oracle, SAP and IBM all pursuing cloud-based BI and analytic capabilities.

Meanwhile, "Tableau, Qlik and other heretofore on-premises-focused BI and analytics vendors have also been moving into the cloud, and it has all added up to increasing competition for Birst," Henschen says.

"Infor will have to integrate with Birst and create a migration path from its existing Infor XI BI capabilities, but it gives them a mature cloud platform and a better shot at retaining customers that might have otherwise chose third-party, cloud-based BI and analytics options," Henschen says.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

 

Tech Optimization Chief Information Officer

Microsoft to Launch IoT As A SaaS Service

Microsoft to Launch IoT As A SaaS Service

Constellation Insights

While Microsoft has offered an IoT  PaaS (platform as a service) for some time through the Azure cloud, it's betting that some customers are willing to trade customization for faster time-to-market. Microsoft IoT Central is a fully managed SaaS (software as a service) that "enables powerful IoT scenarios without requiring cloud solution expertise," as Redmond says in its announcement:

Built on the Azure cloud, Microsoft IoT Central simplifies the development process and makes it easy and fast for customers to get started, making digital transformation more accessible to everyone.

Microsoft IoT Central will be available along with our existing platform-as-a-service (PaaS) solution, Azure IoT Suite, which enables deep customization and full control. This new IoT SaaS offering has the potential to dramatically increase the speed at which manufacturers can innovate and bring new products to market.

Further details on IoT Central weren't available, but it will become available over the next few months. While the initial version will apparently focus on manufacturing scenarios, expect packages for other cases, such as for logistics and retail, to emerge over time. 

Microsoft made a number of other IoT announcements, including Connected Factory, a specialized version of Azure IoT Suite.

Microsoft Azure IoT Suite Connected Factory ... helps accelerate a customer’s journey to Industrie 4.0 and makes it easy to connect on-premises OPC UA and OPC Classic devices to the Microsoft cloud and get insights to help drive operational efficiencies. In addition, it enables customers to securely browse and configure factory devices from the cloud.

Meanwhile, Microsoft is introducing a new service called Azure Time Series Insights. It's supposed to automate the process of analyzing event data from IoT endpoints, which can easily consist of billions of signals:

It helps organizations discover hidden trends, spot anomalies, and conduct root-cause analysis in near real time, all without writing a single line of code through its simple and intuitive user experience. In addition, it provides rich APIs to enable companies to integrate its powerful capabilities into their existing workflows and applications.

Yet another IoT announcement concerns security. Azure IoT will now support the hardware security standards Device Identity Composition Engine (DICE) and Hardware Security Module (HSM), according to a statement. Microsoft will discuss all of its IoT announcements during the Hannover Messe industrial conference in Germany this week.

Redmond's IoT strategy bears watching amid a crowded and highly competitive market. 

"Microsoft is really turning up its focus on IoT over the last few months, but it's been taking an interestingly different direction to most of the other technology vendors," says Constellation Research VP and principal analyst Andy Mulholland. "Their recent focus has been more toward adding intelligence to outcomes, and ignoring the task of the so called final-mile connectivity with the management of the sensors and devices."

"In practice there are no outcomes without good quality inputs from the IoT estate, added to which controlling and managing the data inputs was seen a year or two ago as a shrewd move to control marketplaces," he adds. "In practice, the diversity of sensors, devices and networks that are required to be integrated made this a difficult area for big technology vendors to productize. However perhaps Microsoft has the key to wining marketplace control by using its long experience in working with developers."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

The Linux Foundation Hones In On IoT with EdgeX Foundry

The Linux Foundation Hones In On IoT with EdgeX Foundry

Constellation Insights

Some 50 companies have joined a new open source project focused on IoT (Internet of Things) edge computing at the Linux Foundation. The effort could foster interoperability and faster maturing for enterprise and industrial IoT. Here are the key details from the Foundation's announcement:

IoT is delivering significant business value by improving efficiencies and increasing revenue through automation and analytics, but widespread fragmentation and the lack of a common IoT solution framework are hindering broad adoption and stalling market growth. ... EdgeX solves this by making it easy to quickly create IoT edge solutions that have the flexibility to adapt to changing business needs.

Designed to run on any hardware or operating system and with any combination of application environments, EdgeX can quickly and easily deliver interoperability between connected devices, applications, and services, across a wide range of use cases. Interoperability between community-developed software will be maintained through a certification program.

A key player in EdgeX is Dell. In October, Dell revealed Project FUSE, an IoT stack developed with dozens of partners that it intended to open-source. Those plans have apparently come to fruition through EdgeX Foundry. Dell is contributing the FUSE source code to the Linux Foundation project under the Apache 2.0 open source license, which is considered one of the most permissible of its kind:

The contribution consists of more than a dozen microservices and over 125,000 lines of code and was architected with feedback from hundreds of technology providers and end users to facilitate interoperability between existing connectivity standards and commercial value-add such as edge analytics, security, system management and services.

EdgeX Foundry members include AMD, ForgeRock, VMWare and dozens of other companies that play at different levels of the IoT hardware and software spectrum. The project is founded on the belief that edge computing—wherein sensors and devices send data to distributed gateways rather than a centralized data center, thereby speeding performance and mitigating network congestion—will drive the future of IoT.

IoT's potential depends on getting the right sources of data connected in the right way at the sensor and device level, says Constellation Research VP and principal analyst Andy Mulholland. "As the numbers of devices and sensors have started to proliferate so has the realisation that there are new complexities to master at the edge of the IoT network," he adds. "The sheer variation of activities means this can't be a market where one or two products will emerge as the winners. Instead adopting standards and open source at the edge is to every enterprise and technology vendors's benefit. Its good to see the Linux Foundation stepping up to the challenge with EdgeX Foundry, and to see that significant support is already in place to make this move work."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Chief Information Officer

Digital Business Distributed Business and Technology Models Part 4; Augmented Intelligence and Machine Leaning

Digital Business Distributed Business and Technology Models Part 4; Augmented Intelligence and Machine Leaning

Business has constantly pushed for better ‘intelligence’ to support improved decision-making to support the continued drive towards increasing competitiveness. The impact of Cloud Services in reducing cost and improving availability of capacity together with the rise of Big Data from Web and Social activities has taken analytics and Business Intelligence to new highs. But are these the capabilities to support Digital Business with its massively increased data loads, constant new perspectives, and collapsed time frames required to achieve ‘immediate’ dynamic optimizations? Part 4 of this series explores the new requirement for ‘Intelligence’ in a Digital Business as defined in part 1 of this series.

At the heart of Digital Business is using IoT sensing technology to convert physical objects and events into digital representations to use Augmented Intelligence and Machine Learning to create the Business benefit. The result is to produce amounts and types of data into the Enterprise, together with constantly changing market dynamics that simply defy traditional BI reporting methods. The challenge is to both analyze these data flows, and to make optimized decisions, within the limited time frames required for optimized responses.

AI, together with IoT, are the two new core technologies at the heart of the CAAST technology model of Clouds, Apps, AI, Services and Things that, when used in new integrated frameworks create the capabilities of Digital Business.

In the first blog in this series, part 1, the business model architecture of a Digital Enterprise is outlined and therefore it is recommended to reading this first before reading further. A notable factor of the Business activities of a Digital Enterprise is a constant series of dynamic and innovative adjustments in respond to the conditions of its Digital Ecosystem of partners. This market led optimization is a startling reversal of the current, traditional Business models, which are built on the optimization of Enterprise assets through stability in the operating model as a key factor.

This reversal in the Business model driver unsurprisingly also affects the current, traditional approach to implementation of a Business driven requirement. IT Enterprise architecture methods start with the business requirement, which is assumed to be an Enterprise Application, and proceeds down the technology stack. At each layer the selection of a technology, or product, is made based on the requirements of the Enterprise Application. As many Enterprise Applications were written using particular operating systems, etc., the result is to produce a custom implementation.

Increased awareness of standards, and a trend towards standardization, including the benefits of HyperVisors with Cloud technology, has improved commonality over recent years. Fortunately the complications/cost of maintaining custom, individual technology deployments under Enterprise Applications have been low in the past as the Enterprise Business model would rely on stability thus expecting Business application to have a life of many years.

Digital Business models are built on dynamic responsiveness to markets, and opportunities, and are delivered through quick build Apps, not monolithic applications, accordingly deployment relies on being able to deploy over a common set of enabling technologies. These capabilities span both internal and external infrastructure and are described in more detail in Parts 2, 3a and 3b titled Dynamic Infrastructure, Distributed Services Technology Management and Distributed Services Business management respectively and relating to the two common, shared infrastructure layers.

It is in the final two layers; covered in here in part 5 Augmented Intelligence and Machine Leaning and the concluding Part 6 on Business Apps and Services that a Digital Business competitively differentiates its self. Aligned to the fast moving light weight nature of rapidly deployed Apps is an Enterprise organizational model that reflects the same shift away from centralized monolithic processes and departments. The dynamic innovative Digital Enterprise has become a fast moving decentralized structure able to act swiftly to take decisions and act.

The Enterprise IT structured centralized data environment using historic analysis and reporting to delivers Business Intelligence, or BI, is not present in the activities and environment that relates to Digital Business.

Enterprise IT incorporating BI reporting remains vitally important for those processes that support key commercial functions, including compliance, where stability and ongoing comparisons remains key. However where Digital Business models are implemented the transformation in both Business and Technology models not surprisingly calls for a equal transformation in the approach to ‘Intelligence’.

The following diagram illustrates the two layers of Intelligence and Business Apps and Services, with the diversity of the Apps and Services layer producing a constantly changing demand for intelligent responses from the Intelligence layer below. Note; the term App is used to indicate a deployed business capability whose functionally is fully controlled by single Enterprise. The term Service is used to indicate an orchestration of functional elements from different Enterprises, either created dynamically by response to an event, or to build a Business offer to the market, and therefore not totally controlled by a single Enterprise.

One of the key traits that defines Digital Business is the ability to ‘Read and React’ to the data flow arising from the events and circumstances instrumented by IoT. The data volumes to be analyzed and the time frames in which to do this are one part of the challenge. The other is to use human experience and machine leaning to automate the react decision-making based on the analysis.

AI is used as a convenient term to cover the huge range of technologies and methods that are being developed to address these challenges. In the immediate future there is common agreement amongst the major Technology vendors approaches that Augmented Intelligence is the key. The goal being to use, or augment, Human Intelligence to work in this challenging environment, rather than trying to replace human experiences with entirely computer generated responses. Information Week published an excellent article defining this topic in response to the US Government concerns.

Given time, and the right information, an experienced human mind can successfully work out a reasonable solution to the requirements for ‘read and react’ responses, bit not at the frequency and volumes that Digital Business requires. It is comparable with the drivers that created Industrial Automation, where the ever-increasing production volumes increased speeds beyond the human operators capabilities. Some/ perhaps many, repetitive Office based role face the similar pressures, and though improvements can be made to improve human interactions with computers, ultimately the answer is increasingly likely to be Office Automation.

Even focusing on Augmented Intelligence and Machine Learning technology introduces a big and complex subject that is not a topic for this blog. Here the focus remains on exploring the use of the technologies of CAAST, (Clouds, Apps, AI, Services and Things), in building solutions for Digital Business. To learn more about AI, and Augmented Intelligence, the following links provide access to good primers on the topic; starting with The Verge explanation of common terms; followed by Wired explaining in more depth Machine Learning including the following useful paragraph;

AI is a branch of computer science attempting to build machines capable of intelligent behavior, while 
Stanford University defines machine learning as “the science of getting computers to act without being explicitly programmed”. 

You need AI researchers to build the smart machines, but you need machine-learning experts to make them truly intelligent. Quote from Wired see above link

Information Week published, ‘Why AI should stand for Augmented Intelligence’, drawing on an interview with IBM for their approach. PC Mag considered how Salesforce approached AI noting how, and where, it fits in relationship to Apps above, with the Data flow handing and infrastructure below. The importance of, and the role of Machine Learning comes up in these articles, with ZDnet discussing SAP views on Machine Learning as a closing view for the Technology background reading.

The interviews all point to the major technology vendors sharing, and working on, similar definitions and capabilities for the addition of intelligence and automated processing. But it will be quite a while before enough maturity occurs to allow interworking. As Augmented Intelligence and Machine Learning calls for in depth experience, knowledge and focus on a particular element an Enterprise will be faced with using different vendors for different business deployments.

The Digital Enterprise is made up of a series of high business value activity ‘pools’ operating in a semi autonomous manner to make rapid innovative competitive moves in response to changing events and operational activities. This is a complete reversal of current Business Models that rely on using centralized conformity around a set of optimized processes to reduce costs.

See the diagram below, which appears in Part 1, Understanding the Digital Business Operating model that illustrates this point. The independent enterprise activity ‘pools’ are shown, with the red lines illustrating orchestrations between the activity pools in a response to an external event in a customer building. This same reversal of the Business Model applies equally to the Technology Model, shifting the architecture from close-coupled state to alignment with fixed enterprise processes to loose coupled, stateless orchestrations in response to the events of Digital Business.

Augmented Intelligence starts with deployments to improve the operations of a particular activity; this allows selection of a technology vendor to be made based on either their specialist knowledge of the activity, or to maximize the impact of Augmented Intelligence/Machine Learning of the existing technology installation. In time, and with increasing maturity as a second phase, Augmented Intelligence/Machine Learning will move to the optimization at Enterprise level of the interconnections between the Activity pools.

The ability to start business beneficial deployments around specific activities, rather than wait for a full level of enterprise wide maturity should do much to reduce risks and difficulties for early adopters.

A commercial decision on; 1) where, and why, to initially deploy Augmented Intelligence/Machine Learning should depend on the importance of the activity pool, and, the scope for its operational improvement; 2) It has to be possible to use IoT sensing to provide the necessary quality of digital data to operate Augmented Intelligence/Machine Learning; 3) The output has to be capable of being channeled a ‘react’ Apps that can deliver the Business benefit as a ‘real time’ optimized operational improvement.

Summary; The Digital Enterprise Business is, by definition, a business that has created a full digital representation of its principle Business assets and activities to use computational facilities to optimize business operations. Though it may seem initially that IoT sensing is at the core, Augmented Intelligence/ Machine Learning represents the other half of the transformation.

 

Links to Information on Augmented Intelligence/ Machine Learning

The following is not intended to be an exhaustive listing, and is presented alphabetically. This list is provided for informative purpose and selection is based on Client and Press interest. Inclusion, or absence, from the listing does not imply any significance.

Amazon AWS - https://aws.amazon.com/machine-learning/?tag=vglnk-c312-20

Google - https://research.google.com/pubs/MachineIntelligence.html

IBM Watson – https://www.ibm.com/watson/

Microsoft - https://news.microsoft.com/features/microsofts-ai-vision-rooted-in-research-conversations/#6lVIzKOAeOhXwa57.97

Salesforce - https://www.salesforce.com/uk/products/einstein/overview/

SAP - https://www.sap.com/uk/solution/machine-learning.html

 

Summary; Background to this series

This is third part in a series on Digital Business and the Technology required to support the ability of an Enterprise to do Digital Business. An explanation for the adoption of a simple definition shown in the diagram below to classify the technology requirements rather than attempt any form of conventional detailed Architecture is provided, together with a fuller explanation of the Business requirements.

 

 

 

 

Part One - Digital Business Distributed Business and Technology Models;

Understanding the Business Operating Model

Part Two - Digital Business Distributed Business and Technology Models;

The Dynamic Infrastructure

Part Three – Digital Business Distributed Business and Technology Models

  1. Distributed Services Technology Management
  2.  Distributed Services Commercial Management
Tech Optimization Innovation & Product-led Growth Future of Work AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing SaaS PaaS IaaS Supply Chain Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Customer Service Content Management Collaboration M&A Enterprise Service Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer

'Supercard' Payments Startup Plastc Goes Bankrupt, But So Was Its Core Concept

'Supercard' Payments Startup Plastc Goes Bankrupt, But So Was Its Core Concept

Constellation Insights
 

This week, payment card startup Plastc abruptly went belly-up, telling more than 80,000 customers who had preordered the devices that it is filing for bankruptcy and will not ship a single item.  

“We are disappointed and emotionally distraught, and while we know this is extremely disappointing for you, we want our backers to know that we did everything we could to make Plastc Card a reality,” the company told disappointed would-be customers, who apparently won't be reimbursed.  

The programmable card would have allowed users to store up to 20 cards on the unit itself, with access to an unlimited number of cards through Plastc's app.  

Plastc says it had landed $3.5 million in venture capital as recently as February, but the investors subsequently withdrew the funding. That money would have been enough to ship working cards. A second investor came forward with a $6.75 million offer but also withdrew at the eleventh hour, according to Plastc: "The round was a signature away from closing and we were extremely caught off guard when they notified us yesterday they were backing out." 

Plastc's fate isn't quite as dire as that befell Coin, another payment card startup. Last year, Fitbit acquired Coin for its payments platform technology, but production and sale of its devices were immediately halted. Fitbit is expected to incorporate payment capabilities into its devices as early as this year. 

Stratos, yet another card startup, went out of business abruptly in 2015 after just six months, but its assets were recently acquired by the Danish company CardLab, which is planning to revive the product. 

Plastc and other failed â€œsuper cards” supported magstripe and claimed to be upgradeable to chip-based card technology to. Given that mag stripe is being phased out, this was a crucial promise.  There was never any point in trying to squeeze any life out of obsolete mag stripes, says Constellation Research VP and principal analyst Steve Wilson.  

"You had Coin and Plastc, and also Loop Pay, which was acquired by Samsung, that basically simulates a mag stripe card electromagnetically by blasting EM waves at a POS machine to trick it into thinking a real card has been run over the read head," Wilson says. "All these gizmos sought to keep the old card technology alive, while the US payments industry was slowly catching up with the rest of the world going to chip. Why would you try to keep mag stripe alive, when it was actually the cause of so much fraud?" 

Moreover, programmable third party cards play in a legally murky realm, Wilson notes. "Their operation was in violation of the payment scheme rules, which forbid cloning cards and actually forbid merchants accepting a payment card that is not properly branded," he says. "Coin and Plastc would have put merchants in a difficult position, of enticing them to accept a non-standard pseudocard, just so some customers could enjoy yet another gimmicky way of paying." 

The idea of a programmable mag stripe card seemed clever but it was ignoring far more pressing problems, Wilson says. The most important innovation needed in the payments space is chip-grade security for online payments through various channels. "Card present" payments—those made in person—with chip cards have a robust approach to security woefully missing online. 

Each Card Present payment instruction made with a chip card is signed with a unique cardholder key, which uniquely stamps each payment so it's tied to the cardholder, cannot be tampered with, and cannot be replayed, Wilson says: "You cannot clone a chip card because the key is held inside the chip and only ever activated, on a transaction-by-transaction basis. No attacker can skim chip cards, and then clone the cards." 

It’s for that reason Wilson said from the outset that Coin and Plastc could not promise an upgrade to chip cards. “It’s just not possible to take your chip cards and copy them into one super card, as Plastc and Coin did with mag stripe cards.” 

However, thieves can still steal card holder details and use them online, because "card not present" payments still use unsigned personal data, Wilson adds. 

While the payments industry has pushed the 3-D Secure protocol, which provides an additional authentication layer for online purchases, it has a poor user experience and isn't all that secure, Wilson says. 

"The industry has been avoiding the inevitable—transaction signing in the online environment just as we do offline," he says. The barrier has been figuring out how to get chip cards interfaced through commodity computers, but there's another way—replicating cardholder data within mobile phones. 

This is how ApplePay and SamsungPay essentially work, given they store cardholder details inside special security chips, known as secure elements, in the phones, Wilson says. The problem is that these solutions are proprietary. 

"How about we just replicate chip card functionality in phones, in an open manner," Wilson says. "Let banks issue virtual cards by writing their card data and keys into the secure elements in an open, standards-based and non discriminatory way. 

"Banks and innovators need access to the secure elements, but that's controlled by handset manufacturers and carriers, because they are wedded to a rent-based business model where the precious secure element storage is levied," he adds. "The tech innovation is pretty easy. The business models need changing."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Customer Officer Chief Information Officer Chief Digital Officer

Oracle Buys Moat for Digital Ad Measurement In Key Addition to Its Data Cloud

Oracle Buys Moat for Digital Ad Measurement In Key Addition to Its Data Cloud

Constellation Insights

Oracle is continuing its acquisitive ways in 2017, with the latest purchase focusing on Moat, which makes a platform for measuring the effectiveness of digital advertising. It will become a key new feature of Oracle's Data Cloud service, which has expanded in functional scope and intent over the past couple of years.

Moat already partners with popular digital ad platforms such as Facebook, Google, Pinterest, Pandora and Snapchat and it lists major marketers including Unilever, Kellogg's, Nestle and the New York Times as customers. 

"Oracle's investing heavily to build a comprehensive suite of solutions targeted at marketers and advertisers," says Constellation Research VP and principal analyst Cindy Zhou. "The convergence of adtech and martech is fueling the market consolidation. Moat brings additional digital advertising measurement and analytics capabilities to Oracle Data Cloud."

In a FAQ, Oracle said it would keep Moat "an open measurement and analytics platform, with deep integrations and partnerships across the entire digital publisher and adtech landscape."

Oracle built out Data Cloud beginning a few years ago with the acquisition of BlueKai, which provides a front end to hundreds of consumer data marts. BlueKai users can select and combine data sets from those sources based on their particular marketing and advertising needs.

Oracle touts Data Cloud's ability to provide consumer and B2B profile data that's updated in real time, rather than collected and aggregated on a monthly or quarterly basis. This means marketers can target at a more immediate level, such as by delivering messages when time a prospect is actively shopping for a new product or service.

Other acquisitions Oracle made in support of Data Cloud include DataLogix and AddThis. Those moves expanded the number of user profiles and targeted audiences available through Data Cloud, particularly for B2B scenarios. 

But it's important to note that today,  Data Cloud is far from just a data-as-a-service play, says Constellation Research VP and principal analyst Doug Henschen. "The real appeal is its combination of data and analytic services," he says. "Targeting and measurement are the lion’s share of the business, while straight-up data licensing is a smaller part of the business."

Television is the next frontier for targeted and measurable advertising, and that's squarely where Moat fits into Oracle's strategy, Henschen adds. "Streaming digital video—seen on desktops and connected TVs as well as mobile devices—is the fastest growing form of TV consumption, but advertisers have struggled to measure the impact of digital video." 

Moat focuses on providing "attention analytics" with its Moat Score, which measures the combination of an ad's length, how long it was visible, how long it was audible and the size of the video viewer relative to the size of the device screen, Henschen notes. 

The company says it measures attention analytics more than 19 billion times each day, a sizable figure that Oracle will surely look to scale ever higher.

"Conventional linear television still accounts for more than 90 percent of advertising spend, but addressable advertising, through set-top-boxes and on-demand offerings, and digital video advertising, through connected TVs, Web browsers and in-app viewing, are quickly growing a share of television ad spending," Henschen says. 

Oracle's move could spark competitors such as Adobe and Salesforce to scoop up other ad-tech companies in response. Possible targets include Integral Ad Science, comScore and AdClarity. 

 24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience Chief Customer Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Oracle Looks to Court Developers to Its Cloud with Wercker Acquisition

Oracle Looks to Court Developers to Its Cloud with Wercker Acquisition

Constellation Insights

Oracle executives have been touting the growth of the company's PaaS (platform as a service) and IaaS (infrastructure as a service) offerings, but with a new acquisition have acknowledged it needs to do more to attract developers. This week, Oracle announced it had acquired Dutch startup Wercker, which makes a continuous integration and delivery platform centered around Docker container-based applications. Here are the key details from Oracle's announcement:

Oracle and Wercker share the view that developers greatly benefit from focusing on building great products and applications. Oracle is building a leading IaaS and PaaS platform as the foundation for a new generation of cloud computing. A leading cloud needs great tooling and adding Wercker’s container lifecycle management to Oracle’s Cloud provides engineering teams with the developer experience they deserve to build, launch and scale their applications. Together, Oracle and Wercker will democratize developer tooling for the modern cloud.

Terms of the deal were not disclosed. Formed in 2012, Docker has around 20 employees and had raised only about $8 million in funding over three rounds, suggesting that the price tag was likely on the modest side. Wercker is integrated with Amazon Web Services, Google Cloud Platform, Slack and Kubernetes. 

Wercker has tens of thousands of users, who have developed millions of build and deployment pipelines, according to a statement. It competes with the likes of CircleCI, Codeship and Jenkins. 

Its not clear whether Oracle will deprecate Wercker's integrations with rival cloud providers, particularly AWS, which has been a key target of competitive rhetoric from executive chairman Larry Ellison.

Oracle will continue offering a community edition of Wercker, according to an FAQ document. Support for GitHub and BitBucket will remain.

Wercker's founder and CEO Micha Hernández van Leuffen provided a rationale for the deal in a blog post:

The world of software is changing and so is the world of enterprise. More than ever, we see incumbents in every sector feeling the heat from much smaller competitors who demonstrate an ability to more quickly respond to customers armed with more information and choices than ever before.

Wercker’s Docker-based platform has a strong, rapidly growing user base as companies, large and small, transition to container-based workloads. Developers will now have access to a strong Docker-based portfolio as part of Oracle PaaS and IaaS. 

"It's a good acqusition as Oracle needs to become more attractive for deploying containers and microservices, with the necessary DevOps scaffold around it," says Constellation Research VP and principal analyst Holger Mueller. "Wercker brings this to them."

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

 
Tech Optimization Chief Information Officer

Research Summary: Artificial Intelligence Delivers Mass Personalization In Commerce

Research Summary: Artificial Intelligence Delivers Mass Personalization In Commerce


 

Why Mass Personalization Efforts Fail and Ten Simple Steps to Fix Them

Over the past four decades, valiant attempts at personalization have failed due to the lack of relevant and intelligent automation.  Moreover, expectations of consumers and prospects have only grown.  The result – an expectations gap in personalization that manifests itself in fickler consumers and greater unpredictability in revenues for brands and retailers.  The inability to relevantly connect and effectively engage with consumers reflects some underlying truths:

  1. Stakeholders expect mass personalization. In an age of digital disruption, customers, partners, suppliers, and employees have grown accustomed to massive market choice, a plethora of pricing and policy options, and convenient delivery. The rise in expectations creates an insatiable cycle of satisfaction and disappointment that an omni-channel approach alone cannot deliver.  Today, omni- channel plays only a temporary role as organizations must progress forward.
  2. Lack of relevance leads to lack of engagement. Contextual relevancy can be correlated to an immediate effect on the top line.   Constellation estimates that lack of content relevancy often results in 83 percent lower response rates in the average marketing campaign.  Conversely, personalized contextual relevancy by time of day, geo-spatial location, weather, and identity improves commerce conversions between two to three times over normal non-personalized campaigns.  Context provides brands and organizations with the relevancy to earn the permission to engage with customers.
  3. Context provides brands and organizations with the relevancy to earn the permission to engage with customers.  Manual management of personalization overwhelms most organizations.  Legacy approaches are not designed for creating large-scale individualization and cannot be retrofitted.  These systems classify individuals into forced-fit, binary segments.  Often, individuals who belong to multiple segments and use cases are frustrated with this approach.  Sadly, existing systems fail to handle the management of rules engines, policies, complex event processing, and preferences at the segment level – never mind on the individual level. Those who attempt manual personalization ultimately fail due to the complexities in managing personalization without using much technology. Moreover, sales, marketing and distribution systems must scale from hundreds of thousands to billions of customers.
  4. Static systems miss emerging market shifts.  Technologies can no longer be static.  Legacy personalization systems deceptively start out easy and end up as cumbersome anchors years later.  In an era of dynamic markets, supporting technology must identify new demand signals; assess, analyze, and act on new demand signals; and apply cognitive and machine learning capabilities to adapt.

Designing AI-Driven Smart Services Starts with the Orchestration of Trust

Currently, the fashionable approach is predictive.  Prediction does a great job of using past history to foretell future patterns.  An intention-driven system tests for shifts in patterns by setting up hypotheses and awaiting the results.  If one knows a person always gets a specific type of coffee at the same time every day, that’s predictive.  An intention-driven system will test to see what type of coffee is purchased based on time of day, weather, relationships, location, and even sentiment gathered from heart rate or actions. The test comes from an offer or from studying shifts in patterns and behaviors.  This self-learning and adjusting capability is powered by cognitive computing approaches.  In fact, this algorithm-driven intelligence eventually will think on its own.

Digital transformation describes a shift in business models and approaches to engagement with customers, prospects, partners, employees and suppliers.  AI-driven smart services provide the backbone behind these business model transformations.  Consequently, crafting AI-driven smart services requires a shift in thinking to atomic-driven smart services.  In fact, these new AI-driven smart services rely on five key components (see Figure 1):

Figure 1.  The Secret to Designing Atomic AI-Driven Services.

Source: Constellation Research

  1. Digital footprints and data exhaust use AI to build anonymous and explicit profiles. Every individual, device, or network provides some information. That digital footprint or exhaust could come from facial analysis, a network IP address, or even one’s walking gait. Using AI and cognitive reckoning, systems can start to analyze patterns and correlate identity. That means that AI services will recognize and know individuals across difference contexts and take an intention-driven approach.
  2. Immersive experiences go beyond omni-channel.  The combination of context, content, collaboration, and channels creates immersive experiences that cater to what each individual or node requires.   Context starts with attributes such as identity, relationship, roles time, location, weather, sentiment.  Content includes all content types from web pages, videos, product catalogs, community pages, product listings, knowledge bases, and documents.  Collaboration talks about the sense and respond feedback loops.  Channels are any delivery mechanism that can be accessed by a user from mobile devices, social media, kiosks, gesture, conversations as a service, augmented reality, and other AI driven UX experiences.
  3. Mass personalization at scale delivers intention-driven digital services.  Anticipatory analytics, catalysts, and choices interact to power mass personalization at scale.  Anticipatory analytics allow customers to “skate where the puck will be.”  Catalysts provide offers or triggers for response.  Choices allow customers to make their own decisions.  Each individual or machine will have its own experience in contexts depending on identity, historical preferences, and needs at the time. From choose-your-own-adventure journeys, context-driven offers, and multi-variable testing on available choices, the AI systems offer statistically-driven choices to incite action. With no real beginning nor ending, expect these systems to work like a Choose Your Own Adventure book. Funnels fall aside as customers, partners, employees, and vendors jump in across processes, make their own decisions, and craft their own experiences on their own terms.  Journey maps must account for infinite journeys and support the customer-centric points of view.
  4. Value exchange completes the orchestration of trust. Once an action is taken, value exchange cements the transaction. Monetary, non-monetary, and consensus are three common forms of value exchange. While monetary value exchange might be the most obvious, non-monetary value exchange (including recognition, access, and influence) often provides a compelling form of value. Meanwhile, a simple consensus or agreement can also deliver value exchange, for instance, on the veracity of a land title or the terms of a patient treatment protocol.
  5. Cadence and feedback complete an AI-powered learning cycle. Powered by machine learning and other AI tools, smart services consider the cadence of delivery – one-time, ad hoc, repetitive, subscription-based, and threshold-driven. Using machine learning techniques, the system studies how the smart services are delivered and applies this to future interactions.

Bottom Line:  Mass Personalization At Scale Requires A Strong AI Foundation

The market need for mass personalization at scale and the technology advances in artificial intelligence (AI) enable brands and enterprises to finally deliver on the promises of digital transformation.   As new algorithm-driven intelligence improves, these AI-driven smart services have the capacity to deliver immersive experiences, mass personalization, and value exchange across different modes and cadences.  Further, these systems can apply machine learning to improve their capabilities in future interactions.

This report shows how AI-driven smart services deliver on the promises of mass personalization at scale, how organizations and brands can design their own AI-driven smart services, and highlight 10 recommendations to accelerate personalization success.

Click here to purchase the report and get the 10 recommendations to accelerate personalization success

Your POV.

Are your commerce systems old and creaky?  Ready to modernize commerce but don’t know how?  Do you have a digital transformation strategy?   Looking to apply matrix commerce?  Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com.

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Marketing Transformation Matrix Commerce Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience AI Agentic AI LLMs Generative AI ML Analytics Automation Machine Learning business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Leadership Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer Chief Experience Officer

Hyperledger, Ethereum Blockchain Projects Move Toward Cooperation, Not Competition

Hyperledger, Ethereum Blockchain Projects Move Toward Cooperation, Not Competition

Constellation Insights

As interest in blockchain/distributed ledger technologies grew in recent years, a number of powerful industry consortia have emerged. Two of the most prominent are Hyperledger, which is hosted at the Apache Software Foundation, and the Enterprise Ethereum Alliance. (The latter was only formed in February, but Ethereum already had a strong developer community.)

Now an important milestone has been reached between the Ethereum community and Hyperledger, which counts more than 100 members and is a major player in blockchain. Hyperledger project head Brian Behlendorf made the announcement in a blog post:

[T]he Hyperledger Technical Steering Committee (TSC) approved a proposal submitted by engineers at Monax and Intel, to incubate the community’s first Ethereum derived project – Burrow, a permissionable smart contract machine.

The Burrow project originated with Monax as eris-db, and has been open source since December 2014. The project has been relicensed to Apache Software License 2.0, in accordance with the Hyperledger governance requirements.

First, and foremost, having an Ethereum derived project under the Hyperledger umbrella should send a strong message that any positioning of the Hyperledger and Ethereum communities as competitive is incorrect.

Apart from that, there remain many technical challenges to solve with blockchain, and it makes sense for the community at large to collaborate in the name of solving them with production-ready code faster, Behlendorf wrote. 

An Apache-licensed Ethereum Virtual Machine also means that the variety of distributed ledger falling under the Hyperledger banner, such as Sawtooth Lake and Fabric, can work on integrating EVM, he added:

I know that many in the community have been looking forward to (and working towards!) this day. I think it will mark an important point in Hyperledger’s (and blockchain) history.

That isn't hyperbole, says Constellation Research VP and principal analyst Andy Mulholland

"One aspect that is part of the definition of interactive digital markets is that they consist of an ecosystem of trading enterprises, hence the requirement for a commercial ‘settlement’ system, and the hope that blockchain technology will be able to provide this," he says. "Though it is early days in the progress towards this goal, the importance of Hyperledger and Ethereum as major initiatives is understood. Therefore, this announcement of the two working in tandem introduces an important new impetus."

The addition of Monax to Hyperledger is a welcome measure, and could spark even more collaboration between the communities. Earlier this year, a proposal to relicense the Ethereum C++ client under the ASF's Apache 2.0 license and away from the more restrictive GPLv3 failed after resistance from some members of the Ethereum community. The addition of Monax and Burrow to Hyperledger could serve as a test bed for collaboration and thus a potential motivator for broader integration between Hyperledger and Ethereum in the future.

24/7 Access to Constellation Insights
Subscribe today for unrestricted access to expert analyst views on breaking news.

Tech Optimization Digital Safety, Privacy & Cybersecurity Chief Information Officer