Results

Salesforce Dreamforce 2017: 4 Next Steps for Einstein

Salesforce Dreamforce 2017: 4 Next Steps for Einstein

Salesforce Einstein Prediction Builder, Bots, Data Insights and a new data-explorer feature stand out as the big AI and analytics announcements. Here’s what they’ll do for your business.

To Salesforce customer Bill Hoffman, Chief Analytics Officer at Minneapolis-based US Bank, the “A” in “AI” is about “augmented” intelligence because, as he said in a keynote at this week’s Dreamforce event in San Francisco, “there’s nothing artificial about it.”

US Bank has deployed Salesforce Einstein capabilities including Predictive Lead Scoring and Einstein Analytics (formerly known as Wave) for customer attrition analysis and retention efforts. It's also using Einstein Discovery (formerly BeyondCore) to better understand customer behavior and cross-sell opportunities. The bank expects to roll out Einstein capabilities to more than 2,000 of its customer-facing financial advisers across the firm in hopes of “personalizing service at scale” and “creating a differentiated customer experience,” Hoffman said.

Personalizing at scale is precisely the idea behind two “myEinstein” capabilities announced at Dreamforce. Also announced were two Einstein Analytics capabilities. All four capabilities are coming to the portfolio next year. Here’s what they promise to do for your business.

Einstein Prediction Builder: Plenty of Salesforce customers are using or considering machine-learning-based Einstein capabilities, most of which were detailed in my 19-page report published earlier this year. But at Dreamforce 2017 we heard the revealing stat that some 80% of the customer data in Salesforce is tied to custom (customer-defined) fields and objects. No surprise, then, that the number-one ask among Salesforce customers was for customizable, as well as pre-built, Einstein insights, predictions and recommendations.

Einstein Prediction Builder is a no-code capability designed to enable non-data-scientists to develop predictions using custom fields. Use cases are limitless, but popular use cases are likely to include cross-sell/up-sell, churn, CSAT and propensity-to-escalate analyses. Prediction Builder will be powered by the same machine-learning data pipeline that handles millions of Einstein predictions per day, but it will be opened up – starting with a February beta release and likely June general release – to custom fields and objects in Salesforce. Pricing has not been finalized.

Einstein Bots: Salesforce picked up strong natural language understanding and natural language translation capabilities through its 2016 acquisition of MetaMind. Einstein Bots, a second My Einstein feature, will couple these language capabilities with Salesforce data and the Salesforce workflow engine to power automated customer-service agents. The idea is to handle the bulk of the simple, frequent service cases, such as user password resets, while leaving the long tail of complex and infrequent service inquiries to human agents.

As with Prediction Builder, Einstein Bot development will be a no-code proposition. It will start with point-and-click selections and workflow setup and uploading of spreadsheets of sample customer-service interaction text to train the language model. Beta release is expected in February with generally availability to follow in June. Pricing will be announced at general availability, but I expect it to be based on the volume of cases handled over a specified time. The Bots will start with text-based interaction, but voice-based interaction is likely to follow.

Einstein Data Insights: This new Einstein Analytics (formerly Wave) capability provides deeper insights into standard Salesforce reports from the Sales Cloud, Service Cloud and, eventually, other clouds. Powered by the same engine behind Einstein Discovery, Einstein Data Insights will automatically surface important trends, outliers, changes over time and even data-quality problems within standard reports, displaying a combination of visualizations and textual explanations. Users will press a button embedded on a standard report and the visualizations and textual explanations will appear on the right side of the screen (see image below). This capability is also expected to see beta launch in February with general availability next June. The pricing model has yet to be determined.

Einstein data explorer feature: This capability, which will be included with Einstein Analytics, will let you have "a conversation with your data," says Salesforce, by typing in questions in plain English. Behind the scenes, keyword-driven interpretation will help you drill down on dashboards and visualizations to better understand not just what happened by why it happened. You could drill down on a total figure, for example, by typing “amount by product.” Or you could analyze performance by typing in “lost deals by product.” This feature is expected to be generally available in February.

My Perspective on Einstein’s Progress

As compelling as the coming Einstein capabilities are, the big question on the minds of many customers is “how much will it cost?” It seems we’re still in a chicken-and-egg phase in which both Salesforce and customers are trying to figure out how much Einstein capabilities are worth. Different sorts of predictions and recommendations have different values, depending on the cloud and the types of actions triggered. The size and nature of the customer adds another dimension of complexity, with large enterprises sometimes preferring all-you-can-eat enterprise deals. Salesforce, meanwhile, needs to establish clear revenue expectations to keep its investors and Wall Street happy. Innovation presents its challenges.

Picking up on trends in big data and open-source software pricing these days, one possible pricing approach would be to provide free access to Einstein development tools and a limited number of predictions or recommendations so businesses can get a sense of what they can do. Once the capability is deployed, Salesforce could apply volume-based per-prediction or per-recommendation charges would kick in. In this way, charges would be tied to the value delivered to the customer, although different customers would surely have different perceptions of value, so it might be hard to come up with a one-size-fits-all pricing scheme.

One thing that customers might have found confusing at Dreamforce was the distinction between Einstein and Einstein Analytics (formerly Wave Analytics). There were two separate keynotes at Dreamforce and there are two separate teams behind different sets of Einstein capabilities. But they are both part of one portfolio and a continuum from descriptive and diagnostic analytics to predictive analytics and prescriptive recommendations and actions (as well as advanced language and vision capabilities and APIs for human-interactive applications). Before you can get to the predictive and prescriptive part you need to have good data and reporting in place.

US Bank is using capabilities across the Einstein continuum, and Bill Hoffman, when asked for advice during a keynote, said you have to start with data quality and you have to bring in key stakeholders and risk management and compliance partners from the beginning. In short, don’t expect to get the sizzle of “AI” without addressing the the meat-and-potatoes of data management and baseline reporting and analytics.

The unsung announcements that didn’t get as much attention at Dreamforce included a recent rewrite of the Einstein Analytics engine said to deliver a 30% reduction in data-ingestion and query times. Available data capacity was also more than doubled to 1 billion rows per customer. For easier data loading from external sources, Salesforce has added out-of-the-box data connectors for AWS Redshift, Google BigQuery and Microsoft Dynamics, and more than 20 additional pre-built connectors are to be added over the next six months. Finally, Smart Data Prep capabilities have been enhanced with data profiling, auto clustering, anomaly detection, filtering and transformation suggestions.

These upgrades aren’t the sexy stuff, but they are day-to-day productivity improvements that will help sell customers on handling analytics within Salesforce and advancing to Einstein predictions and recommendations.

Related Reading:
Tableau Conference 2017: What’s New, What’s Coming, What’s Missing
Oracle Open World 2017: 9 Announcements to Follow From Autonomous to AI
Microsoft Stresses Choice, From SQL Server 2017 to Azure Machine Learning

Media Name: Einstein Observatory.jpg
Media Name: Einstein Prediction Builder III.jpg
Media Name: Einstein Data Insights.jpg
Data to Decisions Tech Optimization Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Salesforce IoT; A major showcase at Dreamforce

Salesforce IoT; A major showcase at Dreamforce

It’s that time of the year when Salesforce persuades almost unbelievable numbers of people, (said to be 170,000), to come to San Francisco for its annual Dreamforce event. It’s no longer possible to classify attendees as customers, developers, even pure CRM professionals as the pervasive nature of Salesforce Technology at the center of Enterprise capability has broaden the roles and interests of the attendees. The packed exhibition halls offer a scale and diversity of exhibitors that more closely resembles an Industry Trade show than a vendor show; yet everything on show runs on, or integrates through Salesforce technology including IoT!

Salesforce is certainly not new comer to IoT, but what is new to the Industry generally, is the Salesforce views on what IoT can do for an Enterprise, and how to get started. It’s a carefully crafted approach that matches and compliments their other technology, and the working practices of the ‘core’ Salesforce delivery staff in their various Enterprises.

The IoT keynote introduced the Salesforce point of view and proposition for IoT, with a series of following sessions used to build the details with deployment examples from two, three or four customers as proof points. Here are the Salesforce IoT key message statements presented;

  1. Behind Every Device there is a Customer;

Salesforce IoT products, tools, and deployment all relate directly to those activities that in some way, or other, will create a better customer experience.

A statement that positions Salesforce with a simple, clear, and well-defined Business focus in what is otherwise a very broad, often technology defined, marketplace. The focus allows Salesforce to align IoT with their core Business proposition, customer base, and Industry experience and is a contrast with the majority of the IoT market which aligns with machines and process improvements.

  1. Events, and Data, make IoT Business Valuable

Salesforce defines IoT value as any ‘real-time’ event triggers that create data inputs rather than limiting IoT to the often-used conventional definition referring to data from Sensors.

Whilst this may not fit necessarily fit with the popular Industrial IoT categorization, it does Salesforce to make use of a wider range of Enterprise sources as ‘IoT’ inputs. Salesforce IoT capability is able to make effective use of inputs from all of the nine principle areas of ‘engagement’ to be found in ‘Systems of Engagement’. Some of these sources, such as Social, are clearly of great value to the Salesforce IoT Customer focus in 1).

  1. Deploy from Salesforce Cloud Engines Business Value

IoT pilots, and small-scale deployments, have been hard to business justify due to the need for a unique complex processing

Creating IoT event data from simple low-cost deployments is not difficult, but making use of the data has often been a barrier to any real success. The premise that IoT will support ‘real-time’ read and react capabilities that deliver new business value has been difficult to deliver without expensive investments in complex Event Engines. Salesforce IoT works in association with, and fully integrated to, Salesforce Clouds offering a familiar development environment and full integration with other Enterprise activities. The standard Salesforce benefits of starting small at low cost and scaling up enable practical IoT projects to be quickly tried and tuned.

  1. Salesforce MyIoT Personal Toolset Eases Delivering Initial Value

A new product to facilitate rapid deployment of ideas as fully functional implementation using a highly business intuitive user interface.

In designing the MyIoT product Salesforce expect to encourage any Salesforce professional to be able to deliver a proof of concept as a genuine fully functional solution this breaking down the barriers to wide spread innovation. MyIoT implementations are fully functional and can be used to address the smaller project, or to prepare for a large-scale deployment.

  1. Updated Salesforce IoT Product Sets For Scale Up

Updated versions of Salesforce IoT Explorer Edition and Salesforce IoT Enterprise Edition complete Salesforce IoT product range

Full Product function listings for each can be found at; IoT Explorer Edition and IoT Scale Edition

  1. Significant Experience and Trained Advisor and Partners

Salesforce has grown its support and training capabilities to align with the expected increase in interest that the above will create.

Dreamforce provided a significant number of Case Study sessions on different ways that Salesforce had created business value for customers in a wide range of Sectors. In addition there was a dedicated Salesforce IoT Trail area to showcase partners covering Business Consulting, System Implementations and supporting Products to extend functionality.

 

Constellation Summary

Salesforce has been very active from the early days of the IoT market and clearly has gained a lot of practical experience that has been used in defining its IoT market positioning and products.

Cloud deployment models, the shift towards Systems of Engagement, including Machine Learning and Augmented Intelligence, all combine to require a IoT data and event management. Add Digital Business market disruption driving increased focus on customer centric activity and Enterprises are going to need to look very careful at their IoT strategy.

With the clarification of direction and product updates plus the existing investment in Salesforce existing salesforce customers should look carefully at the Salesforce IoT proposition.

 

Addendum

Salesforce IoT Overview

- https://www.salesforce.com/products/salesforce-iot/overview/

Salesforce ‘Behind every device is a Customer’

- https://www.salesforce.com/products/salesforce-iot/why-salesforce/

 

Tech Optimization

Digital Transformation Digest: IBM's Latest Quantum Computing Push, OpenStack Foundation Looks Beyond, Department Chains Still Wrestling with Online Shift

Digital Transformation Digest: IBM's Latest Quantum Computing Push, OpenStack Foundation Looks Beyond, Department Chains Still Wrestling with Online Shift

Constellation Insights

IBM pushes quantum computing envelope: Big Blue has fired the latest salvo in the competitive war between itself and the likes of Google and Microsoft over quantum computing, announcing it has created a working prototype of a 50 quibit processor.

Cassically-designed computers are binary in structure, storing bits as either a one or a zero. But quantum systems take advantage of the behavior of subatomic particles, which can hold multiple states. This phenomena, which is known as superposition, stands to give quantum systems vast amounts of processing power as they are developed and reach viability at scale. Qubits are the quantum counterpart of traditional bits.

The near term goal among IBM and its rivals is achieving "quantum supremacy," wherein a quantum computer can finish a task faster than any conventional computer. Google has characterized 50 qubits as the bar for quantum supremacy.

POV: While the 50 qubit system remains a lab creation, IBM will provide cloud-based access to Q quantum computers with 20 qubits of power by the end of this year. Early quantum computers such as these remain highly unstable. IBM says its initial Q systems will have a "coherence" rating—the time available to run quantum computations—of 90 microseconds, although IBM characterizes this result as leading the field.

While IBM and others have signaled that commercial quantum computing systems are on the horizon, for now IBM's goal is to create a critical mass of research and academic interest around its activities. Some 60,000 users have conducted nearly 2 million experiments on IBM"s cloud-based quantum computing service, representing nearly two thousand universities, high schools and other institutions, according to a statement.

Desktop Metal hatches heavy 3-D printing plans: A Boston-area startup that has raised more than $200 million for its metal 3-D printing systems is now taking international pre-orders for its Studio System prototyping machine. BMW Group will be the first international company to get one of the systems, according to Desktop Metal's announcement. Interest among U.S. companies for Studio Systems has been strong as well.

But the real—albeit yet unproven—breakthrough for Desktop Metal is set to come in mid-2018, with the release of the Production version of its system. The systems, which use an inkjet printer's approach to creating complex metal parts, will be 100 times faster and 20 times cheaper than laser-based 3-D metal printing systems, according to the company. Desktop Metal Production is geared toward manufacturing at scale, with advantages that other methods can't match.

The systems' sweet spots are smaller batches of complicated parts, that can be designed and printed through software, with no special casts or tooling required. Moreover, the parts can be made in non-factory settings, reducing overhead. Another advantage, as noted in the Register: Since the parts would be software-driven, the files could be sent electronically to local machines, which on paper would avoid import tariffs associated with bringing goods across borders.

POV: Desktop Metal's inkjet-style approach is not unique, nor its its use of metal in the printing process. Still, the startup has attracted investments from the likes of Google and BMW, and is said to have a rich patent portfolio backing up its commercial ambitions. The challenge now is to deliver Production systems that live up to the speed and cost Desktop Metal is touting.

Macy's places new bets on tech for turnaround: Department store chain Macy's reported third-quarter results this week that saw profits beat estimates but revenue fall 6.1 percent to $5.28 billion. Those numbers reflect continued difficulty in the brick-and-mortar side of the business, but also more success in shifting sales online. Macy's recently brought on a new president, Hal Lawton, who has experience at eBay and Home Depot.

Lawton's unique background and experience working with technology is something Macy's is banking on big-time as it plots a defensive and offense strategy against not only rivals such as Kohl's but especially Amazon, which has made a series of strategic moves, such as the acquisition of Whole Foods Market, to build out its brick-and-mortar presence.

Macy's has experienced 33 quarters of double-digit growth in online sales, but still has work to do on some fundamentals, CEO Jeff Gennette said during a conference call:

So some of the things that we're focused on with respect to technology, is really making sure that our ongoing site optimization is just really strong, and we learn every day. We do a good job here, but we have lots of opportunities to improve on this. We're looking at mobile and tablet app responsiveness and making sure that we get the conversion rates there where we want them.

One challenge retailers like Macy's face from Amazon is the latter's sheer scope of inventory and product availability. Macy's is looking to expand its direct-ship-from-vendor operations as a way to combat that, Gennette said. In addition, Macy's plans to leverage machine learning for personalized shopping experiences, he said. The latter is hardly a pace-setting move, so it remains to be seen how well Macy's executes.

Macy's is also hoping to lure new customers, particuarly so-called Generation Z members, to the fold. Gennette gave a broad outline of the chain's plans here:

And then lastly, to the previous question about on-boarding of new customers and the idea about the Gen Z customer and using user-generated content, being in the social space, using our teams in a more relevant way to market in a more authentic way is all part of what's on our kind of technology playbook.

POV: Macy's recently overhauled its customer loyalty program, and while officials reported that initial feedback has been positive, statistical results weren't made available. Overall, the U.S. brick-and-mortar retail sector remains a boxer in late rounds, leaning on the ropes, with forecasts for 2018 not looking especially positive. Macy's is one chain talking a good game about innovation and transformation; as the busy holiday shopping season gets underway, the contest is already in crunch time.

Data to Decisions Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Revenue Officer

Digital Transformation Digest: IBM Adds Privacy Measures in EU Cloud, Office-LinkedIn Integrations Continue, UPS Backs Blockchain for Trucking

Digital Transformation Digest: IBM Adds Privacy Measures in EU Cloud, Office-LinkedIn Integrations Continue, UPS Backs Blockchain for Trucking

Constellation Insights

IBM tightens privacy measures for EU cloud operations: In response to both regulatory and competitive pressures, IBM is adding a new layer of data access and control guidelines in its Frankfurt data center operations, which serves many cloud infrastructure customers across the EU. Here are the key details from its announcement:

IBM will roll out new controls to ensure access to client content (including client personal data and special personal data) is restricted to and controlled by EU-based IBM employees only. These employees will play a critical role in IBM incident and change management processes by reviewing and approving all changes from non-EU based employees that could affect client data.

In a move that is unique to only IBM Cloud’s dedicated environments in Frankfurt, clients will review and approve all non-EU access requests to their content if an instance requires support or access from a non-EU based employee. If granted, this access is temporary and the client will be notified when the temporary access is revoked. Logs that track access are made available to the client.

Big Blue is also adding to customer support teams in the EU, which will now have around-the-clock local staff. No price increases are anticipated. A third measure, coming next year, will give customers the ability to encrypt their data both at rest and in transit while keeping possession of master encryption keys at all times.

POV: IBM's moves are welcome and necessary in light of recent and upcoming developments in EU privacy laws, particularly the General Data Privacy Act, which takes effect next year. It is also playing catchup to rivals such as Microsoft, which has already rolled out similar measures for Azure. Microsoft is working with Deutsche Telekom, who is serving as a third-party steward overseeing customer data held in EU data centers.

IBM is pledging to add the data privacy improvements in other regions around the world, although it provided no timelines.

The LinkedIn-Office integration story continues with Resume Assistant: As time goes on since Microsoft's landmark $26.2 billion acquisition of LinkedIn, the application integration scenarios between the companies' software are becoming abundant and diverse. The latest is Resume Assistant, a feature that pulls LinkedIn data into Microsoft Word as job seekers are crafting or updating their curriculum vitae:

Leverage relevant examples—See how top people in a field represent their work experience and filter by industry and role for a personalized experience.

Identify top skills—Find the most prominent skills for the type of job you’re seeking so you can more easily increase your discoverability.

Customize a resume based on real job postings—People can see relevant job listings from LinkedIn’s 11 million open jobs and customize their resume to appeal to recruiters.

Resume Assisant also provides hooks into LinkedIn's ProFinder freelance help site, as well as the Open Candidates feature, which tells recruiters combing through LinkedIn that you're available and interested in new opportunities.

POV: This is a classic case of line-blurring between two platforms' unique capabilities that results in something more useful on the whole. While LinkedIn profiles over the past several years have become somewhat of a proxy for traditional resumes, they haven't replaced them by a long shot.

Meanwhile, resumes have long been a largely static art form; while there may be no need for a radical reinvention of the format, the elements that go into them can always be improved, and that's where LinkedIn's rich data set can help job seekers, recruiters and hiring managers alike.

UPS joins blockchain trucking alliance: If you're a group forming an industry consortium around the use of blockchain in the trucking industry, you could do worse than to land UPS, the world's biggest package delivery company, as a member. The Blockchain in Trucking Alliance has done just that, in a move that should provide the group's work with a major infusion of energy. Here's how UPS describes why it joined:

In particular, UPS is exploring blockchain applications in its customs brokerage business. UPS is one of the world’s largest customs brokers, and a key objective of its brokerage strategy is to digitize transactions. Blockchain technology would help by improving transaction accuracy and by replacing existing paper-heavy and manual processes.

UPS wants to leverage blockchain technology to facilitate execution and visibility of trusted transactions between UPS, its customers and government customs agencies. Blockchain, a digital database using blocks that are linked and secured by cryptography, can be used to keep record of any information or assets. This includes physical assets, like transportation containers, or virtual assets, like digital currencies.

POV: Some 300 companies have applied to join BiTA. This bodes well for the group's work, which of course remains nascent. Its emergence comes as the trucking industry is experiencing a renewed wave of consolidation.

Later this year, new U.S. rules will take effect requiring trucking companies to use electronic logging of drivers' hours. The logging is aimed at stopping logistics providers from circumventing laws governing how long drivers can work, but as a byproduct is expected to squeeze productivity and already thin profit margins. To that end, blockchain is a longer-term play but one the trucking industry is betting on as a route to more efficient operations.

Digital Safety, Privacy & Cybersecurity Future of Work Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Financial Officer Chief People Officer Chief Information Officer Chief Supply Chain Officer

Digital Transformation Digest: Facebook Adds More Business Features to Messenger, QuickBooks Launches Direct Lending Program, CVS to Offer Next-Day Drug Delivery

Digital Transformation Digest: Facebook Adds More Business Features to Messenger, QuickBooks Launches Direct Lending Program, CVS to Offer Next-Day Drug Delivery

Constellation Insights

Facebook Messenger adds business-friendly features: Version 2.2 of Facebook's wildly popular Messenger app has arrived, and with it a closed beta version of a new plugin that lets businesses embed Messenger in their websites and talk to customers across multiple channels.

The plugin doesn't have all the features found in the full-fledged Messenger app, but key capabilities such as payments support and rich media are part of the initial version. Facebook has lined up some top brands as beta testers, including Air France, KLM, Argos, Volaris and Zalando.

POV: The plugin reflects the B2C side of Facebook's emerging enterprise strategy. On the B2B end of the spectrum lies Workplace at Facebook, which Constellation VP and principal analyst Alan Lepofsky takes a detailed look at right here.

While there are untold numbers of chat clients available for enterprises to use with their websites, Messenger provides a series of advantages for brands, not the least of which is its ubiquity among customers, with more than 1 billion users around the world.

Customers are already using Messenger in their personal lives to communicate with friends and family; businesses can tap into that activity without having to ask customers to login to a separate chat client or account on their websites. Along with Messenger's cross-platform continuity for message threads, the level of friction for customer service, marketing and sales-related conversations can be reduced dramatically.

QuickBooks enters the SMB direct-lending race: Following the lead of Square, Amazon and other tech companies catering to small business entrepreneurs, Quickbooks has introduced a direct lending service that provides loans of up to $35,000 for qualifying individuals.

Dubbed QuickBooks Capital, the service is embedded within the existing application, and use data about the customer QuickBooks already has, along with machine learning models that leverage its large corpus of user information, to make lending decisions quickly. Funds get transferred within a couple of business days, with the money coming directly from QuickBooks and not a third party, as has been the case in the past.

The loans are for between three and six months, with interest rates ranging from 1.75% to 4.74%, which works out to an APR of between 6 and 18 percent. Rates are set based on a customer's business history and personal credit score. On that basis, QuickBooks is being fairly liberal, requiring just a 580 FICO score as a minimum qualification. Indeed, by its own estimates, 60 percent of potential customers for the Capital service wouldn't be able to get a loan elsewhere, the company says.

POV: Short-term infusions of cash are critical for small business, whether to bridge expenses during a traditionally slow time of year, or to hire a key new worker or workers. QuickBooks, with its intimate view of an applicant's financial picture—many not only enter debits and credits into the application, but connect their bank and other accounts—can make decisions on a loan much more quickly than a traditional bank, with far less documentation prep required on the part of borrowers.

While a small business could conceivably get better interest rates and loan terms with a bank, QuickBooks' lending parameters are far from usurious. There are also no prepayment penalties or underwriting fees involved. However, borrowers do face a key risk factor: The loans are personally guaranteed, rather secured by collateral. That means entrepreneurs are on the hook if their business operations can't pay back the loans.

It is not clear how much money QuickBooks will lend, but a spokesperson told Bloomberg an initial pilot involving hundreds of businesses went exceedingly well. Overall, the program is a novel way for QuickBooks to generate more revenue from existing customers while taking advantage of the data it already collects.

CVS to offer next-day prescription delivery: Amid its attempt at a mega-merger with health insurer Aetna, CVS Health has announced plans for next-day delivery of pharmacy products, a move that comes as Amazon mulls an entry into prescription drug sales. Here are the key details from CVS's announcement:

CVS Pharmacy will offer free, same-day delivery service within hours from all locations in Manhattan beginning on December 4. Prescriptions and a curated selection of over-the-counter products will be delivered directly from CVS Pharmacy in secure tamper-proof packaging right to customers' doors to assure complete privacy.

Same-day delivery will expand to Miami, Boston, Philadelphia, Washington, DC and San Francisco in early 2018. These new delivery options will enhance CVS Pharmacy's national network of solutions designed to give customers flexibility in how they want to shop.

POV: CVS has already partnered with Instacart to deliver front-of-store items, including over-the-counter medications, but it's not clear how successful that effort has been so far. It has about 2,600 retail stores in the U.S., with stiff competition from the likes of Walgreens, Rite Aid and Walmart. In recent years, CVS has torn down or closed many smaller locations and opened larger-footprint stores with expanded grocery and household item sections. CVS says its Instacart deal will be expanded to 50 percent of U.S. households this year.

While adding rapid delivery for prescriptions can be seen as a defensive hedge against Amazon, CVS and its peers have a center of gravity that shouldn't be discounted. Even assuming that Amazon, should it enter the market, can provide lower prices and satisfactory service, many drug consumers have lengthy relationships with their retail pharmacies and pharmacists, ties that won't necessarily be broken quickly. CVS is nonetheless wise to get out in front of the Amazon threat, and you can expect rivals to follow suit quickly.

Marketing Transformation Matrix Commerce Next-Generation Customer Experience Chief Customer Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer

IBM Joins Hybrid, Multi-Cloud Data Science Chorus

IBM Joins Hybrid, Multi-Cloud Data Science Chorus

IBM sings praises of build-anywhere, deploy-anywhere, open-source analytics. Here's a review of what's now a familiar refrain.

Lots of big tech vendors are now singing from the same hymn book when it comes to data platforms and data science. The message to customers is that they’re offering a range of deployment options, including hybrid-cloud and multi-cloud for agility. They’re also saying they’re open, supporting a range of open source languages, notebooks, frameworks and libraries. IBM hit on all these notes at its November 2 Cloud and Cognitive Summit in New York, but how does it stand out?

IBM’s Cloud and Cognitive Summit marked the introduction of two new tools and a new Hadoop and Spark service on the Watson Data Platform. Executives also revealed the Kubernetes-based containerization of IBM Data Science Experience, a move they said will enable organizations to build and deploy models wherever the data lives. Here’s a deeper look at the details.

It Starts with the Platform

“You can’t get to AI without IA.” That’s how Rob Thomas, general manager of IBM Analytics, explained the need for solid information architecture as an underpinning of artificial intelligence. Indeed, data management comes first, and IBM describes its Watson Data Platform as a kind of operating system for modern, data-driven applications, This cloud-based platform was launched last fall the Strata NY Conference 2016. The Cloud & Cognitive Summit was the launching pad for two new platform capabilities: Data Catalog and Data Refinery.

Data Refinery checks the box for self-service data-prep capabilities, though my sense is that it’s a starting point (see analysis below). Data Catalog helps users, particularly business users, get their arms around available data by tagging or ingesting preexisting metadata and creating an index of all available assets. IBM says its catalog is not just about data – whether on-premises or in the cloud, structured and unstructured. Using an API, IBM says admins can also inventory assets including models, pipelines and even dashboards.

The Summit also marked the general availability of the IBM Analytics Engine, which is the company’s new Hadoop and Apache Spark service. IBM already offered Hadoop and Spark services, of course, but the Analytics Engine was hatched this summer after the company ended development or its own IBM BigInsights distribution and related cloud service in favor of a partnership with Hortonworks. The new service separates storage and compute decisions, with persistence options including a new IBM Db2 Event Store that uses the Parquet data format to deliver what IBM says is much better performance that ordinary object stores.

Constellation’s analysis: Access control, governance and a shared collaborative and community workspace are the key concepts behind Watson Data Platform. The platform gives large organizations with lots of data sources, data pipelines, models and data-driven applications a centralized, project-oriented home in which to prepare, store and analyze data and then deploy and manage models. The analyze, deploy and manage aspects are handled with the IBM Data Science Experience (detailed below).

With the new Data Catalog and Data Refinery capabilities, Watson Data Platform adds depth as a data-management and governance layer. Seeing the demos and talking to multiple executives at the Summit, I came away wanting more detail. I liked the cataloging vision of being able to inventory pipelines, models, dashboards and other assets as well as data. But there wasn’t a lot of nitty, gritty insight into the out-of-the-box capabilities versus what you can do with APIs. As you can read in my Constellation ShortList on Data Cataloging, there’s a lot to a state-of-the-art product in terms of crawling sources, automatically tagging, applying machine learning to track and understand access patterns, supporting collaboration around assets and offering intelligent recommendations to catalog users. I need to see more and talk to customers before I would add IBM Data Catalog to my ShortList.

A couple of executives I spoke to at the Summit described the Data Refinery as a work in progress. The current plan is for the Refinery to be an extra-cost option, but as a buyer, I’d want to see the list of out-of-the-box connectors and details on assisted data-prep and recommendation capabilities, as outlined in my Self-Service Data Prep ShortList. At this writing there’s a free beta available, so it’s possible to do some comparison shopping before paying extra for this feature of the Watson Data Platform.

Every modern data platform worth its salt now separates storage and compute decisions, and the IBM Analytics Engine was an obvious and inevitable update given the end of BigInsights development. IBM has joined Microsoft, Oracle and Pivotal, among others, in offering cloud services based on the ODPi standard. Adding the Event Store is a good step for performant object storage, though I have no idea why IBM has saddled it with “Db2” branding given that it has nothing to do with that commercial relational database.

Data Science Experience Goes Multi-Cloud

IBM introduced Watson Data Platform and Data Science Experience (DSX) back in 2016 with support for open-source options including Apache Spark, R, Python, Scala and Jupyter notebooks. At last week’s event it joined the chorus of notable vendors (also including Microsoft, SAS and SAP) talking up hybrid and multi-cloud freedom of choice for data science work. In the case of DSX this multi-cloud support has been made possible by the recent containerization of the product by way of Kubernetes, so it can be deployed in Docker or CloudFoundry containers “wherever the data lives.” There was also mention of DSX integration with GitHub, although this is apparently in the formative stages (see analysis below).


IBM Data Science Experience provides a project-oriented management layer for unified,
controlled access to data, models, pipelines, notebooks and collaborative work spaces.

DSX is both a part of and, optionally, independent from Watson Data Platform as DSX Local, which can run behind corporate firewalls or on desktops. DSX provides permission-controlled, collaborative access to projects, data, data science tools, services, and a community space. With its support for R, Python and Scala and Jupyter and (now on DSX Local) Apache Zeppelin notebooks, DSX users can tap popular open source libraries including Spark MLlib, TensorFlow, Caffe, Keras and MXNet.

IBM says DSX’s big differentiator is its ability to support “clickers as well as coders.” I covered the coders part above. Clickers, meaning non-data scientists, use DSX as a gateway to SPSS, which is IBM’s commercial offering supporting point-and-click and drag-and-drop modeling and statistical analysis. SPSS is also the source of IBM’s machine-learning-driven, automated model development, deployment and optimization capabilities, which were rebranded from IBM Predictive Analytics to Watson Machine Learning in October 2016.

Constellation’s analysis: IBM and other leading commercial vendors have gotten the message that data scientists want open source options and hybrid and multi-cloud deployment options through which they can avoid vendor lock-in. This year I’ve seen lots of analytics and data science platform progress announcements, from Cloudera, Databricks, IBM, and Microsoft to Oracle, SAP, SAS, Teradata and more. Common themes include support for Spark for processing; object stores for the separation of storage and compute; column stores for performance; R, Python and, in some cases, Scala, for language support; Jupyter and Zeppelin notebook support; and access to Spark ML, TensorFlow, Caffe, Keras, and other leading frameworks and libraries.

These data science platforms provide a centralized environment for securely sharing access to data, collaborating around models and then deploying, monitoring and maintaining models at scale. Cloudera is focused on doing this work on its own Hadoop/Spark platform whereas IBM, Oracle, Microsoft, SAP and SAS also integrate with their respective commercial data warehousing platforms, streaming capabilities, analytics tools and libraries, and public clouds.

Amazon Web Services and Google both have enviable data platform and data science portfolios as well, but their emphasis is on doing it all in their respective public clouds, which isn’t always possible for big enterprises with lots of systems and data still on premises. IBM, Microsoft and SAS have embraced containerization for hybrid and multi-cloud deployment, acknowledging that customers want to be able to analyze data and build, deploy and run models anywhere, including rival public clouds.

IBM and SAS have had a lot to say about support for open source languages and libraries (and in IBM’s case, Apache Spark), but their commercial analytics software offerings are also part of their Data Science platforms. As a customer, I’d want to know exactly what commercial software I’m licensing or subscribing to along with the platform, the terms of that investment and whether there are options to consume that software in an elastic, services-oriented model on-demand.

I was heartened to hear that IBM is also pursuing GitHub integration with DSX, but few details were available on this push. Among the many data science platform announcements I’ve seen this fall, I’d have to say I was most impressed by Microsoft's next generation of Azure ML (currently in beta). Microsoft has integrated with GitHub to track the end-to-end lifecycle of code, configurations and data (as well as the lineage and provenance of data) used throughout the model development/deployment/optimization lifecycle.

Being able to track data lineage is crucial to satisfying regulatory requirements in the banking and insurance sectors. It’s also what’s needed to satisfy General Data Protection Regulation (GDPR) requirements looming in the European Union and to meet growing demand for explainable and interpretable predictions and recommendations. I suspect IBM is on the same track to bolstering data-governance capabilities with GitHub.

In a separate announcement on November 2, IBM, Hortonworks and ING Group are working with the Linux Foundation to promote an open data governance ecosystem that will define interfaces for diverse metadata tools and catalogs to exchange information about data sources, including where they are located, their origin and lineage, owner, structure, meaning, classification and quality. This work stands to benefit both cataloging and, more importantly, data-governance and GDPR compliance.

Related Reading:
Microsoft Stresses Choice, From SQL Server 2017 to Azure Machine Learning
Oracle Open World 2017: 9 Announcements to Follow From Autonomous to AI
SAP Machine Learning Plans: A Deeper Dive From Sapphire Now

Media Name: Watson Data Platform.jpg
Media Name: IBM Data Science Experience.jpg
Data to Decisions Chief Information Officer Chief Digital Officer

Digital Transformation Digest: Dreamforce Focuses on Personalization, the IoT Implications of Broadcom-Qualcomm, and Amazon's Risky Discounting Program

Digital Transformation Digest: Dreamforce Focuses on Personalization, the IoT Implications of Broadcom-Qualcomm, and Amazon's Risky Discounting Program

Constellation Insights

Salesforce focuses on the personal touch at Dreamforce 2017: One of the busiest weeks San Francisco sees all year is underway, with the start of Salesforce's Dreamforce 2017 conference. As expected, the company has made a slew of product announcements spanning AI, IoT, low-code development and other areas, but there's a clear through-line that will be emphasized all week: Salesforce offers a collection of platforms that can be fine-tuned not just for an organization, but for individual workers' needs and desires. Here's a look at the highlights.

MyTrailhead is a revamped version of Trailhead, the online training service Salesforce first launched in 2014. Trailhead has always used gamification and other modern learning techniques, but the new version adds deep customization capabilities. Trail Maker is a guided setup toolset that companies can use to build out custom training content portfolios, both from Salesforce's library and ones of their own creation. Trail Mixer gives employees and managers the means to pull together bundles of training material for specific roles, and then share them with others. Trail Tracker and Trail Checker focus on accountability, using rewards badges, quizzes and other tools to maintain a record of employees' progress on the training platform.

POV: MyTrailhead is set for a pilot program in the first half of next year, with general availability to follow later in 2018. Based on the descriptions Salesforce provided, the service is evolving significantly. However, unlike Trailhead to date, myTrailhead is a paid service. This is a significant but perhaps not unexpected change. While Salesforce already has had a paid certification program, MyTrailhead represents an additional revenue opportunity; companies who have already embraced the free version may find value in its additional capabilities. Pricing won't be disclosed until the GA date, however, and it is not clear whether a free version of Trailhead will remain in place.

MyIoT is Salesforce's attempt to bring IoT development capabilities to any worker. The initial product is IoT Explorer, which provides a point-and-click interface for developing IoT apps on the Salesforce platform. Salesforce cited use cases that are naturally attuned to its sales, marketing and service milieu, such as if a car dealer created a workflow app that generated automatic service appointment phone calls to connected cars when they reach a certain mileage marker.

POV: Salesforce has relied on partners such as Amazon Web Services for device connectivity, while focusing on providing an IoT application development environment and runtime. Nothing changes in that regard with myIoT; what remains to be seen is how much of its vision of LoB workers spinning up custom IoT apps comes true. Hopefully, Dreamforce will showcase early customers having success with it. IoT Explorer is generally available now with pricing starting at $6,000 per month for companies with enterprise licenses or above.

Salesforce is also announcing mySalesforce, another low-code service for building branded mobile applications; myLightning, which adds more customization and branding capabilities to its underlying Lightning development framework; and myEinstein, for creating AI-driven applications in a point-and-click manner.

Overall, there's a lot on offer at this year's Dreamforce, and we will be following it closely all week.

Broadcom's record bid for Qualcomm and the IoT implications: The semiconductor market was roiled Monday with the announcement of Broadcom's $130 billion takeover offer for Qualcomm, the world's dominant manufacturer of SoC (system on chip) integrated circuits that power the world's higher-end smartphones.

While the proposed deal instantly drew talk of severe antitrust hurdles, Broadcom took a step that could help seed those waters in its favor last week, announcing plans to move its legal headquarters back from Singapore to the United States. It had made the move to Singapore for tax reasons, but cited proposed Republican changes to U.S. tax laws as the reason for the return.

Broadcom is also in the middle of acquiring NXP. The combined company would be the world's third-largest chipmaker after Intel and Samsung, however. (Intel made waves of its own on Monday, announcing a deal with AMD on new processors that combine Intel chips with AMD GPU (graphical processing units), a move that will step up competition with NVIDIA.)

POV: Both the NXP and Qualcomm deals are far from done for Broadcom (and the latter, in particular, depends heavily on borrowing cash, which brings its own challenges), but in the broad strokes are to be expected. IoT market predictions vary, but all of them point to stratospheric rises in the number of connected devices, as well as the average sophistication of those devices over time. That translates into a need for lots of increasingly powerful, yet less expensive chips, and one clear path to that outcome is industry consolidation.

Amazon's new discounting program carries risks: In advance of the holiday shopping season, Amazon has made yet another bold move in a bid to maintain and grow online market share. A new "Discount by Amazon" program applies discounts to items sold on Amazon by third-party sellers, without those sellers needing to do a thing. Amazon gives the discount directly to buyers, while sellers receive their original asking price (and pay the original sales referral percentage fee).

However, Amazon seemingly did next to nothing to publicize the program to third-party sellers, some of whom have bristled over the potential for it to conflict with agreements they have with product manufacturers over publicly posted prices. Such concerns are paramount to more exclusive brands, which in many cases use higher prices as a cachet. Many third-party sellers are themselves aiming for a boutique image, rather than catering to bargain hunters.

POV: Amazon does offer third-party sellers the ability to opt out of the program, but it should have done a better job of publicizing it in the first place. Overall, Amazon must strike a delicate balance between aiming for low-price parity with or victory over rivals such as Walmart (and its Jet.com subsidiary), and the concerns of its third-party sellers, which comprise half of all sales and have profit needs that don't reflect Amazon's ability to spend profiglately to scoop up market share. It will be of interest to see how the discounting program plays out over the next six weeks or so of holiday shopping.

Future of Work Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Monday's Musing: Infinite Ambient Orchestration

Monday's Musing: Infinite Ambient Orchestration

The Design Point For All Future AI Driven Apps

The quest for mass personalization at scale in an era of artificial intelligence (AI) has led to new models of design for the future of applications.  One design point for these new AI driven smart apps is a concept called Infinite Ambient Orchestration.  The three components can be described as:

  1. Infinite.  The design point should consider contextually relevant and relative journey design.  These journeys have no beginning or end.  Journeys deliver both stateful and stateless interactions.
  2. Ambient. Elements of artificial intelligence provide contextual relevancy.  These capabilities make right-time recommendations to augment decision making and in many cases power situational awareness
  3. Orchestration.  In an age of access not ownership, systems must orchestrate across insight, process, platforms, and ecosystems.

As new systems are created, organizations can expect this design point as a first principal for AI driven systems.

Your POV.

So what will you automate first with AI?  Do you have a digital transformation strategy?  Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org.

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

Innovation & Product-led Growth Leadership Chief Experience Officer

Digital Transformation Digest: Synopsys Grabs Black Duck for OSS Security, Tor Gets Big Makeover, Microsoft-Adobe's Partnership Still Blooming

Digital Transformation Digest: Synopsys Grabs Black Duck for OSS Security, Tor Gets Big Makeover, Microsoft-Adobe's Partnership Still Blooming

Constellation Insights

Black Duck Software scooped up by security vendor Synopsys: It's perhaps a stretch to call a company a startup when they've been in business for 15 years, but that's the case with Black Duck, which has offered tools and services for securing and managing open source software. Now Black Duck is making its belated exit in the form of a $565 million buyout from Synopsys. Here are the key details from the companies' announcement:

Software development is undergoing sweeping and rapid change, including the increasing use of open source software (OSS), which makes up 60% or more of the code in today's applications. While the use of open source code lowers development costs and speeds time to market, it has been accompanied by significant security and license-compliance challenges, because most organizations lack visibility into the OSS in use. Black Duck's industry-leading products automate the process of identifying and inventorying the open source code, detecting known security vulnerabilities and license compliance issues.

POV: Black Duck's focus has shifted over the years from open source license compliance toward cybersecurity concerns, which likely made the company a more attractive purchase for Synopsys. One of its chief competitors is Palamida, which was acquired by Flexera in 2016. Now Black Duck's portfolio will work in concert with other Synopsys products, such as Coverity, which provides code-scanning and validation. While Black Duck had reported a sharp uptick in business during the first half of this year, bringing it under the Synopsys umbrella will provide additional scale and visibility going into 2018.

Tor rolling out next-generation onion routing system: Developers of the Tor private communications software have delivered an update some four years in the making. Here's how the changes are summarized in an official blog post:

[T]he legacy onion system has been around for over 10 years and its age has started to show. So let's get a taste of the improvements these next generation onions provide us with:

On the cryptography side, we are looking at cutting-edge crypto algorithms and improved authentication schemes. On the protocol end, we redesigned the directory system to defend against info leaks and reduce the overall attack surface.

Now, from an engineer's perspective, the new protocol is way more extensible and features a cleaner codebase. And finally from the casuals user's PoV, the only thing that changes is that new onions are bigger, tastier and they now look like this: 7fa6xlti5joarlmkuhjaifa47ukgcwz6tfndgax45ocyn4rixm632jid.onion.

All in all, the new system is a well needed improvement that fixes many shortcomings of the old design, and builds a solid foundation for future onion work.

POV: The developers note that the new release remains in early days, still undergoing testing, and that many more features are to come as the code base stablizes. The legacy Tor system will remain the default option for users for the time being, and will still be available for some years after the switchover is made to the new version, according to the blog.

It's important to underscore that the changes in the new version focus largely on improved security, and apparently very little on faster performance. Tor has always provided a sluggish user experience—albeit due to the nature of its architecture—and that has likely held down on adoption numbers and its overall awareness. Speeding up Tor is obviously something crucial to work on over time, but for now its faithful users will no doubt appreciate a fresh approach to security in an age where both malicious attacks and the prying eyes of authorities are at an all-time high.

Adobe, Microsoft bring together CX and CRM: Microsoft and Adobe have integrated their Dynamics CRM and Experience Manager content management products, in the latest instance of progress on the companies' ongoing partnership. 

Teams from Microsoft and Adobe have been working together for more than a year on product integrations. The tie-in between Experience Manager and Dynamics CRM has a dual purpose: delivering personalized content to websites, while feeding back lead-generation information to Dynamics CRM. Together, the products comprise a gestalt for marketing and sales professionals, according to Adobe and Microsoft:

The tight integration of marketing with Dynamics 365 customer data provides joint customers with a complete view of their customers at every interaction. For example, if someone searches for a gym membership, the brand can intelligently customize its landing page, mobile app, chatbot and all other engagement to be focused on her activity of interest, such as yoga. This level of personalization helps increase the individual’s engagement through a more seamless interaction, with a high likelihood for her to convert to become a customer.

More than 150 trillion customer data transactions and 41 trillion rich media requests move through Adobe Experience Cloud each year, according to a statement.

POV: Unlike other tech partnerships, which can be high on sizzle and less so on substance, Microsoft and Adobe's pact has teeth. They are collaborating closely, with sales compensation implications on deals for reps at both companies, says Constellation VP and principal analyst Cindy Zhou. Moreover, Adobe and Microsoft's partnership is tackling a problem that truly needs solving, she adds.

"The problem is that with the dizzying array of marketing tech and sales tech out there, it is making it difficult to gain a unified view of customer data," Zhou says. This is leading to problems with customer personalization, ROI outcomes for marketing campaigns, and potentially, problems with the likes of the General Data Protection Regulation, a stringent new consumer privacy framework set to take effect next year. (You can download an excerpt of Zhou's new report, "A Guide to GDPR Compliance for Marketers," at this link.)

Digital Safety, Privacy & Cybersecurity Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Procurement Officer Chief Digital Officer

My Future of Work Coverage Areas

My Future of Work Coverage Areas

I often get asked about which areas/markets/companies/products I cover, so I thought I'd make a graphic and a video that discusses some of them.

 

 

 

 

Future of Work