Results

Event Report: Dreamforce 2017 - Quip's Digital Canvas Brings Context to the Salesforce Platform and Beyond

Monday Nov 6 - Thu Nov 9, Salesforce held their annual Dreamforce conference in San Francisco. This massive gathering of customers and developers covers the full range of Salesforce offerings, from Sales to Marketing to Customer Service and much more. Constellation Research had a team of analysts in attendance, with my focus being on the collaboration components of the event. Below I will discuss 3 key areas: Quip, Community Cloud, and Salesforce and Google partnership announcement.

For more information, please read the posts by my colleagues Cindy (Marketing and Sales), Doug (Data and Analytics), Holger (Infrastructure and application development), Andy (IoT) and Dion (leadership and business transformation)

Quick Summary

  • Quip: Introduced Live Apps, which enables people to embed a variety of content onto a Quip page, turning it into a Digital Canvas for collecting information in context on a single page
  • Community Cloud: Introduced Lightning Flow and Einstein Answers
  • Salesforce and Google partnership: Access Salesforce information in Gmail and Google Sheets. Salesforce customers can trail GSuite for free for a year. (* several restrictions apply)

Below is a short video where I discuss these items

 

Quip - The Digital Canvas That Glues Salesforce (and many other things) Together

For those of you that are unfamiliar with Quip, it started as an online "word processor" that combined documents, spreadsheets and chat into a seamless experience. They were acquired by Salesforce in August 2016, and since then have been rapidly expanding the variety of content that can be embedded onto a page, culminating in the announcement at Dreamforce of what they are calling, Live Apps. Live Apps fall into three categories:

  • Native features (tables, images, kanban boards, countdown timers, progress bars, Salesforce records, etc)
  • Business Partner integrations (currently: Atlassian Jira, Workplace by Facebook, New Relic, Lucid Chart, Smartsheet, Docusign)
  • Customer applications using the API (an example was shown where 20th Century Fox embedded a custom video player)

MyPOV

As the number of applications and websites people use to get work done grows, so to does the complexity and lack of context. Enter the concept of a "digital canvas", where multiple sources of content can be embedded together in a single place. When conversations can take place in context around these combined canvas, and updates to the content occur in real time, it makes work more productive, effective and efficient. Constellation Research sees Quip as the way Salesforce can tie together not only content from their own services, but those of their partners as well. For example, a Quip canvas could show information from a Salesforce customer record along with all their open customer service tickets, tweets about their products, support documents and files, the marketing campaigns they are part of, product images and a task list, plus enable the account team to have conversations about all these objects in a single place.

Quip faces a few obstacles for mass adoption. First, The concept of a "digital canvas" is new to most employees, so explaining what it does and how it works can be difficult. Does Quip compete with products like Box Notes and Dropbox Paper, or Microsoft Word and Google Docs, or Evernote and Microsoft OneNote, or Confluence and Socialtext? Second, Salesforce needs to reconcile (and vocalize) the long term roadmap between Quip and Salesforce Chatter. While there are differences, customers still need to know when and why to use each product. Third, licensing, purchasing and administration needs to become a seamless part of the overall Salesforce experience in order for it to grow within large enterprises.

Example of a Quip document I used while working on this blog post.

Ex of a Quip document

 

Community Cloud - Enabling Conversations, Knowledge Sharing and Commerce

The Salesforce Community Cloud provides digital experiences for use both externally with customers and internally with employees. At Dreamforce 2017 they made several announcements which can be seen in this keynote.

With respect to personal productivity, collaboration and getting work done... the two most significant items are Lightning Flow and Einstein Answers.

  • Lightning Flow (MyPOV conspicuously named similarly to Microsoft Flow) enables people to easily create rules that automate common actions that occur inside communities. For example, if someone selects a certain topic, you could automatically move them into a certain marketing campaign. It will be interesting to see what templates Salesforce (and their partners) provides to help guide people in creating flows.
  • Einstein Answers leverages Salesforce's artificial intelligence engine to enable community members to ask questions and have the most relevant answers returned to them, as well as recommended subject matter experts, without the need for human intervention. 

Salesforce Community Cloud

MyPOV

With partner integrations for everything from ecommerce to video conferencing, custom branding, integration with the rest of the Salesforce portfolio, and new features like Flow and Einstein answers, Salesforce Community Cloud provides a very robust platform for both internal and external communities. With one of their main competitors Jive Software struggling in 2017, the market is ripe for Community Cloud to gain traction with new customers and expand their usage within existing Salesforce accounts. I would like to see Salesforce push the internal intranet use case stronger than they currently do, as I believe getting employees hooked can lead to expansion into other areas, especially around collaboration via Quip and Chatter.

Salesforce and Google - The Enemy of My Enemy is My Friend

One of the "surprise announcements" at Dreamforce was a tighter partnership with Google. The announcement involves several areas, so I will focus on the productivity and collaboration tools, while my colleague Cindy will cover Marketing and Sales Cloud integration with Google Analytics 360, and Holger will discuss Salesforce's use of Google's data-centres.

At Dreamforce 2015 Salesforce and Microsoft announced several strategic partnerships. However, since Microsoft acquired LinkedIn in 2016 the future of that partnership has been in question. The new announcements with Google hint at a new (but not exclusive) direction for Salesforce with Google's products. The new partnership includes:

  • Salesforce Lightning for Gmail: Users can surface relevant Salesforce CRM data in Gmail, and interactions from Gmail directly within Salesforce.
  • Salesforce Lightning for Google Sheets: Users will be able to auto-update data between Salesforce Records or Reports and Google Sheets.
  • Quip Live Apps for Google Drive and Google Calendar: Teams will be able to embed and access Drive files (e.g., Google Docs, Slides, Sheets) or their Google Calendar inside Quip.
  • Salesforce for Hangouts Meet: Within Hangouts Meet, users will be able to surface relevant Salesforce customer and account details, service case history, and more

Also, Salesforce customers who are not currently Google G Suite customers will be able to trail G Suite for up to a year. There are several restrictions and details to this offer which can be viewed here.

MyPOV

Stronger integration between Google and Salesforce's products will be very welcome to their joint customers. It also could influence customers making a decision between Microsoft Office 365 and Google G Suite, and similarly customers deciding between Salesforce CRM or Microsoft Dynamics. It will be interesting to see where this partnership is in a year; including how many customers have taking advantage of the trail and how far along the product integration has come.

Conclusion  

Salesforce Dreamforce is one of the "can't miss" tech events of the year. This year there was not a major product announcement like there has been in previous years (ex: introduction of Salesforce Lightning or Salesforce Einstein) but rather an overall "maturing" of the product lines. This includes new features, new integrations, new customers and new partners... which is a good thing. Sometimes incremental improvements mean more to customers than a big bang. For my coverage of personal productivity and collaboration, obviously the introduction of Quip's Live Apps is significant. The ability to bring context to content on conversations via a seamless "digital canvas" is going to play a important role in the Future of Work.

 

Future of Work

HPE, Rackspace to Debut Pay-As-You-Go Private Cloud

Constellation Insights

Hewlett-Packard Enterprise and Rackspace are teaming up on a new offering that will combine the OpenStack IaaS fabric with HPE servers run at a data center of a customer's choosing, with metered pricing. The companies claim customers can save at least 40 percent compared to using the "leading public cloud," also known as Amazon Web Services. Here's how HPE and Rackspace describe the value proposition:

Leveraging HPE Flexible Capacity, customers pay for what they use in an on-demand consumption model for infrastructure. This feature enables private cloud customers to more closely align resources to growth and handle burst capacity and traffic spikes without the need to pay for additional fixed capacity.

Enable enterprise-grade security and reliability: With a single-tenant model, customers can eliminate the performance and “noisy neighbor” issues commonly found in multi-tenant environments, and can more easily meet security, compliance and data sovereignty needs.

Rackspace will provide managed services for the systems, with a 99.99 percent uptime guarantee. The company terms itself the world's most seasoned OpenStack operation, with greater than a billion "server hours of OpenStack expertise."

The systems are set for general availability on November 28. Rackspace plans to offer similar managed private clouds based on VMWare and Microsoft Azure Stack next year.

Analysis: A sensible partnership, but will customers follow?

On paper, the deal makes good sense for both HPE and Rackspace. The former gave up its public cloud ambitions entirely, ceding the market to AWS and others, but still has massive amounts of server and networking kit to sell. HPE also has useful software assets for private cloud management, with the acquisition this year of Cloud Cruiser. It had already been the largest customer of Cloud Cruiser's technology for monitoring and measuring IT infrastructure usage and spending, rebadging it as HPE Flexible Capacity.

Meanwhile, Rackspace has steadily moved away from its roots as a hosting provider and into specialized managed services, particularly for OpenStack, which it co-created with NASA in 2010. Today, OpenStack is a highly successful project managed by the OpenStack Foundation, with more than 500 participating companies. That means Rackspace can't claim to be the only player with the right OpenStack chops, but its experience with the technology obviously puts it in the lead.

Where HPE and Rackspace's announcement should prompt cause for skepticism comes in the cost saving claims. Rackspace's website says its estimates are based on an "internal pricing analysis," the methodology of which hasn't been made public. Prospective buyers of the new private cloud service would do well to hold Rackspace's feet to the fire on its pricing claims.

However, even if the savings don't quite measure up to those lofty heights, the large enterprises HPE and Rackspace are targeting may find the service's key differentiator lies in its single-tenant model. although AWS and other public cloud providers may have a quibble with HPE and Rackspace's invocation of "noisy neighbor" problems on their services. The term refers to cases where on a multitenant cloud service, a particular application begins to hog bandwidth and other shared resources, causing performance issues for other tenants. This problem can be avoided through bare-metal deployment options, which have been available for years but do incur fractionally higher costs.

Overall, HPE and Rackspace's announcement came without much ceremony, but the service's success will certainly be one to watch closely in the coming months.

Tech Optimization Chief Information Officer

Salesforce Dreamforce 2017: 4 Next Steps for Einstein

Salesforce Einstein Prediction Builder, Bots, Data Insights and a new data-explorer feature stand out as the big AI and analytics announcements. Here’s what they’ll do for your business.

To Salesforce customer Bill Hoffman, Chief Analytics Officer at Minneapolis-based US Bank, the “A” in “AI” is about “augmented” intelligence because, as he said in a keynote at this week’s Dreamforce event in San Francisco, “there’s nothing artificial about it.”

US Bank has deployed Salesforce Einstein capabilities including Predictive Lead Scoring and Einstein Analytics (formerly known as Wave) for customer attrition analysis and retention efforts. It's also using Einstein Discovery (formerly BeyondCore) to better understand customer behavior and cross-sell opportunities. The bank expects to roll out Einstein capabilities to more than 2,000 of its customer-facing financial advisers across the firm in hopes of “personalizing service at scale” and “creating a differentiated customer experience,” Hoffman said.

Personalizing at scale is precisely the idea behind two “myEinstein” capabilities announced at Dreamforce. Also announced were two Einstein Analytics capabilities. All four capabilities are coming to the portfolio next year. Here’s what they promise to do for your business.

Einstein Prediction Builder: Plenty of Salesforce customers are using or considering machine-learning-based Einstein capabilities, most of which were detailed in my 19-page report published earlier this year. But at Dreamforce 2017 we heard the revealing stat that some 80% of the customer data in Salesforce is tied to custom (customer-defined) fields and objects. No surprise, then, that the number-one ask among Salesforce customers was for customizable, as well as pre-built, Einstein insights, predictions and recommendations.

Einstein Prediction Builder is a no-code capability designed to enable non-data-scientists to develop predictions using custom fields. Use cases are limitless, but popular use cases are likely to include cross-sell/up-sell, churn, CSAT and propensity-to-escalate analyses. Prediction Builder will be powered by the same machine-learning data pipeline that handles millions of Einstein predictions per day, but it will be opened up – starting with a February beta release and likely June general release – to custom fields and objects in Salesforce. Pricing has not been finalized.

Einstein Bots: Salesforce picked up strong natural language understanding and natural language translation capabilities through its 2016 acquisition of MetaMind. Einstein Bots, a second My Einstein feature, will couple these language capabilities with Salesforce data and the Salesforce workflow engine to power automated customer-service agents. The idea is to handle the bulk of the simple, frequent service cases, such as user password resets, while leaving the long tail of complex and infrequent service inquiries to human agents.

As with Prediction Builder, Einstein Bot development will be a no-code proposition. It will start with point-and-click selections and workflow setup and uploading of spreadsheets of sample customer-service interaction text to train the language model. Beta release is expected in February with generally availability to follow in June. Pricing will be announced at general availability, but I expect it to be based on the volume of cases handled over a specified time. The Bots will start with text-based interaction, but voice-based interaction is likely to follow.

Einstein Data Insights: This new Einstein Analytics (formerly Wave) capability provides deeper insights into standard Salesforce reports from the Sales Cloud, Service Cloud and, eventually, other clouds. Powered by the same engine behind Einstein Discovery, Einstein Data Insights will automatically surface important trends, outliers, changes over time and even data-quality problems within standard reports, displaying a combination of visualizations and textual explanations. Users will press a button embedded on a standard report and the visualizations and textual explanations will appear on the right side of the screen (see image below). This capability is also expected to see beta launch in February with general availability next June. The pricing model has yet to be determined.

Einstein data explorer feature: This capability, which will be included with Einstein Analytics, will let you have "a conversation with your data," says Salesforce, by typing in questions in plain English. Behind the scenes, keyword-driven interpretation will help you drill down on dashboards and visualizations to better understand not just what happened by why it happened. You could drill down on a total figure, for example, by typing “amount by product.” Or you could analyze performance by typing in “lost deals by product.” This feature is expected to be generally available in February.

My Perspective on Einstein’s Progress

As compelling as the coming Einstein capabilities are, the big question on the minds of many customers is “how much will it cost?” It seems we’re still in a chicken-and-egg phase in which both Salesforce and customers are trying to figure out how much Einstein capabilities are worth. Different sorts of predictions and recommendations have different values, depending on the cloud and the types of actions triggered. The size and nature of the customer adds another dimension of complexity, with large enterprises sometimes preferring all-you-can-eat enterprise deals. Salesforce, meanwhile, needs to establish clear revenue expectations to keep its investors and Wall Street happy. Innovation presents its challenges.

Picking up on trends in big data and open-source software pricing these days, one possible pricing approach would be to provide free access to Einstein development tools and a limited number of predictions or recommendations so businesses can get a sense of what they can do. Once the capability is deployed, Salesforce could apply volume-based per-prediction or per-recommendation charges would kick in. In this way, charges would be tied to the value delivered to the customer, although different customers would surely have different perceptions of value, so it might be hard to come up with a one-size-fits-all pricing scheme.

One thing that customers might have found confusing at Dreamforce was the distinction between Einstein and Einstein Analytics (formerly Wave Analytics). There were two separate keynotes at Dreamforce and there are two separate teams behind different sets of Einstein capabilities. But they are both part of one portfolio and a continuum from descriptive and diagnostic analytics to predictive analytics and prescriptive recommendations and actions (as well as advanced language and vision capabilities and APIs for human-interactive applications). Before you can get to the predictive and prescriptive part you need to have good data and reporting in place.

US Bank is using capabilities across the Einstein continuum, and Bill Hoffman, when asked for advice during a keynote, said you have to start with data quality and you have to bring in key stakeholders and risk management and compliance partners from the beginning. In short, don’t expect to get the sizzle of “AI” without addressing the the meat-and-potatoes of data management and baseline reporting and analytics.

The unsung announcements that didn’t get as much attention at Dreamforce included a recent rewrite of the Einstein Analytics engine said to deliver a 30% reduction in data-ingestion and query times. Available data capacity was also more than doubled to 1 billion rows per customer. For easier data loading from external sources, Salesforce has added out-of-the-box data connectors for AWS Redshift, Google BigQuery and Microsoft Dynamics, and more than 20 additional pre-built connectors are to be added over the next six months. Finally, Smart Data Prep capabilities have been enhanced with data profiling, auto clustering, anomaly detection, filtering and transformation suggestions.

These upgrades aren’t the sexy stuff, but they are day-to-day productivity improvements that will help sell customers on handling analytics within Salesforce and advancing to Einstein predictions and recommendations.

Related Reading:
Tableau Conference 2017: What’s New, What’s Coming, What’s Missing
Oracle Open World 2017: 9 Announcements to Follow From Autonomous to AI
Microsoft Stresses Choice, From SQL Server 2017 to Azure Machine Learning

Media Name: Einstein Observatory.jpg
Media Name: Einstein Prediction Builder III.jpg
Media Name: Einstein Data Insights.jpg
Data to Decisions Tech Optimization Chief Customer Officer Chief Executive Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Salesforce IoT; A major showcase at Dreamforce

It’s that time of the year when Salesforce persuades almost unbelievable numbers of people, (said to be 170,000), to come to San Francisco for its annual Dreamforce event. It’s no longer possible to classify attendees as customers, developers, even pure CRM professionals as the pervasive nature of Salesforce Technology at the center of Enterprise capability has broaden the roles and interests of the attendees. The packed exhibition halls offer a scale and diversity of exhibitors that more closely resembles an Industry Trade show than a vendor show; yet everything on show runs on, or integrates through Salesforce technology including IoT!

Salesforce is certainly not new comer to IoT, but what is new to the Industry generally, is the Salesforce views on what IoT can do for an Enterprise, and how to get started. It’s a carefully crafted approach that matches and compliments their other technology, and the working practices of the ‘core’ Salesforce delivery staff in their various Enterprises.

The IoT keynote introduced the Salesforce point of view and proposition for IoT, with a series of following sessions used to build the details with deployment examples from two, three or four customers as proof points. Here are the Salesforce IoT key message statements presented;

  1. Behind Every Device there is a Customer;

Salesforce IoT products, tools, and deployment all relate directly to those activities that in some way, or other, will create a better customer experience.

A statement that positions Salesforce with a simple, clear, and well-defined Business focus in what is otherwise a very broad, often technology defined, marketplace. The focus allows Salesforce to align IoT with their core Business proposition, customer base, and Industry experience and is a contrast with the majority of the IoT market which aligns with machines and process improvements.

  1. Events, and Data, make IoT Business Valuable

Salesforce defines IoT value as any ‘real-time’ event triggers that create data inputs rather than limiting IoT to the often-used conventional definition referring to data from Sensors.

Whilst this may not fit necessarily fit with the popular Industrial IoT categorization, it does Salesforce to make use of a wider range of Enterprise sources as ‘IoT’ inputs. Salesforce IoT capability is able to make effective use of inputs from all of the nine principle areas of ‘engagement’ to be found in ‘Systems of Engagement’. Some of these sources, such as Social, are clearly of great value to the Salesforce IoT Customer focus in 1).

  1. Deploy from Salesforce Cloud Engines Business Value

IoT pilots, and small-scale deployments, have been hard to business justify due to the need for a unique complex processing

Creating IoT event data from simple low-cost deployments is not difficult, but making use of the data has often been a barrier to any real success. The premise that IoT will support ‘real-time’ read and react capabilities that deliver new business value has been difficult to deliver without expensive investments in complex Event Engines. Salesforce IoT works in association with, and fully integrated to, Salesforce Clouds offering a familiar development environment and full integration with other Enterprise activities. The standard Salesforce benefits of starting small at low cost and scaling up enable practical IoT projects to be quickly tried and tuned.

  1. Salesforce MyIoT Personal Toolset Eases Delivering Initial Value

A new product to facilitate rapid deployment of ideas as fully functional implementation using a highly business intuitive user interface.

In designing the MyIoT product Salesforce expect to encourage any Salesforce professional to be able to deliver a proof of concept as a genuine fully functional solution this breaking down the barriers to wide spread innovation. MyIoT implementations are fully functional and can be used to address the smaller project, or to prepare for a large-scale deployment.

  1. Updated Salesforce IoT Product Sets For Scale Up

Updated versions of Salesforce IoT Explorer Edition and Salesforce IoT Enterprise Edition complete Salesforce IoT product range

Full Product function listings for each can be found at; IoT Explorer Edition and IoT Scale Edition

  1. Significant Experience and Trained Advisor and Partners

Salesforce has grown its support and training capabilities to align with the expected increase in interest that the above will create.

Dreamforce provided a significant number of Case Study sessions on different ways that Salesforce had created business value for customers in a wide range of Sectors. In addition there was a dedicated Salesforce IoT Trail area to showcase partners covering Business Consulting, System Implementations and supporting Products to extend functionality.

 

Constellation Summary

Salesforce has been very active from the early days of the IoT market and clearly has gained a lot of practical experience that has been used in defining its IoT market positioning and products.

Cloud deployment models, the shift towards Systems of Engagement, including Machine Learning and Augmented Intelligence, all combine to require a IoT data and event management. Add Digital Business market disruption driving increased focus on customer centric activity and Enterprises are going to need to look very careful at their IoT strategy.

With the clarification of direction and product updates plus the existing investment in Salesforce existing salesforce customers should look carefully at the Salesforce IoT proposition.

 

Addendum

Salesforce IoT Overview

- https://www.salesforce.com/products/salesforce-iot/overview/

Salesforce ‘Behind every device is a Customer’

- https://www.salesforce.com/products/salesforce-iot/why-salesforce/

 

Tech Optimization

Digital Transformation Digest: IBM's Latest Quantum Computing Push, OpenStack Foundation Looks Beyond, Department Chains Still Wrestling with Online Shift

Constellation Insights

IBM pushes quantum computing envelope: Big Blue has fired the latest salvo in the competitive war between itself and the likes of Google and Microsoft over quantum computing, announcing it has created a working prototype of a 50 quibit processor.

Cassically-designed computers are binary in structure, storing bits as either a one or a zero. But quantum systems take advantage of the behavior of subatomic particles, which can hold multiple states. This phenomena, which is known as superposition, stands to give quantum systems vast amounts of processing power as they are developed and reach viability at scale. Qubits are the quantum counterpart of traditional bits.

The near term goal among IBM and its rivals is achieving "quantum supremacy," wherein a quantum computer can finish a task faster than any conventional computer. Google has characterized 50 qubits as the bar for quantum supremacy.

POV: While the 50 qubit system remains a lab creation, IBM will provide cloud-based access to Q quantum computers with 20 qubits of power by the end of this year. Early quantum computers such as these remain highly unstable. IBM says its initial Q systems will have a "coherence" rating—the time available to run quantum computations—of 90 microseconds, although IBM characterizes this result as leading the field.

While IBM and others have signaled that commercial quantum computing systems are on the horizon, for now IBM's goal is to create a critical mass of research and academic interest around its activities. Some 60,000 users have conducted nearly 2 million experiments on IBM"s cloud-based quantum computing service, representing nearly two thousand universities, high schools and other institutions, according to a statement.

Desktop Metal hatches heavy 3-D printing plans: A Boston-area startup that has raised more than $200 million for its metal 3-D printing systems is now taking international pre-orders for its Studio System prototyping machine. BMW Group will be the first international company to get one of the systems, according to Desktop Metal's announcement. Interest among U.S. companies for Studio Systems has been strong as well.

But the real—albeit yet unproven—breakthrough for Desktop Metal is set to come in mid-2018, with the release of the Production version of its system. The systems, which use an inkjet printer's approach to creating complex metal parts, will be 100 times faster and 20 times cheaper than laser-based 3-D metal printing systems, according to the company. Desktop Metal Production is geared toward manufacturing at scale, with advantages that other methods can't match.

The systems' sweet spots are smaller batches of complicated parts, that can be designed and printed through software, with no special casts or tooling required. Moreover, the parts can be made in non-factory settings, reducing overhead. Another advantage, as noted in the Register: Since the parts would be software-driven, the files could be sent electronically to local machines, which on paper would avoid import tariffs associated with bringing goods across borders.

POV: Desktop Metal's inkjet-style approach is not unique, nor its its use of metal in the printing process. Still, the startup has attracted investments from the likes of Google and BMW, and is said to have a rich patent portfolio backing up its commercial ambitions. The challenge now is to deliver Production systems that live up to the speed and cost Desktop Metal is touting.

Macy's places new bets on tech for turnaround: Department store chain Macy's reported third-quarter results this week that saw profits beat estimates but revenue fall 6.1 percent to $5.28 billion. Those numbers reflect continued difficulty in the brick-and-mortar side of the business, but also more success in shifting sales online. Macy's recently brought on a new president, Hal Lawton, who has experience at eBay and Home Depot.

Lawton's unique background and experience working with technology is something Macy's is banking on big-time as it plots a defensive and offense strategy against not only rivals such as Kohl's but especially Amazon, which has made a series of strategic moves, such as the acquisition of Whole Foods Market, to build out its brick-and-mortar presence.

Macy's has experienced 33 quarters of double-digit growth in online sales, but still has work to do on some fundamentals, CEO Jeff Gennette said during a conference call:

So some of the things that we're focused on with respect to technology, is really making sure that our ongoing site optimization is just really strong, and we learn every day. We do a good job here, but we have lots of opportunities to improve on this. We're looking at mobile and tablet app responsiveness and making sure that we get the conversion rates there where we want them.

One challenge retailers like Macy's face from Amazon is the latter's sheer scope of inventory and product availability. Macy's is looking to expand its direct-ship-from-vendor operations as a way to combat that, Gennette said. In addition, Macy's plans to leverage machine learning for personalized shopping experiences, he said. The latter is hardly a pace-setting move, so it remains to be seen how well Macy's executes.

Macy's is also hoping to lure new customers, particuarly so-called Generation Z members, to the fold. Gennette gave a broad outline of the chain's plans here:

And then lastly, to the previous question about on-boarding of new customers and the idea about the Gen Z customer and using user-generated content, being in the social space, using our teams in a more relevant way to market in a more authentic way is all part of what's on our kind of technology playbook.

POV: Macy's recently overhauled its customer loyalty program, and while officials reported that initial feedback has been positive, statistical results weren't made available. Overall, the U.S. brick-and-mortar retail sector remains a boxer in late rounds, leaning on the ropes, with forecasts for 2018 not looking especially positive. Macy's is one chain talking a good game about innovation and transformation; as the busy holiday shopping season gets underway, the contest is already in crunch time.

Data to Decisions Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer Chief Revenue Officer

Digital Transformation Digest: IBM Adds Privacy Measures in EU Cloud, Office-LinkedIn Integrations Continue, UPS Backs Blockchain for Trucking

Constellation Insights

IBM tightens privacy measures for EU cloud operations: In response to both regulatory and competitive pressures, IBM is adding a new layer of data access and control guidelines in its Frankfurt data center operations, which serves many cloud infrastructure customers across the EU. Here are the key details from its announcement:

IBM will roll out new controls to ensure access to client content (including client personal data and special personal data) is restricted to and controlled by EU-based IBM employees only. These employees will play a critical role in IBM incident and change management processes by reviewing and approving all changes from non-EU based employees that could affect client data.

In a move that is unique to only IBM Cloud’s dedicated environments in Frankfurt, clients will review and approve all non-EU access requests to their content if an instance requires support or access from a non-EU based employee. If granted, this access is temporary and the client will be notified when the temporary access is revoked. Logs that track access are made available to the client.

Big Blue is also adding to customer support teams in the EU, which will now have around-the-clock local staff. No price increases are anticipated. A third measure, coming next year, will give customers the ability to encrypt their data both at rest and in transit while keeping possession of master encryption keys at all times.

POV: IBM's moves are welcome and necessary in light of recent and upcoming developments in EU privacy laws, particularly the General Data Privacy Act, which takes effect next year. It is also playing catchup to rivals such as Microsoft, which has already rolled out similar measures for Azure. Microsoft is working with Deutsche Telekom, who is serving as a third-party steward overseeing customer data held in EU data centers.

IBM is pledging to add the data privacy improvements in other regions around the world, although it provided no timelines.

The LinkedIn-Office integration story continues with Resume Assistant: As time goes on since Microsoft's landmark $26.2 billion acquisition of LinkedIn, the application integration scenarios between the companies' software are becoming abundant and diverse. The latest is Resume Assistant, a feature that pulls LinkedIn data into Microsoft Word as job seekers are crafting or updating their curriculum vitae:

Leverage relevant examples—See how top people in a field represent their work experience and filter by industry and role for a personalized experience.

Identify top skills—Find the most prominent skills for the type of job you’re seeking so you can more easily increase your discoverability.

Customize a resume based on real job postings—People can see relevant job listings from LinkedIn’s 11 million open jobs and customize their resume to appeal to recruiters.

Resume Assisant also provides hooks into LinkedIn's ProFinder freelance help site, as well as the Open Candidates feature, which tells recruiters combing through LinkedIn that you're available and interested in new opportunities.

POV: This is a classic case of line-blurring between two platforms' unique capabilities that results in something more useful on the whole. While LinkedIn profiles over the past several years have become somewhat of a proxy for traditional resumes, they haven't replaced them by a long shot.

Meanwhile, resumes have long been a largely static art form; while there may be no need for a radical reinvention of the format, the elements that go into them can always be improved, and that's where LinkedIn's rich data set can help job seekers, recruiters and hiring managers alike.

UPS joins blockchain trucking alliance: If you're a group forming an industry consortium around the use of blockchain in the trucking industry, you could do worse than to land UPS, the world's biggest package delivery company, as a member. The Blockchain in Trucking Alliance has done just that, in a move that should provide the group's work with a major infusion of energy. Here's how UPS describes why it joined:

In particular, UPS is exploring blockchain applications in its customs brokerage business. UPS is one of the world’s largest customs brokers, and a key objective of its brokerage strategy is to digitize transactions. Blockchain technology would help by improving transaction accuracy and by replacing existing paper-heavy and manual processes.

UPS wants to leverage blockchain technology to facilitate execution and visibility of trusted transactions between UPS, its customers and government customs agencies. Blockchain, a digital database using blocks that are linked and secured by cryptography, can be used to keep record of any information or assets. This includes physical assets, like transportation containers, or virtual assets, like digital currencies.

POV: Some 300 companies have applied to join BiTA. This bodes well for the group's work, which of course remains nascent. Its emergence comes as the trucking industry is experiencing a renewed wave of consolidation.

Later this year, new U.S. rules will take effect requiring trucking companies to use electronic logging of drivers' hours. The logging is aimed at stopping logistics providers from circumventing laws governing how long drivers can work, but as a byproduct is expected to squeeze productivity and already thin profit margins. To that end, blockchain is a longer-term play but one the trucking industry is betting on as a route to more efficient operations.

Digital Safety, Privacy & Cybersecurity Future of Work Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Financial Officer Chief People Officer Chief Information Officer Chief Supply Chain Officer

Digital Transformation Digest: Facebook Adds More Business Features to Messenger, QuickBooks Launches Direct Lending Program, CVS to Offer Next-Day Drug Delivery

Constellation Insights

Facebook Messenger adds business-friendly features: Version 2.2 of Facebook's wildly popular Messenger app has arrived, and with it a closed beta version of a new plugin that lets businesses embed Messenger in their websites and talk to customers across multiple channels.

The plugin doesn't have all the features found in the full-fledged Messenger app, but key capabilities such as payments support and rich media are part of the initial version. Facebook has lined up some top brands as beta testers, including Air France, KLM, Argos, Volaris and Zalando.

POV: The plugin reflects the B2C side of Facebook's emerging enterprise strategy. On the B2B end of the spectrum lies Workplace at Facebook, which Constellation VP and principal analyst Alan Lepofsky takes a detailed look at right here.

While there are untold numbers of chat clients available for enterprises to use with their websites, Messenger provides a series of advantages for brands, not the least of which is its ubiquity among customers, with more than 1 billion users around the world.

Customers are already using Messenger in their personal lives to communicate with friends and family; businesses can tap into that activity without having to ask customers to login to a separate chat client or account on their websites. Along with Messenger's cross-platform continuity for message threads, the level of friction for customer service, marketing and sales-related conversations can be reduced dramatically.

QuickBooks enters the SMB direct-lending race: Following the lead of Square, Amazon and other tech companies catering to small business entrepreneurs, Quickbooks has introduced a direct lending service that provides loans of up to $35,000 for qualifying individuals.

Dubbed QuickBooks Capital, the service is embedded within the existing application, and use data about the customer QuickBooks already has, along with machine learning models that leverage its large corpus of user information, to make lending decisions quickly. Funds get transferred within a couple of business days, with the money coming directly from QuickBooks and not a third party, as has been the case in the past.

The loans are for between three and six months, with interest rates ranging from 1.75% to 4.74%, which works out to an APR of between 6 and 18 percent. Rates are set based on a customer's business history and personal credit score. On that basis, QuickBooks is being fairly liberal, requiring just a 580 FICO score as a minimum qualification. Indeed, by its own estimates, 60 percent of potential customers for the Capital service wouldn't be able to get a loan elsewhere, the company says.

POV: Short-term infusions of cash are critical for small business, whether to bridge expenses during a traditionally slow time of year, or to hire a key new worker or workers. QuickBooks, with its intimate view of an applicant's financial picture—many not only enter debits and credits into the application, but connect their bank and other accounts—can make decisions on a loan much more quickly than a traditional bank, with far less documentation prep required on the part of borrowers.

While a small business could conceivably get better interest rates and loan terms with a bank, QuickBooks' lending parameters are far from usurious. There are also no prepayment penalties or underwriting fees involved. However, borrowers do face a key risk factor: The loans are personally guaranteed, rather secured by collateral. That means entrepreneurs are on the hook if their business operations can't pay back the loans.

It is not clear how much money QuickBooks will lend, but a spokesperson told Bloomberg an initial pilot involving hundreds of businesses went exceedingly well. Overall, the program is a novel way for QuickBooks to generate more revenue from existing customers while taking advantage of the data it already collects.

CVS to offer next-day prescription delivery: Amid its attempt at a mega-merger with health insurer Aetna, CVS Health has announced plans for next-day delivery of pharmacy products, a move that comes as Amazon mulls an entry into prescription drug sales. Here are the key details from CVS's announcement:

CVS Pharmacy will offer free, same-day delivery service within hours from all locations in Manhattan beginning on December 4. Prescriptions and a curated selection of over-the-counter products will be delivered directly from CVS Pharmacy in secure tamper-proof packaging right to customers' doors to assure complete privacy.

Same-day delivery will expand to Miami, Boston, Philadelphia, Washington, DC and San Francisco in early 2018. These new delivery options will enhance CVS Pharmacy's national network of solutions designed to give customers flexibility in how they want to shop.

POV: CVS has already partnered with Instacart to deliver front-of-store items, including over-the-counter medications, but it's not clear how successful that effort has been so far. It has about 2,600 retail stores in the U.S., with stiff competition from the likes of Walgreens, Rite Aid and Walmart. In recent years, CVS has torn down or closed many smaller locations and opened larger-footprint stores with expanded grocery and household item sections. CVS says its Instacart deal will be expanded to 50 percent of U.S. households this year.

While adding rapid delivery for prescriptions can be seen as a defensive hedge against Amazon, CVS and its peers have a center of gravity that shouldn't be discounted. Even assuming that Amazon, should it enter the market, can provide lower prices and satisfactory service, many drug consumers have lengthy relationships with their retail pharmacies and pharmacists, ties that won't necessarily be broken quickly. CVS is nonetheless wise to get out in front of the Amazon threat, and you can expect rivals to follow suit quickly.

Marketing Transformation Matrix Commerce Next-Generation Customer Experience Chief Customer Officer Chief Financial Officer Chief Information Officer Chief Marketing Officer Chief Supply Chain Officer

IBM Joins Hybrid, Multi-Cloud Data Science Chorus

IBM sings praises of build-anywhere, deploy-anywhere, open-source analytics. Here's a review of what's now a familiar refrain.

Lots of big tech vendors are now singing from the same hymn book when it comes to data platforms and data science. The message to customers is that they’re offering a range of deployment options, including hybrid-cloud and multi-cloud for agility. They’re also saying they’re open, supporting a range of open source languages, notebooks, frameworks and libraries. IBM hit on all these notes at its November 2 Cloud and Cognitive Summit in New York, but how does it stand out?

IBM’s Cloud and Cognitive Summit marked the introduction of two new tools and a new Hadoop and Spark service on the Watson Data Platform. Executives also revealed the Kubernetes-based containerization of IBM Data Science Experience, a move they said will enable organizations to build and deploy models wherever the data lives. Here’s a deeper look at the details.

It Starts with the Platform

“You can’t get to AI without IA.” That’s how Rob Thomas, general manager of IBM Analytics, explained the need for solid information architecture as an underpinning of artificial intelligence. Indeed, data management comes first, and IBM describes its Watson Data Platform as a kind of operating system for modern, data-driven applications, This cloud-based platform was launched last fall the Strata NY Conference 2016. The Cloud & Cognitive Summit was the launching pad for two new platform capabilities: Data Catalog and Data Refinery.

Data Refinery checks the box for self-service data-prep capabilities, though my sense is that it’s a starting point (see analysis below). Data Catalog helps users, particularly business users, get their arms around available data by tagging or ingesting preexisting metadata and creating an index of all available assets. IBM says its catalog is not just about data – whether on-premises or in the cloud, structured and unstructured. Using an API, IBM says admins can also inventory assets including models, pipelines and even dashboards.

The Summit also marked the general availability of the IBM Analytics Engine, which is the company’s new Hadoop and Apache Spark service. IBM already offered Hadoop and Spark services, of course, but the Analytics Engine was hatched this summer after the company ended development or its own IBM BigInsights distribution and related cloud service in favor of a partnership with Hortonworks. The new service separates storage and compute decisions, with persistence options including a new IBM Db2 Event Store that uses the Parquet data format to deliver what IBM says is much better performance that ordinary object stores.

Constellation’s analysis: Access control, governance and a shared collaborative and community workspace are the key concepts behind Watson Data Platform. The platform gives large organizations with lots of data sources, data pipelines, models and data-driven applications a centralized, project-oriented home in which to prepare, store and analyze data and then deploy and manage models. The analyze, deploy and manage aspects are handled with the IBM Data Science Experience (detailed below).

With the new Data Catalog and Data Refinery capabilities, Watson Data Platform adds depth as a data-management and governance layer. Seeing the demos and talking to multiple executives at the Summit, I came away wanting more detail. I liked the cataloging vision of being able to inventory pipelines, models, dashboards and other assets as well as data. But there wasn’t a lot of nitty, gritty insight into the out-of-the-box capabilities versus what you can do with APIs. As you can read in my Constellation ShortList on Data Cataloging, there’s a lot to a state-of-the-art product in terms of crawling sources, automatically tagging, applying machine learning to track and understand access patterns, supporting collaboration around assets and offering intelligent recommendations to catalog users. I need to see more and talk to customers before I would add IBM Data Catalog to my ShortList.

A couple of executives I spoke to at the Summit described the Data Refinery as a work in progress. The current plan is for the Refinery to be an extra-cost option, but as a buyer, I’d want to see the list of out-of-the-box connectors and details on assisted data-prep and recommendation capabilities, as outlined in my Self-Service Data Prep ShortList. At this writing there’s a free beta available, so it’s possible to do some comparison shopping before paying extra for this feature of the Watson Data Platform.

Every modern data platform worth its salt now separates storage and compute decisions, and the IBM Analytics Engine was an obvious and inevitable update given the end of BigInsights development. IBM has joined Microsoft, Oracle and Pivotal, among others, in offering cloud services based on the ODPi standard. Adding the Event Store is a good step for performant object storage, though I have no idea why IBM has saddled it with “Db2” branding given that it has nothing to do with that commercial relational database.

Data Science Experience Goes Multi-Cloud

IBM introduced Watson Data Platform and Data Science Experience (DSX) back in 2016 with support for open-source options including Apache Spark, R, Python, Scala and Jupyter notebooks. At last week’s event it joined the chorus of notable vendors (also including Microsoft, SAS and SAP) talking up hybrid and multi-cloud freedom of choice for data science work. In the case of DSX this multi-cloud support has been made possible by the recent containerization of the product by way of Kubernetes, so it can be deployed in Docker or CloudFoundry containers “wherever the data lives.” There was also mention of DSX integration with GitHub, although this is apparently in the formative stages (see analysis below).


IBM Data Science Experience provides a project-oriented management layer for unified,
controlled access to data, models, pipelines, notebooks and collaborative work spaces.

DSX is both a part of and, optionally, independent from Watson Data Platform as DSX Local, which can run behind corporate firewalls or on desktops. DSX provides permission-controlled, collaborative access to projects, data, data science tools, services, and a community space. With its support for R, Python and Scala and Jupyter and (now on DSX Local) Apache Zeppelin notebooks, DSX users can tap popular open source libraries including Spark MLlib, TensorFlow, Caffe, Keras and MXNet.

IBM says DSX’s big differentiator is its ability to support “clickers as well as coders.” I covered the coders part above. Clickers, meaning non-data scientists, use DSX as a gateway to SPSS, which is IBM’s commercial offering supporting point-and-click and drag-and-drop modeling and statistical analysis. SPSS is also the source of IBM’s machine-learning-driven, automated model development, deployment and optimization capabilities, which were rebranded from IBM Predictive Analytics to Watson Machine Learning in October 2016.

Constellation’s analysis: IBM and other leading commercial vendors have gotten the message that data scientists want open source options and hybrid and multi-cloud deployment options through which they can avoid vendor lock-in. This year I’ve seen lots of analytics and data science platform progress announcements, from Cloudera, Databricks, IBM, and Microsoft to Oracle, SAP, SAS, Teradata and more. Common themes include support for Spark for processing; object stores for the separation of storage and compute; column stores for performance; R, Python and, in some cases, Scala, for language support; Jupyter and Zeppelin notebook support; and access to Spark ML, TensorFlow, Caffe, Keras, and other leading frameworks and libraries.

These data science platforms provide a centralized environment for securely sharing access to data, collaborating around models and then deploying, monitoring and maintaining models at scale. Cloudera is focused on doing this work on its own Hadoop/Spark platform whereas IBM, Oracle, Microsoft, SAP and SAS also integrate with their respective commercial data warehousing platforms, streaming capabilities, analytics tools and libraries, and public clouds.

Amazon Web Services and Google both have enviable data platform and data science portfolios as well, but their emphasis is on doing it all in their respective public clouds, which isn’t always possible for big enterprises with lots of systems and data still on premises. IBM, Microsoft and SAS have embraced containerization for hybrid and multi-cloud deployment, acknowledging that customers want to be able to analyze data and build, deploy and run models anywhere, including rival public clouds.

IBM and SAS have had a lot to say about support for open source languages and libraries (and in IBM’s case, Apache Spark), but their commercial analytics software offerings are also part of their Data Science platforms. As a customer, I’d want to know exactly what commercial software I’m licensing or subscribing to along with the platform, the terms of that investment and whether there are options to consume that software in an elastic, services-oriented model on-demand.

I was heartened to hear that IBM is also pursuing GitHub integration with DSX, but few details were available on this push. Among the many data science platform announcements I’ve seen this fall, I’d have to say I was most impressed by Microsoft's next generation of Azure ML (currently in beta). Microsoft has integrated with GitHub to track the end-to-end lifecycle of code, configurations and data (as well as the lineage and provenance of data) used throughout the model development/deployment/optimization lifecycle.

Being able to track data lineage is crucial to satisfying regulatory requirements in the banking and insurance sectors. It’s also what’s needed to satisfy General Data Protection Regulation (GDPR) requirements looming in the European Union and to meet growing demand for explainable and interpretable predictions and recommendations. I suspect IBM is on the same track to bolstering data-governance capabilities with GitHub.

In a separate announcement on November 2, IBM, Hortonworks and ING Group are working with the Linux Foundation to promote an open data governance ecosystem that will define interfaces for diverse metadata tools and catalogs to exchange information about data sources, including where they are located, their origin and lineage, owner, structure, meaning, classification and quality. This work stands to benefit both cataloging and, more importantly, data-governance and GDPR compliance.

Related Reading:
Microsoft Stresses Choice, From SQL Server 2017 to Azure Machine Learning
Oracle Open World 2017: 9 Announcements to Follow From Autonomous to AI
SAP Machine Learning Plans: A Deeper Dive From Sapphire Now

Media Name: Watson Data Platform.jpg
Media Name: IBM Data Science Experience.jpg
Data to Decisions Chief Information Officer Chief Digital Officer

Digital Transformation Digest: Dreamforce Focuses on Personalization, the IoT Implications of Broadcom-Qualcomm, and Amazon's Risky Discounting Program

Constellation Insights

Salesforce focuses on the personal touch at Dreamforce 2017: One of the busiest weeks San Francisco sees all year is underway, with the start of Salesforce's Dreamforce 2017 conference. As expected, the company has made a slew of product announcements spanning AI, IoT, low-code development and other areas, but there's a clear through-line that will be emphasized all week: Salesforce offers a collection of platforms that can be fine-tuned not just for an organization, but for individual workers' needs and desires. Here's a look at the highlights.

MyTrailhead is a revamped version of Trailhead, the online training service Salesforce first launched in 2014. Trailhead has always used gamification and other modern learning techniques, but the new version adds deep customization capabilities. Trail Maker is a guided setup toolset that companies can use to build out custom training content portfolios, both from Salesforce's library and ones of their own creation. Trail Mixer gives employees and managers the means to pull together bundles of training material for specific roles, and then share them with others. Trail Tracker and Trail Checker focus on accountability, using rewards badges, quizzes and other tools to maintain a record of employees' progress on the training platform.

POV: MyTrailhead is set for a pilot program in the first half of next year, with general availability to follow later in 2018. Based on the descriptions Salesforce provided, the service is evolving significantly. However, unlike Trailhead to date, myTrailhead is a paid service. This is a significant but perhaps not unexpected change. While Salesforce already has had a paid certification program, MyTrailhead represents an additional revenue opportunity; companies who have already embraced the free version may find value in its additional capabilities. Pricing won't be disclosed until the GA date, however, and it is not clear whether a free version of Trailhead will remain in place.

MyIoT is Salesforce's attempt to bring IoT development capabilities to any worker. The initial product is IoT Explorer, which provides a point-and-click interface for developing IoT apps on the Salesforce platform. Salesforce cited use cases that are naturally attuned to its sales, marketing and service milieu, such as if a car dealer created a workflow app that generated automatic service appointment phone calls to connected cars when they reach a certain mileage marker.

POV: Salesforce has relied on partners such as Amazon Web Services for device connectivity, while focusing on providing an IoT application development environment and runtime. Nothing changes in that regard with myIoT; what remains to be seen is how much of its vision of LoB workers spinning up custom IoT apps comes true. Hopefully, Dreamforce will showcase early customers having success with it. IoT Explorer is generally available now with pricing starting at $6,000 per month for companies with enterprise licenses or above.

Salesforce is also announcing mySalesforce, another low-code service for building branded mobile applications; myLightning, which adds more customization and branding capabilities to its underlying Lightning development framework; and myEinstein, for creating AI-driven applications in a point-and-click manner.

Overall, there's a lot on offer at this year's Dreamforce, and we will be following it closely all week.

Broadcom's record bid for Qualcomm and the IoT implications: The semiconductor market was roiled Monday with the announcement of Broadcom's $130 billion takeover offer for Qualcomm, the world's dominant manufacturer of SoC (system on chip) integrated circuits that power the world's higher-end smartphones.

While the proposed deal instantly drew talk of severe antitrust hurdles, Broadcom took a step that could help seed those waters in its favor last week, announcing plans to move its legal headquarters back from Singapore to the United States. It had made the move to Singapore for tax reasons, but cited proposed Republican changes to U.S. tax laws as the reason for the return.

Broadcom is also in the middle of acquiring NXP. The combined company would be the world's third-largest chipmaker after Intel and Samsung, however. (Intel made waves of its own on Monday, announcing a deal with AMD on new processors that combine Intel chips with AMD GPU (graphical processing units), a move that will step up competition with NVIDIA.)

POV: Both the NXP and Qualcomm deals are far from done for Broadcom (and the latter, in particular, depends heavily on borrowing cash, which brings its own challenges), but in the broad strokes are to be expected. IoT market predictions vary, but all of them point to stratospheric rises in the number of connected devices, as well as the average sophistication of those devices over time. That translates into a need for lots of increasingly powerful, yet less expensive chips, and one clear path to that outcome is industry consolidation.

Amazon's new discounting program carries risks: In advance of the holiday shopping season, Amazon has made yet another bold move in a bid to maintain and grow online market share. A new "Discount by Amazon" program applies discounts to items sold on Amazon by third-party sellers, without those sellers needing to do a thing. Amazon gives the discount directly to buyers, while sellers receive their original asking price (and pay the original sales referral percentage fee).

However, Amazon seemingly did next to nothing to publicize the program to third-party sellers, some of whom have bristled over the potential for it to conflict with agreements they have with product manufacturers over publicly posted prices. Such concerns are paramount to more exclusive brands, which in many cases use higher prices as a cachet. Many third-party sellers are themselves aiming for a boutique image, rather than catering to bargain hunters.

POV: Amazon does offer third-party sellers the ability to opt out of the program, but it should have done a better job of publicizing it in the first place. Overall, Amazon must strike a delicate balance between aiming for low-price parity with or victory over rivals such as Walmart (and its Jet.com subsidiary), and the concerns of its third-party sellers, which comprise half of all sales and have profit needs that don't reflect Amazon's ability to spend profiglately to scoop up market share. It will be of interest to see how the discounting program plays out over the next six weeks or so of holiday shopping.

Future of Work Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief People Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer

Monday's Musing: Infinite Ambient Orchestration

The Design Point For All Future AI Driven Apps

The quest for mass personalization at scale in an era of artificial intelligence (AI) has led to new models of design for the future of applications.  One design point for these new AI driven smart apps is a concept called Infinite Ambient Orchestration.  The three components can be described as:

  1. Infinite.  The design point should consider contextually relevant and relative journey design.  These journeys have no beginning or end.  Journeys deliver both stateful and stateless interactions.
  2. Ambient. Elements of artificial intelligence provide contextual relevancy.  These capabilities make right-time recommendations to augment decision making and in many cases power situational awareness
  3. Orchestration.  In an age of access not ownership, systems must orchestrate across insight, process, platforms, and ecosystems.

As new systems are created, organizations can expect this design point as a first principal for AI driven systems.

Your POV.

So what will you automate first with AI?  Do you have a digital transformation strategy?  Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org.

Please let us know if you need help with your Digital Business transformation efforts. Here’s how we can assist:

  • Developing your digital business strategy
  • Connecting with other pioneers
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales .

Innovation & Product-led Growth Leadership Chief Experience Officer