Results

Event Report: Customer Data Platform Takes Center Stage At Salesforce Connections 2019 #CNX19

Customer 360 Mantra Manifests In The Customer Data Platform

Over 12,000 customers from 40+ countries convened at Chicago's McCormick Center for the annual Salesforce Connections conference. Stephanie Buscemi, CMO, welcomed the audience and key customers such as Conagra, Indiana Pacers, and State Farm to the stage. Key highlights from the event include:

Photo: @rwang0

  • Marketing Cloud receives new Einstein AI email marketing enhancements. AI powered capabilties include engagement frequency, send time optimization, and content tagging.  Additional enhancements include transactional messaging.
  • Service Cloud gains channels menu and Einstein Bots. The "Contact Us" page goes bye-bye as chat takes hold through Facebook Messenger, SMS, WeChat, What’sApp, web chat, and voice. New bots build intent models to match incidents to resolutions.
  • Commerce Cloud adds a low code designer and Mulesoft Accelerator. New Commerce Page Designer gives users more control to create, preview, schedule, and reuse content. Einstein Product Recommendations optimize content acceleration. New Mulesoft Accelerator adds integration and best practices with product configuration, social market places, and chat bots.
  • Customer Data Platform ushers in next generation of Customer 360. Salesforce moves beyond the traditional CDP to include data management, identity, consent, and activation across marketing, sales, service, and commerce. Einstein Insights will use AI to help power next best action.

Figure 1. Event Report: #CNX19 Returns To the Customer 360 Theme

 

Figure 2. Q&A With The Key Product Leaders

Photo: @rwang0

Figure 4. Twitter Moments for #CNX19

The Bottom Line: Customer 360 Remains The Elusive Goal

The elusive goal of creating a holistic view of customers remains despite over 25 years of CRM innovations. The objective hasn’t changed, but hopefully technology provides a higher likelihood of success. Despite the ongoing challenge, Salesforce continues to make key product investments on the way to supporting an AI driven approach to marketing, sales, service, and commerce.

When it comes to Customer 360, however, it’s still difficult to separate the current reality from the vision. Customer 360 was the big announcement at Connections 2018. This year’s event brought some concrete advances, including planned general release of the Customer Resolution Engine in November 2019. This provides the unique key that links customer identity across all Salesforce clouds, routing data and events as needed.

The biggest announcement of this year’s event was…also Customer 360. This time in the form of a customer data platform. Is this a tacit admission that CRM hasn’t turned out to be everything it promised to be? Perhaps. In any case, it’s further confirmation that CRM isn’t the only source of customer data and that pulling everything together, at least functionally, is a growing problem customers want solved.

Salesforce has the vision right, but it’s not alone. Microsoft, Oracle, Adobe, and SAP are all working on their own approaches to the problem of building a holistic view of the customer. Salesforce is behind in its announcement. That said, the company has many of the building blocks required to execute on its vision and the customer base to make it a top priority.

The message that came through loud and clear at Connections 2019 is that Salesforce has a solid understanding of what its customers want and need to do. Things are looking good on the front end. Now the company has to execute on making everything on the back end work—easily and effectively—for customers.

Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer

DisrupTV: Rewards Surge for Leaders Who Go with the Flow

“Slow is smooth, smooth is fast, fast is deadly.” In the fast-paced business world, leaders need to remember to stay in touch with their overall businesses and with themselves.

Recently on DisrupTV, our hosts Vala Afshar and R “Ray” Wang interviewed Jerry Colonna, author of REBOOT: Leadership and the Art of Growing Up, Jay Ferro, CIO at Quikrete, and Constellation Research’s own Dion Hinchcliffe, VP & Principal Analyst, to learn some of the key qualities of effective leadership. Here are some quick takeaways from the episode:

Maximize Leadership Growth Through Radical Self-Inquiry

Jerry Colonna, hailed as the “CEO whisperer,” sat down to discuss his introspective leadership philosophy, which he champions in his new book, REBOOT: Leadership and the Art of Growing Up. Jerry is an advocate of “radical self-inquiry,” which he believes can help people confront the various aspects of themselves that hold them back from reaching their fullest potential. To be truly effective, leaders must learn to slow down and look inward. It is only after this process of self-evaluation that a course of action materializes for becoming a stronger leader.

When the discussion turned to the topic of busyness, Jerry suggested striving for a smooth workflow rather than having spikes in productivity. Many people want the appearance of busyness because being busy can be associated with output and personal value. Furthermore, people might be afraid to slow down and face negative emotions, preferring instead to brush them under the rug and focus on their work. Jerry discourages these tendencies and emphasizes the importance of slowing down to a smooth pace of work. As he says, “Slow is smooth, smooth is fast, fast is deadly.”

Enhance Business Strategy through CIO Integration

In the age of digital, the majority of organizations have some sort of digital presence according to Jay Ferro. Given this shift, the CIO must have their fingers on the pulse of all aspects of the business now more than ever. A bidirectional relationship should exist where CIOs learn the business functions from the other officers while catching them up on the technical side of things. When faced with a multitude of intricate technical problems, CIOs may forget to focus on their development as leaders. Jay emphasizes that the best CIOs are business leaders of people. In fact, with their specialized understanding of all things digital, we may see more CIOs taking the role of CEO in the future.

Navigate and Adapt to Pressures of Digital Transformation

Continuing the discussion of executive leadership in the digital transformation age, Dion Hinchcliffe emphasized the learning that CEOs are having to undergo to understand and develop new business models. The demands on the modern workforce are advancing rapidly, and many firms are finding themselves in dire need of education, hiring, and mentoring.

Though many businesses face uncharted terrain, there are plenty of digital transformation success stories to be found in all industries. To illustrate this point, Dion gave us a sneak peek at this year’s BT-150 winners, emphasizing the diversity of this year’s group across various sectors and industries. The BT-150 are leaders who Constellation Research identified for their successes in digital transformation leadership. This year’s list includes some great examples of leaders who are succeeding in the modern age of digital transformation.

As leaders brave the challenging and transitioning business landscape, introspection, awareness, and adaptability are three pivotal qualities that can help leaders reach a flow state that is infinitely more productive.


This is just a small glimpse at the great advice shared during the show. Please check out the full discussions in the video replay here or the podcast.

DisrupTV is a weekly Web series with hosts R “Ray” Wang and Vala Afshar. The show airs live at 11:00 a.m. PT/ 2:00 p.m. ET every Friday.

New C-Suite Chief Executive Officer Chief People Officer Chief Information Officer Chief Marketing Officer

Oracle Reasserts Itself In BI and Analytics

Oracle has simplified its analytics product lineup and pricing and gone public with its roadmap. Here’s what’s coming and why Oracle customers will take a second look.

The Oracle Analytics Summit, held June 24-25, gave the company a chance to introduce what executives called “a new beginning” for Oracle Analytics. The themes of the event, held as both a webcast (attended by more than 6,800) and an in-person event (attended by roughly two hundred customers, partners and analysts) were “simple, transparent” and “trustworthy.”

The company needed a new beginning, as Senior VP of Analytics, T.K. Anand, put it, because Oracle had previously “inflicted a lot of change on the community” and “wasn’t transparent” about product plans and roadmaps. In recent years Oracle has been pushing its cloud-based analytic options while doing little to update software deployed on-premises by tens of thousands of customers.

T.K. Anand, a Microsoft veteran who joined Oracle one year ago as senior VP of Oracle Analytics, announces "a new beginning" at the Oracle Analytics Summit.

On Simplification

Oracle announced a consolidated lineup with just three products: Oracle Analytics Cloud (OAC), Oracle Analytics Server, and Oracle Analytics for Applications. OAC has been the company’s cloud path forward for analytics for more than three years. It’s a modern, self-service-oriented cloud service emphasizing data visualization and offering extensive augmented analytics capabilities (including data prep recommendations, Web and mobile natural language (NL) query, NL generation, automated insights and, soon, personalized recommendations). Recent integrations extend the NL capabilities to the Oracle Digital Assistant and third-party products including Slack, Microsoft Teams, Skype, Amazon Alexa and Google Assistant.

Oracle Analytics Server, due in Q4, is an on-premises-oriented product that consolidates governed, self-service and augmented capabilities and replaces myriad products that came before it. The promise is that Oracle Analytics Server will be aligned, feature and function wise, with OAC and will see annual updates based new capabilities first delivered through OAC. It’s also Oracle’s option for multi-cloud deployment, as it will be software that can be deployed anywhere.

Oracle Analytics for Applications, as the name suggests, is essentially OAC pre-integrated with Oracle cloud applications. The integrations will make it easier to analyze data from Oracle applications, starting with Oracle Fusion ERP by Q4 2019 and Fusion HCM by Q1 2020. Executives promised that other Oracle cloud applications and third-party applications will follow on a quarterly cadence. Importantly, Oracle Analytics for Applications includes a managed data pipeline and data warehouse instance based on the Oracle Autonomous Database. Also included will be supporting content, including ready-to-use semantic models, reports and dashboards.  

Another key element of the simplification is the pricing for OAC, which is straightforward and aggressive at $20 per user, per month for the workgroup- and departmental-oriented Professional Edition and $2,000 per Oracle Compute Unit (OCPU) per month for the Enterprise Edition, including unlimited numbers of users.

On Transparency and Trust

To be more transparent, Oracle has published its Oracle Analytics roadmap online for the first time. This is something Tableau, Qlik and others have done for many years, so Oracle is merely catching up with customer expectations, but it’s a sign that the company is listening.

In another sign that the company is listening and working on building trust, T.K. Anand, a Microsoft veteran who joined Oracle one year ago, said that Oracle has doubled the number of analytics customer-support engineers and improved processes to reduce service request resolution times by as much as ten times. It’s also investing in auto-remediation capabilities to reduce the time to recovery. Service improvements are a constant area of investment for most vendors, but a doubling of support staff is a real commitment.

Oracle's newly published analytics roadmap, available at oracle.com/solutions/business-analytics/roadmap.html

MyPOV on the New Oracle Analytics

From Oracle’s perspective, any customer of its applications or data-management software should naturally consider Oracle’s analytics options, but defections have mounted in recent years. The Summit announcements were designed to get customers to give Oracle Analytics products a first (or second) look. The attention getter was clearly the simplified, aggressive list pricing, and as Oracle customers all know (and count on), the more you spend, the more likely the company is to offer discounts.

To put the Enterprise pricing in context, I talked to one customer with 1,000 users who said he’s using 12 OCPUs while another customer with 600 users said he was “overprovisioned” and had room to grow with just six OCPUs. Capacity requirements will vary based on the complexity of the data and query, reporting and analytical demands.

The real new news among the announcements is the pending release of the Oracle Analytics Server aimed at on-premises deployment. Oracle’s prior emphasis on OAC and only OAC left the impression that cloud was the only way forward, yet plenty of customers aren’t ready to go there. Oracle Analytics Server is billed as an all-inclusive product that will get annual updates. In another carrot for current customers, any licensee of Oracle BI Enterprise Edition will be automatically licensed and entitled to download and deploy Oracle Analytics Server, which will include the latest self-service and augmented capabilities.

In short, the newly promised terms, transparency, support improvements and feature upgrades across Oracle Analytics are attractive and worth considering. We’ll have to wait and see whether the finished Oracle Analytics Server product is as simplified, integrated and all-inclusive as promised. Will it be as hybrid and multi-cloud friendly as the options from independent vendors? Oracle executives demurred on whether and when vendor-supported options for container-based deployment would emerge (something some rivals have delivered or announced). We also heard that there would be some differences in functionality between OAC and the Oracle Analytics Server (in the area of augmented analytics, for example), and licensing terms for on-premises deployment is another area where details have yet to emerge. 

Customers that I talk to that have gone cloud don’t hesitate to choose cloud-based options. But tell an on-premises customer that cloud is the only way forward and they will be less likely to try your cloud option. The way forward for vendors that didn’t start in the cloud is to give customers clear, transparent choices and the freedom to move when they are ready. This is the new beginning that Oracle Analytics promises and we’ll surely hear more details about each option at this year’s Oracle Open World.

Related Reading:
Qlik Advances Cloud, AI & Embedded Options, Extends Data Platform
Constellation ShortListâ„¢ Cloud-Based Business Intelligence and Analytics Platforms
Tableau Advances the Era of Smart Analytics
MicroStrategy Embeds Analytics Into Any Web Interface

Data to Decisions Oracle Chief Information Officer Chief Analytics Officer Chief Data Officer

New Offering Launch - Oracle Exadata X8 - How a 10 year delivers the next generation compute platform

Few products in cloud software / platforms are around for 10 or more years, Oracle Exadata just turned ten years old and joined that illustrious club. Let's look at what the underlying importance and relevance for enterprises is.
 

What are the key trends?

For a product to make it to 10 years, it must have gone with the trends of time, in this case staying with the computing trends that power enterprise workloads. There are a number of trends that are changing the enterprise computing landscape – let's look at the most pertinent ones:
 
  • Heterogeneous Computing Demands. CxOs are confronted with rapidly changing computing demands. Barely having satisfied the business need for big data, the computing requirements that CIOs must answer stretch from support for machine learning to speech recognition for internal and external digital assistant / chatbot solutions, all the way to the edge of the enterprise. New computing platforms have entered the data center—for instance, with the advent of large GPU racks to run machine learning. A never-before-seen platform diversity manifests itself at the edge of the enterprise to support the Internet of Things (IoT). And the pace of change is not slowing down, as shown by new demands for additional workforce support (e.g., augmented/mixed/virtual reality) and new user experience support (e.g., holographic displays).
     
  • The Need for a Single Control Pane. The era of CxOs simply accepting that new products bring a new control pane is history. CxOs operating next-generation applications[i] must run them as efficiently as possible, via a single control pane. This not only allows for more efficiency to manage infrastructure but also is the best way to manage a heterogeneous landscape effectively. Ramping down and ramping up resources as demand requires cannot be done from a "zoo" of instrumentation. At the same time, the automation of resource scaling is essential, so humans can focus on oversight instead of spending time and energy on operational tasks.

  • Degrees of Cloud Skepticism. Although many next-generation application use cases are best (and sometimes only) operated in the cloud, there is still a degree of skepticism over computing in the public cloud. It ranges from rational challenges (such as whether IaaS vendor data instances are available inside of a necessary jurisdiction) to reasonable challenges (hardware write-downs and connections to existing on-premises computing resources, such as mainframes) to less rational concerns (for instance, regarding data safety). Nonetheless, it means that CIOs need to implement and operate workloads in local data centers for at least the next decade.
Figure 1 – The six Next Generation Computing Platform Trends
Source – Holger Mueller, Constellation Research
 
Other relevant trends are the pressure to achieve high data center utilization, the rising complexity of the IT organizations, and compliance pressure. 
 

What is it?

Oracle Exadata X8 is an engineered appliance / server system that has been engineered to run Oracle workloads best. First and foremost, the Oracle Database, but also Oracle's portfolio of SaaS applications.
Oracle has continued to upgrade Oracle Exadata over the decade of its existence. Key recent innovations on the hardware side are flash memory to support in-memory columnar storage, hot swappable flash storage and 25 GigE client networking. On the software side Oracle has ensured that Exadata works best with the software innovations of the Oracle Autonomous Database, the support of automatic indexing and on the services side the support for Exadata Cloud Service and Exadata Cloud at Customer.
 
Figure 2 – The capability growth of Oracle Exadata
 
Source: Oracle
 
What sets Oracle V8 apart from other next generation compute planes is the 100% Identicality between running Exadata on premises and running Exadata in the Oracle Cloud. No other vendor has the same physical hardware on both sides of the computing equation between on premises and the public cloud.
 
High Identicality gives CxOs the confidence that they can move compute loads across the compute architectures, across on premises and the cloud without having to make any changes. Identicality on the hardware side ensures that there is no residual risk of hardware related incompatibility that is possible in purely software abstraction solutions. This matters to enable key next generation computing best practices like bursting workloads and achieving cross platform high availability.
 

Why does it matter?

There as number of reasons why CxOs care about viable Next Generation Computing platforms:
 
Old-Guard Vendors Are No Longer Viable
Humans are driven by habits, and CxOs are no exception. If they could still procure all of their computing needs from the vendors they dealt with in the 1990s, the majority of CxOs would likely do so. The problem with these "old-guard" vendors is that they have failed to innovate, are no longer viable from a cost perspective and often have switched to business models that are perceived as extortion. Therefore, innovation and commercial necessities require CxOs to deal with a new set of computing vendors.
 
Cost Pressure
For decades now, CxOs have been asked to do more with less, especially on the IT side. For a long time, the benefits of Moore's Law have bailed out CIOs because they were able to offer better computing power at the same costs or equal computing at lower costs. But Moore's Law is running out of runway, and at the same time new next-generation application use cases require innovative new platforms that charge a premium.
 
The Innovation Imperative
While software is eating the world, enterprises are turning into software companies, and, as such, they need to innovate faster than ever. This makes CxOs look for winning platforms and ideally allows them to move workloads as seamlessly across them as possible. As enterprises flock to platform-as-a-service (PaaS)[ii] products to help them build these next-generation applications,[iii] workload portability is a key acquisition criterion and overall success factor for the selection of a PaaS[iv] platform.
Additionally, CxOs face challenges due to lack of skilled workers and contractual challenges that limit them to outdated and older platforms.
 
Figure 3 – The Five Buyer Challenges
Source: Holger Mueller, Constellation Research

Advice for CxOs

The following recommendations can be made for CxOs looking at their computing architecture:
 
Enable enterprise acceleration. Enterprises need to move faster than ever before, and IT/computing infrastructures cannot remain the shackles on agility that they have been in the past. This is why CxOs look for next-generation computing platforms that allow them to transfer workloads from on-premises to the cloud and vice versa without having to make changes. This is a key strategy to help the technical side of an enterprise contribute to the overall objective and necessity of enterprise acceleration.
 
Select companies that have the greatest capability of identicality. Identicality is the key to workload portability. The higher the identicality between an on-premises architecture and a cloud architecture, the better the chances to move workloads seamlessly. This argument is intuitively clear to CxOs leading the transformation, and platforms with high identicality are therefore clearly preferred. Even better when vendors state that they designed for identicality and want to keep identicality high — as high as technically feasible. As stated in this report, Oracle excels at Identicality between Exadata on-premises, Oracle Exadata Cloud Service, Oracle Autonomous Database and the Oracle Exadata Cloud at Customer platforms.
 
Evaluate Oracle Exadata as existing Oracle customers. As most customers run the Oracle Database in one way or another, it is important that they familiarize themselves with the most prominent member of the Oracle Cloud at Customer product family, Oracle Exadata Cloud at Customer. Being able to lower TCO, reduce support and maintenance, fit sizing to the average load of the machine, burst to the cloud for peaks and transfer loads between Oracle Cloud and on-premises are substantial benefit drivers that CxOs cannot ignore. Experienced Oracle customers know that the best deals are usually available in Q4.
 
Consider Oracle's option as a prospect. Database and tech stack migrations are challenging, so non-Oracle customers will look at Oracle Cloud at Customer with some distance. The benefits of Oracle Exadata on premises are substantial are substantial, though, and CxOs need to talk with their respective cloud and technology stack vendors about what they can do in this regard. Should the projected gap of future roadmap become too large, and the potential cost savings with Oracle Exadata substantial enough, it is time to pay attention, but consider a potential migration.
 
Take a stance on commercial prudence. No matter which vendor, enterprises need to make sure they pay for value. For Oracle Exadata, CxOs need to pay attention that licenses and services (for instance, costs to burst to the cloud) are still providing their enterprise with an attractive TCO. As with all services-related offerings, prices will fluctuate, need to be contractually agreed as long as desired and need to be constantly monitored to avoid negative commercial surprises.
 
Oracle has invested for a long time, and practically gave up on short-term, incremental growth areas in the marketplace to get its systems engineered from the silicon all the way to the SaaS application suite products together in one technology stack. Oracle has always kept the ability to deploy the same infrastructure on-premises, likely to anticipate customer demands as well as knowing that Oracle's IaaS offering was the last of the Oracle "as-a-service" products to reach maturity. This has put Oracle Exadata in a favorable position compared with the competition for next-generation computing architectures because it gives CxOs the highest flexibility to fluidly deploy workloads across the cloud and on-premises.[v]
 

MyPOV

It is good to see enterprise IT vendors pursuing diverse strategies, and we can see the major players following distinct strategies. Diverse strategies mean different value propositions for enterprises, and that means more choice, which consequently gives CxO more options to differentiate and accelerate their enterprise with information technology.
The current three approaches are:
 
  1. The software only approaches that Google Cloud (with Anthos) and IBM (with IBM Cloud Private) pursue.
  2. The partner hardware strategy that Microsoft is using with Azure Stack. (It is too early to know where AWS will end up with Outposts).
  3. And there is Oracle who is building the vertically integrated product stack from silicon, across all ISO / OSI layers to the user click in a SaaS application.

Oracle Exadata X8 is the manifestation of the merits of that strategy, as Oracle has designed Exadata X8 highest Identicality, so that it can run the Oracle Autonomous Database in the best and most efficient way from on-premises to the Oracle Cloud. It is unlikely the competition will even make the attempt that they can run the Oracle Database better than Oracle. Effectively this means that Oracle Database customers will have compelling reasons to remain … Oracle customers.

So for now it is congratulations to Oracle with Oracle Exadata X8 – we will see soon how well the market will receive this new offering.
 
 
[i] Holger Mueller, "The Era of Infinite Computing Triggers Next-Generation Applications," Constellation Research, June 1, 2018. https://www.constellationr.com/research/era-infinite-computing-triggers-next-generation-applications
[ii] For more best-practice considerations for PaaS, offerings, see: Holger Mueller, "As PaaS Turns Strategic, So Do Implementation Considerations,"  May 9, 2018. https://www.constellationr.com/research/paas-turns-strategic-so-do-implementation-considerations
[iii] For more on next-gen applications and PaaS offerings, see: by Holger Mueller, "Why Next-Gen Apps Start with a Next-Gen Platform as a Service," April 5, 2018. https://www.constellationr.com/research/why-next-gen-apps-start-next-gen-platform-service
[iv] For a Constellation ShortList™ on PaaS vendors, see: Holger Mueller, "Constellation ShortList™ PaaS Tool Suites for Next-Gen Apps," August 22, 2018. https://www.constellationr.com/research/constellation-shortlist-paas-tool-suites-next-gen-apps
[v] For more details see: Holger Mueller, "Constellation ShortList™ Next-Generation Computing Platforms" February 12th 2019, https://www.constellationr.com/research/constellation-shortlist-next-generation-computing-platforms
Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Future of Work Big Data SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Salesforce to Acquire Tableau: Why Now and What’s the Path Forward?

Salesforce $15.7 billion mega acquisition will add revenue and blunt a Microsoft competitive threat, but long-term benefits will depend on deeper integration and additive innovation.

Salesforce is spinning its mega acquisition of Tableau Software as the number-one CRM vendor buying the number-one business intelligence (BI) and analytics vendor. It’s a big deal that was likely hastened by last week’s acquisition of Looker by Google. In the short term, it will give Salesforce more revenue, but in my view, the success and ultimate value of the proposed $15.7 billion deal will depend on what Salesforce and Tableau can do together and whether Tableau can accelerate its move into the cloud.

Tableau fills a competitive gap for Salesforce that Einstein Analytics hasn’t filled. Einstein Analytics (which originated as Salesforce Wave Analytics in 2014) is still very new, and it’s not widely adopted by Salesforce customers. What’s more, Einstein Analytics has been largely aimed at CRM-centric analytic needs, whereas Tableau gives it broad, multi-purpose analytical capabilities that are already widely adopted and highly regarded.

A key challenge, however, is that only one third of Tableau customers, at best, are running in the cloud. So either Tableau has to accelerate its move into the cloud or Salesforce has to develop more of a hybrid strategy. The latter would go against Salesforce’s longstanding “no software” ethos, although even cloud player Amazon Web Services (AWS) has made accommodations for on-premises deployments in recent years.

 

One thing that Salesforce and Tableau have in common (other than tens of thousands of customers) is Microsoft as a formidable rival. Microsoft goes after Salesforce primarily with Microsoft Dynamics 365 and it goes up against Tableau primarily with Power BI. In both cases, Microsoft stresses its broader platform, including Office 365, Azure, the LinkedIn graph, and its broad data-management portfolio, but the real weapon on both fronts is the blunt instrument of competitive pricing. Microsoft effectively discounts its CRM and analytics offerings knowing it can count on long-term benefits, stickiness and profits from each customer and byte of data that ends up on Azure.

Competing against Microsoft Power BI is one thing, but cloud competition is about to get tougher with Google’s acquisition of Looker, announced last week. And with both Google and Microsoft now strongly pursuing the BI and analytics market, it likely won’t be long before AWS steps up its game from its current, less-than-competitive QuickSight offering.

Tableau needed a deep-pocketed parent to help it compete against these new competitors. A key area of investment important to both Salesforce and Tableau is augmented analytics and artificial intelligence (AI). Microsoft has been adding augmented capabilities to Power BI, and it highlights the connection to the rest of its AI portfolio. Leveraging one set of AI and augmented analytics investments across Salesforce and Tableau should provide economies of scale that will help both parties innovate.

MyPOV on How to Better Serve Customers Together

I appreciate that Salesforce is promising to maintain Tableau as an independent business, just as it did when it acquired Mulesoft last year. Salesforce is far better than most companies at retaining the leadership, talent and values of the companies it acquires. A big part of Tableau’s strength has been its culture, and I see Salesforce as more likely than any other suitor to retain that energy.   

As I noted above, investments in AI and augmented analytics are an obvious place to start on future innovation. But with trends moving toward low-latency demands and predictive and prescriptive recommendations, I see analytics as destined to be more frequently embedded into applications. Not just OEM apps, but software apps that customers build themselves. Salesforce and the Force.com platform are both good fits for accelerating Tableau’s embedding strategy. Microsoft is pursuing these trends with its Power Apps, Flow and Power BI Embedded capabilities, and Salesforce and Tableau would do well to exploit their strengths.

As for how Salesforce and Tableau could improve and take advantage of integration, a few areas should be addressed to better serve customers. For starters, Tableau must evolve its self-service strengths and provide more tools and controls for centralized governance. The company started down this path a few years ago with data-certification capabilities, and it’s expected to add a data catalog this year.  Salesforce and Tableau together could do more to address centralized data modeling, ensuring reusability and a single version of the truth. Here’s where Looker has strengths, offering an old-school semantic modeling environment built for modern cloud data architectures.

The addition of Tableau also raises questions anew for Salesforce as to how deeply it will invest in data-management capabilities. Last year’s Mulesoft deal upped Salesforce API-oriented integration capabilities, but AWS, Google and Microsoft offer end-to-end database, data warehouse, data integration and high-scale data platform capabilities that give customers one-stop-shop opportunities while also fueling AI capabilities. Salesforce has to decide whether to take a Switzerland approach -- working with all the major clouds and third-party vendors -- or whether it’s going to also offer its own data platforms and services. Perhaps it could choose a middle ground by focusing exclusively on analytics, acquiring, say, Snowflake, and perhaps a bit more in the way of big data and data integration capabilities.

These are interesting times, and I am hearing echoes of the BI and analytics consolidation that happened just over a decade ago. There is a danger that history could repeat itself, as when BusinessObjects, Cognos and Hyperion were acquired in 2007/2008 by SAP, IBM and Oracle, respectively. Back then, many predicted that these massive consolidators would push independents out of business, but that’s not what happened. That’s exactly when Tableau, Qlik, Spotfire and other innovators emerged and it was mostly downhill from there for the incumbents.

The lesson for Salesforce is that it can’t count on the power of its platform to retain and win new Tableau customers; the product must remain competitive on its own merits, and that will require investment and the spark of innovation that got Tableau where it is today.    

Related Reading:
Google to Acquire Looker: First Salvo in a New Round of BI and Analytics Competition
Tableau Advances the Era of Smart Analytics
MicroStrategy Embeds Analytics Into Any Web Interface

 

Tech Optimization tableau salesforce amazon Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Digital Officer Chief Analytics Officer Chief Data Officer

Google to Acquire Looker: First Salvo in a New Round of BI and Analytics Competition

Google announced May 6 its intention to acquire Looker, in a $2.6 billion, all-cash deal that will see the business intelligence, data applications and embedded analytics vendor become, upon the close of the deal, part of the Google Cloud. The move is not totally unexpected, as Looker was a close partner with Google, but it's the first shot that will likely see Google buy more and, most likely, a competitive response from Amazon Web Services.

Looker competes with BI and analytics vendors ranging from IBM, Tableau, Qlik and Microsoft PowerBI to Microstrategy, Oracle and SAP. Looker's strengths include its centralized data-modeling and governance, which promotes consistency and reuse. It runs on top of modern cloud databases, including Google BigQuery, AWS Redshift and Snowflake. There's speculation that Snowflake, which is currently independent, might be a next aquisition target for Google.

Looker has been a significant partner for Google, and with each delployment, customers bring significant amounts of data for analysis onto the Google Cloud. The model for Google going forward is likely to be similar to the way Microsoft promotes Power BI at competitive prices, knowing that there's a payoff in bringing more data onto the cloud platform, making it stickier and driving ongoing storage fees.

With Microsoft promoting Power BI and Google soon promoting Looker, watch for AWS to respond by building or buying an analytics and BI offering that's more attractive and comprehensive than QuickSight, which has thus far, in Constellation's estimation, failed to capture much marketshare.

Established BI and analytics vendors have already been responding to the competitive pressure of Power BI by diversifying and deepening their capabilities, variously adding data-management, data-prep, data catalog, and advanced analytics capabilities. Competitors to Looker will point out that LookML coding is not exactly business-user friendly. What's more, the company has not pursued much in the way of augmented analytics or advanced analytic capabilities, though these are strong suits for Google where it could advance Looker functionality. What's more, Looker is dependent on the underlying database for performance, and customers running on Redshift or other clouds may have concerns about the acquisition by Google.

Google's move will surely spark even more intense competition, and perhaps consolidation among BI and analytics vendors.

Data to Decisions Tech Optimization Microsoft Chief Customer Officer Chief Information Officer Chief Analytics Officer Chief Data Officer

Identity is dead

First published April 2018. 

For at least five years there has been a distinct push within the identity management industry towards attributes: a steady shift from who someone is to what they are.  It might have started at the Cloud Identity Summit in Napa Valley in 2013, where Google/PayPal/RSA veteran Andrew Nash, speaking on a panel of “iconoclasts” announced that ‘attributes are more interesting than identity’.  A few months earlier, the FIDO Alliance had been born. On a mission to streamline authentication, FIDO protocols modestly operate low down the technology stack and leave identification as a policy matter to be sorted out by implementers at the application level. Since 2013, we’ve also seen the Vectors of Trust initiative which breaks out different dimensions of authentication decision making, and a revamp of the US Federal Government Authentication Guide NIST SP 800-63 which decomposes the coarse old Levels of Assurance

Across cyberspace more broadly, provenance is the hottest topic.  How do we know what’s real online? How can we pick fake accounts, fake news, even fake videos?

Provenance in identity management is breaking out all over, with intense interest in Zero Knowledge Proofs of attributes in many Self Sovereign Identity projects, and verified claims being standardised in a W3C standards working group. 

These efforts promise to reverse an inexorable complication. Identity has long been over-analysed and authentication over-engineered.  The more strongly we identify, the more we disclose, and the unintended consequences just keep mounting.  

Yet it doesn’t have to be so. Here’s what really matters:  

  • What do you need to know about someone or something in order to deal with them?
  • Where will you get that knowledge?
  • How will you know it’s true?

These should be the concerns of authentication.  It’s not identity per se that usually matters; instead it’s specific attributes or claims about the parties we're dealing with. Furthermore, attributes are just data, and their provenance lies in metadata.

The conventional wisdom in IDAM now is that few transactions really need your identity.  So why don’t we just kill it off?  Let’s instead focus on what it is that parties really need to know when they transact, and work out how to deliver that knowledge in our transaction systems.

IDAM has been framed for years around a number of misnomers. “Digital identity” for instance is nothing like identity in real life, and “digital signatures” are very strange signatures.  Despite the persistent cliché, there are no online “passports”.

But the worst misnomer of all is the Identity Provider, an abstraction invented over a decade ago to try and create a new order (dubbed at the time, the "Identity Metasystem").  Now, I agree in theory that bank accounts for example may be regarded as “identities”, and it follows that banks could be regarded as “identity providers” (IdPs). But these conceptual models have proved sterile. How many banks in fact see themselves as “identity providers”? No IdPs actually emerged from well-funded programs like Identrus or the Australian Trust Centre, and only one bank ever set up as an IdP in the GOV.UK Verify program. If Identity Providers are such a good idea, they should be widespread by now in all advanced digitizing economies!

The truth is that Identity Providers, as imagined, can’t deliver. Identity is in the eye of the Relying Party. The state of being identified is determined by a Relying Party (RP) once it is satisfied that enough is known about a data subject to manage the risk of transacting with them.

Identity is metaphorical shorthand for being in a particular relationship, defined by the RP (for it is the RP that carries most of the risk if an identification is faulty).  Identity is not the sort of good or service that can be provided; it is a state that is defined and conferred by RPs. The metaphor of identity provision is all wrong; canonical Digital Identity is a false idol.

We hardly ever need to know "who people are" online (or in real life for that matter); we just need to know certain specifics about them. So let’s get over identity, and devote our energies to critical infostructure to supply the reliable data and metadata so urgently needed for an orderly digital economy.

Digital Safety, Privacy & Cybersecurity Tech Optimization Future of Work Security Zero Trust Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Revenue Officer Chief Supply Chain Officer

Recap - Telemedicine panel at So.Cal HIMSS

I had the opportunity to moderate the telemedicine panel at the Southern California HIMSS event on 5/23/19. We had an outstanding list of panelist:

Zia Agha, MD, Chief Medical Officer & EVP, Clinical Research, Medical Informatics & Telehealth, West Health

William Jih, MD, MBA, Medical Director, Population Health & Strategy, Loma Linda University Medical Center

Michael Pfeffer, MD, FACP, Assistant Vice Chancellor and Chief Information Officer for the UCLA Health Sciences

Omid Toloui, MBA, MPH, Vice President, Digital Health, CareMore

 

Virtual Care Trend

  • Five years ago, it would have been a differentiator if you had a virtual care service offering, now it is an expectation from the patient community.
  • Roughly 14% growth in Telehealth adoption.
  • Poll from MGMA in 2018  - 39% of physicians do not offering telehealth option.  There are still a lot of doubters in the physician community.
  • Technology is the easy part of the telemedicine program design.

Challenges with Virtual Health programs

  • Health systems do not have a consistent workflow designed for telemedicine, departments specialties utilizing the telemedicine solutions are all unique, making it difficult when creating the telemedicine technology solution.
  • Physicians are still skeptical about using a computer screen to diagnose the patient.
  • Physicians feel that the loss of touch and senses hinders their ability to make the best clinical decision. 

Chou’s Overview:

  • Telemedicine should be used to augment care. The medical community has to understand that technology is not going to be a replacement for clinicians.
  • Lack of education on telemedicine training.  The use of technology must be incorporated into the medical school curriculum as we live in the digital world.   Next generations of caregivers are digital natives, and they expect that the use of technology to be pervasive in providing care.  Currently, the medical school curriculum does not have a focus on how the use of technology for physicians.
  • Key concerns still exist with uncertainty around reimbursement and physician skeptics, which does not help with adoption.
Tech Optimization Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience AR Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Digital Officer

SAP #SAPPHIRENOW - Healthcare CIO POV

 

 

Great event at SAP SAPPHIRENOW with an emphasis on customer experience and the integration of Qualtrics.  The integration of Qualtrics will be the key.  Where is Qualtrics used in healthcare currently?  The healthcare clients that I have spoken to are utilizing Qualtrics mostly for HR employee engagement and surveys for the research department.  The next phase for Qualtrics is to persuade the health system to use the platform from a patient/customer engagement tool.

What’s next for SAP and the Healthcare CIO?

  1. ERP optimization is a big theme for healthcare provider systems.  Every hospital system in North America is going through an evaluation of their ERP system.  Organization are looking to either upgrade their current system or evaluating a potential change to the next generation cloud platform offering.  This is an opportunity for SAP to grab the healthcare ERP industry, but the challenge is that SAP is not currently on the shortlist for the North American healthcare provider CIOs at the moment.
  2. SAP Intelligent Enterprise.  SAP business object has a great presence throughout the healthcare industry.  The key is to convert these customers to the intelligence platform and will the North American healthcare providers trust SAP as the data platform of choice.

3. Global Healthcare Market.  SAP has done a great job globally in healthcare.  Great customers in the UK, EU, and China has allowed SAP in growing incrementally globally.  I have seen a few healthcare organizations utilize SAP as the main clinical and back office system and that momentum may continue to grow as they focus on building out the platform more.  

 

 

 

Tech Optimization Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience AR Chief Financial Officer Chief Information Officer

Qlik #Qonnections – CIO Point of View

Big Announcements

  • Qlik SaaS and Multi-cloud offering. 
  • Attunity acquisition and platform integration.

Healthcare CIO Point of View

  • When we think of Qlik, it is still a BI platform while the company’s strategy is to transform Qlik to be a data platform.

  • I like the Attunity acquisition because the challenge for every organization’s data transformation is the data integration effort. Hopefully, Attunity will deliver the results as promised.  The key challenge for healthcare providers is data mapping and integration with enterprise healthcare applications.  Will Attunity figure out this out quick and will the existing healthcare clients look to Qlik as an enterprise solution versus a visualization product?

  • Great announcement of supporting multi-cloud and SaaS offering.  Enterprise clients that are currently on-premise will have to rely on the expertise of Qlik resources or partners for the migration which CIO should explore deeper.  It is not an easy migration and I believe we are in the early stages of having a migration playbook.
  • Qlik product sponsorships typically start at the departmental level for the majority of healthcare clients.  Qlik must work towards getting more executive sponsorship at the CIO level in order to align with the strategy of transitioning to a data platform.
Data to Decisions Tech Optimization Future of Work Innovation & Product-led Growth Next-Generation Customer Experience AR Chief Information Officer Chief Digital Officer