Results

Lead times for generative AI systems extending into 2024

Lead times for generative AI systems extending into 2024

Lead times for systems for generative AI and AI workloads are lengthy to the point where infrastructure ordered today may not be installed until Spring 2024.

That's the takeaway from a bevy of earnings conference calls from infrastructure providers. Dell Technologies reported better-than-expected second quarter results and noted that its PowerEdge XE9680 GPU-enabled server is the fastest ramping product it has launched.

Dell's problem: Orders are off the charts, but parts--namely GPUs--are hard to come by even though the company has a strong partnership with Nvidia. Dell Chief Operating Officer Jeff Clarke said the company has about $2 billion of XE9680 orders in backlog with a higher sales pipeline.

Yes, generative AI investment is taking off. But the revolution--and the enterprise use cases on premises and locally--will be delayed due to supply and demand imbalances. Clarke later put a number on the backlog issue (emphasis mine):

"Maybe the easiest measure to determine where we are with supply is demand is way ahead of supply. If you order a product today, it's a 39-week lead time, which would be delivered the last week of May of next year. So, we are certainly asking for more parts working to get more parts. It's what we do. I'm not the allocator, I'm the allocatee. We’re advocating our position on our demand. Again, we are winning business signaled by the $2 billion in backlog today with a pipeline that's significantly bigger I was in the discussion yesterday with two different customers about AI, the day before about AI. It is constantly something that's coming into our business that we're fielding the opportunities. From different cloud providers to folks building AI as a service to enterprises now beginning to do proof of concept and trying to figure out how they do exactly what I just said earlier, use their data on-premises to actually drive AI to improve their business."

A few notable takeaways from Clarke's comments:

To that end, Clarke added that supply will catch up.

"We're tracking at least 30 different accelerator chips that are in the pipeline in development that are coming. So, there are many people that see the opportunity. Some of these new technologies are fairly exciting from neuromorphic type of processors to types of accelerators there's a series of new technologies and quite frankly, new algorithms that we think open up the marketplace and we'll obviously be watching that and driving that across our businesses and helping customers."

AMD is the likely beneficiary with its fourth quarter ramp of next-generation GPUs.

This GPU supply issue is also affecting other infrastructure players.

HPE CEO Antonio Neri said on the company's latest earnings call:

"We start now shipping some of those orders, those wins, but it's a long way to go. And remember that there's two components related to that. Number one is availability of supply, which obviously in the AI space is constrained. Number two is the fact that when you deploy these deals, you have to install it and then drive acceptances, which means elongated times for revenue recognition. And then maybe in a specific win or two, there are other conditions related to the contractual agreements."

Neri added that he's pleased with the quality of AI deals the company is making. HPE, like Dell, is building a lineup that aims for a portfolio of gear for AI training to tuning to inferencing for enterprises looking at domain-specific models.

Broadcom CEO Hock Tan said AI systems take time and supply issues are going to impact a bevy of infrastructure players. "These products for generative AI take long lead times," said Tan. "We're trying to supply, like everybody else wants to have, within lead times. You have constraints. And we're trying to work through the constraints, but it's a lot of constraints. And you'll never change as long as demand, orders flow in shorter than the lead time needed for production."

According to Intel CEO Pat Gelsinger, who spoke at an investor conference last week, competitors will start grabbing some of that GPU demand. Intel's entry in the GPU and AI accelerator race is its Gaudi lineup. Gaudi is a line of AI accelerators built off the acquisition of Habana in 2019.

Gelsinger said Nvidia "has a great leadership position," but "there's a bit of false economy there." He said there's huge demand, high prices and supply chain constraints for Nvidia GPUs but that won't last forever. "Lots of people are showing up, including us, to compete," said Gelsinger. "We've seen a rapid expansion of our Gaudi pipeline. We're building our supply chains to get much larger for our footprint there as we start competing as well as others will."

Intel's argument is that GPUs and CPUs will be used for AI models and enterprises will follow pricing vs returns. "There is a bit of euphoria so overall I expect to see more moderation," he said. "We're going to be competing more for the GPU and accelerator, but we also see workloads driving energy that will create opportunities for our CPU offerings as well."

Research:

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Zoom sets AI Companion roadmap across platform

Zoom sets AI Companion roadmap across platform

Zoom Video Communications' generative AI digital assistant is now available for no additional cost to paid Zoom accounts.

The Zoom AI Companion, formerly known as Zoom IQ, will be rolled out across various services on the Zoom platform, including Meetings, Team Chat, Phone, Email and Whiteboard and be available in a side panel.

Separately, Zoom said that Zoom IQ for Sales will be renamed Zoom Revenue Accelerator.

Zoom launched its generative AI efforts in June with free trials of Team Chat compose and Meeting Summary. The launch Zoom AI Companion marks a more significant rollout across the company's platform. Zoom said it will incorporate its own large language models as well as third party models such as Meta's Llama 2, OpenAI and Anthropic.

The collaboration software space is rapidly adopting generative AI. For instance, Google Cloud is adding a bevy of AI-driven features to Workspace and its platform. Microsoft Teams will have co-pilots.

Another thread to consider is whether vendors are charging for add-ons for generative AI capabilities. Some vendors are adding it to the platform without additional costs while others are adding new SKUs.

Zoom emphasized that it won't use customer audio, video, chat, screen-sharing, attachments or other customer content to train models. In addition, AI Companion is turned off by default.

Here's a look at AI Companion and the roadmap ahead.

  • In Zoom Meetings, AI Companion will enable highlights, smart chapters, review summaries and next steps. Attendees can catch up on meetings using an AI Companion side panel. Hosts can get an automated meeting summary. In spring 2024, Zoom said it will add real-time feedback on their presence in meetings and coaching on presentation skills.
  • Zoom Team Chat will use AI Companion to draft messages based on chat thread context and change tone and length. Long threads will get summaries shortly and Zoom said users will be able to auto-complete sentences and schedule meetings from chat in early 2024.
  • In the fall, Zoom Whiteboard will get AI Companion to help generate ideas. In spring 2024, users will be able to use whiteboard content to generate images and populate templates.
  • Zoom Mail will have the option for draft email suggestions from AI Companion in the fall. In spring 2024, Zoom Phone will be able to summarize SMS threads and calls.
  • The company added that in the spring of 2024, AI Companion will have a conversational interface for pre-meeting preparation, in-meeting support and post meeting summaries, stakeholder recap and action items.

As for Zoom Revenue Accelerator, the company said it will expand capabilities in the fall with Virtual Coach, Deal Risk Signals and Discover Monthly for sales teams. Virtual Coach aims to dynamically train sales teams with conversation simulations. Deal Risk Signals will send alerts for stalled deals. And Discover Monthly looks at competitor trends from sales calls. 

More:

Constellation Research's take

Constellation Research analyst Liz Miller handicapped the Zoom moves. She said:

"Zoom, like many vendors in the CX space, have realized that the high-fidelity signal of customer voice at the moment of engagement, be it in a sales call or in a customer service call, can no longer sit in a silo of communication channel. That voice, thanks to talk to text transcription and AI models trained to extract key signals including sentiment and intent, can now lead to powerful workflows that benefit the buyer and the business. And, thanks to Zoom’s pre-built CRM integrations, this intelligence and conversation doesn’t just sit in the conversation channel but can now be included with CRM data.

The real question for me is where Zoom will pivot next. It isn’t enough to apply AI to voice and engagement. It needs to connect across the silos of sales, service and marketing--something that CRM has proven to do poorly in previous incarnations. Will Zoom become that next platform? Will Zoom be the connection point for all conversations and points of collaboration, including those being held in and at events and marketing-led experiences? While Zoom is keeping pace with the CX platforms out there (and by that I specifically mean those solutions that span Sales, Marketing and Service not just CCaaS and UCaaS tools) is there enough in the innovation engine to think beyond and deliver something more than what the rest of the market has? Right now, it feels like a bigger game of keeping up rather than surging ahead."

 

Future of Work Data to Decisions Innovation & Product-led Growth New C-Suite Marketing Transformation Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Tech Optimization zoom AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Get ready for a parade of domain specific LLMs

Get ready for a parade of domain specific LLMs

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly.

Generative AI and large language models (LLMs) have received plenty of buzz, but enterprises need to stay focused on how domain-specific models develop. Why? That's where the returns will be.

At Constellation Research, we've kicked around small language models, safeguarding corporate data while tuning LLMs and how the real wins will be for specific use cases. Generative AI is nice for search and consumer use cases, but the magic happens when enterprises drive returns.

Fortunately, domain specific models are developing quickly. For instance, Google Cloud outlined Med-PaLM 2, a medically tuned LLM, that's aimed at healthcare. Google Cloud launched a bevy of AI tools at Google Cloud Next for CIOs to digest.

Bayer is using Google Cloud’s Vertex AI Search and exploring Med-PaLM 2 use cases. Hospital giant HCA is working with Google Cloud to leverage Med-PaLM to support doctors treating patients. Google Cloud fleshed out the HCA collaboration a bit more in a release, but the gist is that the two parties, along with Augmedix, is tuning models to support doctors and nurses. HCA is targeting:

  • Generative AI to improve patient handoffs between nurses.
  • Speech to text with ambient medical documentation.
  • Google AI to document key medical information more easily from conversations during patient visits.

According to Google Cloud, prompts were designed to guide the LLM toward topics such as medication changes, lab results, vital sign fluctuations and patient concerns. HCA is collecting nurse feedback to refine the tool. Ultimately, HCA wants to use Med-PaLM 2 LLM to support caregivers.

8 takeaways from Constellation Research's Healthcare Transformation Summit

My bet is that these domain-specific LLMs are going to be the real win for enterprises since they can leverage models and refine with their own data. These large LLMs such as PaLM, ChatGPT and LLama will have versions for various industries and use cases.

And this trend is going to go beyond healthcare. See:

Aneel Bhusri, CEO of Workday, said during the company's second quarter earnings call that it is using customer anonymized data to train LLMs. "We can then do domain-specific large language models, and those are smaller and less expensive. And we turn around and use those models to either make our products more competitive or they're the basis of new SKUs like the Skills Cloud," explained Bhusri. The real takeaway is that Workday isn't going to go add-on happy with charges. He said:

"I think you see us more in the mode of new SKUs like Skills Cloud rather than actually charging for any insight from the data -- that it's the customer's data. They allow us to use it in an anonymized way and we give them the results back. But I think what it allows us to do is train these large language models and then domain specific ones that will create new SKUs."

Intuit CEO Sasan Goodarzi was also bullish on domain-specific LLMs. Goodarzi said that Intuit "has incredibly rich longitudinal, transactional and behavioral data for 100 million customers."

Goodarzi added:

"For small businesses, we have a 360-degree view of their business and customers. We have 500,000 customers and financial attributes per small business on our platform and this data gives us insights into behaviors, income streams, expenses, profitability, and cash flows, enabling us to provide personalized experiences and recommendations to help them prosper.

Additionally, we have 60,000 financial and tax attributes per consumer on our platform. We are using our data to fine-tune our own financial large language models that specialize in solving tax, accounting, cash flow, marketing, and personal finance challenges."

Another key thought about domain specific LLM and AI use cases is that workloads will be spread around across multiple industries. 

Dell Technologies' Jeff Clarke, Chief Operating Officer, said the domain-specific use of LLMs and AI models will touch every industry. He said:

"What we think really happens on the enterprise level and in business is sort of the notion of domain-specific process-specific or field of study type of AI, where we actually use customers' data business will use their data they will tune the model and then run inference at site on edge, whether that be in a smart factory, smart hospital in a transportation network. So when you think about the vertical nature of this and how it will actually work in the real world, we think that technology makes its way all the way out to the edge, AI follows where the data is going to be created."

Data to Decisions Innovation & Product-led Growth Next-Generation Customer Experience Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity Big Data AR AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

MongoDB sees Q2 surge, ups fiscal 2024 outlook

MongoDB sees Q2 surge, ups fiscal 2024 outlook

MongoDB raised its outlook for the third quarter after delivering strong second-quarter results. The earnings land after a bevy of MongoDB launches focused on generative AI workloads.

The company projected third quarter revenue of $400 million to $404 million with non-GAAP earnings of 47 cents a share to 50 cents a share. For fiscal 2024, MongoDB projected revenue of $1.596 billion to $1.61 billion with non-GAAP earnings of $2.27 a share to $2.35 a share.

MongoDB previously projected fiscal 2024 revenue between $1.52 billion to $1.54 billion with non-GAAP earnings of $1.42 to $1.56 a share.

That optimistic outlook is partly fueled by the strong second quarter. MongoDB said it is capturing workloads and bolstering margins as developers seek a unified platform for AI workloads. CEO Dev Ittycheria said MongoDB's Atlas platform was gaining enterprise traction.

At MongoDB's local developer conference in New York, the company launched a bevy of features including Atlas Vector Search, Atlas Stream Processing and Atlas for Industries. 

The company reported second-quarter revenue of $423.8 million, up 40% from a year ago, with non-GAAP earnings of 93 cents a share. MongoDB's net loss was 53 cents a share.

Wall Street was looking for non-GAAP earnings of 45 cents a share on $390.87 million.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity mongodb Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Dell Q2 strong as AI-optimized servers, workstations see demand surge

Dell Q2 strong as AI-optimized servers, workstations see demand surge

Dell Technologies reported better-than-expected second quarter results and said it saw strength in AI optimized servers, storage and workstations that can run AI workloads locally.

The company reported second-quarter earnings of 63 cents a share on revenue of $22.9 billion, down 13% from a year ago, but 10% higher than the first quarter. Non-GAAP earnings for the quarter were $1.74 a share.

Wall Street was expecting Dell Technologies to report earnings of $1.14 a share on revenue of $20.85 billion.

Dell Technologies, like HPE, said it was seeing strength as cloud providers and enterprise build infrastructure for AI workloads. Dell has a partnership with Nvidia for AI-optimized infrastructure. How AI workloads will reshape data center demand

Jeff Clarke, Dell's chief operating officer, said the company saw "a better demand environment" and AI "is already showing it's a long-term tailwind, with continued demand growth across our portfolio."

In prepared remarks, Clarke said Dell was cautious about the quarter, but demand "improved at a faster rate than we anticipated, particularly as we moved into June and July."

Clarke added that Dell executed well and was more selective on deals and pricing.

Dell has built out its lineup for generative AI gear and has validated designs with Nvidia. Other generative AI optimized products include Dell PowerEdge XE9680 servers and Dell Precision workstations with up to four Nvidia RTX 6000 Ada generation GPUs. 

Research:

Clarke said:

"From a solutions perspective, we saw significant strength in AI enabled servers. PowerFlex and PowerStore demand grew within our storage portfolio. PowerFlex, our proprietary software defined storage solution, has now grown for eight consecutive quarters, with demand in Q2 more than doubling year-over-year. Workstation demand grew and was another bright spot that will continue to benefit from the rise of AI. Developers and data scientists can now fine-tune Gen AI models locally before deploying them at scale."

The company's Infrastructure Solutions Group had second quarter revenue of $8.5 billion, down 11% from a year ago. Storage revenue was $4.2 billion with servers and networking revenue of $4.3 billion. Operating income was $1 billion.

"In Q2 alone, we saw unprecedented strength from our PowerEdge XE9680. It's the fastest ramping new solution in Dell history and builds on the success of other GPU enabled servers we have been selling for years," said Clarke.

Although Dell didn't provide an outlook, the demand surge for AI-optimized servers appears to be ongoing. AI servers were 20% of server order revenue in the first half of the year and Dell has $2 billion of XE9680 orders in backlog with a strong sales pipeline.

Dell's Client Solutions Group had second quarter revenue of $12.9 billion, down 16% from a year ago. Commercial client revenue was $10.6 billion and consumer revenue were $2.4 billion. Operating income was $969 million.

Clarke added that many AI workloads will be on-premises or at the edge due to latency, data security and costs. Customers are focused on using generative AI for customer operations, content creation, software development and sales.

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity dell Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Cybersecurity platforms spar over data, generative AI, wallet share

Cybersecurity platforms spar over data, generative AI, wallet share

Cybersecurity platforms are consolidating, and enterprise buyers are evaluating datasets, intelligence, integration and generative AI capabilities. What's unclear is the number of cybersecurity platforms that win.

The three next-gen cybersecurity platforms--Palo Alto Networks, Crowdstrike and Zscaler--all have AI capabilities, strong platforms, data signals and heady revenue growth. The elephant in the cybersecurity room is also clear: Microsoft.

Crowdstrike's second quarter earnings highlighted the moving parts well. Crowdstrike hasn't shied away from taking a few jabs at Microsoft. CEO George Kurtz noted:

A major auto manufacturer tried but failed to consolidate their security on Microsoft E5. This company's security team quickly realized Microsoft's complexity, multiple consoles, lack of integration, miss detections and complex deployments hampered their ability to defend themselves and consolidate. This customer is now consolidating on the Falcon platform with Falcon Complete for Endpoint, Identity and Cloud. Now with a single agent, single user interface and single platform, they have complete visibility across their end points, cloud and identities and the ability to stop threats in real time. By moving from expensive Microsoft E5 to CrowdStrike, organizations can save 50% plus per user per year on Microsoft licensing costs, adding up to millions of dollars of savings."

That quote landed just a few months after Crowdstrike's Investor Day where a section of the presentation was devoted to Microsoft and how the company wins 8 out of 10 times when an enterprise customer tests the two platforms.

Microsoft's Charlie Bell, Executive Vice President of Microsoft Security, was speaking at an investor conference the same day as Crowdstrike earnings. "I think we're one of the major beneficiaries of the consolidation move. We see healthy growth. We're now a million organizations protected, and that number grew by 26% last year," said Bell. "The number of customers who are using more than four workloads, that number has gone up by 33%. I think there's a lot of optimization that people were doing."

Bell added that Microsoft's AI efforts go beyond ChatGPT. "We often say security is a team sport. Well, within the AI world, building a copilot is this team sport. It's not just the LLM, it's specially trained models," explained Bell. "One of the beauties of being a cloud provider is you don't just get to see one environment, you get to see lots of environment. And so there's a data asymmetry that works to our advantage. We do 65 trillion signals a day processed within our products. And the fact that we have all that data, I think, is a huge advantage."

Crowdstrike has generative AI called Charlotte AI that promises to create virtual security analysts and help enterprises respond to threats faster. Charlotte AI, which will be priced in the weeks ahead, leverages Crowdstrike's data.

More: Palo Alto Networks: Takeaways from a Friday afternoon treatise

These cybersecurity platforms are arguing data is the differentiator since it can train models to read and react to incidents faster.

Palo Alto Networks will focus on "precision AI" that can't be wrong. "We have to build a lot of our own models. We have to train them. We have to collect first-party data. We have to understand the data. Today, we collect approximately 5 petabytes of data. Yes, 5 petabytes of data on behalf of our customers and analyze it for them to make sure we can separate signal from noise and take that signal and go create security outcomes for our customers," said CEO Nikesh Arora.

Crowdstrike's Kurtz was asked about how much of a generative AI and data moat the company has when the big guns are all talking the same game. Kurtz said curation of the data set matters as much as the petabytes involved. He said:

"It isn't just about the most data. You'll hear that from a lot of vendors. It's really about sort of the curated data set because when we think about generative AI, it actually has to be trained. We have a very well-defined training set that's annotated based upon all the threat hunting that we've done over the last 10 years. So we believe our 10-year head start in terms of having a data set that's actually curated is going to give us a distinct advantage of helping our customers. Then it's a foundational platform component, which is made available to every other service on the platform, which is different than others. We'll see how it all unfolds, but initial customer reaction has been very positive."

Bottom line: Four major security players are looking to blend data, signals, platforms and cybersecurity. These four can take business from smaller players for multiple quarters. What'll be interesting to see is how these cybersecurity giants take business from each other.

Constellation Research's take

Liz Miller, Constellation Research analyst, said:

"AI has long been touted as a potential savior for security, especially security operations centers that have long been overwhelmed by lackluster signals setting off an avalanche of alerts that are time consuming and tedious. In this regard the big players like Palo Alto and especially Microsoft are particularly well positioned with an expansive and comprehensive portfolio to train and fine tune models. However, where the training needs to focus is on automating the workflows around the work of security.

It may not be time to count out IBM in this AI for cyber mix. IBM is looking at everything from protecting the data that is now being randomly splashed into enterprise business and customer graphs along with powering risk analysis for incident summaries that are based on fine tuned, high-fidelity reports. IBM's managed services solutions including MDR and IDPS solutions are turnkey and most include their “X-Force” response team that now has AI added as an army of support.

But there is an even greater threat to security platforms and their wallet share given this AI evolution. Organizations have started to admit that they are diverting budget away from security transformation initiatives and shifting those dollars into AI initiatives that are driving revenue or saving money. Yes, AI has the potential to shift the cybersecurity posture and preparedness discussion completely and dramatically, but it also has the potential to sideline security initiatives. That is not where we should be today.”

Digital Safety, Privacy & Cybersecurity Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience infor Security Zero Trust AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Product Officer

Salesforce's Dream(force) is about Data Cloud, being your single source of truth

Salesforce's Dream(force) is about Data Cloud, being your single source of truth

Here's all you need to know about Salesforce's strategy. On the company's second quarter earnings conference call, Data Cloud was mentioned 34 times, Dreamforce 31 times and generative AI 21 times.

Put the three together and you pretty much know what's coming at Dreamforce Sept. 12-14.

The bigger picture here is that Salesforce will double down on Data Cloud, which the company says is its fastest organic growing cloud and use it to expand wallet share. For a second there, I thought Salesforce CEO Marc Benioff was channeling Snowflake CEO Frank Slootman, who has said "enterprises and institutions alike are increasingly aware they cannot have an AI strategy without a data strategy."

Benioff took a more roundabout way to get to the data strategy part. Salesforce's take is that enterprise data strategy should be to unify on Data Cloud and then leverage the integration and generative AI tools with its other clouds.

ResearchHow Data Catalogs Will Benefit From and Accelerate Generative AI | Constellation ShortList™ Embedded Analytics Platforms for Cloud Applications | Analytics and Business Intelligence Evolve for Cloud, Embedding, and Generative AI

Benioff said (emphasis mine):

What you can see with Data Cloud is that customers must get their data together if they want to achieve success with AI. This is the critical first step for every single customer.

We're going to get this Data Cloud turned on as fast as we can and as easily as we can for every single one of our customers.

It not only has AI built in, but it's real time, it's automated, it's integrated with the core platform. It's not some separate Data Cloud. It's an integrated part of our platform in our metadata, in our core code, like our Sales Cloud, like our Service Cloud and, as you're about to assume seeing our new Marketing Cloud and Commerce Cloud and of course, our core application development capabilities all inside our Data Cloud.

And our data cloud is so deeply integrated as part of this core metadata architecture. It's allowing our customers to quickly action all of their data from any source without the costly integration project necessary with stand-alone data warehouses and data lakes, they've been forced to buy and create more islands of information and all of these independent systems and independent teams versus having one integrated data architecture.

We're moving our customers from having islands of data to having a single source of truth for all of their data. This is our greatest dream."

According to Salesforce, Data Cloud ingested 6 trillion records in the second quarter. Data Cloud was in five of Salesforce's top 10 deals in the quarter with FedEx being the biggest win.

Clearly for Salesforce, the game is stacking multiple clouds with Data Cloud being the linchpin that drives returns. Benioff said:

"As these clouds get stacked with these customers', attrition falls, customers become more successful, they develop a single source of truth. And our job is to get all of these things running on our core and getting all of these things ignited with artificial intelligence."

Will customers buy Salesforce's Data Cloud dream?

It's not exactly a new phenomenon that an enterprise vendor wants to be your single source of truth. The more data that is housed on a platform the more lock-in is created. Oracle, SAP, Salesforce, Microsoft and a bevy of others play a similar game. Toss in Snowflake and Databricks and there's big money in being the data platform of choice.

For Salesforce, it's critical that Data Cloud integrates its various services. Data Cloud is the avenue for Salesforce to sell more clouds. And the built-in integration with multiple Salesforce clouds may be well timed given enterprises are consolidating vendors. That native Data Cloud integration with Salesforce applications is likely to drive demand regardless.

However, the idea that Salesforce will be the single source of data truth is a bit farfetched. After all, data resides in multiple lakes, warehouses, databases and repositories across clouds. The game is able to tap into those data stores seamlessly. Salesforce knows this reality already and has partnered with Salesforce as well as the hyperscale cloud providers so customers can bring their own data.

My bet: Data Cloud will be the big Salesforce theme, but MuleSoft, which connects various systems and data, will be the secret sauce.

Previously:

Data to Decisions Marketing Transformation salesforce Big Data Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer

Salesforce Q2 better than expected, outlook raised ahead of price increases

Salesforce Q2 better than expected, outlook raised ahead of price increases

Salesforce reported better-than-expected second quarter results and raised its outlook ahead of its August price increases.

The company reported second quarter earnings of $1.28 a share on revenue of $8.6 billion, up 11% from a year ago. Non-GAAP earnings in the quarter were $2.12 a share. Wall Street expected Salesforce to report non-GAAP earnings of $1.90 a share on revenue of $8.53 billion.

In July, Salesforce said it would raise list prices across its clouds by about 9%. Those price increases started in August and weren't captured for the second quarter, which ended July 31.

Salesforce CEO Marc Benioff said the company is seeing improving demand in the second half of the fiscal year and raised its fiscal 2024 outlook for operating margins and cash flow.

On a conference call with analysts, Benioff said the company is committed to improving margins while investing in the future. Naturally, Benioff talked about AI. “We are at the dawn of a new AI innovation cycle. Every company will undergo an AI transformation with the customer at the center,” said Benioff.

He added that Dreamforce in September will feature Data Cloud advances that make it easier to access data with one architecture. Benioff took aim at data warehouses and silos and said it’s hard for customers to integrate. “Our goal is to make it easy for every customer to turn (Data Cloud) on,” said Benioff. “Customers must get their data correct if they want to move forward with AI.”

Benioff added that Salesforce is using its own AI internally and “trying to augment ourselves using Einstein.”

The company projected third quarter revenue of $8.7 billion to $8.72 billion, up 11% from a year ago. Non-GAAP earnings for the third quarter will be $2.05 a share to $2.06 a share. For fiscal 2024, Salesforce said revenue will be $34.7 billion to $34.8 billion, up 11%, with non-GAAP earnings $8.04 a share to $8.06 a share.

When Benioff was asked about whether Salesforce could grow wallet share, he said growth will start with AI even as sales cycles are longer than usual. “I really think Dreamforce will be a catalyst for customers to grow with us as they reignite their IT budgets,” said Benioff. International expansion and industries are also key growth avenues.

In the second quarter, Salesforce's revenue by cloud showed growth between 10% and 16%. Here's the breakdown.

  • Sales cloud revenue was $1.89 billion, up from $1.69 billion a year ago.
  • Service cloud sales were $2.05 billion, up from $1.83 billion a year ago.
  • Platform and other revenue were $1.64 billion, up from $1.48 billion.
  • Marketing and commerce cloud revenue was $1.24 billion, up from $1.21 billion a year ago.
  • Data revenue was $1.19 billion, up from $1.02 billion. Data revenue includes Tableau and MuleSoft.

By geographic region, Asia Pacific had the fastest growing revenue growth at 24% from a smaller base with growth in Europe at 11% and Americas at 10%.

Marketing Transformation Matrix Commerce Next-Generation Customer Experience Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Digital Safety, Privacy & Cybersecurity Tech Optimization salesforce AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

UserTesting launches AI Insights Summary, marries generative AI with experience research

UserTesting launches AI Insights Summary, marries generative AI with experience research

UserTesting launched UserTesting AI, tools that will provide customer experience insights and pain points, and AI Insights Summary, which will use GPT to put verbal and behavioral data in plain English.

AI Insights Summary, outlined at UserTesting's customer conference, is interesting because it enables UserTesting to use generative AI to feel your pain with new designs and experiences. UserTesting provides tools and research to improve product and customer experiences.

According to UserTesting, AI Insights Summary will provide summaries as well as evidence since each insight points back to source video and data.

Constellation Research analyst Liz Miller, checking in from UserTesting's THiS23 conference, said UserTesting's approach with AI Insights Summary can save companies money by limiting rework as designs are tweaked to improve experiences.

"One of the things CEO Andy MacMillan emphasized in his keynote here at THiS23 something that a lot of us in digital experience and design know all too well: the most costly part of the design process is the cost of reworking what didn’t quite land as expected," said Miller. "AI Insight Summary is a tool that can help address that issue."

Miller said AI Insight Summary's ability to synthesize multiple streams of data and bring evidence will enable a researcher, marketer, CX or UX team to double click into video and clips from research panels. These drilldowns can provide a lot of feedback and nuances to improve design.

"While lots of 'conversation summary' tools exist today--and many of them have been supercharged thanks to generative AI--this AI tool has also been trained in research," she said. "It is intentionally looking for those patterns, connections and actions. Imagine connecting a summary that outlines what a customer did, WHY they did and how they FELT while they were doing it."

Indeed, AI Insight Summary can show how many users completed a purchase, became frustrated and interacted with images and guides. AI Insight Summary processes verbal, design and behavioral data and then converts it to transcripts to identify anomalies and various findings.

Other key points about UserTesting AI and AI Insight Summary:

  • UserTesting AI highlights areas for improvement and friction points in workflows and processes in digital experiences.
  • AI will be embedded in UserTesting AI throughout the states of research processes with a focus on sentiment, intent analysis, interactive path flows and friction.
  • Research teams can leverage generative AI to focus on more strategic work.
  • AI Insight Summary is in beta.

Miller added that AI Insight Summary is a good indication of where experience research can go. "These are the complex processes in research and UX research that are ripe for AI innovation," said Miller. "These innovations are allowing designers and product owners to ask very different questions and establish processes where repeatable, trusted and verifiable research insights can just be built into the design lifecycle."

Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Marketing Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

The Driver Controls the Radio: GenerativeAI, Microsoft’s CoPilot and How AI is Learning the Rules of the Road

The Driver Controls the Radio: GenerativeAI, Microsoft’s CoPilot and How AI is Learning the Rules of the Road

There is a fundamental road-trip rule: The driver controls the music.

The driver, you see, controls the car. The audio system is part of the car. Ergo, the driver controls the audio system. And she who controls the audio system controls the music. Can a passenger make requests? Yes. Can the passenger occupying that passenger seat be asked to manipulate settings or get us to the right playlist? Sure. But, in the end, the driver controls the music.

Today’s cars have everything from endless variety of curated playlists to digital whoopie cushions that can deploy a range of gas-based melodies, turning the once simple choice of suitable driving music into a potentially overwhelming extravaganza. I rely on my co-pilot’s capacity to listen, synthesize requests around the entire vehicle, make suggestions, but in the end to execute, not distract me from the job at hand.

As a marketer, a co-pilot for this wild ride known as customer engagement is exactly what I and so many of my CMO colleagues have expressed needing for their teams and for themselves. I’m not looking for someone to control the radio…just someone to help recommend some new playlists and time-saving options. What I absolutely don’t need is more complexity…more buttons to push, more apps to sift through and more Teams messages distracting me from the road ahead. I don’t need the promise of perfection, but I would welcome a shortcut or two.

This is where too many early applications of AI (and especially generative AI) have failed the average Marketer. In a time when we didn’t think being a marketer armed with data and technology could be any more complex, we added AI that can spit out commands about what time emails are opened and a dozen new subject lines we should be testing. Far too many AI tools sound like a mystical “easy button” that enter this world fully formed and matured, but in reality they are barely trained to understand language, let alone understanding business or marketing.

When the headlines started screaming how AI could take a marketer’s job, some marketers just laughed. Can any of you remember the last time we asked for something—an email delivery report, a segment analysis, an image for the home page—and we got the exact right thing we wanted the first time we asked for it? While marketers have been curious about the world of possible, many are also weary of all the promises unfulfilled and the implications of applications gone wrong, especially as early examples of AI malcontent have begun to emerge.

This is why I was a bit skeptical when taking a spin through the ongoing advancements Microsoft has been making with its Dynamics 365 Co-Pilots, now available for Sales, Customer Service and Marketing with a preview recently announced for Field Service. Did we really need another tool? Was AI going to make a marketer’s drive that much more enjoyable? Or could this new AI implementation take our eyes off of growth?

Almost immediately I came to an understanding that the CoPilots infused into the Microsoft universe, from Dynamics to Office, were truly there to sit in the passenger seat and work with the driver…not to distract. First things first…CoPilot is not something that a Microsoft Dynamics user HAS to use. Instead, CoPilot is something that can be toggled on. CoPilot also isn’t isolated to a single functional tool which is exceedingly important to Chief Marketing Officers in this age of driving enterprise-wide growth and engagement strategies. CoPilot thrives when cross functional data is unleashed and true insights about the customer, about opportunities and about the work at hand can be accessed.

Let’s take the act of identifying opportunities for growth, a strategy that typically starts with identifying a segment or cohort ripe for profitable engagement. While this sounds simple and like what systems have been able to do before generativeAI, the reality is that the questions marketers learned to ask where limited by their knowledge of the data structures underlying their analytics or engagement systems. Questions were structured around what was known and were actually rather difficult to craft. If we pull back the curtain on the realities of posing these types of queries before large language models and generative AI innovations, marketers would either need to rely on data and analytics teams or craft basic queries from pre-seeded templates. Questions that felt easy to answer demanded prompt engineering expertise and the ability to craft a query in SQL.

Segment building with CoPilot allows a marketer to just ask a question, in their natural pattern of speech, to identify a new segment. But CoPilot can actually take an additional step by including deeper insights to the response, delivering a look-alike segment that also includes details on customer lifetime value or known engagement or contact preferences. It takes "can I see a segment of prospects in the Los Angeles area" to "can I see a segment of prospects within the retail industry with headquarters in the Los Angeles area who have attended webinars and also have an average deal size of over 1 million year to date."

While CoPilot has easily stepped into the spotlight in recent Microsoft Dynamics centric announcements, what caught my eye even more was Microsoft’s move to center both the data AND the engagement layers marketing relies on by bringing their Dynamics 365 Marketing and Dynamics 365 Customer Insights into one solution, now dubbed Dynamics 365 Customer Insights. This shift brings all engagement and experience tools that had existed across the Marketing offerings into a natively integrated, robust customer data platform (CDP) that sits at the ready to ingest, harmonize and normalize customer data and the accelerate to shift to execution and engagement. It intentionally and profoundly connects the stores of data that represent knowledge of the customer with the systems of engagement that power experience delivery.

Yes…in the analogy of who controls the music, what Microsoft envisions here is better integration of car and fuel so that the AI-powered CoPilot has an even greater opportunity to connect to, synthesize and deliver recommendations and actionable intelligence that make the entire driving experience that much better. While the driver is still in total control, there is more readily available information, recommendations, and ambient experiences to make driving easier and perhaps more rewarding.

But for any driver…I mean Marketer…looking out at all these AI innovations, including Microsoft’s CoPilot features in Dynamics Customer Insight, now is the time to raise the question of strategy and deployment. AI, especially generative AI, still requires guiderails to ensure commercial viability and enterprise readiness. It demands a massive corpus of data to train and refine direction and outcomes. It will still require the brilliance, creativity and judgement of the marketer to turn questions and ideas into profitable strategies and executions. The goal of AI for modern marketing is to have a new breed of amazing interns/assistance/copilots, capable of a decision, content and insight velocity that can move us well beyond the limitations of human scale.

My recommendation to Marketers is to start driving with AI in the same way that you started driving a car…you learn, you practice. You don’t necessarily jump into a Ferrari and speed onto the highway on your first ever drive. Instead, you drove mom’s old car in circles in a massive parking lot until you built confidence, knowledge and skill. Start small, but keep scaling and know that not every suggestion will be a win…and not every AI-derived recommendation needs to be followed. These are the early days…the days when the training wheels most definitely stay on.

But these are also the days when you just roll down the window, ask your CoPilot to find something new that you can sing along to, stick your hand out into the wind and just enjoy the ride.

 

 

 

Data to Decisions Marketing Transformation New C-Suite Next-Generation Customer Experience Chief Customer Officer Chief Marketing Officer Chief Digital Officer