Results

A nuanced view of AI data center buildout concerns

Microsoft CEO Satya Nadella and Meta CEO Mark Zuckerberg riffed on AI, productivity and building out infrastructure this week at the LlamaCon developer conference and inadvertently touched on concerns about the great AI infrastructure buildout.

Nadella, who was on stage with Zuckerberg, talked about AI as a productivity enhancer as great as electricity was. Nadella said:

"AI has the promise to deliver real change in productivity and that requires software and management change. People have to work with it differently. What happened with electricity? It was there for 50 years before people figured out they had to really change the factories to use electricity differently. It was a famous Ford case study. To me, we are somewhere in between. I hope we don't take 50 years. Tech has to progress and you have to put that into systems to actually deliver the new workflow."

Zuckerberg, who is doubling down on AI spending, quipped:

"We're all investing as if it's not going to take 50 years so I hope it doesn't take 50 years."

That exchange largely sums up concerns about the AI data center buildout as Google, Meta, Microsoft and Amazon report results followed by Nvidia a few weeks from now. Most of the consternation about data center overcapacity revolves around whether the insatiable demand for Nvidia GPUs will continue. I'll give you the spoiler alert: That demand won't continue. All of Nvidia's big buyers are also developing custom silicon in addition Jensen Huang's stack. Supply and demand will balance out.

However, the data center buildout is a way more nuanced topic. GPU demand could stagnate and even fall, but the big picture is that spending on newfangled AI factories will continue. Why? Time frames are vastly different. The AI-optimized system is just one part of the equation. You could stop buying AI gear tomorrow and still need to invest in the data center shells, power and cooling and building permits for the next three years. Once those other parts of the AI factory in place you can simply buy Nvidia gear three or four generations from now.

Simply put, buying Nvidia's latest and greatest is an annual decision. Building a functioning data center is a three- to five-year decision.

With that backdrop in mind, I took the pulse of the data center buildout and aggregated comments from CxOs at every level of the data center stack. Here's a look at the nuance.

CenterPoint Energy CEO Jason Wells:

“Since we submitted our forecast to ERCOT at the end of January, our load interconnection queue has grown by another seven gigawatts through 2031. This represents a nearly 20% increase in load interconnection requests in a little more than two months.

This significant increase is driven by a diverse set of load growth factors, including industrial customer demand, data centers and transportation electrification projects.

You had a seven gigawatt increase in our interconnection queue that I mentioned. So, on the fourth quarter call, a little over two months ago, we said that we had a 40-gigawatt interconnection compute now it's 47 gigawatts, six of that is related to incremental data center demand our data center queue is now roughly 20 gigs.

We’re starting to grow a larger ecosystem here in the Greater Houston area. We've had some really high-profile high-tech manufacturing announcements with Foxconn, Apple, NVIDIA all looking at rapidly expanding their production of their server racks, everything but effectively the chips.

And I think that significant investment in that kind of data center ecosystem is also continuing to attract data center demand. And so, it has really been an explosive level of growth for us, really starting back to last summer.”

Alphabet CFO Anat Ashkenazi:

"With respect to CapEx, our reported CapEx in the first quarter was $17.2 billion primarily reflecting investment in our technical infrastructure, with the largest component being investment in servers, followed by data centers to support the growth of our business across Google Services, Google Cloud, and Google DeepMind.

in Cloud, we're in a tight demand supply environment and given that revenues are correlated with the timing of deployment of new capacity, we could see variability in cloud revenue growth rates depending on capacity deployment each quarter.

We expect relatively higher capacity deployment towards the end of 2025. Moving to investments, starting with our expectation for CapEx for the full year 2025. We still expect to invest approximately $75 billion in CapEx this year. The expected CapEx investment level may fluctuate from quarter-to-quarter, due to the impact of changes in the timing of deliveries and construction schedules."

PG&E CEO Patti Poppe:

“We've updated our data center project pipeline from the year-end call. And as you can see, our pipeline has grown from 5.5 gigawatts to 8.7 gigawatts.

We are privileged to serve California, including the Bay Area, which has the fiber network enabling speed and reliability for data center customers and also the density of talent needed to maximize the potential of artificial intelligence.

We have 1.4 gigawatts in final engineering comprised of 18 projects. To-date, these have not been the mega data center project designed to power large language learning models. Rather, the demand in our service area has been mostly from customers looking to power inference models which are driving value for their businesses, true Goldilocks demand, big enough to matter, not so big that it's a problem.

And what's exciting about Northern California, and this is why we call it Goldilocks, is that these are inference model size data centers. The bulk of them are the 100-megawatt or so inference model data centers, imagine a data center that's designed to do -- to serve multiple tech companies who are using AI in their daily business, so they need more compute power. That's what we're able to serve.

So these aren't single silver shovel projects as we talk about. These are a variety of smaller projects that will go through because that demand for compute power is real, particularly here in the Bay Area where we have this density of technical talent who can leverage AI. So this trend is absolutely real for us."

Digital Realty Trust CEO Andy Power:

“Despite the headlines, demand for data center capacity remains strong and our value proposition continues to resonate, evidenced by nearly $400 million of new leasing completed in the quarter, or $242 million of new leasing at Digital share, with healthy contributions from both our major product categories.

This year we announced our first U.S. Hyperscale Data Center Fund, continuing to evolve our funding model and further expanding the pool of capital available to support the growth of hyperscale data center capacity. The fund offers a unique opportunity for private institutional investors to invest directly in hyperscale data centers alongside the world's largest data center provider. It is dedicated to investing in high-quality hyperscale data centers located across top-tier US metros, including Northern Virginia, Dallas, Atlanta, Charlotte, New York Metro and Silicon Valley.

We've seeded the portfolio with five operating assets and four land sites for data center development and have received very strong interest and limited partner commitments from some of the world's savviest investors, including sovereign wealth funds, pension funds, insurance companies, endowments and other institutional investors. The fund will support approximately $10 billion of hyperscale data center investment, enabling us to serve the robust demand of our customers, while enhancing our returns through fees.”

Schneider Electric CFO Hilary Maxson:

“In data center, we continue to see strong double digit demand with continued strength in North America and East Asia. This strong growth trend already starts to include adjustments made by certain customers and is indicative, we believe, of the true underlying trend for data centers, which we expect to continue aligned with the expectations we shared at our Capital Markets Day and again in 2024.

We also expect continued strong demand for systems led by data center and infrastructure projects.”

Schlumberger Limited CEO Olivier Le Peuch:

“We also saw continued growth momentum in our data center infrastructure solutions business in this region.

Customers are accelerating the adoption of digital and AI solutions to extract further efficiencies and performance across the upstream lifecycle -- both in planning and in operations across development and production.

Over the past 2 years, we have engaged hyperscalers, whom we partner with in digital, to unlock new opportunities for our business through the development of data centers. This resulted in a significant contract award for the provision of manufacturing services and modular cooling unit, which we are currently fulfilling.

Based on our performance and unique capabilities, we are also gaining access to a new opportunity pipeline, and we are expanding our technological offerings with low-carbon solutions to serve new potential customers. Overall, this is a very exciting and fast-growing market driven by AI demand, and we expect it to contribute to our diversified exposure beyond oil and gas in the coming years.

We will have at the end of this year -- supported more than 15 data center solutions across the US. That’s something we're proud of. You can see all of this as a sum of business that it will grow at a higher rate than the oil and gas sector for the years to come.”

FirstEnergy CEO Brian Tierney:

“We remain excited about the data center development we are seeing across our footprint. Our plan through 2029 includes 2.6 gigawatts of data center demand that is active or contracted with more in the project pipeline that would be incremental to our base plan.

Earlier this month, Meta announced an investment of more than $800 million to build their new Bowling Green data center in our Toledo Edison service territory and is expected to come online by the end of the year. This data center will be optimized for Meta's AI workloads. The transmission CapEx associated with this facility is included in the current capital plan.

In the first quarter of this year, we received 15 large load study requests for data centers, representing approximately 9 gigawatts of load. 11 of these studies are for locations in Pennsylvania and Ohio. We have not experienced any slowdown of data center interest in our service territory. We are also excited about the significant growth opportunities for transmission investment.”

GE Vernova CEO Scott Strazik:

“We start with the markets. We continue to see very strong end markets in Power and Electrification. Put simply, the world is entering an era of accelerated electrification, driven by manufacturing growth, industrial electrification, EVs and emerging data center needs, which is driving an unprecedented need for investment in reliable baseload power, grid infrastructure and decarbonization solutions.

As supply chains become more decoupled with more redundancy built into the global system to manage trade complexities, this manufacturing build-out creates incremental demand for additional electrons. To put today's investment super cycle into perspective in terms of energy needs and decarbonization, the scale of load growth we're seeing in North America is the most significant since the post-World War II industrial build-out.”

CBRE CEO Bob Sulentic:

“As it relates to data centers is we're a service provider, not an owner. And we had a really good quarter with data centers. We have a data center services business that includes an M&A deal we did called Direct Line which provides some additional technical services that go beyond what we had done historically. Very good quarter for that business. That acquisition is meaningfully outperforming underwriting.

Turner & Townsend is a big project manager, the creation of new data centers. They have a lot of work going on. They have seen some pullback from some of the hyperscalers that we've all read about. But they're pretty much at capacity in terms of their ability to do that kind of work. So you're not seeing that impact us. Trammell Crow Company has kind of a unique role in the data center world. They are in a position to acquire land and create the improvements you need on land, both the kind of soft improvements in terms of entitlements and then the hard improvements in terms of gaining access to electricity that makes that land a lot more valuable than it was when they acquired it. We've seen a good start to the year and expect it to be a strong finish to the year.”

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

PayPal eyes agentic commerce platform, but first has to retire tech debt

PayPal has big plans for transform from a payments company to a commerce platform that can carry out buying decisions. But first, the company is going to have to retire tech debt from acquisitions and a siloed data architecture.

Speaking on PayPal's first quarter earnings call, CEO Alex Chriss laid out the strategy, which revolves around "agentic commerce":

"PayPal is transforming from a payments company to a commerce platform. This includes expanding to be available everywhere, whether it’s online, in store or agentic. This means moving from a one-size-fits-all experience to personalized experiences that leverage the vast data at our fingertips.

Imagine what a future would look like where AI agents could bring up the right products at the right time and complete your purchase. Now any business can create agentic experiences that allow customers to pay, track shipments, manage invoices and more, all powered by PayPal and all within an AI client."

Indeed, PayPal launched a remote Model Context Protocol (MCP) server so developers can integrate with PayPal APIs to create experiences. PayPal reported first quarter earnings of $1.29 a share on revenue of $7.8 billion, up 1% from a year ago. PayPal ended the quarter with 436 active accounts and saw strength in its Venmo brand. PayPal maintained its earnings guidance.

The broad strategy looks like this:

A screenshot of a blue and white poster

AI-generated content may be incorrect.

To get there, PayPal has to get to one platform and it's a heavy lift.

One data pool, one customer profile

On PayPal's February investor day CTO Srini Venkatesan noted that the company has 430 million consumer and merchant accounts across 200 markets and trillions of interactions. PayPal is sitting on "a valuable data vault that is more than 500 petabytes," said Venkatesan.

He said:

"PayPal’s vast and unmatched two-sided network creates a rich tapestry of insights across millions of merchants, unlocking a holistic view of the customer well beyond a merchant’s typical view. This level of intelligence helps the merchant personalize that delights our customers with rewarding shopping experiences, creating a powerful flywheel effect."

The catch? The data is scattered in disparate systems.

"Today our data is scattered across disparate systems. This slows us down. Unifying our technology is my number one priority. I’m proud to share we have already started with four initiatives: One Platform, One Profile, One Process, all Powered by AI," said Venkatesan.

A screenshot of a video game

AI-generated content may be incorrect.

Venkatesan said PayPal's pickle was the result of multiple acquired products. Each product had its own infrastructure for speed. That speed was needed at the time, but it's slowing PayPal down in an AI-driven world. PayPal needs its units to share capabilities and provide standard customer onboarding processes, features and preferences.

"Our North Star is One Platform: a common chassis that we orchestrate behind the scenes, unseen by the consumer. Our core capabilities move up to the chassis. We can build once and use for all," said Venkatesan. "This allows us to bring best-of-breed services to market faster for every customer, no matter which product they are using."

PayPal is unifying various products on one platform throughout 2025 and into 2026. The big plan is to get to one single view of the customer.

A black and white screen with white text

AI-generated content may be incorrect.

Venkatesan explained four experiences that One Profile can enable:

  • Interoperability across products seamlessly, say Venmo and PayPal.
  • Opportunities for customers can be extended across products.
  • Shopping experiences can be personalized.
  • And customer support interactions can be resolved faster since PayPal will have more visibility to be proactive.

"We are modernizing our systems, are on a journey to be cloud native to provide our customers and merchants with the fastest experience. This will be one of our most impactful modernization efforts to date," said Venkatesan.

Cloud native to enable AI

PayPal historically took a lift and shift to the cloud, but applications must become cloud native. PayPal's Braintree unit is cloud native and has seen a 20% reduction in latency of services and 2x to 3x faster time to market. Braintree can go from concept to production in less than 30 days today vs. three to four months before.

Venkatesan said:

"As we unify our platform and modernize our infrastructure, we are also standardizing on one unified development process. Over the past six months, we have streamlined a complex development system to enable faster releases and accelerated speed to market.

Today our top applications have moved to this process. And here are the results we are seeing: 50% improvement in lead time, time from concept to delivery, 40% increase in speed of build time. This has enabled our check out app to move from weekly releases to daily, up to multiple times a day. In the next few months, all developers will be on the same process."

A diagram of a process

AI-generated content may be incorrect.

Venkatesan added that cloud native architecture not only enables AI in the future, but also speeds up its transformation. He said AI agents are helping developers create test use case and update legacy code.

AI is also being used for virtual assistants for employees, analyze compliance cases and ultimately enhance customer experiences.

A screenshot of a website

AI-generated content may be incorrect.

Venkatesan said agentic AI will drive commerce.

"People are increasingly using Gen AI for shopping research. However, today there is no option to make a purchase. We have already started development on an agent within the PayPal app. Customers can prompt the agent to research what they need for a camping trip, or ask it to reorder a recent purchase. With PayPal’s profile and identity context, you can seamlessly place your order. Likewise, our strategic partners could leverage PayPal agent to augment their context and to complete the purchase."

Data to Decisions Matrix Commerce Next-Generation Customer Experience Sales Marketing Revenue & Growth Effectiveness Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

UiPath launches next-gen platform: 'Agents think. Robots do. People lead.'

UiPath launched the latest version of platform with a focus on "agentic automation" that encompasses AI agents, but acknowledges that technologies like robotic process automation and even people fit into automated processes.

The next-gen UiPath Platform aims to be an orchestrator of agentic AI much like other vendors, but can also leverage RPA, process mining and other tools. The idea behind the approach has been noted by BT150 members at Constellation Research. These CxOs have noted that in many cases older technologies can be more effective automation tools with better returns.

According to UiPath, managing reliable AI agents, robots and people will be critical to enterprises. The company said UiPath Platform combines "decades of leadership in automation with a new, agentic architecture that is purpose-built for business-critical workflows."

The UiPath Platform launch also nods to observers who have argued that agentic AI is interesting, but really a tool that is in the larger automation mix.

CEO Daniel Dines said the launch of the next-gen UiPath Platform is the company's second act. "We’ve built a platform that unifies AI, RPA, and human decision making so companies can deliver smarter, more resilient workflows without added complexity. As models and chips commoditize, the value of AI moves up the stack to orchestration and intelligence," said Dines, who added that enterprises realize there are three actors delivering a process--robots, agents and people.

UiPath Platform is adding the following:

  • UiPath Maestro, an orchestration layer that automates, models and optimizes business processes with process intelligence and KPI monitoring.
  • A controlled agency model that provides guardrails for AI agents via governance, real-time vulnerability assessments and data access controls.
  • Tools for developers to build automation with low code to full code.
  • Integration with third party agent frameworks such as LangChain, Anthropic and Microsoft. UiPath supports Model Context Protocol (MCP) and Google Cloud's Agent2Agent (A2A).

In a blog post, Dines was realistic about agentic AI. He said:

"I don't remember the last time so many people were so energized about a technology that could substantially change the way we work—enter agentic AI. In part, this is due to people using LLMs on a daily basis, hence assuming that AI models will be able to autonomously execute all the tasks we want them to soon enough.

The reality is that enterprises have yet to go past PoCs, and AI adoption is present in the automation of isolated tasks. That is because enterprise workflows have become more complex, with the average large company using over 175 different applications and systems. A business process has deterministic and non-deterministic workflows, which require different models in order to successfully automate them."

UiPath's next-gen platform was private preview since January and has so far seen thousands of autonomous agents created, 450 partners and hundreds of customer use cases identified and created.

 

 

Data to Decisions Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Revenue & Growth Effectiveness Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Meta launches Llama API, Meta AI app

Meta has made a lot of headway with its open source Llama family of large language models, but with increasing competition from China's DeepSeek and Qwen models the company is ramping distribution.

For enterprises, the biggest news out of Meta's LlamaCon AI developer conference was the Llama API, which is in limited free preview. With the API, developers will be able to better experiment with Llama models and pair them with the company's software developer kits.

Pricing wasn't available for the Llama API, but it's clear that Meta is thinking about ways to monetize its flagship model.

A screenshot of a computer

AI-generated content may be incorrect.

Key takeaways about Llama API:

  • The API has one-click API key creation and playgrounds to explore models, including Llama 4 Scout and Llama 4 Maverick.
  • SDKs in both Python and Typescript will be available for Llama app building.
  • Llama API is compatible with OpenAI SDK.
  • The API will include tools to evaluate and tune custom Llama models starting with Llama 3.3 8B.
  • Meta said it is collaborating with Cerebras and Groq to speed up inference on Llama 4 models using Llama API.

A screenshot of a computer

AI-generated content may be incorrect.

On the consumer side of the Llama equation, Meta launched a standalone Meta AI app. The previous strategy for Meta AI revolved around infusing it on the company's family of applications. If you wanted Meta AI, you'd have to use it through Facebook, Instagram or WhatsApp. The problem is that some of us don't like any of those apps. Now you'll be able to use Meta AI on its own.

The Meta AI app will run on Llama 4 and feature text and voice interfaces. The app will also have Meta AI features such as image generation and editing.

 

Data to Decisions Innovation & Product-led Growth Next-Generation Customer Experience Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Freshworks CEO Woodside on strong Q1, outlook, resilience

Freshworks delivered a strong first quarter, upped its outlook for the second quarter and year and CEO Dennis Woodside said the company is positioned well despite economic uncertainty.

The company, which offers employee and customer experience software, broke even on a first quarter earnings per share basis on revenue of $196.3 million, up 19% from a year ago. Non-GAAP earnings were 18 cents a share.

Wall Street was expecting earnings of 13 cents a share on revenue of $191.9 million.

Freshworks said it had 23,275 customers with more than $5,000 in annual recurring revenue, up 13% from a year ago. Freshworks has 73,000 customers across multiple industries.

As for the outlook, Freshworks projected second quarter revenue of $197.3 million to $200.3 million, up 13% to 15%, with non-GAAP earnings of 10 cents a share to 12 cents a share. For 2025, Freshworks is projecting revenue of $815.3 million to $824.3 million, up 13% to 14%. Non-GAAP earnings for the year were projected to be 56 cents a share to 58 cents a share.

I caught up with Woodside to talk about the quarter and Freshworks strategy. Here are the takeaways.

The market. Woodside said the company is focused on employee experience (EX), which encompasses its IT service and IT asset management businesses. In EX, Woodside said Freshworks is focused on the midmarket--think of a 5,000 employee company--and often competes with ServiceNow as well as Atlassian. "Midmarket companies need to automate their IT department, but a ServiceNow implementation is too heavy and expensive," said Woodside.

On the customer experience (CX) side, Freshworks has its customer support offerings and often competes with Zendesk in a fragmented market. "CX skews more SMB, but we're taking that upmarket too," said Woodside, who noted that a big division of Airbus and S&P Global are Freshdesk customers.

A screenshot of a computer

AI-generated content may be incorrect.

The value proposition. Woodside said the value proposition for Freshworks is offering software that's not complicated and is half the cost of larger legacy competitors. The bet is that value proposition plays well in uncertain economies. "We haven't seen changes in customer behavior, but if we do see a recession we think we're well suited because customers are going to need to save money," said Woodside. "As contracts with legacy providers come up enterprises are going to look for alternatives. We also have the AI and automation."

Agentic AI early. Get AI returns now. Woodside said there is a lot of promise in agentic AI, but it's early. Customers are seeing real business value in AI use to deflect that first ticket and use Freddy Copilot to enhance productivity. Freshworks has 2,700 customers using its Freddy Copilot and AI features so it's still early in the AI adoption game.

Being model agnostic. Woodside said Freshworks has built a layer to swap various large language models as they develop. Its Freddy AI is really a set of more than 70 capabilities "and we don't rely on any single model for these features," said Woodside. "When we look at the future, we're trying to use the best model for the best use case. Today we have introduced 40 different models." Freshworks, which leverages Amazon Web Services as a cloud base, uses OpenAI via Microsoft Azure for conversational use cases, Google Cloud for image capabilities, Meta's Llama for other things and its own proprietary model.

Navigating an uncertain economy. Woodside previously noted that Freshworks is poised to do well in a recession, but it's worth noting that the company provided an annual outlook. Many companies are withholding guidance for 2025 or just providing a quarterly view. "We don't know what's going to happen, but we wouldn't have raised guidance if we didn't have confidence in the business and the trends we're seeing," said Woodside. "We're seeing pretty consistent demand."

 

Data to Decisions Future of Work Next-Generation Customer Experience New C-Suite Chief Information Officer

Tableau Conference 25: Tableau Redefining BI/Analytics in the Agentic AI Era

The event centered on "data and analytics in the agentic era," with Tableau Next - built on the Salesforce platform - as its flagship. Among the fanfare of their DataFam at their yearly Tableau Conference, Tableau also announced a bevy of new features for Tableau Cloud, Tableau Server, Tableau Desktop, and GA dates for components of Tableau Next, which was initially announced in 2024 at Dreamforce and last year's Tableau Conference.

Why this matters? Salesforce and Tableau are building an AI-native decision layer powered by semantics, embedded agents, and real-time workflows. This marks Tableau’s shift from business intelligence to business orchestration.

Highlights of What Tableau Announced

Key GA announcements and timelines include:

  • Tableau Semantics: Now generally available (GA). Integrated with Salesforce Data Cloud, Tableau Semantics provides centralized metrics, labels, and relationships to support natural language questions and semantic queries for both analysts and AI alike. 
  • Agentforce skills (Data Pro and Concierge): GA in June 2025. These assist in data modeling, prep, and conversational analytics using natural language.
  • Agentforce skills (Inspector): Beta in Q2 2025 to monitor key business metrics, get alerts, and actionable insights on KPIs and anomalies.
  • Tableau Agent is coming to Tableau Public to guide AI-powered dashboard creation.
  • Internal marketplace: GA in Q3 2025 for team-based content and agent sharing.

Tableau reassured its customer base by announcing continued investment in Tableau Cloud, Server, Public, and Desktop. Tableau’s CPO, Southard Jones, said he dedicated over half of his development resources to delivering over 130 features (see figure below) to those platforms in the last 12 months. He committed to the audience and later shared his continued roadmap of investments across the Tableau product lines with analysts. This included announcing the Tableau Blueprint to help its customers lead the change towards an agent-powered future.

The audience cheered new features on Tableau’s current platforms, such as Authoring Extensions API, which opens up a variety of automation tools and visualization extensions; VizQL Data Services, to allow customers, partners, and community members to integrate Tableau platform APIs into open-source AI frameworks; Tableau Pulse Research Assistant for AI-assisted analysis; dark mode (of course); enhanced accessibility features; and Google Sheets integration. Unsurprisingly, the data analysts’ quality of life features received the biggest applause, particularly items like Recycle Bin, which can restore deleted items.

Figure: Continued investment across Tableau’s family of products over the last 12 months. Source: Tableau.

 

My POV: Delivering the Operating Model For Data & AI

  1. Establishing a Data and AI platform edge: Tableau Next is built atop the Salesforce platform to deliver a vertically integrated stack that spans CRM, Agentforce, Data Cloud, Slack, and now, analytics. Tableau Next fills the critical gap between structured enterprise data and dynamic, embedded decision support to form a platform that standalone BI and SaaS application vendors would struggle to match. 
  2. Delivering value to all customers: Tableau Next means deeper integration, more value from existing data, and a clear path to embedding AI-powered insights directly into everyday workflows. For Salesforce customers, it makes it easier to get more from their CRM investment without switching tools, streamlining how decisions are made across sales, service, and operations rather than using platforms like Power BI or Looker. For non-Salesforce customers, it provides an analytics platform ready for agentic AI. 
  3. Sending a message on protected roadmaps: Tableau Next sends a clear message to Salesforce customers: Tableau Next is your go-forward platform for agentic analytics. Equally, for the non-Salesforce Tableau user base, there’s reassurance with a clear roadmap: you don’t need to migrate from its leading Tableau Server or Tableau Cloud offerings to provide a foundation for analytic agents or access AI capabilities- summaries, data exploration, and analytics generation agents- but the agentic-building future is clearly being built with Tableau Next on the Salesforce stack. 
  4. … but with the need for clarity on which road to take: Tableau has built on-ramps for customers to use some of the new AI and agentic capabilities, such as using Tableau Next by leveraging Published Data Sources from Tableau Cloud and Server platforms and Tableau Public getting Tableau Agent (see the figure below on the Tableau expanded family of products). Still, more communication will be needed over the next year on what platform a customer should use. Over half of the conference attendees interviewed were confused about what product lines had what features, what worked with what, and how to frame decisions on platform choice to use for which use case. Coupled with usage-based pricing, those same customers were uncertain about how to move forward. At the same time, all expressed trust that Tableau would take care of them.

Figure: Tableau is now a multi-product line family with Tableau Next as the latest addition. Source: Tableau.

What Data and AI Leaders Should Do Next

As one Tableau partner said on the show floor, “[This is the] first year where it became clear you're going to need something different" - Tableau Partner.

For Salesforce customers: Evaluate Tableau Next as an obvious BI/analytics platform for a path to tighter integration, faster time-to-insight, and AI-native workflows. With Data Cloud, Agentforce, and Tableau Semantics now natively integrated, Salesforce customers should plan to shift focus from dashboards to adopt conversational analytics, new smart alerting

on anomalies and derived insights (e.g., using the Inspector pre-built analytics skill), or interactive insights embedded into CRM workflows. For sales, service, and marketing leaders, this means real-time insights without having to leave the context of their applications. The key is to pilot use cases like churn prediction or pipeline acceleration, where semantic metrics and CRM context already exist.

For Tableau customers not on Salesforce: The message from the conference is both reassuring and directional. Customers can continue to use Tableau Cloud and Server standalone. Tableau Semantics is available now, and in June, Tableau+ customers can also try the Tableau Next AI features—Concierge, Data Pro, and Inspector (Beta). By Fall 2025, Tableau Next will support connections to Tableau Cloud and Published Data Sources. However, there’s a strategic fork ahead: full access to agentic features and capabilities like Tableau Semantics, Agent skills like Data Pro and Inspector, depend on Salesforce infrastructure (e.g., Hyperforce, Data Cloud). As part of a long-term cloud data strategy, customers must evaluate data architecture and pricing plans to decide if integration with Salesforce services or an alternative platform aligns better with their roadmap.

For organizations considering Tableau, Tableau is a leader in BI/analytics, delivering deep data visualizations and storytelling capabilities with flexible deployment options spanning cloud and on-premise, a vibrant community, and a vision for agentic analytics in Tableau Next. While Tableau is committed to maintaining its standalone BI platforms, CIO’s and CDO’s should recognize that Tableau’s future foundational components are increasingly tied to the Salesforce Agentic Architecture, with greater reliance on components like the Salesforce Data Cloud for data orchestration across data sources, the Einstein Trust Layer for secure and governed AI interactions, and Agentforce to drive action beyond dashboards. CIOs and CDOs should also pay attention to the emerging usage pricing models for Tableau Next to optimize their analytics investments.

Bottom Line: Tableau isn’t just evolving with better visualizations; it’s being redefined into a “question & answer” foundation supporting decisioning atop the Salesforce Agentic Architecture—both as a standalone BI platform and Salesforce’s AI-native analytics. The longer-term unlock won’t be dashboards: it will be agents and semantics driving real-time, context-aware decisions anywhere. More tactically, in the short and mid-term, Tableau provides multiple choices of platforms and onboarding points.

Based on your maturity and incumbent analytics solution, you have many combinations of solutions and potential future starting points. If you have trouble thinking this through, I would love to speak with you and help you work out your path. There are just too many considerations to put into a short blog.

What stood out to you most? Ping me, or drop your thoughts. 

Data to Decisions Tech Optimization Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Technology Officer

Alibaba's Qwen picking up momentum

DeepSeek's family of large language models (LLMs) may have put the spotlight on China's AI ambitions, but Alibaba and its Qwen efforts may win out.

Alibaba's Qwen3 launched and its flagship model, Qwen3-235B-A22B, is competitive with DeepSeek-R1, OpenAI's o1, o3-mini, Grok 3 and Gemini 2.5 Pro.

Two mixture of expert models, Qwen3-235B-A22B and Qwen3-30B-A3B, were open weighted and there are six other models under Apache 2.0 licenses.

Qwen3 introduces a hybrid approach to problem solving and support a thinking mode, where the model takes time to reason step by step, and non-thinking mode for quick answers for simple questions.

What that hybrid approach means is that users control how much thinking the model has to use. The hybrid approach can also work well with budgets and compute resources. The models also support 119 languages and dialects. In addition, Qwen3 was trained on about 36 trillion tokens.

A diagram of a process

AI-generated content may be incorrect.

In the short term, Qwen3 is the latest development in the LLM race in an almost daily game of leapfrog. In the long run, Qwen3 has a few built in advantages. First, the LLMs are backed by Alibaba. Second, Qwen3 seems to be popular on Hugging Face and has some open source heft. And finally, Qwen is easily accessible on Alibaba Cloud, which is one of the dominant cloud providers to enterprises in Asia.

Add it up and Qwen has some Alibaba ground game advantages that may be more relevant to enterprises.

Data to Decisions Innovation & Product-led Growth Next-Generation Customer Experience Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

RSAC 2025: AI agents dominate new security features

The AI agents are marching in on the RSA Conference as CrowdStrike launched a set of agents to its Charlotte AI platform. Rest assured that more agentic AI layers will be added to security platforms. Google Cloud, IBM, SentinelOne, Cisco and others all made plays to make their security operations workflows and analysis more autonomous.

RSAC kicks off in San Francisco this week and cybersecurity vendors are outlining a bevy of agentic AI and automation tools on their respective platforms.

CrowdStrike announced Charlotte AI Agentic Response and Charlotte AI Agentic Workflows, two security operations tools that are designed to go with Charlotte AI Agentic Detection Triage.

With a portfolio of AI agents for the Charlotte AI platform, CrowdStrike is looking to offer autonomous reasoning for first- and third-party data. Just as cybersecurity vendors are looking to consolidate platforms they are jockeying to be that automation layer.

CrowdStrike CEO George Kurtz said the goal is to "shift from reactive to proactive security." CrowdStrike announced the following:

  • Charlotte AI Agentic Response, which automatically asks and answers questions a security analyst would. The tool also analyzes root causes and guides next steps on investigations.
  • Charlotte AI Agentic Workflows, which use large language models in workflows to drop into automated playbooks based on policies.
  • Falcon Complete with Charlotte AI, which uses agents to triage alerts.
  • Charlotte AI Agentic Triage for Identity is added to Falcon Identity Protection.

What's unclear at this point is whether these AI agents are truly autonomous or speed up reasoning and response.

A screen shot of a computer

AI-generated content may be incorrect.

Other RSAC items of note:

  • Google Cloud launched Google Unified Security that integrates threat intelligence, security operations, cloud security, secure enterprise browsing and Mandiant intelligence into one package. The company said the integrated stack will "simplify workflows, reduce toil, and empower analysts." In addition, Google Cloud outlined its vision for agentic AI powered security operations center. Mandiant's M-Trends report based on 450,000 hours of incident investigations in 2024 was also released.

A diagram of security data security

AI-generated content may be incorrect.

  • Cisco launched new threat detection and response tools for Cisco XDR and Splunk Security. Cisco XDR gets Instant Attack Verification, which integrates data from the Splunk platform and then uses AI agents to carry out plans and responses. The company also launched XDR Forensics for visibility into endpoint activity and XDR Storyboard to visualize attacks. The company also said Splunk Enterprise Security and Splunk SOAR 6.4 can be combined with Cisco XDR for enhanced network visibility and detection.
  • SentinelOne launched its Athena release of its Purple AI platform. The new release features agentic AI to offer orchestration, reasoning and analysis that a security analyst would. In addition, Purple AI Athena will open up the platform to third party security platforms and data lakes. Purple AI Auto Triage is also generally available. SentinelOne is looking to automate workflows across platforms by connecting to multiple data sources and embedding AI agents throughout. 
  • Minimus, an application security startup, launched its platform that's designed to eliminate 95% of CVEs from software supply chains. The company raised $51 million in a deed round from YL Ventures and Mayfield.
  • NetRise also is focused on software supply chain security launched NetRise ZeroLens, which uses AI to summarize and remediate compiled code for weaknesses.
  • Bedrock Security announced its Model Context Protocol (MCP) Server, which will complement its Bedrock Metadata Lake Copilot. The idea is to secure AI agents and enterprise data as enterprises adopt the technology. Bedrock Security is providing a standardized gateway to data for AI agents with MCP Server.
  • IBM launched Autonomous Threat Operations Machine (ATOM), an AI agent system for threat triage, investigation and remediation. IBM also launched the new X-Force Predictive Threat Intelligence (PTI) agent for ATOM, which uses industry focused AI models for threat insights. 

Constellation Research's take

Chirag Mehta, a Constellation Research analyst at event, said:

"RSAC 2025 marked a noticeable shift in the cybersecurity conversation from experimentation to execution. Across briefings and discussions, CISOs consistently highlighted the challenge of operational scale: too many tools, too much data, and not enough capacity to turn signals into timely decisions. As a result, the focus this year moved toward simplifying security operations, improving interoperability across platforms, and reducing friction caused by fragmented workflows.

There was also a clear maturation in how enterprises are approaching AI in security. The discussion has moved beyond curiosity and pilots toward practical questions around trust, governance, and measurable outcomes. Security leaders are prioritizing use cases that improve response time, reduce analyst fatigue, and strengthen foundational controls rather than chasing novelty. Identity security, exposure management, and data security emerged as core priorities, reinforcing that many security failures still originate from misconfigurations, unmanaged assets, and credential abuse.

Overall, RSAC 2025 reflected a more pragmatic market. Buyers are demanding accountability, clarity, and demonstrable impact. Vendors that can translate innovation into operational value, without adding complexity, are the ones most likely to earn long-term trust."

Data to Decisions Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Information Officer Chief Information Security Officer Chief Privacy Officer

Palo Alto Networks acquires Protect AI, aims to secure AI ecosystems

Palo Alto Networks acquired Protect AI to expand its reach into securing AI and machine learning applications, launched Prisma AIRS to secure enterprise AI apps, agents and models and updated its Cortex securities operations platform.

The three-pack of announcements landed as the RSA Conference kicked off in San Francisco.

According to Palo Alto Networks, the purchase of Protect AI will broaden its reach into securing the multiple layers involved with AI-driven applications including models, agents, infrastructure, tools and APIs.

Terms of the deal weren't disclosed, but Geekwire put the price tag at $500 million or so. The deal is expected to close in the first quarter of Palo Alto Networks first fiscal quarter.

Palo Alto Networks said the acquisition of Protect AI boosts its vision for Prisma AIRS. The company announced Prisma AIRS along with the Protect AI acquisition.

Prisma AIRS includes:

  • AI model scanning for vulnerabilities and risks including model tampering, malicious scripts and deserialization attacks.
  • Posture management to assess risks in enterprise AI ecosystems.
  • AI red teaming to discover exposure.
  • Runtime security to protect against threats such as prompt injection, malicious code and sensitive data leaks.
  • AI agent security to protect against emerging threats.

Separately, Palo Alto Networks outlined Cortex XSIAM 3.0, its next-gen security operations platform. Updates include:

  • Cortex Exposure Management, which surfaces vulnerabilities, prioritizes them and then remediates.
  • Cortex Advanced Email Security, which uses AI and analytics to detect advanced phishing and email threats.

The next generation of Cortex XSIAM will be available in the fourth quarter of Palo Alto Networks fiscal fourth quarter.

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience palo alto networks Security Zero Trust AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Product Officer

Agentic AI: Everything that’s still missing to scale

Agentic AI is entering its next phase of hype as vendor conference season unfolds. Pick an enterprise vendor and you'll hear a keynote that revolves around AI agents.

It's AI agents everywhere. If we created a drinking game every time "agent" was uttered we'd be hammered. The big idea behind agentic AI will come to fruition, but there are more than a few missing elements that need to fall in place.

Here's what is missing from the agentic AI stack today.

Standards. If AI agents live in one platform and focused on one function such as CRM, HR, service, sales they can be useful. However, there are few workflows that live in one data store in control of one vendor. The reality is that these AI agents are going to have to communicate, negotiate, form workflows and execute tasks on their own.

That reality is why Anthropic created Model Context Protocol (MCP), an open standard for connecting AI assistants to systems where data lives. OpenAI and a bevy of others are backing MCP. The need for standards is why Google Cloud also launched Agent2Agent, a communications standard that also has some big backers. These efforts are a nice step toward connecting agents across the AI workflow chain, but more is needed.

Constellation Research CEO R "Ray" Wang noted that AI agent interoperability needs to model itself after HL7 (Health Level 7), which is a set of international standards for exchanging electronic health information.

Another question worth asking via Constellation Research analyst Holger Mueller: Will these AI agent communication and automation standards be based on natural language or API calls or both?

Multi-agentic cross platform agents. That requirement is a mouthful, but without standards in place and vendors focused on agentic AI within their platforms true agents working across platforms remain more about theory. If agentic AI is going to work they have to be horizontal.

There's a reason why early agentic AI use cases and development have thrived at integrators relative to vendors. Integrators core business is working across platforms. Vendors not so much.

What we need to see is interoperability across MCP, Google Cloud's A2A and platforms from the likes of Boomi. In some ways agentic AI is like email back in the day--at first email only worked within a company. Once multiple companies could email new collaboration was opened up.

The end of agent washing. Vendors hopped on the agentic AI bullet train in a hurry. Everything is now agentic. But there's a downside to this marketing bonanza as surfaced by our BT150 CxOs. The downside? CxOs are glazing over at the AI agent claims. These BT150 veterans, who have seen cloud washing, followed by AI washing, followed by agentic AI washing, are noting that RPA may get the job done and note that there are a lot of APIs masquerading as AI agents these days.

A white t-shirt with black text

AI-generated content may be incorrect.

Horizontal use cases that can go vertical. While vendors have focused on AI agents that revolve around their go-to-market efforts, the natural progression for enterprises may be horizontal use cases that can form and then deconstruct based on what needs to be done.

Minimized agent sprawl and lock-in risk. Those realities point to a horizontal approach and CxOs are likely to look to their hyperscale cloud providers as well as cross-system vendors like ServiceNow to orchestrate agents. The value will be in the orchestration layer and neutral vendors are going to be valued.

Data products that are autonomous and can work with agents. To date, the prerequisite of agentic AI is to get your data all in one place--or just a few places. Perhaps that means you'll have just one or two data lakehouses instead of the six you have now.

Here's the problem: Enterprises have been chasing the one data store to rule them all strategy and it hasn't worked. Constellation Research’s Mueller has said enterprises need to get down to a few primary data stores and it's likely you'll still have a data estate that includes SAP, Databricks, Salesforce Data Cloud, Snowflake and hyperscale cloud platforms. If that lineup still sounds like a lot consider that just having four data platforms is way better than the 10 you have now. Congrats!

NextData, a startup led by Zhamak Dehghani, who created the data mesh concept, launched the NextData OS, a platform for building and operating autonomous data products. The idea is that these data products are decentralized and enable you to scale with agents without uprooting your infrastructure. "I hope to change this paradigm and frame of thinking that we have to have the data in one place because the moment you get there it's out of date," said Dehghani. "We want to have one way of getting access to data in a standard way."

Dehghani's company is young, but an abstraction layer and operating system for data that works well with agents and multiple modes of access is on target. NextData's launch webinar featured Mars and Bristol Myers Squibb as customers.

An end to agentic AI storyline that obsesses about humans. To date, agentic AI has been pitched as a way to scale labor. These digital labor pools will complement humans, but also restrain hiring.

Microsoft's annual report on the state of work envisions management roles that emerge to coordinate human and digital labor. Under these constructs, AI agents are mapped to human roles.

That AI agent-as-human thinking may be misplaced. Wang argues that the focus for agentic AI has to revolve around decision trees and automated decision-making. Mapping AI agents to human roles just scales what enterprises do today.

A diagram of automation

AI-generated content may be incorrect.

Process knowhow. AI agents are a handy way to handle the process flows that drive ROI. Order-to-cash, procurement, supply chain and other core processes move the returns the most for enterprises.

What's lacking in a lot of these agentic AI pitches is process automation, process mining and intelligence. I've noted before that agentic AI is going to flop without a hefty dose of process.

The focus on process is needed because the biggest question facing enterprises is where do you insert the human in agentic AI workflows? That issue is what unlocks ROI. If you see AI agents as a human labor replacement, you're missing the big process picture.

In the end, enterprises are a collection of human and digital processes that can be optimized and continually improved. Flashy agentic AI marketing isn't going to change that fact.

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer