Results

Hitachi Digital Services: A deep dive on what it does, IT, OT, AI strategy

Hitachi Digital Services is looking to leverage its ability to integrate operations technology (OT) and information technology (IT) combined with industry domain knowledge, data and artificial intelligence (AI) know-how, and cloud expertise to drive growth.

In many ways, Hitachi Digital Services (HDS) is reintroducing itself to the technology industry following its late-2023 spin-off into an independent subsidiary of Hitachi. Here’s a look at HDS and everything you need to know from its first annual US Analyst & Advisor Day, May 20–21, 2025, in Frisco, Texas.

Background

HDS sits within the Japanese conglomerate Hitachi, which was founded in 1910 and has a long history in IT, R&D, and mission-critical systems across multiple industries. HDS was previously part of Hitachi Vantara before being spun off into a separate entity in November 2023. HDS focuses on cloud, data, Internet of Things (IoT), AI, and integration of OT and IT. Hitachi Vantara is focused on storage systems.

For the parent company, HDS is part of Hitachi’s Digital Systems & Services group. Toshiaki Tokunaga, CEO of Hitachi, is betting on a new management plan called Inspire 2027 that will drive growth for the conglomerate.

In April, on Hitachi’s fourth-quarter earnings call, Tokunaga said the company can leverage its ability to offer IT, OT, and products together to “demonstrate our unwavering commitment to transforming Hitachi into a digital-centric company.” Hitachi’s strategy is to remain decentralized but leverage a digital core to create a more unified company across its Energy, Mobility, Connective Industries, and Digital Systems and Services units. Digital Systems and Services accounts for 28% of Hitachi’s revenue.

Hitachi’s operating model, called Lumada, has been in place since 2016 and now has had a few versions. Lumada 3.0 aims to combine Hitachi’s domain knowledge and capabilities and turbocharge them with AI.

A screenshot of a computerAI-generated content may be incorrect.

Understanding where HDS sits in the company highlights how it has R&D assets and capabilities across the parent to deliver cutting-edge systems. Nevertheless, HDS isn’t exactly a household name in North America, which represents a small portion of Hitachi’s overall $65B revenue.

Nevertheless, Hitachi’s technology is ubiquitous in global railways, hospitals, financial services firms, and other places. HDS CEO Roger Lvin boils down the company’s mission: “If I distill our mission: We build, integrate, and operate mission-critical applications.”

Hitachi Digital Services CEO Lvin on AI transformation, operations technology, and use cases

“These mission-critical applications, often infused with what we refer to as the Japanese quality and Japanese process things, cannot go down and have real-life implications when they do not work,” says Lvin.

 

A screenshot of a computerAI-generated content may be incorrect.

AI strategy

HDS offers multiple services such as advisory on processes, cloud migrations, and transformation roadmaps; smart enterprise technology for manufacturing, IoT, AI, and connected digital solutions; transformation services such as consolidation, migrations, process optimization, system integration, and automation; and managed services for applications, cloud, incident management, and other areas.

But the vendor conversation today starts with the AI strategy. HDS executives briefed analysts on multiple parts of their business and key topics ranging from IoT, Industry 5.0, and ERP, but AI is the connective tissue across the company as well as parent Hitachi.

HDS’s AI strategy revolves more around architecture and bringing proofs of concept to production for mission-critical applications. The strategy also aims to weave AI into operations throughout the Hitachi conglomerate.

What sticks out about HDS’s AI strategy is that it is decidedly practical and potentially future-proof, given its emphasis on architecture. AI isn’t about buzzwords but, rather, new tools for mission-critical applications.

Prem Balasubramanian, CTO of HDS, said the company didn’t want to set up a separate AI team, because it wanted to go with a holistic approach. “We wanted to establish a team that works with every facet of this company, integrating AI into the workforce and integrating AI into what we build,” he said.

Focused on taking proofs of concept into production, Balasubramanian said, HDS didn’t want to chase frameworks for technologies that would be commoditized like retrieval-augmented generation (RAG) or even models. HDS’s strategy revolves around R202.ai, a set of prebuilt and reusable AI libraries and blueprints; responsible AI tenets; and HARC for AI, an end-to-end observability, security, and governance platform.

A diagram of a pilotAI-generated content may be incorrect.

Hitachi Application Reliability Centers (HARC) for AI was announced in April as a service to help enterprises run AI and generative AI (GenAI) applications more reliably and efficiently. HARC for AI is designed to keep costs in check, limit performance degradation, and provide oversight of models.

Reliable, Responsible, Observable and Optimal AI, or R2O2.ai, was launched in late 2024 as a framework to bridge the gap between proof-of-concept projects and scalable AI deployments.

“We firmly believe that when you take an AI workload from a proof of concept and you really want to productionize it in an enterprise, you need to ensure that it’s responsible. You need to ensure it’s reliable. That the answers are correct consistently. You need to ensure your explainability and auditability, which is observability, and you need to ensure you’re spending the right amount of money on this,” Balasubramanian said.

Architecture is also critical, because it’s essential when enterprises adopt AI agents. Balasubramanian argued that agents aren’t a separate category as much as a connector to existing systems. “The bulk of agentic AI is existing systems, and you have to integrate agents and AI into them to get more revenue, acquire customers, and retain them," said Balasubramanian. “Our view of agents is that it’s just about technology. We will use all the technology available, but we think of it in domains. One is industrial, vertical use cases. Another is applications and how they interact with the real world. Architecture is going to be the future.”

Balasubramanian said business value will be driven by delivery of applications that use AI agents seamlessly.

Chetan Gupta, a research and development leader at Hitachi, said the company’s research priorities revolve around simulation, reinforcement learning, and industrial and enterprise transformation.

“We believe AI essentially is a tool to transform enterprise operations and industrial operations, and that’s what we will enable,” said Gupta. “So the way we do things today will be different from the way things will be done tomorrow.”

The ultimate challenge is moving from proof of concept to production. For that, HDS is focused on training industrial models for specific use cases and reliability.

As for partnerships, HDS has worked with Nvidia on multiple use cases across logistics, manufacturing, energy, and transportation and is vendor-neutral.

A screenshot of a computerAI-generated content may be incorrect.

The cloud imperative

Balasubramanian walked analysts through the company’s cloud partnerships with hyperscale giants such as Amazon Web Services (AWS), Google Cloud, and Microsoft Azure; case studies; and use cases.

“Every use case that we’ve touched upon, we’ve had cloud data in it,” said Balasubramanian, who noted that HDS is running cloud services for multiple enterprises. HDS was able to save more than $20 million a year optimizing a large pharmaceutical company’s cloud infrastructure.

HDS will run cloud infrastructure, optimize it, and often add value with custom applications, said Balasubramanian. HDS has also moved multiple state and federal government agencies onto AWS FedRAMP cloud. In addition, HDS operates in the background for vendors servicing government customers.

Balasubramanian’s big takeaway is that cloud migrations are continuing. “We’ve got what we call a sprint to clouds. It’s essentially a way for us to quickly assess and help migrate to the cloud. We use some accelerators and some products that we work with,” said Balasubramanian. However, there’s less lift-and-shift and more modernization projects to move applications to the cloud to take advantage of AI agents.

A screenshot of a computer applicationAI-generated content may be incorrect.

Chris Ansert, executive manager of North American Quality Systems and Technology Platforms at Toyota North America, walked through the automobile manufacturer’s Quality1 program, which is a platform to ingest data about product quality issues and resolve them.

The project, which uses HDS services and AWS cloud infrastructure to modernize legacy systems, connects R&D, production engineering, purchasing, production, sales, and service functions in North America to create a feedback loop for quality issues.

Industry use cases

HDS’s secret sauce is integrating OT, IT, and industry use cases. With its domain expertise, HDS can apply AI and the latest technologies to manufacturing, supply chains, and transportation networks.

Ganesh Bukka, global head of HDS’s Industry 4.0 business, outlined the company’s thinking on industry use cases and how they failed to scale. Bukka’s talk revolved around whether Industry 4.0 was a brilliant failure that set up the next evolution of technology.

“For the last decade or so, we talked about Industry 4.0 and every manufacturer was going to revolutionize assets and manufacturing. And this is not just manufacturing. It could be anything in the asset-heavy or even some cases in the asset-light industry. A lot of these initiatives did not scale beyond the pilots,” said Bukka.

Why? Siloed processes proliferated due to digital initiatives, interoperability was challenging, and data and AI skills weren’t developed. Security and culture were other big issues, added Bukka.

A diagram of a companyAI-generated content may be incorrect.

“Industry 4.0 was all about technology, but the problem is that IT teams built OT and refused to acknowledge what OT teams wanted. IT built great dashboards, which could give you intelligence, but it could not act autonomously from that intelligence and put it into actions,” explained Bukka.

Bukka argued that HDS is in a unique position for Industry 5.0 applications, given that it’s a system integrator with expertise in melding OT, IT, and complex engineering. .

For the next generation of industrial applications, Bukka said there are five success factors:

  • Human collaboration. Human/AI collaboration will be an important element, given that so many AI agents will come online.
  • Industrial AI. Hyperpersonalization driven by AI will be critical for creating digital operators.
  • Industrial edge computing.
  • Industrial metaverse via digital twins, data, and AI.
  • Sustainability.

Hitachi launched a digital factory in Hagerstown, Maryland, to build railcars. The institutional and process knowledge from other plans was incorporated into GenAI models, robotic models, and other models. The factory, built via a collaboration between Hitachi Rail, HDS, and Hitachi R&D will be a showcase for the latest manufacturing technology and innovation.

A screenshot of a computerAI-generated content may be incorrect.

Customers

HDS held multiple customer panels over the two days as well as breakout sessions for automotive, IT operations, cybersecurity, and enterprise applications. Some high-level takeaways:

  • Penske Transportation Solutions outlined a project with HDS to create a proactive diagnostics model that predicts vehicle failures. The project reduced downtime for a fleet of more than 400,000 vehicles.
  • Toyota Motor North America cited a collaboration on building a quality knowledge center using vector databases and AI models. The platform supports technical assistance for difficult repairs, reduced cycle time, and improved customer service.
  • HDS and Verizon Business have a collaboration on healthcare IT and applications, with plans to move across all industries.

Here’s a look at HDS’s reach across the automotive industry, followed by healthcare/life sciences.

A chart of company logosAI-generated content may be incorrect.

A screenshot of a computerAI-generated content may be incorrect.

Takeaways

  • HDS’s expertise in mission-critical applications and IT/OT convergence could give the company a competitive edge as AI evolves from horizontal use cases to vertical ones.
  • As AI strategy and implementation become a boardroom issue in manufacturing, healthcare, life sciences, and energy, HDS’s approach could be valuable.
  • What remains to be seen is whether HDS can leverage its conglomerate roots seamlessly while remaining focused on its core mission to bring unique value to enterprises.
Data to Decisions Future of Work Next-Generation Customer Experience Tech Optimization Chief Information Officer

JPMorgan Chase’s IT, AI bets: Where the returns are

For JPMorgan Chase, the investment in technology and AI will never reach the finish line. Think of transformation as an ongoing project.

JPMorgan Chase has 84 million US customers and $4 trillion in assets under management. The company's approach to data, modernization and artificial intelligence has been worth watching due to the bank's scale, investment in technology and approach to AI.

Recent history:

Jamie Dimon, CEO of JPMorgan Chase, said during the bank’s Investor Day that the transformation work is never finished--and neither is the spending on technology. "It's table stakes. It will be for the rest of eternity. So our tech spend, I think, is, call it, 10% of revenues which is less than most other companies by the way," said Dimon. "In my experience, I think people make a mistake like somehow you're in one transformation and when you get through it, you're done. I've been doing this for a while and I've been through transformation after transformation after transformation, and we're learning. The hardest part is getting data into the form where it can be used properly and where these things belong."

Get the Constellation Insights newsletter

As a result, JPMorgan Chase is spreading its tech bets. "We're building our own cloud-based data centers. We have our virtual servers. We are using all these other folks. We're going to be quite cautious on software-as-a-service, how we deal with cloud providers. I don't mind doing anything ourselves," said Dimon. "I think the mindset should be that whenever management teams meet, you're talking about what you need to do in technology writ large to do a good job for your client. We talk about AI all the time at every different level."

CFO Jeremy Barnum said JPMorgan Chase will spend $18 billion on technology in 2025, up $1 billion from 2024. Barnum did say that the company's modernization investment has peaked.

"We are now probably past the point of peak modernization spend, resulting in a tailwind this year that is funding some of our ongoing investments in products, platforms and futures. And we do continue to see volume growth across the company drive some increased hardware and infrastructure expense. This in turn is partially offset by efficiencies," he said. "The majority is spent on products, platforms and features."

Barnum said this move from retiring technical debt by moving to the cloud and modern infrastructure sets up its AI strategy. JPMorgan Chase has about 65% of its workloads on the public or private cloud, up from 50% a year ago. "If we include the applications that run largely on virtual servers, that number increases to 80%. In addition, we have almost completed the application migrations for our largest legacy data centers and we are in the process of dismantling the physical infrastructure in those sites," said Barnum. "This progress in our modernization efforts continues to deliver significant engineering efficiencies, which we see through ongoing improvement in our speed and agility metrics, but we can't afford to fall behind."

Specifically, JPMorgan can't fall behind in AI and cloud is making it easier to deliver features faster. Here's a look at where JPMorgan Chase is getting the most bang for its AI dollar.

AI coding assistance by software engineers. Barnum said the accelerated adoption of AI coding is promising. "On a personal note here, I'll say that I've recently been indulging in what I've come to learn is known as “vibe coding,” a little bit, and it's actually pretty amazing," he said. "And from what certain of my colleagues tell me who are actually trained, professional computer scientists, it actually helps them quite a bit, too, with their efficiency. It's not just the amateurs who are helped by these tools. It's amazing stuff and we have high hopes for the efficiency gains we might get."

AI for operational efficiencies. Barnum said that a big AI use case is in the call center where algorithms can help agents anticipate and respond to questions faster. 

Democratized efficiency. JPMorgan Chase has a generative AI platform that's model agnostic called LLM Suite. More than 200,000 employees globally have access and are gaining several hours per week doing less valuable tasks. "We are starting to see a number of “citizen developer” use cases go into production. While we've made substantial progress over the last decade, we are still in the early stages of our AI journey. We are focused on modernizing data, investing in scalable platforms and being at the forefront of innovation as technology evolves, positioning the company for sustained future success," said Barnum.

Digital engagement. Marianne Lake, CEO of Consumer & Community Banking at JPMorgan Chase, said the unit has been investing in tech, data, and AI to drive customer experience and productivity. "We estimate spend of about $9 billion on tech, product and design this year, moderating to a 6% growth rate year-on-year. $7.4 billion of this is in tech, about 10% of revenue," she said.

Lake added that JPMorgan Chase is also deploying AI to boost card servicing. The company has boosted its product velocity with AI. "We have increased code deployments by more than 70% over the last two years and improved the quality of product delivery over the same period with a 20% reduction in work being replanned. Our investments this year have more than two times return on investment and continue to pay back within five years, and our investments in AI/ML delivered a 35% increase in value last year," said Lake.

Here's a look at Consumer & Community Banking's tech spending plans.

Umar Farooq, Co-Head of J.P. Morgan Global Payments, said: "We are laser focused on providing the absolute best digital experience to every single client segment," he said. "We are really focused on building digital experiences that are targeted to specific segments like technology startups."

Process automation. Lake said improvement in operations have kept expenses flat in her unit over the last five years. Accounts per serviced ops headcount are up 25% due to improving self-service options for companies. Servicing calls per account costs are down nearly 30% per account and processing costs are down 15%. "While AI has definitely contributed here, a lot of this is good old-fashioned process automation and organizational efficiency," said Lake.

Fraud detection and deterrence. Lake noted that JPMorgan Chase is seeing a 12% compound annual growth rate in attacks, but the company has held the cost of fraud flat due to AI tools.

Traditional machine learning and AI. JPMorgan's Lake said the bank expects big productivity gains over the next five years, she said it's worth noting that traditional models are a big reason. She said:

"We have a very rich and valuable tapestry of data. And despite the step change in productivity we expect from new AI capabilities over the next five years, we have been delivering significant value even with more traditional models and the value we're delivering is growing exponentially. I point that out for two reasons; one is that, not every opportunity requires Gen AI to deliver it, and we are “all systems go” already; and second, we are well on our way, modernizing our data to make it more efficiently consumable and machine readable."

Data improvements. Lake said that moving to the cloud has improved storage and compute efficiency, but the bank is spending on improving data management. "Our data needs to be in our target platforms. We're about halfway through that journey, and making data truly fit-for-purpose will include a subset that will need to be streamed real-time, and we've made significant progress here, in particular, for servicing and personalization. There's still a way to go, but we are delivering significant value," said Lake.

Here's a look at the data flywheel Lake highlighted.

Doug Petno, Co-CEO of Commercial & Investment Bank at JPMorgan Chase, said the unit has more than 175 AI use cases in production looking to leverage its data to feed models.

Farooq added that his unit is leveraging its data assets. "We have been building and have completed building a cloud-native data infrastructure and are utilizing AI and machine learning models for everything, from prospect qualification to transaction screening and operations," said Farooq. "The operational efficiencies our data platform has allowed us to capture with AI models are truly impressive. In the last few years, our transaction volumes have gone up by more than 50%. At the same time, our AI models have allowed us to cut manual exceptions by more than 50%, delivering significant operating leverage."

Trading. Mary Callahan Erdoes, CEO of Asset & Wealth Management at JPMorgan Chase, said the company has been "fortifying and using AI on our trading desks for the past eight years." She added that JPMorgan Chase trades about $260 billion in volume daily and hit $500 billion in early April. "AI is not just a tool, it's reimagining workflows and it's changing the loading capacities for thousands of people on the frontline and in the back," said Erdoes.

She pointed out Smart Monitor is a tool that uses AI to find stocks, absorbs call reports, stock moves and ratios to highlight trades. Connect Coach is another feature that anticipates next best action for trades.

Data to Decisions Next-Generation Customer Experience Tech Optimization Chief Executive Officer Chief Information Officer Chief Data Officer

Dell Technologies continues to ride AI infrastructure wave with strong Q1

Dell Technologies reported strong first quarter results and provided a solid outlook due to strong demand for AI infrastructure.

The company reported first quarter earnings of $1.37 a share on revenue of $23.4 billion, up 5% from a year ago. Non-GAAP earnings were $1.55 a share.

Dell was expected to report first quarter earnings of $1.69 a share on revenue of $23.2 billion.

CFO Yvonne McGill noted that Dell's non-GAAP earnings grew three times faster than revenue. Chief Operating Officer Jeff Clarke said the company saw strong AI demand. "We're experiencing unprecedented demand for our AI-optimized servers. We generated $12.1 billion in AI orders this quarter alone, surpassing the entirety of shipments in all of FY25 and leaving us with $14.4 billion in backlog," he said.

Indeed, AI infrastructure carried the quarter for Dell. The infrastructure solutions group (ISG) delivered operating income of $1 billion on revenue of $10.3 billion, up 12% from a year ago. Servers and networking revenue was a record $6.3 billion in the first quarter, up 16% from a year ago.

Clarke said:

"We experienced exceptionally strong demand for AI-optimized servers, building on the momentum discussed in February and further demonstrating that our differentiation is winning in the marketplace. Our pipeline continued to grow sequentially across both Tier 2 CSPs and private and public enterprise customers - and remains multiples of our backlog. Enterprise AI customers grew again sequentially with good representation across key industry verticals."

Clarke did note that demand and shipments are likely to be lumpy for the foreseeable future.

For the PC unit, also known as the client solutions group (CSG), revenue in the first quarter was $12.5 billion, up 5% from a year ago, with operating income of $653 million. Commercial client revenue was $11 billion, up 9% and consumer revenue was down 19% to $1.5 billion. Dell is primarily focused on business PCs.

As for the outlook, Dell projected second quarter revenue of $28.5 billion and $29.5 billion, up 16% from a year ago. Non-GAAP earnings will be $2.25 a share. For fiscal 2026, Dell projected non-GAAP earnings of $9.40 a share on revenue of $101 billion to $105 billion.

McGill hinted that the outlook could turn out to be conservative but the economy is the big unknown. "We’re optimistic on our portfolio and our ability to execute - however, we want to be thoughtful of how customers think through their IT spend relative to the macro environment," she said.

Holger Mueller, an analyst at Constellation Research, said:

“Dell had a good quarter, benefitting from the inference demand of AI. That's easy to see as servers and networking were up respectable 16%, storage barely beat inflation at 6%. If Dell customers were training AI models locally we should see more storage sales. Or Dell is not participating on the increased demand for data lakehouses. We will know more in the next quarter.”

Data to Decisions Tech Optimization dell Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

C3.ai bets demonstration licenses deliver future growth

C3.ai is betting that it can land more enterprise customers with demo licenses issued to partners that highlight its AI applications and ultimately turn into long-term deals.

The company, which reported better than expected fourth quarter results, derived $33.8 million in demonstration versions of C3 AI applications out of total revenue of $108.7 million, up 26% from a year ago. C3.ai reported a net loss of 60 cents a share and an adjusted net loss of 16 cents a share. Annual revenue was $389.1 million, up 25% from a year ago, with a loss of $2.24 a share (41 cents a share non-GAAP).

CFO Hitesh Lath noted that C3.ai sells those licenses to distribution partners to demonstrate the software possibilities to customers and "accelerate AI adoption."

C3.ai has invested heavily in partnerships with the top cloud providers--Amazon Web Services, Microsoft Azure and Google Cloud--as well as systems integrators. Professional services revenue was $21.4 million, and $17 million of that sum was Prioritized Engineering Services, which "are undertaken when a customer requests that we accelerate the design, development and delivery of software features and functions that are planned in our future product road map," said Lath.

These demonstration licenses are a twist on what Palantir does with its bootcamps. The idea is to get enterprises to try the AI applications, see the value and then accelerate adoption. Going forward, Lath noted that Prioritized Engineering Services and subscriptions will be about 90% of revenue.

CEO Tom Siebel said the demo licenses were an investment into scaling C3.ai's applications. Siebel, who said he is back in the fold after health issues, said on the company's earnings call that C3.ai is using hyperscale cloud providers as a sales force multiplier.

He said:

"We have tens of thousands of salespeople at Azure. I believe tens of thousands of salespeople at AWS. Thousands of salespeople at GCP. They have lots of products to sell in their bag, and it's very confusing, so we need to make it simple. So in order to make it simple for them, we invested in building demo applications that run and take advantage of the full utility of the Azure stack or the AWS stack or the GCP stack. So these people in Frankfurt and Munich and Detroit and Madrid and Moline can go into their customer and give a demo of a complex application to customers show them what the economic benefit is of supply chain optimization, of demand forecasting, of predictive maintenance."

"We've done good work at arming our partners with demonstration licenses. Think about that as an investment in future growth."

C3.ai also providing demonstration licenses to customers. "We sold demonstration licenses to our customers. Why would we do that? Because Dow Chemical or Shell or Coke or Cargill or whoever or the United States Air Force, whoever it may be, they have a hugely successful application and they want to encourage others to use these applications. For example, the Air Force has 22 platforms today, and they want to deploy the application across 44 platforms," said Siebel.

For C3.ai, these demo licenses accelerate adoption, ease the change management and then convert to regular subscriptions. The bet for Siebel is simple: Turn those demo licenses into joint sales calls with much larger cloud vendors and then do deals quickly because enterprises already have master agreements in place. "It takes two to five months out of a contract negotiation process and accelerates the sales cycle," said Siebel.

Other notable items from the C3.ai earnings call from Siebel.

  • "We have, depending on how you count, someplace between 20 and 100 agentic AI solutions out there in production, in the hands of happy customers. And if we were to spin that business out, just that business out, and take it to a Andreessen Horowitz or a Bessemer or Nvidia or whatever it is, that business alone would be valued at multiples of where C3 AI trades today and we all know that's a true statement."
  • "One of the most notable achievements in Q4 was the renewal and expansion of our strategic partnership with Baker Hughes. This alliance, which began in 2019, has been a cornerstone of our success in the oil and gas sector, generating over $0.5 billion in revenue from this vertical and the chemical markets. The renewed agreement underscores the proven value we deliver through our joint efforts, enhancing efficiency, safety, reliability, and sustainability across upstream, midstream, and downstream operations."
  • "I did get slowed down for a little bit. There's no question about it. And I had to work from home for a little while and take it easy and recover, but I will catch a red eye to Washington, D.C. tonight. I will be in Washington, D.C. again for three days. I think 10 days from now after attending a wedding in Cabo. So, just when you thought it was safe, I'm back."
Data to Decisions Next-Generation Customer Experience Tech Optimization Chief Information Officer

VMware still under pressure as customers plot escapes, rivals gain

The mass exodus from VMware following Broadcom wasn't a sprint as much as it was a walk. Nevertheless, earnings results from Nutanix and Pure Storage highlight VMware customers are taking a measured approach to leaving.

First, Nutanix reported better-than-expected fiscal third quarter results and raised its outlook. The company reported non-GAAP earnings of 42 cents a share on revenue of $639 million, up 22% from a year ago. Nutanix also raised its outlook and now projects fourth quarter revenue of $635 million to $645 million.

The big technology development in the quarter was Nutanix partnership with Dell Technologies. Nutanix is supporting external storage in a move that will broaden its reach and give enterprises more options to move to its virtualization platform. Nutanix on Dell PowerFlex became available the end of April. Typically, customers would consume Nutanix storage along with its hypervisor.

The Dell deal, coupled with a more mature Cisco partnership, and a new effort with Pure Storage gives Nutanix more enterprise heft and ability to target AI workloads. Nutanix's cloud platform will also support Google Cloud.

"We continue to focus on helping customers build apps and run them anywhere," said Nutanix CEO Rajiv Ramaswami. "Our largest wins in the quarter demonstrated our ability to land and expand within some of the largest and most demanding organizations in the world as they look to modernize their IT footprints, including adopting hybrid multi-cloud operating models and modern applications, as well as those looking for alternatives in the wake of industry M&A."

Previous Nutanix earnings calls didn't usually name VMware directly and alluded to migrations as moves from legacy providers or some not-so-vague reference. This conference call was more direct as analysts and Nutanix executives mentioned VMware 13 times. Nutanix is often coming into the enterprise as a second virtualization option and growing from there.

Ramaswami also said Nutanix is developing its Kubernetes efforts. Talking about Nutanix's .NEXT conference, which was recently held, Ramaswami noted that the company is building up its customer base that migrated from VMware.

"Our customers would like us to support every external storage array that's out there. They want to see how we can make migration as easy as possible for them. And there were many customers who talked about their migration experience moving from VMware to Nutanix at the conference," he said.

According to Ramaswami, VMware isn't using pricing as much to keep accounts. Nutanix is taking a more a la carte approach and VMware is selling a complete stack.

Nutanix stands to benefit as VMware customers that signed three-year deals prior to the Broadcom acquisition come up for renewal now.

"Some did three years, some did five years with VMware as soon as they heard about the acquisition or around the time the acquisition was announced or as it started getting to be closed," said Ramaswami. "All of those customers renewals are coming up now, let's say this year or next year."

He said the one bucket of VMware customers are planning to actively migrate. Other customers will probably have to renew with VMware, but are planning for it to be the last deal. Either way, Nutanix plans to play a long game.

VMware was also a topic on Pure Storage's earnings call. Pure Storage reported strong first quarter results with non-GAAP earnings of 29 cents a share compared to estimates of 25 cents a share. Revenue in the quarter was $778.5 million, which also topped estimates.

As for the outlook Pure Storage projected second quarter revenue of $845 million and annual revenue of $3.51 billion.

CEO Charlie Giancarlo noted that Pure Storage is benefiting from its storage software, flash-based systems and subscription model to manage data for AI workloads. "Modern AI environments require a wide variety of performance levels consistently delivered across tens of thousands of GPUs," said Giancarlo, who noted AI inference and retrieval augmented generation is benefiting Pure Storage. "Q1 was a strong quarter in our breadth of AI wins."

As for the VMware hook, Giancarlo said AI workloads are forcing enterprises to revisit virtualization strategies. He cited the Nutanix partnership as an important milestone.

"This joint solution provides a modern, scalable, virtualized environment, which is purpose built for high demand data center scale workloads. Our partnership will deliver a high performance virtualized environment, providing Nutanix cloud infrastructure with Pure's enterprise data cloud using Pure FlashArray storage. We expect the solution to be generally available later this year," said Giancarlo.

Pure Storage landed multiple virtualization deals in the quarter with its Portworx offering that unifies container and virtual machine workloads.

The results from Nutanix and Pure Storage highlight the encroachment on VMware's customer base. Yes, Broadcom's VMware purchase was a financial win, but moves on perpetual licenses have rankled customers.

And the VMware angst also benefits smaller companies not just large vendors. Platform9 appears to be benefiting with private cloud migrations too.

Platform9 recently penned an open letter to VMware customers about "sweeping changes in licensing and strategy, often at odds with what was to their benefit."

The bottom line: A big chunk of VMware's customer base is up for grabs and competitors are playing the long game to win them over.

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work vmware SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Nvidia Q1 strong, continues to ride data center demand

Nvidia reported strong first quarter results as its data center demand remains strong. The company’s writedown for H20 chips designed for China was also lower than expected.

The company reported first quarter earnings of 76 cents a share on revenue of $44.1 billion, up 69% from a year ago. Of those sales, data center revenue was $39.1 billion, up 73% from a year ago. Excluding a writedown of $4.5 billion, non-GAAP first quarter earnings would have been 96 cents a share.

Wall Street was expecting Nvidia to report non-GAAP earnings of 75 cents a share on revenue of $43.25 billion. Nvidia previously said it will take a $5.5 billion inventory charge due to chips it can’t sell in China due to US bans.

Going into the results, Nvidia’s quarter was expected to be messy due to the charges due to export controls to China. Sales of H20 products destined for China in the first quarter were $4.6 billion. Nvidia said it was unable to ship an additional $2.5 billion of H20 revenue in the first quarter.

Nvidia CEO Jensen Huang said the company’s Blackwell NVL72 AI supercomputer is now in full-scale production across systems and cloud computing providers. “Global demand for Nvidia’s AI infrastructure is incredibly strong. AI inference token generation has surged tenfold in just one year, and as AI agents become mainstream, the demand for AI computing will accelerate,” said Huang.

CFO Colette Kress said in prepared remarks:

“We saw our Blackwell architecture ramp expand to all customer categories, while large cloud service providers remained our largest at just under 50% of Data Center revenue. Data Center compute revenue was $34.2 billion, up 76% from a year ago and up 5% sequentially. Networking revenue was $5.0 billion, up 56% from a year ago and up 64% sequentially, driven by the growth of NVLink compute fabric in our GB200 systems and continued adoption of Ethernet for AI solutions at cloud service providers and consumer internet companies.”

As for the outlook, Nvidia projected revenue of $45 billion including a loss of $8 billion due to H20 revenue.

Here's what Huang had to say on the earnings call:

  • "On export control, China is one of the world's largest AI markets and a springboard to global success with half of the world's AI researchers based there, the platform that wins China is positioned to lead globally today. However, the $50 billion China market is effectively closed to us."
  • "China's AI moves on, with or without us. The question is not whether China will have it. It already does. The question is whether one of the world's largest AI markets will run on American platforms, shielding Chinese chip makers from us. Competition only strengthens them abroad and weakens America's position. Export restrictions have spurred China's innovation and scale. The AI race is not just about chips, it's about which stack the world runs on. The US has based its policy on the assumption that China cannot make AI chips. That assumption was always questionable, and now it's clearly wrong."
  • "It's very clear that every company will have AI factories, and very soon there'll be robotics companies and those companies will be also building AI to drive the robots. We're at the beginning of all of this build out."
  • "We're also increasing our supply chain and building out our supply chain. They're doing a fantastic job. We're building it here onshore, United States, but we're going to keep our supply chain quite busy for several, many more years coming."

Constellation Research analyst Holger Mueller said:

"Nvidia keeps firing on all cylinders, and beats expectations despite regulatory writedowns.  Nothing seems to be able to stop Jensen Huang and company. With its deals in the Middle East, Nvidia is planting the seeds for a few $100 billion in future revenue from sovereign cloud. We also know now what role China revenue could have been given the writedowns. With all the data center revenue, the other Nvidia business don't have to do well, but the question one can ask is why the automotive business is not taking off."

Data to Decisions Tech Optimization nvidia Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Salesforce Q1 strong, outlook raised for Q2

Salesforce delivered better than expected first quarter results and upped its outlook for the second quarter. The company said it saw strength in Data Cloud and AI annual recurring revenue.

Salesforce reported first quarter net income of $1.54 billion, or $1.59 a share, on revenue of $9.3 billion. Non-GAAP earnings were $2.58 a share. Wall Street was expecting Salesforce to report first quarter earnings of $2.55 a share on revenue of $9.75 billion.

The company’s first quarter results land a day after the company announced its $8 billion acquisition of Informatica for its data integration and management platform.

CEO Marc Benioff said the company is seeing traction due to its “deeply unified enterprise AI platform.” Robin Washington, President, Chief Operating and Financial Officer, said the company delivered “solid execution” in the first quarter.

By the numbers for the first quarter:

  • Salesforce has closed more than 8,000 Agentforce deals. Half are paid.
  • Nearly 60% of Salesforce’s top 100 deals in the first quarter included Data Cloud and AI.
  • More than half of Salesforce’s top 100 deals in the quarter included more than six clouds.
  • Data Cloud ingested 22 trillion records in the first quarter.
  • Sales cloud revenue in constant currency was up 7% from a year ago as was service.
  • Platform and other revenue in constant currency was up 14% from a year ago.
  • Marketing and commerce revenue in constant currency was up 4% from a year ago.
  • Integration and analytics in constant currency was up 10%.

As for the outlook, Salesforce said it would see a currency tailwind due to a weaker US dollar. The company projected second quarter revenue of $10.11 billion to $10.16 billion, up 8% to 9%. Non-GAAP earnings in the second quarter will be between $2.76 a share to $2.78 a share. In constant currency, growth would be 7% to 8%. Salesforce projected fiscal 2026 revenue of $41 billion to $41.3 billion, up 8% to 9%. Non-GAAP earnings for fiscal 2026 are projected to be $11.27 a share to $11.33 a share.

Here are the takeaways from the Salesforce earnings call:

  • Informatica. Benioff said Salesforce sees Informatica as a transformational deal at a good price. "Every AI transformation is a data transformation," said Benioff. "You have to have your enterprise data together to get the results that you want. Informatica combined with Salesforce Data Cloud and Tableau will create this incredible data business."
  • Slack as UI. Benioff said that Slack is where you'll go to begin and end every Agentforce conversation. "You will really like AI taking place on Slack and agents just coming right into your channels to talk to you in real time," Benioff. 
  • AI agent washing. Benioff said that "every company does say they have agents, but with out the agents, the data, the apps and metadata framework you're not able to deliver this complete experience for the enterprise including delivering digital labor."
  • Finding growth. Benioff said that Salesforce is finding growth pockets inside the company, notably in small and medium sized businesses. Miquel Milano, chief revenue officer, said the company is known for structuring large deals, but Salesforce is also making it easier for companies to buy. The company booked $2 billion in business through AWS Marketplace.  
  • Consumption models. Milano said the company is focused on its consumption motion and selling the overall platform. Thirty percent of Agentforce bookings in the first quarter were due to customers increasing consumption. 
Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience salesforce Chief Customer Officer Chief Information Officer Chief Marketing Officer Chief Data Officer

Box's Q1, outlook highlight potential in AI agent ecosystem

Box reported better-than-expected first quarter results and raised its outlook as its content and unstructured data platform carves out a key role as enterprises move to AI agents.

Speaking on an earnings conference call, Box CEO Aaron Levie said customers are upgrading to the company’s Enterprise Advanced plan to leverage Box AI. Box also released a State of AI in the Enterprise Survey, which found more than half of the 1,300 IT leaders surveyed expect transformation from AI in the next two years. Ninety percent of respondents are using AI agents in some capacity with unstructured data and documents a primary use case.

"Box AI Agents will enable enterprises to streamline a due diligence process on hundreds or thousands of documents in an M&A transaction, correlate customer trends amongst customer surveys and product research data, or analyze life sciences and medical research documents to generate reports on new drug discovery and development," said Levie. "None of this would have been possible even a year ago."

The Box earnings follow the company's rollout of Box AI Agent integration for Microsoft 365 Copilot, IBM watsonx Orchestrate, Google Agentspace, Slack AI, ServiceNow AI Agent Fabric and Zoom AI Companion as well as the Box Model Context Protocol (MCP) server. Box is also officially integrated with OpenAI's ChatGPT's deep research agent. If there's a foundational model such as Meta's Llama or Grok, Box plans to integrate.

Levie added that Box is also benefiting from lower compute costs for AI inference and model improvements.

"On the AI inference side, we've just been very happy about the rate of like-for-like AI model improvements that we're seeing from a cost standpoint. And that can show up in 2 ways. The first is that you can take an existing use case and it might just on a one-to-one basis, be cheaper on a kind of pretty regular basis every kind of 6 to 12 months at a minimum," said Levie. "The alternative is that you get a new capability unlock because you can -- you either get the base model just is getting much better or you can use an existing model and do multiple passes through the model for better accuracy or more complex use cases."

AI use cases will ultimately be margin-neutral, said Levie, who added that Box's pricing is based on seats and credits instead of use case based pricing.

The numbers

Box reported first quarter earnings of $3.51 million, or 2 cents a share, on revenue of $276.27 million, up 4% from a year ago. Non-GAAP earnings were 30 cents a share.

Wall Street was expecting Box to report first quarter adjusted earnings of 26 cents a share on revenue of $274.77 million.

Billings in the first quarter were $242.3 million, up 27% from a year ago. Remaining performance obligations (RPO) of $1.47 billion, up 21%, or 17% on a constant currency basis.

As for the outlook, Box projected second quarter revenue of $290 million to $291 million, up 8% from a year ago. Non-GAAP earnings will be between 30 cents a share and 31 cents a share. Wall Street was looking for second quarter earnings of 28 cents a share on $284.1 million.

CFO Dylan Smith said economic uncertainty hasn't had an impact on Box's business, but it wanted to "remain prudent" with its outlook for fiscal 2026. Box is also navigating currency fluctuations given a big chunk of its business is in Japan.

Box projected annual revenue to be in the range of $1.165 billion to $1.7 billion, up $10 million from its previous guidance, with adjusted earnings of $1.22 a share to $1.26 a share. Wall Street was looking for adjusted earnings of $1.19 a share for fiscal 2026.

Where Box sits in the AI agent ecosystem

Given Box's content platform is a repository and management system for valuable unstructured data, Levie said the company is "sitting very naturally at the center of so much of the innovation happening in AI right now."

Levie added that Box is not competing with any of the AI model providers. Instead, Box is a meeting place where models can add value to customer data. Box also serves as a secure place for content that adds a layer of governance.

"We act as a very natural kind of convening point for these AI models when customers want to be able to use data with any of these leading platforms," said Levie. "You want to ensure that data access controls are actually maintaining the security of your information. And so you don't want to be in a position where you're trying to pack too much of that intelligence into the model layer, you want to pack that into the data plane and the architecture around that, which is what Box provides customers."

By sitting in the data plane, Box has more appeal to regulated customers as well as mission critical AI use cases, he added. Levie said Box will occupy two layers of the AI stack including the software plane for end user interaction and platform via APIs. Monetization will be based on seats and usage for agent queries going into Box.

"We want to execute on both of those as fast as possible and in tandem because no single company is going to decide where all of the user does their work. It's just not possible," said Levie. "We want to make sure that you can manage your content one place and ensure that it works everywhere."

Constellation Research analyst Holger Mueller said:

"Box keeps innovating, doing the right for customers, but cannot lift its revenue back into the teenage growth numbers, which is the least investors expect from an innovative AI company. The current quarter flirted with the inflation rate, meaning that Box was treading water…  The promise of AI changing how people will upgrade their future of work with documents is becoming clearer and clearer – and if Box can unleash the acceleration potential in the best practices shift, it may well grow again as it should. On the downside are the commoditization pressures that Box has been fighting since the pandemic."

Bottom line: It's early in AI agent use cases, but Box is seeing billings growth, interest for Enterprise Advanced and high-level CxO conversations. Based on those leading indicators, it's just a matter of time until Box sees accelerating revenue growth.

 

Data to Decisions Future of Work Tech Optimization Sales Marketing Innovation & Product-led Growth Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity box Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Machine Learning LLMs Agentic AI Robotics Quantum Computing Enterprise IT Enterprise Acceleration IoT Blockchain Leadership VR Chief Information Officer Chief Marketing Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Salesforce acquires Informatica for $8 billion to bolster Agentforce

Salesforce said it will acquire Informatica for $8 billion, or $25 a share, in a deal that will give it a neutral data integration and management platform to connect Agentforce across systems.

The two companies were in talks a year ago, but couldn't agree on a price. Salesforce will fund the Informatica purchase with cash and new debt.

Salesforce said Informatica, which recently announced tighter integration with Microsoft Fabric, Databricks, Snowflake and others, will power its agentic AI vision across enterprises. Informatica brings a data catalog, integration, governance and metadata management.

What Informatica also brings to Salesforce is a neutral platform. CxOs have said that Agentforce is viewed as more of a Salesforce-specific AI agent play instead of a horizontal solution across third-party systems.

Informatica will also bring revenue to Salesforce. Informatica recently projected 2025 revenue of $1.67 billion to $1.72 billion, or growth of 4.6% at the midpoint. Salesforce said Informatica will boost non-GAAP earnings and free cash flow in the second year after the deal closes.

Salesforce revamps Agentforce pricing with Flex Credits: What you need to know | Salesforce expands ecosystem, courts developers and partners to Agentforce | Salesforce landing Agentforce deals, but Q4 and outlook mixed

Salesforce CEO Marc Benioff said the combination of Data Cloud, MuleSoft and Tableau with Informatica will "enable autonomous agents to deliver smarter, safer, and more scalable outcomes for every company."

Here's what Salesforce bought:

According to Salesforce, Informatica will do the following:

  • Strengthen Data Cloud as a customer data platform (CDP).
  • Given Agentforce the ability to interpret and act on a wide range of enterprise data.
  • Augment Customer 360.
  • Bring data quality, integration and governance for the data used by MuleSoft APIs.
  • Provide context for Tableau insights.
  • Bolster Salesforce's industry offerings.

Once the deal closes, Salesforce said it will "rapidly integrate Informatica’s technology stack — including data integration, quality, governance, and unified metadata for Agentforce, and a single data pipeline with MDM on Data Cloud." When integrated, Informatica will be embedded into Salesforce's system of understanding.

Behind the Scenes: The Force Behind Agentforce

In addition, Salesforce said it will support Informatica's ecosystem and data management products.

Salesforce will talk about the deal more on its first quarter earnings call on Wednesday. In the meantime, here are some questions to ponder.

  • How will Data Cloud and Informatica be sold individually? Will there be a conflict?
  • Will Informatica's mojo as a neutral party in data management erode as part of Salesforce? A neutral vendor in your stack is great, but also a pipe dream
  • How quickly can Informatica be integrated into Salesforce's platform?
  • And customers will look for alternatives to Informatica as they are already digesting Agentforce and pricing changes?
  • How will Informatica's CLAIRE AI agent efforts be affected in the deal?

Constellation Research’s take

Constellation Research CEO R “Ray” Wang said:

“The bottom line is that MuleSoft was not enough. Salesforce showed why Data Cloud was important for AI. But how do you get the data into Data Cloud? A data integration company and iPaaS vendor would have to do the trick. But which one? Informatica has been on the block before and it’s old, legacy, but it has a ton of customers and some great data management tools. But it’s not the future and this what makes the difference between at $10B acquisition vs a $8B acquisition. However, Boomi would have been the smarter buy - brand new tech, an agentic framework ready to go, fast growing company, and a rock star CEO.”

The bigger question in the long run is whether the Informatica deal positions Salesforce to manage and orchestrate agents beyond its platform. 

Liz Miller, an analyst at Constellation Research, said:

"Top of mind for everyone will be the question: "Is this the $8 billion missing piece to shift Agentforce from a dominant promise to a dominant reality for Salesforce customers?" What Informatica brings to the table is data integration capacity, metadata integration and serious data provenance, lineage and governance to stitch all of Salesforce's recent data innovations together.

For Salesforce customers the question will always come down to what additional value will Informatica bring to my current business goals? Does this seismically change the trajectory that my data and my processes and my outcomes are on? 

The moves to watch will revolve, as they often do with Salesforce, around price. In recent weeks Salesforce has worked to simplify and streamline Agentforce pricing and there has been a long history of fine-tuning Data Cloud's consumption pricing. Time will tell if Salesforce's customers can shoulder another layer of data related costs." 

Data to Decisions Future of Work salesforce Chief Information Officer

Accenture's Karthik Narain on human, AI collaboration, trust

Karthik Narain, Group Chief Executive, Technology and CTO at Accenture, said enterprises should think about trust as the primary gatekeeper to AI adoption, eye collaboration between AI agents and humans, and architect companies to take advantage of a time where there's a "a personalized digital brain for every employee."

Narain, who appeared on DisrupTV Episode 399 and is an author of Accenture's Tech Vision, riffed on the future.

AI as the New Digital Foundation: Narain emphasized that artificial intelligence (AI) is not just another tech trend, but a foundational shift similar to what we saw with the rise of digital technologies in the 2000s. "The theme we believe is going to be very foundational for years to come, and that is all about the role of AI and how it's going to impact societies," said Narain. "All of that is going to be taken to the next level with AI."

Cognitive Digital Brain: Accenture envisions every enterprise creating a "cognitive digital brain," an intelligent system that continuously learns, makes decisions, and collaborates. "We call that concept that an enterprise will create as a cognitive digital brain," said Narain. "Over time, you will have a personalized digital brain for every employee, a digital brain for an enterprise, and at an industry level."

Trust is the Gatekeeper to AI Autonomy: Trust will be the defining factor in determining how much autonomy AI systems are granted. "We believe that the only thing that's going to be a limiting or an accelerant is going to be trust. And it's trust and autonomy that go hand in hand," said Narain. "Organizations need to inject trust in the system for AI to be able to be used. It's a combination of confidence, our own intuition, explainability."

The Binary Big Bang – Architectural Disruption: The "Binary Big Bang" describes how large language models are fundamentally changing how technology is developed and deployed. "This explosion is going to append all technologies and create a new architecture that can come together to drive new experiences and increase the digitization index of enterprises," said Narain.

AI Refinery and Trusted Agent Huddle: Accenture is operationalizing these concepts through platforms like the AI Refinery and Trusted Agent Huddle, focusing on collaborative agentic AI and responsible deployment. "The whole idea of this Trusted Agent Huddle platform that we created is the fact that there needs to be collaboration… and the humans play a very, very important role," said Narain. "Trusted Agent Huddle is not just a communication protocol, it is basically a trust protocol."

Investing in People: Cognitive Upgrade vs. Transfer: Accenture is focusing not on replacing humans but enhancing them—what Narain calls a "cognitive upgrade."

“We are upgrading our talent 80,000 to 100,000 of our employees to become data and AI proficient," said Narain. "There is all this conversation of gloom and doom, but when you reimagine a workflow or a process using AI agents, the human comes first."

 

Data to Decisions Future of Work Next-Generation Customer Experience accenture Chief Information Officer Chief Technology Officer