Results

Snowflake Summit 2025: Everything you need to know

Snowflake Summit 2025: Everything you need to know

Snowflake launched a bevy of data and AI enhancements that position the company as "an agent of transformation."

The combination of launches at Summit 2025 highlight how Snowflake is expanding its lens for its platform ranging from analytics to data to AI. CEO Sridhar Ramaswamy said the company's revised mission statement is to "empower every enterprise to achieve its full potential through data and AI."

The big theme at Snowflake Summit 2025 is about stitching together data sets, tools and workflows to enable AI use cases. "Snowflake is where data does more," said Ramaswamy, noting the conference's mantra.

Ramaswamy noted that Snowflake is in a different place than it was a year ago due to strong results and a faster product cadence. "Over the past few years, we've been expanding our lens. Obviously, we started as an analytics platform, but we've been adding substantial functionality to that platform, extending it so that we can work with customers. We can help customers through more of their data life cycle," said Ramaswamy.

According to Ramaswamy, Snowflake is in a place where it can bring together AI and unstructured and structured data. Here's a look at the major announcements from Snowflake Summit 2025.

Snowflake Adaptive Compute, platform enhancements

Snowflake announced the addition of Adaptive Compute to is data warehouse platform. The company also outlined Standard Warehouse-Generation 2 with 2.1x faster analytics performance over the previous generation, interoperability features and AI security and governance features in Snowflake Horizon Catalog.

Here's the breakdown:

  • Snowflake Gen2 warehouses are retooled with upgraded hardware and enhanced software and generally available.
  • Snowflake Adaptive Compute, which is in private beta, automatically scales resources and routes queries efficiently behind the scenes. Customers won't have to manually manage warehouse sizes, concurrency settings and multi-cluster configurations. Adaptive Compute will power Adaptive Warehouses.
  • Snowflake Chief Product Officer Christian Kleinerman said this adaptive compute approach eliminates complexity from the platform. "We're bringing increased ease of use to the platform," said Kleinerman. "We are materially improving the price performance of the platform of Snowflake."
  • Gen2 warehouses feature simplified conversion to adaptive warehouses without downtime.
  • Interoperability across catalogs and engines with Catalog-Linked Databases, which will enable customers to automatically sync Snowflake Horizon Catalog with Apache Iceberg objects managed by any Iceberg REST Catalog including Apache Polaris and AWS Glue.
  • Expanded Data Discovery with Universal Search, which is in private preview. Customers will be able to discover data in external relational databases including Postgres SQL, MySQL without leaving Snowflake.
  • Pfizer was cited as an early adopter of Adaptive Compute.

AI-Driven data governance, security and compliance including the ability to ask security questions via Snowflake Cortex AI as well as a set of AI observability tools.

Snowflake Openflow

The company launched Snowflake Openflow, which simplifies the movement of data to where it can be used. Openflow is the evolution of Snowflake's Datavolo acquisition.

Snowflake said Openflow is a multi-modal data ingestion service that will give users the ability to connect any data source and architecture.

Openflow, which is powered by Apache NiFi, will enable customers to meld data engineering practices into their Snowflake workflows.

With Openflow, Snowflake is pursuing what it calls limitless interoperability to create data that's ready for AI uses. Openflow has multiple integrations with Box, Google Ads, ServiceNow, Workday and Zendesk to name a few.

Snowflake Intelligence, Data Science Agent

At Snowflake Summit 2025, executives also moved to outline how the company is leveraging AI for customer use cases and democratizing data for business users.

The company launched Snowflake Intelligence, which will be in public preview soon, to offer users a conversational experience with data agents that can traverse dashboards, structured and unstructured data stores and analytics tools to deliver answers.

In a demo, Snowflake showed how users in different functions--finance, marketing, manufacturing--could ask questions and run SQL queries without analysts and technical expertise in a compliant manner.

Snowflake Intelligence leverages the platform's connectors, Openflow and Cortex Knowledge Extensions that can bring in third party content from Snowflake partners.

According to Snowflake, Snowflake Intelligence uses large language models from Anthropic and OpenAI running inside Snowflake perimeter with Cortex Agents.

Snowflake also launched Data Science Agent, which will be in private preview. Data Science Agent automates machine learning model development tasks, and AI workflows to boost productivity.

Data Science Agent uses Anthropic's Claude model to break down problems into steps including data analysis, data prep, feature engineering and training.

Cortex AISQL, SnowConvert AI

Snowflake launched Cortex AISQL, which embeds generative AI into customer queries to analyze and build pipelines with SQL syntax. Snowflake claimed that Cortex AISQL can lead up to 60% cost savings when filtering or joining data.

According to Snowflake, Cortex AISQL can "effectively turn every data analyst into an AI engineer." Cortex AISQL uses models from Anthropic, Meta, Mistral and OpenAI and combines them with Snowflake's SQL Engine.

The secret sauce for Cortex AISQL is that analysts can query both structured and unstructured data to analyze multi-modal data, eliminate silos, consolidate tools and enrich customer tables.

SnowConvert AI is designed to make data warehouse, business intelligence and extract transform and load (ETL) migrations faster. Snowflake said SnowConvert AI makes code conversion and testing phases in migrating from legacy systems 2x to 3x faster.

Snowflake Marketplace enhancements

Snowflake launched Cortex Knowledge Extensions on Snowflake Marketplace to add content and data from The Associated Press, CB Insights and Stack Overflow. These third-party content partners can be used to enrich AI apps with real-time news.

Via Snowflake Marketplace, customers can share Semantic Models to integrate AI-ready structured data within their Snowflake Cortex AI apps.

Snowflake also rolled out Agentic Snowflake Native Apps on Snowflake Marketplace. Agentic Native Apps are interoperable and can deliver standalone experiences on customer or provider data or be used as building blocks for apps created on Cortex Agents or in Snowflake Intelligence.

Data to Decisions snowflake Chief Information Officer Chief Data Officer

UKG acquires Shiftboard, eyes oil and gas, energy and manufacturing

UKG acquires Shiftboard, eyes oil and gas, energy and manufacturing

UKG said it has acquired Shiftboard, which provides employee scheduling software for oil and gas, energy and manufacturing companies.

Terms of the deal weren't disclosed. Shiftboard brings customers such as BASF, Bridgestone, Daisy Brand and Shell to UKG.

Scheduling for Shiftboard's industries requires operational continuity and engagement with predominantly frontline workers. Shiftboard's software orchestrates scheduling, labor strategy, production demand and various union and regulatory requirements.

According to UKG, Shiftboard's industry employee scheduling platform will be integrated into UKG Pro Workforce Management suite and UKG's AI experiences and insights. Shiftboard already had integrations with UKG as well as Workday, SAP, ADP, Microsoft Dynamics and others.

The deal is notable considering UKG was formed in 2020 via the merger of Ultimate Software and Kronos. Since that combination, which formed UKG, the company has made a series of cloud acquisitions including EverythingBenefits, Great Place to Work, Ascentis, SpotCues and Immedis.

UKG also recently forged a partnership with ServiceNow to integrate AI agents.

Last year, UKG named Jennifer Morgan CEO. Morgan had been co-CEO of SAP. Since Morgan joined in July 2024, UKG has revamped the leadership team. UKG has added a new CFO, Chief Product Officer, Chief Marketing Officer, Chief Communications Officer, CIO and Chief Partner Officer.

Holger Mueller, an analyst at Constellation Research, said UKG's acquisition counters ADP in the emerging category of fatigue management in human capital management. Mueller said:

"Less than 2 weeks ago at UKG's global industry analyst conference I asked what is the future of the many vertical scheduling engines that primarily includes Kronos, but also systems Ultimate had collected through the decades. Now we know. Acquire innovative startups that bridge more industries - in this case regulated industries, where there are additional than legal requirements, e.g. fatigue management and more. This is a smart move by the new UKG leadership team. Now comes the hard work of consolidating scheduling engines."

 

Data to Decisions Future of Work Tech Optimization AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief People Officer Chief Information Officer Chief Customer Officer Chief Human Resources Officer

Introducing Ciroos: AI Startup Launching SRE Teammate to Transform Enterprise Operations

Introducing Ciroos: AI Startup Launching SRE Teammate to Transform Enterprise Operations

Ciroos raised $21 million to deliver an agentic AI teammate for SRE, DevOps and operations teams to automate and cut incident response times by 90%. What's interesting about Ciroos is that it is looking to address gaps in operations and its approach wouldn't have been possible without agentic AI.

The company is looking to solve a big problem for site reliability engineers (SREs)--it's almost impossible to keep up with operations across multiple applications, domains, architectures and tools including static runbooks and dashboards. cAnd as enterprises move to AI agents, keeping up with operations is even more challenging. Energy Impact Partners led the funding. Ronak Desai, co-founder and CEO of Ciroos, said the company built its AI SRE Teammate to "end the toil" for SREs by accelerating root cause identification, automating actions and giving back time and control to build reliable systems. 

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Innovation & Product-led Growth Tech Optimization On Insights ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/jzNz0FPKYr4?si=TFGYsd8WAkDpKWps" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Snowflake makes its Postgres move, acquires Crunchy Data

Snowflake makes its Postgres move, acquires Crunchy Data

Snowflake said it will acquire Crunchy Data, which provides open source Postgres technology, and launch Snowflake Postgres for its AI Data Cloud.

Terms of the deal weren't disclosed, but the Wall Street Journal put the price tag at about $250 million. Databricks acquired Neon, a startup offering similar Postgres services for $1 billion.

Snowflake's acquisition of Crunchy Data kicks off the company's Snowflake Summit 2025. Databricks' conference is next week as the fierce rivals battle for data workloads used for AI applications.

Crunchy Data will bring FedRAMP compliant products directly into Snowflake Postgres and Snowflake AI Data Cloud. PostgreSQL is a popular database for developers and that popularity has extended to agentic AI. Vivek Raghunathan, SVP of Engineering at Snowflake, said in a statement that Crunchy Data addresses "a real need for our customers to bring Postgres to the Snowflake AI Data Cloud."

In a blog post, Snowflake noted that Crunchy Data has built a strong following making Postgres enterprise friendly and mission critical with offerings that "span managed cloud services, Kubernetes deployments and on-premise solutions." Crunchy Data also recently launched a Postgres-native data warehouse with Iceberg support.

According to Snowflake, the company will make a strong commitment to Postgres as well as existing Crunchy Data customers. Snowflake said Snowflake Postgres, which will be available in private preview, is part of a strategy to support transactional data along with efforts like Unistore.

Constellation Research's take

Michael Ni, an analyst at Constellation Research, said:

"Databricks bought Neon. Snowflake countered with Crunchy. This isn’t about big data analytics anymore—it’s about becoming the AI-native data foundation unifying analytics, operational storage, and machine learning. Crunchy gives Snowflake an enterprise-grade PostgreSQL engine to support AI agents, co-pilot apps, and context-aware workflows that demand structured, compliant, and low-latency operational data storage. This is about turning insights into action without leaving the Snowflake platform."

Constellation Research analyst Holger Mueller said: 

"If one had any doubt that PostgteSQL is the common denominator for accessing data outside of the large commercial databases, then this move by Snowflake removes those doubts. The acquisition further cements the position of PostgreSQL as the lingua franca for accessing data. It is good to see Snowflake and other vendors supporting the query language."

Data to Decisions snowflake Chief Information Officer

Pegasystems launches Pega Agentic Process Fabric, rides AI momentum

Pegasystems launches Pega Agentic Process Fabric, rides AI momentum

Pegasystems launched Pega Agentic Process Fabric, a service that aims to orchestrate AI agents and make them more reliable with process knowledge.

The launch, outlined at PegaWorld, builds on Pegasystems strategy to combine agentic AI and its core workflow and process platform. For instance Page Agentic Process Fabric is an extension of the Pega Process Fabric, which enables agents, apps, systems and data to coordinate.

Pegasystems applications are designed for everything from business process management to customer engagement and digital process automation. Those areas all intersect with what AI agents will do and have led to strong demand for Pegasystems' Pega GenAI Blueprint. Pegasystems' growth has surged as enterprises look to leverage AI tools without additional risk on a platform that's known well.

Speaking on Pegasystems first quarter earnings call, CEO Alan Trefler said Pega GenAI Blueprint has been in "pretty much every client conversation we have." He noted that Blueprint serves as an AI agent that uses Pega's best practices and melds it with customer data and workflows to build applications.

He added:

"Enterprises want the promise and power of automation that agents could offer, but we don't think they want thousands of agents running unchecked, producing unreliable, undesirable results. They need an agent to wherever possible follow a consistent process with full transparency on how it does work out. And this is where our unique approach is, combining the power of language model-driven agents with the predictability of workflows."

Trefler noted that PegaWorld has more than 200 live demos of its agentic AI meets process automation approach. The company's headline products are Pega Infinity, Pegasystems' suite, and Pega Blueprint, which use generative AI and best practices to create application workflows and automation.

Pega Agentic Process Fabric features the following:

  • The ability to analyze its library of discoverable agents, workflows and data across systems to find AI agent is best suited for the task.
  • Workflow tools that extend Pegasystems Pega Predictable AI Agents to combine workflows and AI reasoning. When Pega Agentic Process Fabric invokes a workflow, Pega Predictable AI Agents interact with users on guided compliant workflows.
  • Support for Model Context Protocol (MCP) and Agent-to-Agent (A2A).
  • Security controls to prevent misuse.

Pega Agentic Process Fabric will be available in the third quarter as part of its Pega Infinity suite. The company said several parts of Fabric are available now.

Here's a look at Pega Infinity.

Pegasystems delivered strong revenue growth in the fourth quarter due to is Pega GenAI applications. The company reported first quarter net income of $85.42 million on revenue of $475.63 million, up 44% from a year ago. Non-GAAP earnings of $1.53 a share were well ahead of expectations.

Constellation Research analyst Liz Miller said:

"While the market has been talking about Agentic AI and overloading buyers with a laundry list of AI agents, bots and orchestrators, Pega has been focused on the underlying processes and workflows that have long been their bread and butter. Their approach to AI has been rooted in predictability and responsibility, and this new step forward with agentic AI capabilities is no different. "

The company also announced the following at PegaWorld.

  • Pega Infinity App Studio gets agentic AI tools for developers to speed up design, integration, user experience and testing. Pegasystems added an enhanced AI developer agent, integrations between Pega applications and third party systems and a new design agent for robotic process automation. Pega Infinity App Studio leverages Pega Blueprint, an AI-driven workflow designer.
  • AI agents in Pega Blueprint can ingest, analyze and convert legacy system assets to aid modernization efforts. AI agents on Pega Blueprint will also deliver process insights on legacy systems and analyze and generate information needed to build a data model that can help move apps to the cloud.
  • Systems integrators will be able to use Pega Blueprints to integrate their own intellectual property with the Pegasystems knowledge base.
Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Digital Safety, Privacy & Cybersecurity New C-Suite Marketing Transformation I am Team Leader at the Nominee Organization (no vendor self nominations) Distillation Aftershots ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

HCLSoftware launches Unica+, brings AI agents to its marketing stack

HCLSoftware launches Unica+, brings AI agents to its marketing stack

HCLSoftware launched HCL Unica+, the latest version of its marketing platform featuring a bevy of AI agent features.

For HCLSoftware, HCL Unica+ represents an effort to future-proof the marketing stack. HCL said it has reimagined Unica+ to be AI first and meld intelligence, intention and data to add context and insights to what customers want.

Raj Iyer, Executive Vice President and Portfolio Manager, HCLSoftware, said the goal of HCL Unica+ is to create a "bridge to trust" where digital experiences are personalized and leverage intent to strengthen relationships. HCLSoftware bills Unica+ as the "MarTech Platform for the Intelligence Economy."

Constellation Research analyst Liz Miller said HCL Unica+ is an effort to make customer engagements more intentional. She said:

"The age of random applications of AI, automation and data is over as customers and marketers alike have heightened expectations for engagement rich with intentionality and notable outcomes. CMOs and their teams deserve marketing technologies that meet this new era of intelligence head on with platforms that deliver context and understanding of both the customer and the business, drawing from data and insights from across the organization and across the digital and physical reality of the customer."

Research: B2B Marketing for the Enterprise: HCL Unica | Constellation ShortList™ B2C Marketing Automation for the Enterprise | Constellation ShortList™ B2B Marketing Automation for the Enterprise

Features of HCL Unica+ include:

  • A set of autonomous AI agents including Segmentation Agent, which produces personalized offers, Content Optimizer Agent, which automates content for contextual engagement, and Insights Agent, which manages campaign performance.
  • MaxAI Workbench, which gives teams the ability to build custom AI models for campaigns and audience scoring. MaxAI is an assistant designed to efficiently design, refine and execute marketing strategy.
  • Digital Body Language, which is a tool to read digital behavior and gauge intent.
  • Real-time personalization, which optimizes messaging.
  • Customer One view, which unified customer data into one profile.
  • Guardrails for privacy, compliance, safety and responsible AI.

The launch of HCL Unica+ will be followed up with a webinar Tuesday June 3. Iyer and Dario Debarbieri, Chief Marketing Officer, outlining and demonstrating the new features.

Here are a few screenshots of HCLUnica+:

Data to Decisions Marketing Transformation Chief Information Officer Chief Marketing Officer

Hitachi Digital Services: A deep dive on what it does, IT, OT, AI strategy

Hitachi Digital Services: A deep dive on what it does, IT, OT, AI strategy

Hitachi Digital Services is looking to leverage its ability to integrate operations technology (OT) and information technology (IT) combined with industry domain knowledge, data and artificial intelligence (AI) know-how, and cloud expertise to drive growth.

In many ways, Hitachi Digital Services (HDS) is reintroducing itself to the technology industry following its late-2023 spin-off into an independent subsidiary of Hitachi. Here’s a look at HDS and everything you need to know from its first annual US Analyst & Advisor Day, May 20–21, 2025, in Frisco, Texas.

Background

HDS sits within the Japanese conglomerate Hitachi, which was founded in 1910 and has a long history in IT, R&D, and mission-critical systems across multiple industries. HDS was previously part of Hitachi Vantara before being spun off into a separate entity in November 2023. HDS focuses on cloud, data, Internet of Things (IoT), AI, and integration of OT and IT. Hitachi Vantara is focused on storage systems.

For the parent company, HDS is part of Hitachi’s Digital Systems & Services group. Toshiaki Tokunaga, CEO of Hitachi, is betting on a new management plan called Inspire 2027 that will drive growth for the conglomerate.

In April, on Hitachi’s fourth-quarter earnings call, Tokunaga said the company can leverage its ability to offer IT, OT, and products together to “demonstrate our unwavering commitment to transforming Hitachi into a digital-centric company.” Hitachi’s strategy is to remain decentralized but leverage a digital core to create a more unified company across its Energy, Mobility, Connective Industries, and Digital Systems and Services units. Digital Systems and Services accounts for 28% of Hitachi’s revenue.

Hitachi’s operating model, called Lumada, has been in place since 2016 and now has had a few versions. Lumada 3.0 aims to combine Hitachi’s domain knowledge and capabilities and turbocharge them with AI.

A screenshot of a computerAI-generated content may be incorrect.

Understanding where HDS sits in the company highlights how it has R&D assets and capabilities across the parent to deliver cutting-edge systems. Nevertheless, HDS isn’t exactly a household name in North America, which represents a small portion of Hitachi’s overall $65B revenue.

Nevertheless, Hitachi’s technology is ubiquitous in global railways, hospitals, financial services firms, and other places. HDS CEO Roger Lvin boils down the company’s mission: “If I distill our mission: We build, integrate, and operate mission-critical applications.”

Hitachi Digital Services CEO Lvin on AI transformation, operations technology, and use cases

“These mission-critical applications, often infused with what we refer to as the Japanese quality and Japanese process things, cannot go down and have real-life implications when they do not work,” says Lvin.

 

A screenshot of a computerAI-generated content may be incorrect.

AI strategy

HDS offers multiple services such as advisory on processes, cloud migrations, and transformation roadmaps; smart enterprise technology for manufacturing, IoT, AI, and connected digital solutions; transformation services such as consolidation, migrations, process optimization, system integration, and automation; and managed services for applications, cloud, incident management, and other areas.

But the vendor conversation today starts with the AI strategy. HDS executives briefed analysts on multiple parts of their business and key topics ranging from IoT, Industry 5.0, and ERP, but AI is the connective tissue across the company as well as parent Hitachi.

HDS’s AI strategy revolves more around architecture and bringing proofs of concept to production for mission-critical applications. The strategy also aims to weave AI into operations throughout the Hitachi conglomerate.

What sticks out about HDS’s AI strategy is that it is decidedly practical and potentially future-proof, given its emphasis on architecture. AI isn’t about buzzwords but, rather, new tools for mission-critical applications.

Prem Balasubramanian, CTO of HDS, said the company didn’t want to set up a separate AI team, because it wanted to go with a holistic approach. “We wanted to establish a team that works with every facet of this company, integrating AI into the workforce and integrating AI into what we build,” he said.

Focused on taking proofs of concept into production, Balasubramanian said, HDS didn’t want to chase frameworks for technologies that would be commoditized like retrieval-augmented generation (RAG) or even models. HDS’s strategy revolves around R202.ai, a set of prebuilt and reusable AI libraries and blueprints; responsible AI tenets; and HARC for AI, an end-to-end observability, security, and governance platform.

A diagram of a pilotAI-generated content may be incorrect.

Hitachi Application Reliability Centers (HARC) for AI was announced in April as a service to help enterprises run AI and generative AI (GenAI) applications more reliably and efficiently. HARC for AI is designed to keep costs in check, limit performance degradation, and provide oversight of models.

Reliable, Responsible, Observable and Optimal AI, or R2O2.ai, was launched in late 2024 as a framework to bridge the gap between proof-of-concept projects and scalable AI deployments.

“We firmly believe that when you take an AI workload from a proof of concept and you really want to productionize it in an enterprise, you need to ensure that it’s responsible. You need to ensure it’s reliable. That the answers are correct consistently. You need to ensure your explainability and auditability, which is observability, and you need to ensure you’re spending the right amount of money on this,” Balasubramanian said.

Architecture is also critical, because it’s essential when enterprises adopt AI agents. Balasubramanian argued that agents aren’t a separate category as much as a connector to existing systems. “The bulk of agentic AI is existing systems, and you have to integrate agents and AI into them to get more revenue, acquire customers, and retain them," said Balasubramanian. “Our view of agents is that it’s just about technology. We will use all the technology available, but we think of it in domains. One is industrial, vertical use cases. Another is applications and how they interact with the real world. Architecture is going to be the future.”

Balasubramanian said business value will be driven by delivery of applications that use AI agents seamlessly.

Chetan Gupta, a research and development leader at Hitachi, said the company’s research priorities revolve around simulation, reinforcement learning, and industrial and enterprise transformation.

“We believe AI essentially is a tool to transform enterprise operations and industrial operations, and that’s what we will enable,” said Gupta. “So the way we do things today will be different from the way things will be done tomorrow.”

The ultimate challenge is moving from proof of concept to production. For that, HDS is focused on training industrial models for specific use cases and reliability.

As for partnerships, HDS has worked with Nvidia on multiple use cases across logistics, manufacturing, energy, and transportation and is vendor-neutral.

A screenshot of a computerAI-generated content may be incorrect.

The cloud imperative

Balasubramanian walked analysts through the company’s cloud partnerships with hyperscale giants such as Amazon Web Services (AWS), Google Cloud, and Microsoft Azure; case studies; and use cases.

“Every use case that we’ve touched upon, we’ve had cloud data in it,” said Balasubramanian, who noted that HDS is running cloud services for multiple enterprises. HDS was able to save more than $20 million a year optimizing a large pharmaceutical company’s cloud infrastructure.

HDS will run cloud infrastructure, optimize it, and often add value with custom applications, said Balasubramanian. HDS has also moved multiple state and federal government agencies onto AWS FedRAMP cloud. In addition, HDS operates in the background for vendors servicing government customers.

Balasubramanian’s big takeaway is that cloud migrations are continuing. “We’ve got what we call a sprint to clouds. It’s essentially a way for us to quickly assess and help migrate to the cloud. We use some accelerators and some products that we work with,” said Balasubramanian. However, there’s less lift-and-shift and more modernization projects to move applications to the cloud to take advantage of AI agents.

A screenshot of a computer applicationAI-generated content may be incorrect.

Chris Ansert, executive manager of North American Quality Systems and Technology Platforms at Toyota North America, walked through the automobile manufacturer’s Quality1 program, which is a platform to ingest data about product quality issues and resolve them.

The project, which uses HDS services and AWS cloud infrastructure to modernize legacy systems, connects R&D, production engineering, purchasing, production, sales, and service functions in North America to create a feedback loop for quality issues.

Industry use cases

HDS’s secret sauce is integrating OT, IT, and industry use cases. With its domain expertise, HDS can apply AI and the latest technologies to manufacturing, supply chains, and transportation networks.

Ganesh Bukka, global head of HDS’s Industry 4.0 business, outlined the company’s thinking on industry use cases and how they failed to scale. Bukka’s talk revolved around whether Industry 4.0 was a brilliant failure that set up the next evolution of technology.

“For the last decade or so, we talked about Industry 4.0 and every manufacturer was going to revolutionize assets and manufacturing. And this is not just manufacturing. It could be anything in the asset-heavy or even some cases in the asset-light industry. A lot of these initiatives did not scale beyond the pilots,” said Bukka.

Why? Siloed processes proliferated due to digital initiatives, interoperability was challenging, and data and AI skills weren’t developed. Security and culture were other big issues, added Bukka.

A diagram of a companyAI-generated content may be incorrect.

“Industry 4.0 was all about technology, but the problem is that IT teams built OT and refused to acknowledge what OT teams wanted. IT built great dashboards, which could give you intelligence, but it could not act autonomously from that intelligence and put it into actions,” explained Bukka.

Bukka argued that HDS is in a unique position for Industry 5.0 applications, given that it’s a system integrator with expertise in melding OT, IT, and complex engineering. .

For the next generation of industrial applications, Bukka said there are five success factors:

  • Human collaboration. Human/AI collaboration will be an important element, given that so many AI agents will come online.
  • Industrial AI. Hyperpersonalization driven by AI will be critical for creating digital operators.
  • Industrial edge computing.
  • Industrial metaverse via digital twins, data, and AI.
  • Sustainability.

Hitachi launched a digital factory in Hagerstown, Maryland, to build railcars. The institutional and process knowledge from other plans was incorporated into GenAI models, robotic models, and other models. The factory, built via a collaboration between Hitachi Rail, HDS, and Hitachi R&D will be a showcase for the latest manufacturing technology and innovation.

A screenshot of a computerAI-generated content may be incorrect.

Customers

HDS held multiple customer panels over the two days as well as breakout sessions for automotive, IT operations, cybersecurity, and enterprise applications. Some high-level takeaways:

  • Penske Transportation Solutions outlined a project with HDS to create a proactive diagnostics model that predicts vehicle failures. The project reduced downtime for a fleet of more than 400,000 vehicles.
  • Toyota Motor North America cited a collaboration on building a quality knowledge center using vector databases and AI models. The platform supports technical assistance for difficult repairs, reduced cycle time, and improved customer service.
  • HDS and Verizon Business have a collaboration on healthcare IT and applications, with plans to move across all industries.

Here’s a look at HDS’s reach across the automotive industry, followed by healthcare/life sciences.

A chart of company logosAI-generated content may be incorrect.

A screenshot of a computerAI-generated content may be incorrect.

Takeaways

  • HDS’s expertise in mission-critical applications and IT/OT convergence could give the company a competitive edge as AI evolves from horizontal use cases to vertical ones.
  • As AI strategy and implementation become a boardroom issue in manufacturing, healthcare, life sciences, and energy, HDS’s approach could be valuable.
  • What remains to be seen is whether HDS can leverage its conglomerate roots seamlessly while remaining focused on its core mission to bring unique value to enterprises.
Data to Decisions Future of Work Next-Generation Customer Experience Tech Optimization Chief Information Officer

JPMorgan Chase’s IT, AI bets: Where the returns are

JPMorgan Chase’s IT, AI bets: Where the returns are

For JPMorgan Chase, the investment in technology and AI will never reach the finish line. Think of transformation as an ongoing project.

JPMorgan Chase has 84 million US customers and $4 trillion in assets under management. The company's approach to data, modernization and artificial intelligence has been worth watching due to the bank's scale, investment in technology and approach to AI.

Recent history:

Jamie Dimon, CEO of JPMorgan Chase, said during the bank’s Investor Day that the transformation work is never finished--and neither is the spending on technology. "It's table stakes. It will be for the rest of eternity. So our tech spend, I think, is, call it, 10% of revenues which is less than most other companies by the way," said Dimon. "In my experience, I think people make a mistake like somehow you're in one transformation and when you get through it, you're done. I've been doing this for a while and I've been through transformation after transformation after transformation, and we're learning. The hardest part is getting data into the form where it can be used properly and where these things belong."

Get the Constellation Insights newsletter

As a result, JPMorgan Chase is spreading its tech bets. "We're building our own cloud-based data centers. We have our virtual servers. We are using all these other folks. We're going to be quite cautious on software-as-a-service, how we deal with cloud providers. I don't mind doing anything ourselves," said Dimon. "I think the mindset should be that whenever management teams meet, you're talking about what you need to do in technology writ large to do a good job for your client. We talk about AI all the time at every different level."

CFO Jeremy Barnum said JPMorgan Chase will spend $18 billion on technology in 2025, up $1 billion from 2024. Barnum did say that the company's modernization investment has peaked.

"We are now probably past the point of peak modernization spend, resulting in a tailwind this year that is funding some of our ongoing investments in products, platforms and futures. And we do continue to see volume growth across the company drive some increased hardware and infrastructure expense. This in turn is partially offset by efficiencies," he said. "The majority is spent on products, platforms and features."

Barnum said this move from retiring technical debt by moving to the cloud and modern infrastructure sets up its AI strategy. JPMorgan Chase has about 65% of its workloads on the public or private cloud, up from 50% a year ago. "If we include the applications that run largely on virtual servers, that number increases to 80%. In addition, we have almost completed the application migrations for our largest legacy data centers and we are in the process of dismantling the physical infrastructure in those sites," said Barnum. "This progress in our modernization efforts continues to deliver significant engineering efficiencies, which we see through ongoing improvement in our speed and agility metrics, but we can't afford to fall behind."

Specifically, JPMorgan can't fall behind in AI and cloud is making it easier to deliver features faster. Here's a look at where JPMorgan Chase is getting the most bang for its AI dollar.

AI coding assistance by software engineers. Barnum said the accelerated adoption of AI coding is promising. "On a personal note here, I'll say that I've recently been indulging in what I've come to learn is known as “vibe coding,” a little bit, and it's actually pretty amazing," he said. "And from what certain of my colleagues tell me who are actually trained, professional computer scientists, it actually helps them quite a bit, too, with their efficiency. It's not just the amateurs who are helped by these tools. It's amazing stuff and we have high hopes for the efficiency gains we might get."

AI for operational efficiencies. Barnum said that a big AI use case is in the call center where algorithms can help agents anticipate and respond to questions faster. 

Democratized efficiency. JPMorgan Chase has a generative AI platform that's model agnostic called LLM Suite. More than 200,000 employees globally have access and are gaining several hours per week doing less valuable tasks. "We are starting to see a number of “citizen developer” use cases go into production. While we've made substantial progress over the last decade, we are still in the early stages of our AI journey. We are focused on modernizing data, investing in scalable platforms and being at the forefront of innovation as technology evolves, positioning the company for sustained future success," said Barnum.

Digital engagement. Marianne Lake, CEO of Consumer & Community Banking at JPMorgan Chase, said the unit has been investing in tech, data, and AI to drive customer experience and productivity. "We estimate spend of about $9 billion on tech, product and design this year, moderating to a 6% growth rate year-on-year. $7.4 billion of this is in tech, about 10% of revenue," she said.

Lake added that JPMorgan Chase is also deploying AI to boost card servicing. The company has boosted its product velocity with AI. "We have increased code deployments by more than 70% over the last two years and improved the quality of product delivery over the same period with a 20% reduction in work being replanned. Our investments this year have more than two times return on investment and continue to pay back within five years, and our investments in AI/ML delivered a 35% increase in value last year," said Lake.

Here's a look at Consumer & Community Banking's tech spending plans.

Umar Farooq, Co-Head of J.P. Morgan Global Payments, said: "We are laser focused on providing the absolute best digital experience to every single client segment," he said. "We are really focused on building digital experiences that are targeted to specific segments like technology startups."

Process automation. Lake said improvement in operations have kept expenses flat in her unit over the last five years. Accounts per serviced ops headcount are up 25% due to improving self-service options for companies. Servicing calls per account costs are down nearly 30% per account and processing costs are down 15%. "While AI has definitely contributed here, a lot of this is good old-fashioned process automation and organizational efficiency," said Lake.

Fraud detection and deterrence. Lake noted that JPMorgan Chase is seeing a 12% compound annual growth rate in attacks, but the company has held the cost of fraud flat due to AI tools.

Traditional machine learning and AI. JPMorgan's Lake said the bank expects big productivity gains over the next five years, she said it's worth noting that traditional models are a big reason. She said:

"We have a very rich and valuable tapestry of data. And despite the step change in productivity we expect from new AI capabilities over the next five years, we have been delivering significant value even with more traditional models and the value we're delivering is growing exponentially. I point that out for two reasons; one is that, not every opportunity requires Gen AI to deliver it, and we are “all systems go” already; and second, we are well on our way, modernizing our data to make it more efficiently consumable and machine readable."

Data improvements. Lake said that moving to the cloud has improved storage and compute efficiency, but the bank is spending on improving data management. "Our data needs to be in our target platforms. We're about halfway through that journey, and making data truly fit-for-purpose will include a subset that will need to be streamed real-time, and we've made significant progress here, in particular, for servicing and personalization. There's still a way to go, but we are delivering significant value," said Lake.

Here's a look at the data flywheel Lake highlighted.

Doug Petno, Co-CEO of Commercial & Investment Bank at JPMorgan Chase, said the unit has more than 175 AI use cases in production looking to leverage its data to feed models.

Farooq added that his unit is leveraging its data assets. "We have been building and have completed building a cloud-native data infrastructure and are utilizing AI and machine learning models for everything, from prospect qualification to transaction screening and operations," said Farooq. "The operational efficiencies our data platform has allowed us to capture with AI models are truly impressive. In the last few years, our transaction volumes have gone up by more than 50%. At the same time, our AI models have allowed us to cut manual exceptions by more than 50%, delivering significant operating leverage."

Trading. Mary Callahan Erdoes, CEO of Asset & Wealth Management at JPMorgan Chase, said the company has been "fortifying and using AI on our trading desks for the past eight years." She added that JPMorgan Chase trades about $260 billion in volume daily and hit $500 billion in early April. "AI is not just a tool, it's reimagining workflows and it's changing the loading capacities for thousands of people on the frontline and in the back," said Erdoes.

She pointed out Smart Monitor is a tool that uses AI to find stocks, absorbs call reports, stock moves and ratios to highlight trades. Connect Coach is another feature that anticipates next best action for trades.

Data to Decisions Next-Generation Customer Experience Tech Optimization Chief Executive Officer Chief Information Officer Chief Data Officer

Dell Technologies continues to ride AI infrastructure wave with strong Q1

Dell Technologies continues to ride AI infrastructure wave with strong Q1

Dell Technologies reported strong first quarter results and provided a solid outlook due to strong demand for AI infrastructure.

The company reported first quarter earnings of $1.37 a share on revenue of $23.4 billion, up 5% from a year ago. Non-GAAP earnings were $1.55 a share.

Dell was expected to report first quarter earnings of $1.69 a share on revenue of $23.2 billion.

CFO Yvonne McGill noted that Dell's non-GAAP earnings grew three times faster than revenue. Chief Operating Officer Jeff Clarke said the company saw strong AI demand. "We're experiencing unprecedented demand for our AI-optimized servers. We generated $12.1 billion in AI orders this quarter alone, surpassing the entirety of shipments in all of FY25 and leaving us with $14.4 billion in backlog," he said.

Indeed, AI infrastructure carried the quarter for Dell. The infrastructure solutions group (ISG) delivered operating income of $1 billion on revenue of $10.3 billion, up 12% from a year ago. Servers and networking revenue was a record $6.3 billion in the first quarter, up 16% from a year ago.

Clarke said:

"We experienced exceptionally strong demand for AI-optimized servers, building on the momentum discussed in February and further demonstrating that our differentiation is winning in the marketplace. Our pipeline continued to grow sequentially across both Tier 2 CSPs and private and public enterprise customers - and remains multiples of our backlog. Enterprise AI customers grew again sequentially with good representation across key industry verticals."

Clarke did note that demand and shipments are likely to be lumpy for the foreseeable future.

For the PC unit, also known as the client solutions group (CSG), revenue in the first quarter was $12.5 billion, up 5% from a year ago, with operating income of $653 million. Commercial client revenue was $11 billion, up 9% and consumer revenue was down 19% to $1.5 billion. Dell is primarily focused on business PCs.

As for the outlook, Dell projected second quarter revenue of $28.5 billion and $29.5 billion, up 16% from a year ago. Non-GAAP earnings will be $2.25 a share. For fiscal 2026, Dell projected non-GAAP earnings of $9.40 a share on revenue of $101 billion to $105 billion.

McGill hinted that the outlook could turn out to be conservative but the economy is the big unknown. "We’re optimistic on our portfolio and our ability to execute - however, we want to be thoughtful of how customers think through their IT spend relative to the macro environment," she said.

Holger Mueller, an analyst at Constellation Research, said:

“Dell had a good quarter, benefitting from the inference demand of AI. That's easy to see as servers and networking were up respectable 16%, storage barely beat inflation at 6%. If Dell customers were training AI models locally we should see more storage sales. Or Dell is not participating on the increased demand for data lakehouses. We will know more in the next quarter.”

Data to Decisions Tech Optimization dell Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

C3.ai bets demonstration licenses deliver future growth

C3.ai bets demonstration licenses deliver future growth

C3.ai is betting that it can land more enterprise customers with demo licenses issued to partners that highlight its AI applications and ultimately turn into long-term deals.

The company, which reported better than expected fourth quarter results, derived $33.8 million in demonstration versions of C3 AI applications out of total revenue of $108.7 million, up 26% from a year ago. C3.ai reported a net loss of 60 cents a share and an adjusted net loss of 16 cents a share. Annual revenue was $389.1 million, up 25% from a year ago, with a loss of $2.24 a share (41 cents a share non-GAAP).

CFO Hitesh Lath noted that C3.ai sells those licenses to distribution partners to demonstrate the software possibilities to customers and "accelerate AI adoption."

C3.ai has invested heavily in partnerships with the top cloud providers--Amazon Web Services, Microsoft Azure and Google Cloud--as well as systems integrators. Professional services revenue was $21.4 million, and $17 million of that sum was Prioritized Engineering Services, which "are undertaken when a customer requests that we accelerate the design, development and delivery of software features and functions that are planned in our future product road map," said Lath.

These demonstration licenses are a twist on what Palantir does with its bootcamps. The idea is to get enterprises to try the AI applications, see the value and then accelerate adoption. Going forward, Lath noted that Prioritized Engineering Services and subscriptions will be about 90% of revenue.

CEO Tom Siebel said the demo licenses were an investment into scaling C3.ai's applications. Siebel, who said he is back in the fold after health issues, said on the company's earnings call that C3.ai is using hyperscale cloud providers as a sales force multiplier.

He said:

"We have tens of thousands of salespeople at Azure. I believe tens of thousands of salespeople at AWS. Thousands of salespeople at GCP. They have lots of products to sell in their bag, and it's very confusing, so we need to make it simple. So in order to make it simple for them, we invested in building demo applications that run and take advantage of the full utility of the Azure stack or the AWS stack or the GCP stack. So these people in Frankfurt and Munich and Detroit and Madrid and Moline can go into their customer and give a demo of a complex application to customers show them what the economic benefit is of supply chain optimization, of demand forecasting, of predictive maintenance."

"We've done good work at arming our partners with demonstration licenses. Think about that as an investment in future growth."

C3.ai also providing demonstration licenses to customers. "We sold demonstration licenses to our customers. Why would we do that? Because Dow Chemical or Shell or Coke or Cargill or whoever or the United States Air Force, whoever it may be, they have a hugely successful application and they want to encourage others to use these applications. For example, the Air Force has 22 platforms today, and they want to deploy the application across 44 platforms," said Siebel.

For C3.ai, these demo licenses accelerate adoption, ease the change management and then convert to regular subscriptions. The bet for Siebel is simple: Turn those demo licenses into joint sales calls with much larger cloud vendors and then do deals quickly because enterprises already have master agreements in place. "It takes two to five months out of a contract negotiation process and accelerates the sales cycle," said Siebel.

Other notable items from the C3.ai earnings call from Siebel.

  • "We have, depending on how you count, someplace between 20 and 100 agentic AI solutions out there in production, in the hands of happy customers. And if we were to spin that business out, just that business out, and take it to a Andreessen Horowitz or a Bessemer or Nvidia or whatever it is, that business alone would be valued at multiples of where C3 AI trades today and we all know that's a true statement."
  • "One of the most notable achievements in Q4 was the renewal and expansion of our strategic partnership with Baker Hughes. This alliance, which began in 2019, has been a cornerstone of our success in the oil and gas sector, generating over $0.5 billion in revenue from this vertical and the chemical markets. The renewed agreement underscores the proven value we deliver through our joint efforts, enhancing efficiency, safety, reliability, and sustainability across upstream, midstream, and downstream operations."
  • "I did get slowed down for a little bit. There's no question about it. And I had to work from home for a little while and take it easy and recover, but I will catch a red eye to Washington, D.C. tonight. I will be in Washington, D.C. again for three days. I think 10 days from now after attending a wedding in Cabo. So, just when you thought it was safe, I'm back."
Data to Decisions Next-Generation Customer Experience Tech Optimization Chief Information Officer