Results

Google Cloud CEO Kurian on agentic AI, DeepSeek, solving big problems

Google Cloud CEO Thomas Kurian said AI agent interoperability is critical to link process and workflows, outlined the sovereign AI strategy and said the company is building an agentic AI ecosystem.

Speaking at an ask me anything session with analysts, Kurian said the following at the end of day 1 at Google Cloud Next.

Scaling the organization while looking around corners. "Our strength as an organization has already come from one understanding deeply what customer problems we're trying to solve," said Kurian. "We are very careful, given the number of problems out there, to focus on a very specific set of problems. We focus on a few important ones."

Building the go-to-market ground game. Historically, Google has been a company that has had great technology, but difficult to buy and deploy. That vibe has changed as Google Cloud has built an ecosystem. Kurian said Google Cloud can appeal to multiple CxOs. Kurian said: "If you look at AI, much of the buy decision is not in IT. It's outside it. How you engage not just in IT, but outside, and how you can build a solution portfolio that our team will sort of take market is constantly evolving."

Process flow and accuracy. Kurian was asked about the success of agents given hallucinations and how enterprises may be reluctant to hand off processes to AI. "We've taught people to decompose. Take a process and break it into a set of sub steps. We need people to go through each step and not design one agent. You need to have that ability to view each step and prepare the accuracy of that step," said Kurian. "Enterprises want predictability."

Starting with agentic AI. Kurian said companies just starting out need to start with the business problem to solve and measure the result. Data quality feeding that AI agent is also critical. There are also industry considerations. Workflows and security are also issues. Kurian said: "There's a lot of specific guidance whenever we talk to customers. The best proof point is always look at others in the industry and what they may have done so that you can actually face the timescale you think you need to change the organizational change management."

Why Kurian seems happier? First, Chappell Roan called him back (even if she isn't performing at Next). That call was noted during the keynote. "I've always been proud of our team. Many people thought we would not succeed in it. Many people thought that we would never be successful outside of the consumer domain. I'm proud of the team and the resilience we've shown."

Solving big showcase problems. Kurian said "The Wizard of Oz" project is an example of how Google Cloud likes to participate in big projects, but let the customers tell the story. "I don't think people realize Dorothy shoes only show up in eight frames so tuning a model takes a lot of work to improve the model. We chose to work with that because if you can do this here, you can do pretty much anything else," said Kurian. Google Cloud is also working on similar tough projects with hedge funds, healthcare, life sciences and manufacturing. "We like to change the way those industries work," he said.

DeepSeek. Kurian: "The industry tends to rotate every few months because a bunch of articles that were written that may not be accurate. They've done some really good things. If you look at the cost of our models, I think people misinterpret why we didn't say anything. We're very confident that the actual training cost for a model from us is comparative to anybody out there."

"We always say there's a lot of debate on these topics. And if you actually sit down with the DeepMind guys who are working on it, I think you see the reality of some of the numbers. That's why, when we don't say something, it's not that we are surprised and it's obvious what they did. Credit to them for doing it."

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Walmart's tech bets improve resilience as it reaffirms Q1 outlook

Walmart reaffirmed its first quarter sales growth outlook; said its e-commerce business is on track to be profitable and highlighted technology initiatives that have made the company more resilient in a volatile economy.

At Walmart's investor meeting, Walmart CEO Doug McMillon said the company's investments in customer experience, technologies and a business model that has added higher margins have enabled the company to push its advantages. "While in the short term we are not immune to some of the effects businesses face in today’s operating environment, we are uniquely positioned to play offense," said McMillon.

Walmart said it expects its first quarter sales to be in line with its 3% to 4% outlook. In a presentation to analysts, Walmart executives noted that Walmart is technology powered and is leveraging AI on multiple fronts to be more efficient.

A diagram of a company

AI-generated content may be incorrect.

Some examples of how technology investments have paid off include:

  • Walmart has integrated physical and digital experiences to gain share and reach consumers via multiple channels.
  • Digital businesses such as Walmart's advertising and data units have improved margins.

A blue background with text and icons

AI-generated content may be incorrect.

A diagram of a company's marketing strategy

AI-generated content may be incorrect.

  • CFO John David Rainey said the company has leveraged data and automation in its supply chain to manage inventory better and cut costs. In the US, half of e-commerce fulfillment center volume flows through next-gen fulfillment centers and more than half of stores receive freight from automation.
  • Digital commerce best practices from the US are being deployed internationally and vice versa.
  • Sam’s Club U.S. President and CEO Chris Nicholas said 40% of total transactions are digital including scan & go and online sales.

A person standing in a store

AI-generated content may be incorrect.

Data to Decisions Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Future of Work New C-Suite Digital Safety, Privacy & Cybersecurity B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI ML LLMs Agentic AI HR HCM Metaverse developer SaaS PaaS IaaS Quantum Computing Healthcare VR CCaaS UCaaS Chief Information Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer Chief Information Security Officer Chief AI Officer Chief Analytics Officer

An (Alternative) Method to the Tariff Madness

The new tariff regime instituted by the United States over the past couple of weeks is creating uncertainty in the executive world.  Questions on how it affects an industry abound and answers are not plentiful.

Two camps have clearly emerged, with little room in between for indecision: pro tariffs and against.  My friend and colleague Ray Wang wrote a convincing case for the tariffs and the downstream effects they could have for the American economy.  Only time can tell if this method he proposes will play out, and I cannot say with certainty it will not. 

I do have an alternative model, one that continues what has worked well in the past 65+ years.  There are four reasons why continuing the existing model is a better bet:

  1.  Globalization has not failed (including the WTO).  It is not a movement intended to level the playing field for everyone involved, it leverages what every country does best and promotes and rewards those that focus on that. 

No single country in the world can ever do everything great: a rising middle class does not want to sit in a hot, dirty, dangerous coal mine or manufacturing facility to earn the same or less they could make in a service job.  They prefer to code a social-media app while sipping matcha smoothies and playing foosball. 

This is an intrinsic part of the evolution of the United States in the past few decades.  Trade imbalances are not only acceptable but also expected for the American economy.  Countries that can manufacture better, cheaper, faster than us, should – and we will buy their output because our time is better spent creating better profits via service solutions.

  1. The United States is no longer a manufacturing economy. One hundred 100 years ago, manufacturing was the key to the American power in the world.  Starting in the early 1900s, our innovations in manufacturing led the world to better and more precise tools and processes, ending with the hegemony we displayed during WW2 with almost impossible production of war machines and supplies.  Post WW2, manufacturing led the growth of the American economy and the creation of a middle class that remains to this day a model for developing countries to follow. 

In the 1980s, this middle class shifted towards consumerism, at the same time the manufacturing class was beginning to decay in quality and abilities.  New manufacturing prowess emerged (mostly) in Asian countries, with Taiwan and Japan leading the way.  We conceded that role of leading the world in manufacturing to shift towards a service economy that produces more than 100 times more revenue, and 50 times better margins.

This is our new reality, and the reason we are seen as the leader in technology, software, services, and even distribution of American non-durable goods and services around the world.  Our economic power today is predicated in outsourcing the low-level, thin-margins, exhausting and un-interesting jobs to places better suited for them (developing countries that are where we were 100 years ago) in exchange for a focus in better jobs with better revenues. 

  1. Consumer sentiment is the basis for the new economy.  While I admit to a nostalgic bend in my life, it is not the foregone days when we could manufacture cars and refrigerators better than anyone. The interesting part of the economic shift over the last 40-50 years is not unique to the United States.  Consumerism is a global trend, and one that newly emerging middle classes in places like Nigeria, India, China, and other up-and-coming countries quickly embraced as they grown.  I am very happy, as are most Americans, with the ability to buy a new, bigger, better TV for a low-price every few years.  This shift to consumerism must be fueled by cheaper manufacturing.

Any economist will tell you that consumer sentiment, and the accompanying spend into a consumerist model, is what powers economies today. A service economy produces better margins, more disposable income, and better living conditions that in turn require an environment where those gains can be spent forwards a better life.  That Is the power of the middle class.

  1. History has proven tariffs to not work.  The US tariff regime of the 1930s, implemented to help pull the us out of the grand depression, took almost 80 years to fully unravel.  It also did not have the intended effect: while it worked for a while, long-term growth was driven by a service economy and outsourcing of manufacturing, not the return of manufacturing to replace imports.

Economic models shifted dramatically in a world where globalization drives economic growth, and people out of poverty, around the world.   The pandemic left us without a “normal” model of operation, but a return to nostalgic models that were proven wrong before is not the answer.  The trade imbalance in durable goods manufacturing is replaced by a trade surplus in services that is magnitudes bigger than any potential increase in revenues from manufacturing.   With better margins.

Trying to recreate the infrastructure we need to become a manufacturing nation again is prohibitive in cost, time, and resources.  We won’t find the workers willing to take the jobs at the prevailing wages.  We won’t find the machinery in time to replace what is intended to being replaced.  We won’t find the know-how. We can do it, but it is not the way.  Not following a model that has proven to not work before.

As I said, only time will tell which model is proven right.  I don’t believe tariffs are the way, nor do most economists and experts. 

Using tariffs to bring trading partners to the negotiating tables? Well, it seems to have worked – but as always, there are better methods grounded in diplomacy and communication over exerting muscle.

No?

New C-Suite Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Revenue Officer Chief Supply Chain Officer Chief Sustainability Officer Chief Technology Officer

The AI Race, CRM Composability, AI Security, Google Cloud Next | ConstellationTV Episode 102

ConstellationTV Episode 102 just dropped! 📺 Co-hosts Larry Dignan and Martin Schneider kick off with #tech news, including the AI race 🏎️ between Salesforce and ServiceNow and economic uncertainty around #tech spending. 

Next, Martin intros his latest #research on AI and composability in #CRMs, covering the benefits of composable apps and how they enable businesses to make cost-effective decisions and reduce overall costs. 📈 

Constellation analyst Chirag Mehta then unpacks the challenges of securing #AI workloads and understanding AI attack vectors.🔒 His new report on AI security covers mandatory compliance, frameworks for securing AI systems, and the role of marketplaces in supporting AI solutions. 

Larry concludes with his top 5️⃣ takeaways from Google Cloud Next 2025, including Google #Cloud's AI hypercomputer, hashtag#agenticAI developments and horizontal agent integration.

00:00 - Meet the Hosts
00:20 - Enterprise Technology News
09:33 - AI and Composability in CRM
16:43 - AI Security and Marketplace Shifts
21:07 - Google Cloud Next 2025 Highlights

Tune in for a new ConstellationTV episode every two weeks! Get the latest news, research updates, and case study interviews during the fastest 30 minutes in enterprise technology!

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/1hZLRKnOFDo?si=R8YSGDu8lpuvwz0a" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Google Cloud CTO Grannis on the confluence of scale, multimodal AI, agents

Google Cloud CTO Will Grannis said multimodal models including ones learning how to smell, AI agents at scale that will break down enterprise silos and continually optimized compute will usher in new experiences.

Grannis, speaking at an analyst summit at Google Cloud Next, said the company is pacing disruption and transformation based on a confluence of technologies cloud, generative and agentic AI, and multimodal model advances with reasoning

"A year ago, customers talked about getting started with generative AI. This year is about scale and agents," said Grannis. "The work we do is an early signal on where technology is headed. We don't do demos, we do engineering proofs. We are seeing a confluence of events."

More from Google Cloud Next:

Google, through Google Cloud, Google DeepMind and Google Labs, is leaving breadcrumbs on where the enterprise is headed through its research. Here's a look at where Google Cloud is placing its bets and why enterprises should pay attention as they cook up new experiences and use cases.

Multimodality. Grannis said models will continue to improve and they are quickly becoming "able to understand and control tings in the physical world." "Enterprises are dreaming up new scenarios and customer experiences that two to four years ago were impossible," said Grannis, who noted Google is focusing on bleeding-edge multimodal models including Veo 2, Lyria and Chirp.

Scale. Grannis said Google Cloud "thinks about scale up front and tries to bake it in at the beginning." By engineering well in the beginning, new use cases for something like AI agents can emerge faster because you won't have to retrofit an infrastructure in case something scales.

"Agents are in early stages right now and starting to be proven out," said Grannis. "MCP works well but AI agents need to scale out with authentication, stability, reliability."

Google Cloud's Agent2Agent protocol is a step in scaling agents. "Interoperability is super important. We're doing what we did with Kubernetes and TensorFlow--drive the industry forward with partners so a single agent can turn into clouds of connected agents," said Grannis.

What does agentic AI at scale look like? "Think of a procurement agent that can go and negotiate contracts for a customer and do it well if game theory is baked in," said Grannis. Today, that procurement agent will require instrumentation and multiple parties. "Bots are actually customers and partners trying to interact," explained Grannis. "Rethink how to create digital experiences."

Optimization. Grannis said Google Cloud over time is continually refining and improving. For instance, Google is on its 7th version of TPUs in eight years. "We're still getting exponential increases every year," he said.

Grannis said his most underrated announcement at Google Cloud Next 2025 is Cloud WAN, which is a service that makes Google's high-speed, low-latency network available to enterprises. "Cloud WAN is a big deal because it allows you to transit over the high performance network of Google and its consistent over geographies. It's one of the hardest optimizations to run," said Grannis, noting that Google Distributed Cloud with Google AI and Nvidia baked in is critical too.

Silo busting. Grannis said the horizontal approach to AI agents is critical for breaking down silos--data and work streams. "Agentspace is profound because it can create agent AI for everyone. Employees are getting a boost in creativity and usage," said Grannis, who noted that there is a lot of creativity that is pent up and can be freed as agents break down corporate silos.

Already, Google Cloud cited multiple Agentspace integrations and customer use cases. Culturally, Grannis said this silo busting may be the biggest disruption.

 

Data to Decisions Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Lloyds Banking Group bets on Google Cloud for AI-driven transformation

Lloyds Banking Group is building its next-generation machine learning platform on Google Cloud's Vertex AI in a move that will replace legacy systems.

Ranil Boteju, Chief Data and Analytics Officer at Lloyds Banking Group, said the company's previous machine learning and data science platform was on-premise and pushing a decade of use. "We realized we needed to modernize and wanted to move to the public cloud and Vertex AI," said Boteju. Lloyds completed the migration a year ago.

Boteju said the bank's hundreds of data scientists are building models at scale. According to Boteju, the move to Vertex AI is "quite a significant step up in capability," but is also enabling Lloyds to "pursue our agentic AI aspirations as well."

More from Google Cloud Next 2025

According to Boteju, Lloyds is leveraging Vertex AI to accomplish the following:

  • "Enable the whole bank with AI," he said.
  • Build the bank's GenAI Workbench on Vertex AI.
  • Use multiple models via Google Cloud's Model Garden including Gemini and open source models based on ability to handle use cases. "We can select the right LLM for the right task," he said.

Ultimately, Boteju said the plan is to leverage agentic AI. "For the last nine months, we've had a heavy focus on agentic AI capabilities," said Boteju. "There are many use cases in financial services and we think there are at least 50. What we're trying to build is a robust agentic AI architecture that we can deploy against multiple use cases from customer advice to software engineering to claims or underwriting."

Lloyds AI plans part of broader transformation

Lloyds's use of Vertex AI is part of a broader transformation that's focused on driving efficiencies and experiences throughout the company.

Speaking at a Morgan Stanley conference, Lloyds CEO Charlie Nunn said the company hit its targets and was feeling good about its position in the UK. Nunn said the company committed to driving market share and growth, operating leverage and change via technology.

"The most important part is what we call change, so grow-focus-change. And the change is really about building new capabilities, bringing in new technologies, and then driving underlying business and market share growth. We did that across every single part of the bank, and that's the stuff that gets me excited," said Nunn.

Lloyds said it started the first phase of its strategic transformation in 2022 with the aim of returning to growth and adding capabilities. Nunn said the bank focused on people, technology and data and specifically hired engineering talent.

"Our technology strategy over the next two years will have two distinct elements. Ongoing modernization and rationalization of our estate will continue to deliver savings and improve efficiency, providing the capacity for investment in new technologies," said Nunn, who added that the bank had more than 800 AI models live at the end of 2024 and "already launched a significant number of genAI use cases across the group."

From concept to production

Boteju said Lloyds Bank was viewing agentic AI as a concept that was just emerging as recently as last summer. "We went through a process with the Google team on a 12 week sprint to have an MVP (minimum viable product)," he said. "We started with a first step toward exposing an intelligent agent directly to customers to provide financial tips and guidance."

That use case was chosen due to less risk. Lloyds Bank refined the agent with more advice elements that can be used for other use cases, added Boteju.

Boteju said the advice assistant MVP was complete at the end of 2024 and the plan is to have a production agent in place by August or September.

Now that Lloyds Bank engineers have seen what's possible agentic AI capabilities are being used elsewhere.

"For example, we have automated our process to build data products. There's a lot of automation and our engineers have started building on top of that with an agentic approach to make the process more intuitive and easy to use," said Boteju. "Agentic AI has gone from literally PowerPoint slides 12 months ago to MVPs to engineers building agent systems."

Consistency matters

Lloyds Bank's experience with developing agent systems highlights how consistency matters.

Boteju said "it's really important that people use and deploy agents in a consistent way." He laid out the strategy to ensure consistency:

  • Lloyds Bank built a workbench that is consistent across the bank.
  • Give everyone the same set of capabilities and leverage Google Cloud advances to accelerate processes.
  • Deploy and operate guardrails centrally across the enterprise.
  • Create a center of excellence.
  • Invest in skilled users and AI literacy. "We've invested very heavily in AI literacy and data culture targeted at both technicians and business leaders," said Boteju. "On the technician side there's a lot of upskilling about how to use these capabilities. On the business side, the whole point is to reimagine your business."

The workbench needs to facilitate multiple models that are exposed to teams across the bank.

"We focused everyone on building things that scale to ensure reuse and that we focus on the highest value use cases," said Boteju. "We're excited about the next 12- to 18-months as these technologies really start to mature and add value across our bank."

Data to Decisions Innovation & Product-led Growth Matrix Commerce Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Tech Optimization Future of Work Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Lowe's eyes AI agents as home improvement companion for customers

Lowe's is betting that AI agents powered by Google Cloud can become do-it-yourself home improvement companions and offer personalized customer experiences.

Neelima Sharma, SVP Omnichannel and Ecommerce Technology at Lowe's, will be a speaker at Google Cloud Next 2025. Sharma said tools like Google Cloud Agentspace can be the next evolution of customer experience.

"We continue to look at ways where we can build a stronger relationship with a customer," said Sharma. "If an agent is going to be part of that it will be taking machine learning, generative AI, and autonomy to become the home improvement companion for our customers."

Lowe's bet on Google Cloud to be a big part of its digital transformation, which has been chronicled at Constellation Insights. In 2020, Lowe's and Google Cloud expanded their partnership to focus on commerce, merchandising, supply chain and pricing as well as customer experience. The idea was to create a "channel-less" customer experience. See: Lowe's betting on AI to drive customer experience, optimize multiple processes (PDF) | Lowe's bets on AI, technology to navigate slowing demand

Since that expansion, Google Cloud has enabled Lowe's engineering platform as well as its Total Home Strategy.

Google Cloud envisions AI agents as a way to meld what are disparate functions inside a retailer. Agents will carry out specific tasks like marketing campaign creation and research, but the promise of agentic AI is working across functions.

The retail challenge

Carrie Tharp, vice president of solutions and industries at Google Cloud, said the retail industry is frequently under pressure due to shifting consumer expectations, market share shifts and uncontrollable developments like tariffs.

Given the pace of change in the industry it will be critical to have merchandising, advertising, supply chain and other functions connected by agents, said Tharp.

In addition, the customer journey in retail is increasingly longer in retail. "AI has become a critical engagement point," said Tharp. "Five years ago, I would have told you the average touchpoints in the consumer journey was six. We now see up to 10. Even worse we see consumers delaying decisions when they have too much information."

Sharma said Lowe's is looking to use AI, search and multimodal visualization capabilities to provide good experiences for the retailer's associates and customers.

"We started a journey with giving the best experience for our customers. We very quickly followed up with associates who are selling to our customers, and we modernized that. And now we've been focused on modernizing our corporate systems," explained Sharma. "That framework has been extended to AI as well. We have taken a very orchestrated approach towards AI to personalize our customer experience, help our associates while they sell and bring the power of all the data sources together in a conversational style."

The goal: Home improvement partner

Sharma said if Lowe's is successful with AI, it will be able to be "our customers home improvement partner" and help them find exactly what products and services they need for complex projects.

To be that home improvement partner, Lowe's needs to leverage AI as well as its data and machine learning foundation via Google Cloud.

Lowe's has laid the foundation to set up for agentic AI and enhanced experiences. Sharma said the company will be closely watching technology developments to see how it can advance its cause.

"We have more than 50 models in production today and deep learning capabilities including search models, recommendation models, sourcing, demand planning, performance pricing, promo and so on so forth. With generative AI, we are actually bringing more generated content on top of all these models," said Sharma.

Data to Decisions Future of Work Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity Tech Optimization Google Cloud Google B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI SaaS PaaS IaaS CCaaS UCaaS GenerativeAI ML LLMs Agentic AI Chief Information Officer Chief Data Officer Chief Customer Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer Chief Information Security Officer Chief AI Officer Chief Analytics Officer

Healthcare leaders eye agentic AI as next frontier for clinicians, patients

AI agents are likely to be adopted in healthcare as a way to provide patient guidance in any modality and carry out tasks like transportation, appointment scheduling and follow-ups.

Those were a few takeaways from two healthcare leaders during an industry panel at Google Cloud Next. Richard Clarke, Chief Data and Analytics Officer at Highmark Health, and Sameer Sethi, Chief AI Officer at Hackensack Meridian Health, outlined what their organizations have done so far with generative AI and the groundwork in place for agentic AI.

"We're very excited for AI agents directly interacting with our members and patients with guidance that is always on in whatever modality they wish," said Clarke.

Sethi was also upbeat about the promise of agentic AI. "Imagine a patient calling to schedule an orthopedic appointment and also needing a wheelchair, a ride, a pharmacy visit—seven different actions handled by one AI agent," said Sethi.

AI in healthcare is a hot topic given that the industry is facing pressure on multiple fronts. Aashima Gupta, Global Director of Healthcare Strategy Solutions at Google Cloud, said the industry is using generative AI to alleviate administrative burdens such as paperwork and searching medical records.

Gupta said that AI in healthcare is also transitioning from simple chatbots to single purpose agents and ultimately multi-agent systems across multiple departments. The overarching goals are to reduce the burnout in the field and improve patient care.

"GenAI has really evolved from a buzzword to a business essential," said Gupta. "We're seeing a paradigm shift in how we interact with healthcare. Agents represent a strategic opportunity to reimagine care and journeys, underscoring that conversation is becoming the new interface. Our customers are able to reimagine patient care and how it could be personalized for everyone."

With that backdrop, here's a look at some of the takeaways from Highmark Health and Hackensack Meridian Health.

Highmark Health

Adoption across the enterprise. Clarke said Highmark Health already has more than 14,000 of its 40,000 employees regularly using internal genAI tools built on Vertex AI and Gemini. Highmark Health inked a 6-year strategic partnership with Google Cloud in 2020 and the two companies have worked together on multiple projects.

Ambient intelligence. Clarke said ambient listening by AI during patient visits is an underrated breakthrough because clinicians can focus on the person, not the notetaking. "Everything in the ambient listening space has been a true gift to bring joy back to practice for many of our clinicians," said Clarke. This point about ambient listening has surfaced before with healthcare leaders.

Multi-modality matters. Clarke said the ability of models to deliver in multiple ways is critical. "We were stuck on a particular use case and when Gemini 2.0 came out, it kind of made us get over the hump," said Clarke.

The vision for agentic AI. Clarke said the promise of agentic AI rhymes with his ambient intelligence points. The big idea is that AI agents can interact with members and patients to provide guidance in multiple formats on demand.

Cost concerns so far. "There was some concern that our cloud costs would be challenged, but we just haven’t seen that," said Clarke.

Guardrails. Clarke said Highmark Health puts AI projects into shadow mode followed by assisted audit before going into production with automation. "We need logging, monitoring, and governance," he said.

Hackensack Meridian Health

Sethi said his organization's genAI strategy aims at delivering personalized experiences, addressing efficiency, reducing burnout and offering disease prediction and precision treatments.

Hackensack Meridian Health started on BigQuery and Looker with Google Cloud before moving to Vertex AI and Gemini.

He added that there are a few use cases for AI agents that are appealing. "We want to make sure we were focusing on the patient and our workforce," said Sethi, who noted the following use cases:

  • A nurse agent. "The nurse can sift through large amounts of data and instead of going through binders or PDFs, the agent provides that insight directly," said Sethi.
  • Patient agent. This agent would be able to string together multiple tasks that are required during discharge processes or care that usually involve multiple people.

Now that use case for agents isn't ready just yet, but Sethi noted that the innovation is happening at a fast pace. Sethi said that these agents would require guardrails and testing. "We built a whole test suite," he said. "Every time we receive feedback; we add that to our test library and test against it in future releases."

He added that implementing AI requires a heavy dose of process optimization and change management. "The biggest barrier in technology enabling is actually humans," said Sethi, adding that human-in-the-loop is critical to determining how processes can be automated. "Figuring out what should be agentic is the hardest part," he said. "It’s easy to create an agent but identifying the elements where a human has to come in—that's what takes time."

More on healthcare transformation

 

 

 

Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Tech Optimization Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Google Cloud Next 2025: Agentic AI, Ironwood, customers and everything you need to know

Google Cloud announced a bevy of capabilities and services designed to enable multi-agent systems for its customer base, which is increasingly integrating the company’s models, data platform and agent development tools into their stacks.

Google Cloud Next has kicked off with more than 32,000 attendees and more than 500 customer stories. With that backdrop here's a look at the key themes from Google Cloud Next in Las Vegas.

Customers and maturity

Google Cloud CEO Thomas Kurian's keynote includes Verizon, Honeywell, Intuit, Mattel, Reddit, Sphere Entertainment and McDonalds to name a few. Google Cloud is leaning into its impressive customer roster, targeting industries and adding expertise and services to its go-to-market engine. Google Cloud has leveraged its AI knowhow into a seat at the cloud vendor table and now is often an AI layer in multi-cloud environments.

Google Cloud CEO Thomas Kurian said enterprise customers have integrated AI tools into their products and service and are now demonstrating business value. "We have more than 500 customers speaking at Cloud Next, and we're thrilled that these customers will share the real value that they're seeing from the use of our technology. They span a broad range," said Kurian.

The customers cited by Kurian span a wide range of industries ranging from retail to healthcare to financial services. Google Cloud has also revamped its go-to-market approach to focus on industries, domain knowledge and use cases that drive returns. "We found the most important problems that customers care about and we have chosen to focus early on addressing them," said Kurian. "We've been fortunate that we made a bet early on with AI. The AI landscape has matured and we can offer many different pieces."

The showcase customer at Google Cloud Next was Sphere, which along with Google DeepMind, Google Cloud, Magnopus and Warner Bros. Discovery, reinvented the "The Wizard of Oz" with researchers, programmers, visual effects artist, AI engineers and archivists.

The "Wizard of Oz at the Sphere" effort revolved around using AI to recreate the 1939 movie to the Sphere and its sensory experience for a Aug. 28 degree. The companies developed a "super resolution" tool to turn celluloid frames from 1939 into high-definition imagery using Google Cloud models, AI outpainting and models to expand the scope of screens.

A circular object with people in it

AI-generated content may be incorrect.

Kurian and Sphere CEO Jim Dolan said the goal was to honor the integrity of the original film and extended it to a new format. At a private screening event and keynote, Kurian described it as "almost like you were told to do AI and your first project was your PhD thesis, not your undergraduate."

Dolan added that it was the first time "I didn't feel like a customer. I felt like a partner. That's why this worked."

I saw the output from the collaboration and it was striking how much AI didn’t screw up the original.

A screenshot of a movie

AI-generated content may be incorrect.

Google Cloud's AI hypercomputer, custom silicon and integrated stack

Google Cloud announced Ironwood, its 7th generation TPU designed for faster model training and more efficient inference. Ironwood is the headliner with 4x peak compute and 6x high bandwidth memory, but Google Cloud launched storage, inference and AI at the edge advances. Gemini will be on Google Distributed Cloud infrastructure.

Ironwood is focused on powering responsive AI models. Ironwood can scale up to 9,216 liquid cooled chips linked through Inter-Chip Interconnect (ICI) networking. Ironwood also gives developers the ability to use Google's Pathways software stack to combine tens of thousands of Ironwood TPUs.

Key items about Ironwood include:

  • For Google Cloud customers, Ironwood has two sizes for AI workloads--a 256 chip configuration and a 9,216 chip configuration.
  • When scaled to 9,216 chips per pod at 42.5 Exaflops, Ironwood supports more than 24x the computer power of the El Capitan supercomputer.
  • Ironwood has 192 GB per chip High Bandwidth Memory (HBM) capacity.

A close-up of a chart

AI-generated content may be incorrect.

For Google Cloud, integration between its hardware stack is critical. Kurian said Google Cloud's stack is meant to address training as well as inference. In addition to Ironwood, Google Cloud announced:

  • Hyperdisk Exapools, the next generation of block storage with up to exabytes of capacity and terabytes per second of performance per AI cluster.
  • Rapid Storage, a new Cloud Storage zonal bucket that improves latency and features 20x faster data access and 6TB/s throughput.
  • Cloud Storage Anywhere Cache, which reduces latency up to 70% with 2.5TB/s throughput. Anthropic is a big user.
  • A fully managed zonal parallel file system called Google Cloud Managed Lustre and Google Cloud Storage Intelligence for insights specific to a customer's environment.
  • A new GKE AI Inference Gateway to load balance inference requests.
  • Cloud WAN, which provides 40% lower latency and up to 40% savings. Cloud WAN ensures traffic targeting Google WAN enters and exits Google's high-performance network at the geographically closest point of presence.
  • Gemini on Google Distributed Cloud, which will run Gemini models on your on-premises infrastructure. Agentspace and Vertex AI will be available on Google Distributed Cloud.
  • Google Cloud also launched optimized software for AI training and inference including Cluster Director, Pathways, Google's internal machine learning runtime, and vLLM for TPUs, an efficient library for inference that can be used with TPUs across Compute Engine, GKE, Vertex AI and Dataflow.

Here's the overview of what's new on the infrastructure front.

Agent development tools and models

With Google Cloud Next, the company is making it clear that it’s a notable cog in multi-agent systems and development tools. The company launched an open source Agent Development kit and Agent Engine that supports MCP, connectors to many players in the enterprise stack and Agent2Agent, which is a communications standard so agents can communicate.

More details include:

  • Open Source Agent Development Kit and Agent Engine is designed as a combo to enable developers build multi-agent applications. It will be launched in Python and support MCP with other languages being added in the future.
  • More than 100 connectors to access enterprise data and use cases. These pre-built connectors are designed for systems like Salesforce, ServiceNow, Jira, SAP, UiPath, Oracle and others.
  • Agent2Agent enables communication across agent frameworks so agents can securely collaborate, manage tasks, negotiate UX and discover capabilities.

Of those items Agent2Agent (A2A) is the headliner. 

Combined with MCP, Google Cloud's A2A effort highlights how agent interoperability standards have gone from nothing to something viable in just 5 months. Google Cloud said A2A has support of more than 50 technology partners including Atlassian, Box, Intuit, Langchain, MongoDB, Salesforce, SAP, ServiceNow, UKG and Workday. Systems integrators across the board are supporting A2A. 

Google Cloud emphasized that A2A complements MCP because it focuses on AI agent collaboration regardless of their underlying technology. 

Not surprisingly, Google Cloud rolled out a series of models that'll be on Vertex AI. Gemini 2.5 Flash and Pro are designed for complex reasoning and the company added a series of next-gen offerings such as Veo 2.0, Lyria and Chirp 3.

But the most interesting item on the model front was Model Optimizer, an experimental service that applies the best AI model for use case and requirements using Vertex AI's evaluation service. Model Optimizer will allow customers to tailor what models are used to meet a variety of business objectives.

Here’s the rundown:

  • Gemini 2.5 Flash & Pro will be available on Vertex AI. Google Cloud also said that Veo 2.0, Lyria and Chirp 3 will bring new audio generation and understanding capabilities to the platform.
  • Open Source Agent Development Kit, a framework for building multi-agent systems.
  • Agent Engine, which enables customers to deploy agents from any framework to a fully managed runtime.

Agents everywhere

Agentspace is being built out. Google Cloud is using Agentspace to enable companies to build agents and use prebuilt ones too. Agents are being added to Google Cloud's Customer Engagement Suite, Data Cloud for engineering, data science and analytics tools, Security Suite and Workspace.

Google Cloud is ensuring that enterprises have multiple touchpoints to access and procure agents via Agentspace and its broader platform. The company launched:

  • Agentspace Agent Gallery, a curated set of agents from Google, customers and partners.
  • Customer Engagement Suite updates with Google AI to use AI to build AI agents with multimodal and human-like voices.
  • Use case specific agents such as Food Ordering AI Agent specifically designed for quick serve restaurants that operate with multiple languages. Google Cloud also announced Automotive AI Agent, which gives automakers the ability to create and deliver custom in-vehicle assistants.
  • Customer Engagement Suite gets conversational agents in a new unified console as well as AI Coach and AI Trainer to help humans upsell and prebuilt agents for flight booking, ticketing, appointments and shopping assistance.
  • Customer Engagement Suite is getting the ability to use Google's latest voice technologies so agents can sound like a human.
  • Google outlined Idea Generation Agent that runs a tournament of ideas for evaluation as well as Deep Research agent.
  • NotebookLM integration with Agentspace for natural language search results.
  • Developers will get Cloud Hub, which provides top insights across applications and infrastructure, agentic AI application capabilities in Firebase Studio, Code Assist Agents and new tools in Cloud Assist.
  • Data Cloud gets Data Agents tailored to users, Looker and AlloyDB integration with Agentspace, and LLM integration directly into BigQuery and AlloyDB.
  • On the security front, Google Security Operations will get an Alert triage agent to investigate alerts and respond. Google Cloud is also adding a malware analysis agent in Google Threat Intelligence.
  • Workspace will feature a series of agents and Gemini tools throughout the platform.
  • Google Cloud announced Google Workspace Flows to automate work with the help of AI agents.

Here’s a look at how Google Cloud’s agent stack comes together.

A screenshot of a computer

AI-generated content may be incorrect.

 

Data to Decisions Next-Generation Customer Experience Tech Optimization Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Google Cloud, UWM partner as mortgage battle revolves around automation, data, AI

United Wholesale Mortgage (UWM) said it will use Google Cloud for AI and analytics in an effort to automate more of the mortgage process.

The partnership, announced ahead of Google Cloud Next, highlights how the mortgage industry is looking to transform with first party data and automation and reload for an eventual rebound in the housing market. The Google Cloud partnership with UWM also sets up an interesting technology battle given that Rocket, built on AWS, has acquired Redfin as well as Mr. Cooper in recent weeks. Rocket's goal is to acquire first party data that fuels its models and cross-selling opportunities.

For Google Cloud, the UWM win gives it another big financial services customer.

UWM has a technology platform that includes ChatUWM that is a borrower personal guide through loan documentation and processes and Trac+, which manages title review, closing and disbursement during the mortgage process

Key points about the Google Cloud-UWM partnership:

  • UWM will integrate Google Cloud AI and machine learning in its lending platform that's used for underwriting automation, document processing and customer support.
  • UWM is using Google Cloud's Gemini Flash 1.5 to enhance underwriting automation.
  • The companies will explore Google Cloud infrastructure to scale capacity and enhance security.
  • Google Cloud and UWM said the two companies will leverage AI to personalize loan recommendations and identify the right products for a borrower.
  • The two companies will announce more technology integrations and products in the months to come.

The big picture

Rocket's acquisition spree (Redfin and Mr. Cooper) is notable because it's focused on buying the first party data that can be used in its models. However, there's an additional thread to consider here. Rocket covered why the acquisitions made sense based on 30 petabytes of data, but there are also good business reasons to make the purchases beyond training AI models. Rocket's acquisitions are based on industry consolidation and feeding the sales funnel too.

A diagram of a company's homeownership platform

AI-generated content may be incorrect.

The UWM-Google Cloud partnership has a similar ring to it and highlights how the battle between the No. 1 mortgage lender (UWM) and No. 2 (Rocket) is moving to AI and the cloud.

Here's what Ishiba said on UWM's fourth quarter earnings call:

"We continued to invest in cutting edge technology, including AI, investing in our people, and we're in the best position to capitalize on any change in the current market dynamics. There are about $2.5 trillion and growing in mortgage rates over 6%. And so it won't take much of a shift on rates for those loans to be in the money. No matter what happens to the market, we are focused on what we can control and making sure we are more prepared than our competition."

Holger Mueller, analyst at Constellation Research, said:

"The battle on which cloud your mortgage is being stored, serviced, analyzed and more is in full swing. As Google Cloud wants and needs to make progress to its larger cloud provider competitor and that happens with huge workloads. UWM is such a large workload and will become a substantial part of Google Cloud utilization, once all set and done. For UWM, it has a chance to use Google Cloud features and have both a leg up on the competition with a new implementation and new offerings. It can also build on the cloud platform that has 3-4 years of a lead over its competitors when it comes to what matters today, AI."

Data to Decisions Marketing Transformation Matrix Commerce New C-Suite Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Customer Officer Chief Information Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer