How Verizon manages AI agent sprawl

Published April 26, 2026

Verizon has been scaling its use of AI agents across its enterprise with a focus on operational efficiencies and customer experience. As a result, Verizon has had to put in processes to navigate AI agent sprawl as well as approaches to returns and costs.

Speaking at Google Cloud Next 2026, Anil Kumar, VP of Consumer AI and Analytics at Verizon, outlined the company's approach to agentic AI in a briefing with industry analysts. The talk was notable given it highlighted how far agentic AI deployments have come as well as how far it has to go.

Kumar said Verizon, a longtime Google Cloud customer that leverages the platform for customer experience and operations, sees agents as a workforce-wide strategy. It's not a niche tool. The goal is to get everyone using agents. "If everybody is using agents then we can get much more done versus very few people using them," said Kumar. "We have so many applications, different systems and different knowledge bases. Through Gemini Enterprise, we can bring all of them together and create a simple agent that can be utilized across the entire workforce."

Agents will ultimately be a unifying layer of intelligence for Verizon because they blend together predictive, descriptive and generative AI in them. Verizon has built a LangChain layer for enterprise agents so it can have one single control plane across multiple data stores and clouds.

Verizon is approaching its AI agents plan based on total cost of ownership and quantifiable returns such as improved customer satisfaction scores and resolution rates. Kumar said TCO is used because some large models have token-based costs and others are smaller models where the cost is compute. The blended TCO approach makes more sense as you scale.

Kumar said Verizon is walking a line between encouraging every employee to create an agent and avoiding a Wild West scenario. Verizon determines what happens to agents based on whether they are mission critical and usage rates. If an agent gains adoption it is standardized and promoted to an enterprise agent.

The harsh reality is that if you don’t contain AI agent sprawl you’ll have something like this:

Agent sprawl

Here’s a look at how Verizon solves for AI agent sprawl.

Determine where AI agents fit best. For Verizon, systems of engagement are often the best fit for AI agents as they can plug into customer support, CRM, marketing and operations. AI agents thrive in non-deterministic use cases.

Kumar noted that agents aren't as helpful in systems of engagement. If a process is highly deterministic you don't need an agent and can use classic automation.

Agents that apply to mission critical or high-priority agents go through a central process and are registered in advance.

Encourage broad experimentation first. Verizon encourages employees to create low-risk agents if they have access to agents. Experimentation only applies to low-fidelity agents. "We are encouraging employees to create their own agents, but at the same time, we don't want to have scale agents sitting in Google Enterprise," said Kumar.

Usage is what determines what agents are voted off the island. Actual usage is the primary filter, according to Kumar. Agents with repeated usage can graduate to a broader enterprise scope. "If there's no usage, we don't want to run the [agent], because it's a waste of computing, waste of resources," said Kumar.

Graduate bottoms-up agents to the broader enterprise. If an agent solves a problem across many employees and departments, you standardize it, add more governance and publish it on a central platform.

Monitor ongoing agent usage due to costs and decommission low-ROI or unused agents on a regular basis. Don’t be shy about culling the AI agent herd. Managing agent sprawl comes down to creating it and then paring back repeatedly.

  • Encourage creation.
  • Observe usage.
  • Keep and promote high-usage agents.
  • Retire low-usage ones.
  • Apply stronger governance to high‑priority/sensitive agents.

More from Google Cloud Next 2026: