Results

RPA and those older technologies aren’t dead yet

Robotics process automation (RPA) isn't dead and arguably should get more attention in your enterprise automation portfolio. Simply put, RPA still has a role in an overarching AI strategy and in many cases can be good enough. Keep that in mind for old school AI, machine learning, process automation and even hybrid cloud.

We’ll stick to RPA for now, but the general theme remains: Automation and returns are what matters. And you don’t have to rely solely on heavy compute, reasoning large language models to get there.

Rest assured that most vendors aren't going to preach the benefits of RPA, but orchestration is about more than AI agents. Process, automation and workflow matter. Use the tool for the right job. If a rule-based approach like RPA works use it to deliver returns. Ditto for workflow engines, traditional AI and any other non-agentic technology. For things you already know about there’s no reason to force an LLM to reason repeatedly.

This theme has been bubbling up in recent months amid the agentic AI marketing barrage. CxOs in Constellation Research's BT150 have been noting that RPA is part of the generative AI mix and some acknowledged that the technology may be good enough in many use cases. In fact, older technologies—traditional AI and machine learning—are good enough to deliver significant value.

Marianne Lake, CEO of JPMorgan Chase's Consumer & Community Banking unit, set the scene during the bank’s annual investor day: "Despite the step change in productivity we expect from new AI capabilities over the next five years, we have been delivering significant value with more traditional models. Not every opportunity requires Gen AI to deliver it."

Vendors that play to value instead of the buzzwords seem to be faring well.

During his PegaWorld keynote, Pegasystems CEO Alan Trefler said orchestrating automation, workflows and AI is about finding the right tool for the task at hand. Yes, Pegasystems has RPA, but didn't mention it during PegaWorld or on recent earnings calls. The company has been riding its GenAI Blueprints for revenue growth and launched the Pega Agentic Process Fabric as well as a blueprint that’ll help you ditch legacy infrastructure.

Overall, Pegasystems broad platform includes decisioning, workflow automation and low code development.

Trefler said agentic AI is being talked about extensively due to "magical marketing." He said that there will be thousands of AI agents running around in enterprises and leading to sprawl and no orchestration.

"The right AI for the right purpose is absolutely critical and candidly forgotten by the pundits that just want to dump things in and hope everything goes right. Large language models aren't everything. Languages models are great for some stuff, but for other things you want to use other forms of AI," said Trefler.

Trefler, like Philipp Herzig is Chief AI Officer and Chief Technology Officer at SAP, argue that prompt engineering is dead. Why? Semantic approaches leave too much uncertainty for processes that need to be followed repeatedly. And you don’t need agentic AI to do everything because it’ll make your costs balloon just on energy consumption.

"If you go down the philosophy of using the GPU to do the creation of the workflow and a workflow engine to execute the workflow, the workflow engine takes a 200th of the electricity because it's not reasoning. It's all this reasoning. You don't have to reason on things you already know about," said Trefler.

The upshot is that old-school AI, machine learning and rule-based RPA can be used in a comprehensive automation strategy. In other words, AI agents and genAI simply think (reason) too much.

Francis Castro, head of digital and technology customer operations at Unilever, said at PegaWorld: "I'm a technologist 25 years in the company, driving technology. Sometimes you fall in love choosing the right tool and the right technology. Sometimes we fall in love with the technology, or we fall in love with the problem, but we forget about what we want to achieve."

The takeaway from Pegasystems is that workflows, automation, agentic AI, process and RPA all go together. It's one continuum. ServiceNow sings from the same hymn book, but sure does talk a lot about AI agents today. In the end, RPA is good enough to automate specific, repetitive and rule-based tasks.

"While the market has been talking about Agentic AI and overloading buyers with a laundry list of AI agents, bots and orchestrators, Pega has been focused on the underlying processes and workflows that have long been their bread and butter," said Liz Miller, analyst at Constellation Research.

Systems integrators are busy building AI agents and have shown they're better at it than vendors. But the big picture for these integrators is automation of processes for customers and internally for business process management.

Verint is another vendor playing the value game. The company's first quarter results were better than expected as customers used its AI bots to automate specific processes without the need to change infrastructure or platforms.

Verint CEO Dan Bodner said: "First, more and more brands are fatigued by the AI noise and are looking for vendors that can deliver proven, tangible and strong AI business outcomes now. And second, brands are looking for vendors with hybrid cloud that can deploy AI solutions with no disruption and with a show me first approach."

The CX automation company's secret sauce is delivering value over talk about orchestration, LLMs and agentic AI protocols. "More brands are fatigued by the AI noise and are looking for vendors that can deliver proven, tangible and strong AI business outcomes now. And second, brands are looking for vendors with hybrid cloud that can deploy AI solutions with no disruption and with a show me first approach," said Bodner.

Here's a look at a Verint slide on returns for use cases:

In fact, Bodner mentioned LLMs just twice at the very end of Verint's earnings call. Verint is agnostic about the LLMs it uses for its bots, which were mentioned 24 times. The game for Verint is to automate "micro workflow" in various processes to deliver returns. Verint bots use Verint's Da Vinci AI and are designed to automate tasks and workflows. Verint bots are focused on specific tasks instead of being general purpose.

“Verint’s bet that simple is better paid off,” said Miller. “While everyone was ramping up the hype machine around AI Verint dug into the idea that all of these innovations were great, but outcomes were better. So their simplified preset bot approach where you start with the intended outcome and then connect the automation dots with automation, skills and workflows works.”

Like Pegasystems, Verint doesn’t talk about RPA anymore. In 2019, Verint talked a lot about RPA and has plenty of docs about the technology lying around.

For UiPath CEO Daniel Dines, the company's RPA heritage is an advantage and he hasn’t banned the acronym yet. "Our extensive installed base of robots and AI capabilities already operating autonomously across more than 10,000 customers gives us unparalleled insight into real enterprise processes and workflows where agents are a natural extension," said Dines on UiPath's fiscal first quarter earnings call. "We uniquely bridge deterministic automation or RPA and probabilistic automation or agentic, allowing customers to extend automation into more complex adaptive workflows."

UiPath has been building out its automation platform and launched UiPath Maestro, which aims to leverage AI agents, RPA and humans to orchestrate processes. UiPath’s first quarter results were better-than-expected due to traction for its agentic automation platform.

"There is a tremendous benefit of combining AI agents with robots. And when you go and decide on an AI genetic automation platform, it's a natural way to think maybe we should bring the robots into the same platform," said Dines. "Again, the benefits from the security and governance perspective and having agents and robots and managing humans also in the same platform are tremendous."

Naturally, Dines is going to say RPA has a role in automation given UiPath has a significant legacy business tied to the technology. However, I don't think he's off. AI agents aren't needed for everything when a robot will do fine. And none of these agents are going to work without process intelligence.

CxOs need to deliver value and chasing agentic AI when there are other tools that provide returns faster isn't a great blueprint. Is RPA going to see a renaissance? Probably not. But RPA is definitely worth keeping in the automation toolbox. A lot of those older, less buzzworthy technologies should stick around too.

Data to Decisions Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Revenue & Growth Effectiveness Chief Information Officer

Celonis, SAP reach data access cease fire amid litigation

SAP has agreed to not interfere with Celonis' data extractor for customer data until the litigation between the two companies has been resolved. In turn, Celonis has withdrawn its motion for a preliminary injunction.

As previously reported, the Celonis lawsuit against SAP is worth watching given agentic AI largely depends on enterprise technology vendors providing data access to other applications--and AI agents.  

SAP said: "SAP rejects Celonis’s claims and continues to seek dismissal of the case. In the meantime, SAP has agreed to maintain the status quo with Celonis and avoid confusion for the benefit of SAP’s customers. SAP will continue to evaluate its IP rights and take action as appropriate."

In March, Celonis sued SAP in federal district court in San Francisco alleging that SAP used anticompetitive practices to thwart Celonis' process mining applications. SAP owns Signavio, a process mining tool that's often combined with the ERP giant's platform.

Celonis alleged that SAP is using its ERP dominance to block Celonis in the process mining market.

Under the agreement, SAP will enable Celonis customers to access their own SAP data without additional fees or licenses.

Related:

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Revenue & Growth Effectiveness Digital Safety, Privacy & Cybersecurity SAP Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

P&G outlines Supply Chain 3.0, next digital transformation moves

Procter & Gamble's two-year restructuring includes a heavy dose of digital transformation, artificial intelligence and supply chain automation and optimization.

The headlines from Procter & Gamble's appearance at the Deutsche Bank Global Consumer Conference revolved around the company's move to cut 7,000 jobs. However, P&G Chief Financial Officer Andre Schulten and Chief Operating Officer Shailesh Jejurikar outlined the bigger picture.

P&G is navigating an uncertain economy, shifting tastes, inflation and tariffs. P&G's approach is to use productivity to fund growth initiatives over the years. Its next phase is focused on developing its systems and digital capabilities to support automation, data strategy, insights and analytics, said Schulten.

"We believe we now have the opportunity to step forward to enable the tremendous growth opportunities we have with an even more focused and efficient portfolio, supply network, and organization," said Schulten. "In fiscal 2026, we will begin a 2-year noncore restructuring program. This program includes 3 interdependent elements: portfolio choices, supply chain restructuring and organizational design changes."

P&G plans to shed brands in various categories and categories will be outlined more in July when the company reports its fiscal year end results.

Here's a look at the high level transformation moves from P&G:

  • The company will right size its supply chain, optimize by locating production to drive efficiencies, speed up innovation and cut costs.
  • P&G will make roles broader and shrink teams as the company leverages automation and digitization.
  • P&G will cut up to 7,000 non-manufacturing roles, or 15% of its non-manufacturing workforce. Those job cuts are incremental to what P&G has outlined for its Supply Chain 3.0 savings.

"This restructuring program is an important step toward ensuring our ability to deliver our long-term algorithm over the coming 2 to 3 years. It does not, however, remove the near-term challenges that we currently face," said Schulten. "All the more reason to double down now on the integrated growth strategy that has enabled strong results over the past 6-plus years, executing our strategy and accelerating this opportunity, especially under pressure is our path forward."

According to Schulten, future success for P&G requires innovation for product, packaging, brand communication and marketing and retail execution. Schulten said all of those variables have to work together to deliver value. No one thing can carry the team. He explained:

"Superior performing products and superior packages provide noticeably better benefits to consumers. They become aware of and learn about these products through superior brand communication. This comes to life in stores and online with superior retail execution and deliver superior consumer value at a price that is considered worth it across each price tier in which we choose to compete."

Here's a look at the P&G plan:

Fund investment with productivity gains. Schulten said P&G can mitigate cost and currency headwinds by becoming more efficient across cost of goods sold to marketing to in-store and online execution.

Data and insights across the value chain. Jejurikar said the company was looking for "better consumer insights on what's required for the specific job to be done; integrated technical capabilities applied across formulated chemistry, assembled products and devices to deliver the superior solution; integrated communication across package, shelf, online and other channels; and at a value that balances price and performance for the consumer and the retailer."

Use AI to make the marketing and advertising spend more efficient. Jejurikar said the company is using programmatic and algorithm-based media buying to find consumers most likely to be receptive to messaging. "Our proprietary Consumer 360 data platform enables brand to use target audience algorithms to serve ads at the right frequency each week, all year round, more effective reach and more cost efficient," he said. P&G's media reach in the US has grown to 80% from 64% over five years. Reach in Europe is now 75%.

He added:

"We are driving advertising effectiveness, starting with superior consumer insights and leveraging AI as a tool to deliver superior content creation. We're improving efficiency using AI tools for ad testing improving quality, cost and speed. Ads can now be tested and optimized in just a few days versus weeks at 1/10 of the cost versus prior methods."

In-store optimization via data and computer vision. Jejurikar said P&G was combining point-of-sale data with millions of retail shelf images to optimize shelf design. P&G has proprietary tools that enable the company to analyze assortment online and offline.

Supply Chain 3.0. Jejurikar said P&G is looking to extend that supply chain data from suppliers to customers to retail shelves. "We are investing in advanced supply planning technologies to better anticipate consumer demand and adjust production and inventory levels, accordingly, helping minimize stock-outs overproduction and waste," said Jejurikar.

The supply chain transformation will have a heavy dose of automation in manufacturing sites. P&G is capturing images and visual data on manufacturing lines to improve quality.

According to Jejurikar, P&G's warehousing center of excellence will be the hub for the company's 50 distribution centers. This hub will coordinate warehouse activity from the moment a truck enters the gate until it leaves. This coordination is improving productivity 50% on indirect administrative work at each site.

Jejurikar also outlined the KPIs for Supply Chain 3.0:

  • 98% on-shelf and online availability.
  • Up to $1.5 billion before tax in gross productivity savings each year.
  • 90% of free cash flow productivity.

Those targets are in addition to savings of $1.5 billion in cost of goods sold and $500 million in marketing previously outlined.

Data to Decisions Innovation & Product-led Growth Marketing Transformation Matrix Commerce Next-Generation Customer Experience Sales Marketing Revenue & Growth Effectiveness Supply Chain Automation Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A Chief Information Officer Chief Supply Chain Officer Chief Experience Officer

Broadcom delivers strong Q2 on AI networking demand

Broadcom reported strong second quarter results amid "robust demand for AI networking."

The company, which is thriving as hyperscale cloud providers buy processors for AI workloads, delivered second quarter earnings of $4.96 billion, or $1.03 a share, on revenue of $15 billion, up 20% from a year ago. Non-GAAP earnings were $1.58 a share, two cents ahead of Wall Street estimates.

Broadcom CEO Hock Tan said AI revenue in the second quarter was up 46% from a year ago to $4.4 billion. "We expect growth in AI semiconductor revenue to accelerate to $5.1 billion in Q3, delivering ten consecutive quarters of growth, as our hyperscale partners continue to invest," said Tan.

CFO Kirsten Spears the company free cash flow was a record $6.4 billion, up 44% from a year ago.

Semiconductor revenue in the second quarter was $8.41 billion, up 17% from a year ago. Software infrastructure revenue, led by VMware, was $6.6 billion, up 25% from a year ago.

As for the outlook, Broadcom projected third quarter revenue of $15.8 billion with adjusted EBITDA of at least 66% of projected revenue.

Constellation Research analyst Holger Mueller said:

"Broadcom Is firing on all cylinders breaking the $15 bilion milestone in revenue by a hair. The demand for AI keeps growing Broadcom's semiconductor and infrastructure software businesses. It's good to see an extra $400 million invested into R&D, even if it cost Broadcom some revenue. CEO Hock Tan and team know Broadcom needs to invest into product to keep the growth going."

On the conference call with analysts, Tan said the following:

  • "Custom AI accelerators grew double digits year-on-year, while AI networking grew over 170% year-on-year. AI networking, which is based on Ethernet was robust and represented 40% of our AI revenue. As a standard-based open protocol, Ethernet enables one single fabric for both scale out and scale up and remains the preferred choice by our hyperscale customers. Our networking portfolio of Tomahawk switches, Jericho routers and NICs is what's driving our success within AI clusters in hyperscalers."
  • "We continue to make excellent progress on the multiyear journey of enabling our 3 customers and 4 prospects to deploy custom AI accelerators. As we had articulated over 6 months ago, we eventually expect at least 3 customers to each deploy 1 million AI accelerated clusters in 2027, largely for training their frontier models. And we forecast and continue to do so a significant percentage of these deployments to be custom XPUs. These partners are still unwavering in their plan to invest despite the certain economic environment."
  • "There's no differentiation between training and inference in using merchant accelerators versus custom accelerators. I think the whole premise behind going towards custom accelerators continues, which is it's not a matter of cost alone. It is that as custom accelerators get used and get developed on a road map with any particular hyperscaler, there's a learning curve, a learning curve on how they could optimize the way the algorithms on their large language models gets written and tied to silicon."
  • "Why inference is very hot lately--we're only selling to a few customers hyperscalers with platforms and LLMs--. is these hyperscalers and those with LLMs need to justify all the spending they're doing. Doing training makes your frontier models smarter. There's no question. Make your frontier models by creating very clever algorithms that consumes a lot of compute for training smarter. You want to monetize inference. And that's what's driving it. Monetize. To justify a return on investment on training you create a lot of AI use cases, and consumption through inference. And that's what we are now starting to see among our small group of customers."
  • Customers are increasingly turning to VCF (VMware Cloud Foundation) to create a modernized private cloud on-prem, which will enable them to repatriate workloads from public clouds while being able to run modern container-based applications and AI applications. Of our 10,000 largest customers, over 87% have now adopted VCF."
Tech Optimization Data to Decisions Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

IBM's AI Data Lakehouse: Bridging Structured & Unstructured Data Insights

Want to see how #AI is making complex #data management look easy?💡 

At IBM Think 2025, Constellation analyst Holger Mueller interviews IBM's Miran Badzak and Edward Calvesbert about how companies can transform their data into a strategic asset.

Watch the full interview to understand:

✅ How to unlock 90% of your unused data
✅ AI tools that simplify database management
✅ Breakthrough technologies reshaping#enterprise intelligence

Learn more here ➡️ https://lnkd.in/eUBmtjhx

On <iframe width="560" height="315" src="https://www.youtube.com/embed/hQRxUXsvfzY?si=64A_Sc5nXbMnMtzY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Zscaler's master plan: Combine Zero Trust, data fabric and agentic AI

Zscaler has its own plans to consolidate your cybersecurity budget as it branches out from network security to securing data and agentic AI operations.

The company, which is known for its Zero Trust architecture, has been on a tear financially as it fleshes out its vision makes acquisitions to extend its platform.

CEO Jay Chaudhry laid out the vision at the company's annual Zenith conference this week. "Our strategy is to make sure we secure your data no matter where it is with one policy. When all traffic is going through us when it goes to Internet and knowing that all data leads to the Internet, we are in the best position to really provide holistic data security for us," he said.

That data and network security approach will be very relevant as AI agents proliferate. He said securing AI agents is a natural extension of Zscaler's footprint.

"Zscaler is securing users to have right access to right application. AI agents are able to do same kind of stuff that your people did. I know many call center customers who are using call center agents. There will be other agents like that. That's no different than what we do. We are securing users and with similar technology with some of the additions will secure agents as well," said Chaudhry. "Identity of the agent becomes one piece of it. And there's some other things related to if agents are reaching out to LLMs and some of the other apps. We have some enhancement being made, but we are more natural than anybody else to solve this."

To deliver that vision, Zscaler has been busy.

  • At Zenith, the company launched a set of updates to its Zscaler Zero Trust Exchange platform including a unified appliance for Zero Trust Branch, which secures communications between branches, campuses, factories and various IoT devices.
  • The company also launched its Zero Trust Gateway for Cloud Workloads and Zscaler Micro segmentation for Cloud Workloads. Both efforts secure traffic and data running in hybrid environments.
  • Zscaler also outlined a set of AI tools including AI-powered Data Security Classification, generative AI predictions with prompt visibility, AI segmentation and Zscaler Digital Experience (ZDX) Network Intelligence.
  • The company also said it would acquire Red Canary, which is known for managed detection and response (MDR). Zscaler said the addition of Red Canary will give it automated and agentic workflows that can leverage Zscaler's data on its security cloud and intelligence from its research team.

Chaudhry said that Zscaler's purchase of Red Canary will give it the ability to power next-gen security operations centers (SOCs) with AI agents. Zscaler previously acquired the data fabric piece of the equation when it bought Avalor a little more than a year ago.

"The message here is really building a number of agentic technologies, agentic task agents that can do particular task, perceiving, reasoning, action being taken. It's getting very exciting, and they're all coming together," said Chaudhry.

Strong quarter lumpy cybersecurity industry

It's early in Zscaler's transformation to expand its total addressable market. The company's latest quarter stood out amid rivals that stumbled. CrowdStrike's quarter was a disappointment and Palo Alto Networks sold off despite a strong quarter.

Zscaler reported a fiscal third quarter net loss of $4.1 million, or 3 cents a share, on revenue of $678 million, up 23% from a year ago. Non-GAAP earnings were 84 cents a share. Those results were well ahead of estimates.

As for the outlook, Zscaler projected revenue of $705 million to $707 million with non-GAAP earnings of 79 cents a share to 80 cents a share.

For fiscal 2025, Zscaler is projecting revenue of $2.659 billion to $2.66 billion.

Chaudhry said the Zscaler platform has more than 50 million users and that is creating a network effect for data. The company's Zero Trust Exchange processed more than 100 trillion transactions in the last year, blocked 60 billion threats and enforced 5 trillion policies.

The game plan for Zscaler is clear: Take that data flywheel, which generates more than 20 petabytes of data, and use it to power the AI agents that'll automate cybersecurity operations.

"While legacy vendors are attempting to cobble together disjointed point products and calling it a platform, we are constantly expanding our core Zero Trust exchange by integrating new functionality to solve more and more of our customers' security concerns," said Chaudhry.

He noted that customers remain cautious about IT spending, but they're interested in taking out costs. The big question is whether Zscaler, Palo Alto Networks or CrowdStrike turns out to be the security budget consolidator over time.

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology cybersecurity Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief AI Officer Chief Experience Officer

MongoDB reports strong Q1 with revenue growth of 22%

MongoDB reported strong first quarter results powered by revenue growth of 22% from a year ago.

The company reported a first quarter net loss of $37.6 million, or 46 cents a share, on revenue of $549 million, up 22% from a year ago. Non-GAAP earnings in the quarter were $1 a share.

Wall Street was expecting MongoDB to report first quarter non-GAAP earnings of 67 cents a share on revenue of $528 million.

Keep in mind that MongoDB's recent quarterly results have been lumpy with hits and misses depending on consumption. MongoDB's outlook fell short of expectations following its fourth quarter results.

CEO Dev Ittycheria said the company is off to a strong start with Atlas revenue growth of 26%. MongoDB also said it had the highest total net customer additions in six years. "We are confident in our position to drive profitable growth as we benefit from this next wave of application development," said Ittycheria.

As for the outlook, MongoDB said second quarter sales will be between $548 million to $553 million with non-GAAP earnings of 62 cents a share to 66 cents a share. Analysts were modeling second quarter non-GAAP earnings of 58 cents a share on revenue of $549.28 million.

MongoDB added that annual revenue will be between $2.25 billion and $2.29 billion, up from its previous guidance $2.24 billion to $2.28 billion. The company said non-GAAP earnings for the year will be between $2.94 a share to $3.12 a share.

During the quarter, MongoDB launched its MongoDB Model Context Protocol (MCP) server, named Mike Berry CFO and launched two new Voyage AI retrieval models.

Constellation Research analyst Holger Mueller said:

"MongoDB is growing nicely, fueled by the need data for AI, delivered in the cloud. Also good to see record new customer additions. The challenge remains for MongoDB to turn a profit, From about $100 million more in revenue, only $43 million made it to reducing its net loss. The good news is Dev Ittycheria kept Sales and Marketing constant, reduced G&A and R&D took in $22 million, which is key in the current period of innovation. Now the question is – can MongoDB repeat the same trick for Q2 and then break out a small profit? The growth engines remain the same for Q2."

Data to Decisions mongodb Big Data Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer

AI SRE, Tech Acquisitions, Infrastructure Transformation | CRTV Episode 106

ConstellationTV Ep. 106 is live! This week, co-hosts Larry Dignan and Martin Schneider break down key shifts in #enterprise tech...

💡  Snowflake Summit: A look at leadership’s vision and #AI-powered innovation.
 🤝 Salesforce + Informatica: What the acquisition says about the future of #data and AI strategy.
 🧠 AI Infrastructure: Esteban Kolsky explains why enterprises are moving toward hybrid and private AI models for more control and agility.
 🚀 Startup Spotlight: Meet Ciroos. CEO Ronak Desai shares how the company is reimagining site reliability engineering (#SRE) with AI technology that will reduce incident response times and streamline operations.

Watch the full episode below for insights that matter to #IT leaders, data professionals, and technology decision-makers.

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Innovation & Product-led Growth Marketing Transformation New C-Suite Next-Generation Customer Experience Tech Optimization Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Revenue Officer Chief Supply Chain Officer Chief Sustainability Officer Chief Technology Officer On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/HBlu3iOF6TE?si=KwNMSlfnWFNJ3s2_" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Amazon revamps supply chain, last mile delivery, warehouses with AI models

Amazon is throwing its AI foundation model weight behind its supply chain as it optimizes routes with SCOT (Supply Chain Optimization Technology). With the move, Amazon is using its generative AI tools to highlight the returns that'll show up in its earnings results.

The company also highlighted its robotics, agentic AI and physical AI efforts.

At an event, Amazon outlined SCOT, which will touch every Amazon package in its supply chain. SCOT has an AI foundational model that powers its supply chain. Today the model processes more than 400 million items across 270 different time spans.  Strategically, this supply chain, last mile delivery and robotics advance with foundational models makes a lot of sense. First it improves operations and drives real returns for Amazon. Second it's a nice showcase and first customer reference for Amazon Web Services (AWS).

Amazon CEO Andy Jassy has made a point of highlighting how AI is helping the company's overall operations. At re:Invent 2024, Jassy talked extensively about how continous improvement in the supply chain can save a few pennies per package that add up to billions of dollars at scale. He also noted robotics and automation advances in warehouses and distribution centers. 

Constellation Research CEO R "Ray" Wang was at the Amazon event and noted:

"Amazon is showing the power of Exponential Efficiency. Just like Uber optimized ride batching, dynamic pricing, and route optimization, Amazon is using its data to drive down costs, improve customer experience, reduce delivery times and perfect orders. Digital giants in an AI age have the ability to use their data to create massive operational efficiencies and improve customer experience at machine scale."

Key points about the SCOT model include:

  • SCOT is predicting what customers want before they click the buy button to reduce delivery ties by almost a day while lowering carbon emissions.
  • The model predicts where they'll want orders delivered and when. SCOT also recognizes local demand patterns.
  • The supply chain model ingests weather patterns and planned promotions as well as traditional data.
  • Amazon said SCOT enables the company to position inventory closer to customers for fewer miles driven.
  • So far, SCOT has driven a 10% improvement in long-term national forecasts and a 20% improvement regionally.
  • SCOT is live in US, Canada, Mexico and Brazil. EU and other countries will be live in the near future.

Last mile genAI meets physical AI

Amazon also launched last mile generative AI mapping that leverages satellite imagery, road networks, land parcels and building footprints along with delivery scan data and GPS data from past deliveries. All of Amazon's data is used to produce a model of where packages will be dropped off.

The company said that advances in foundational models allowed it to scale reasoning and perception across petabytes of data to fine tune without humans.

In October 2024, Amazon launched the first version of the mapping technology and was able to map more than 2.8 million apartment addresses as well as 4 million parking locations.

This data fed a more accurate geospatial model across the US that will now scale by 10x in 2025. Key items include:

AI mapping helps Amazon drivers navigate university campus and office complexes by identifying optimal parking locations.

By November, Amazon is on track to refine apartment address to building mappings for more than 11 million apartments across 700,000 campuses.

Amazon said that it will have learned more than 130 million delivery locations, 200 million parking spots and 800,000 building entrances.

Agentic AI and robotics

Amazon also outlined how Project Vulcan is combining robotics and agentic AI.

The project enables robots to hear, understand natural language and act autonomously.

According to Amazon, the goal is to create systems of robots instead of specialists so they will be more valuable in warehouse roles as assistants.

The company noted that it's moving "beyond one brain per robot” to fine-tuning a single large model that works across multiple robots, tasks and sensory inputs.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon Supply Chain Automation Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software IoT Blockchain ERP Leadership Collaboration M&A ML Machine Learning LLMs Agentic AI Generative AI AI Analytics business Marketing SaaS PaaS IaaS Next Gen Apps CRM finance Healthcare Customer Service Content Management Chief Supply Chain Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

OpenAI's enterprise business surging, says Altman

OpenAI CEO Sam Altman said the company's enterprise unit is doing well as businesses continue to invest in large language models and increasingly AI agents.

Speaking at Snowflake Summit 2025, Altman said the enterprises that have learned to iterate quickly are doing best. The challenge for enterprises is that AI is changing so quickly and that usually favors the agile, said Altman, who appeared on stage with Snowflake CEO Sridhar Ramaswamy.

"There's still a lot of hesitancy, and the models are changing so fast, and there's always a reason to wait for the next model," said Altman. "But when things are changing quickly, the companies that have the quickest iteration speed, make the cost of mistakes low and have a high learning rate win."

He added that enterprises are clearly making early bets.

Altman said that a year ago, he would have recommended startups run toward generative AI and enterprises should wait for more maturity and opt for pilots over production. Today, generative AI is more mainstream and OpenAI's enterprise business is seeing strong demand.

"Big companies are now using us for a lot of stuff. What's so different? They say it just took a while to figure it out. That's part of it. But the models just works so much more reliably. It does seem like sometime over the last year we hit a real inflection point for the usability of these models," said Altman.

Altman's comments landed a few days ahead of OpenAI's rollout of connectors to Dropbox and OneDrive for ChatGPT Team, Enterprise and Education users. The company also said Model Context Protocol (MCP) support is coming to Pro, Team and Enterprise accounts. 

OpenAI said it has 3 million paying business users, up from 2 million in February. 

Altman added:

"I think we'll be at the point next year where you can not only use a system to automate products and services, but the models will be able to figure out things that teams of people on their own can't do. And the companies that have gotten experience with these models are well positioned for a world where they can use an AI system to solve the most critical project. People who are ready for that, I think will have another big step change next year."

According to Altman, LLMs are more like interns today, but at some point soon they will be "more like an experienced software engineer."

"You hear about companies that are building agents to automate most of their customer support, sales and any number of other things. You hear people who say their job is to assign work to agents and look at quality and see how it fits together as they would with a team of relatively junior employees. It's not evenly distributed yet, but that's happening. I would bet next year that in some limited cases, at least in some small ways, we start to see agents that can help us discover new knowledge or can figure out solutions to business problems that are non-trivial. Right now, enterprises are focused on repetitive cognitive work to automate over a short time horizon. As that expands to longer time horizons and higher and higher levels, you get an AI scientist, an AI agent that can discover new science. That will be a significant moment in the world."

Other takeaways from Altman:

The ideal model. Altman said the ideal is "a very tiny model that has superhuman reasoning capabilities." "It can run ridiculously fast and 1 trillion tokens of context and access to every tool you can possibly imagine. And so it doesn't kind of matter what the problem is. Doesn't matter whether the model has the knowledge or the data in it or not," said Altman, who noted that framework isn't something that OpenAI is about to ship.

Altman added that using the models as a database is "sort of ridiculous" and expensive.

Prioritizing compute. Altman said that enterprises using the latest models are seeing real returns and you could solve hard problems with unlimited compute, but that's not realistic. Companies will get to the point where they will "be willing to try a lot more compute for the hardest problems and most valuable things," said Altman.

Data to Decisions Future of Work Chief Information Officer