Results

Accenture strong Q4, sees IT budget savings funding genAI projects

Accenture CEO Julie Sweet said enterprises aren't increasing IT budgets for generative AI, but are looking to save money on technology and reallocate spending toward genAI and data projects.

Speaking on Accenture's fourth quarter earnings call, Sweet said IT budgets for 2025 are likely to be more of the same as enterprises are forming spending plans. She said:

"What we are seeing is the continued trend of trying to save money on IT to free up the spending on areas of GenAI. We haven't seen a change in overall spending. We'll see what the budgets come in January and February, but we're not expecting a big change. But what we also are seeing is that as they're saving money, they want to invest it in things like GenAI and data."

Sweet said the IT spending environment remains cautious and discretionary spending isn't likely to move higher.

Accenture ended the fourth quarter with $3 billion in genAI bookings for the fiscal year and expects another healthy increase in 2025. "We know there's clear demand. We're starting to see more of our clients move from proof-of-concept to larger implementations," said Sweet. "We're also continuing to see data pull-through."

In a nutshell, Sweet said that genAI and AI will be a lot like digital efforts in enterprises. At some point, AI will touch every unit, use case and operation in an enterprise. Sweet said Accenture generative AI deals were averaging around $1 million, but have moved to larger deals of $10 million or more.

She said:

"Like digital, AI is both the technology and a new way of working, and the full value will only come from strategies built on both productivity and growth. And it will be used in every part of the enterprise. We believe the introduction of GenAI signifies a transformative era that is set to drive growth for us and our clients over the next decade much like digital technology has in the last decade and continues to do so."

Accenture is also adopting genAI for productivity gains and to hone its services.

Q4, fiscal 2024 results

Accenture reported better-than-expected fourth quarter earnings of $2.66 a share on revenue of $16.4 billion, up 3% from a year ago. Non-GAAP earnings were $2.79 a share.

Generative AI new bookings for the fourth quarter were $1 billion.

For fiscal 2024, Accenture reported earnings of $11.44 a share on revenue of $64.9 billion, up 1% from a year ago.

As for the outlook, Accenture projected fiscal 2025 revenue growth of 3% to 6% in local currency and earnings of $12.55 a share to $12.91 a share.

Accenture has 774,000 employees.

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity accenture AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

CFOs in Q3 hold optimism, but cautious about spending plans

A pair of CFO surveys highlight how finance chiefs remain optimistic about the economy, but are cautious about investment plans due to uncertainty.

Duke University's Fuqua School of Business and the Federal Reserve Banks of Richmond and Atlanta released its third quarter The CFO Survey, which has 450 respondents.

According to The CFO Survey, companies are still expecting a soft landing in the economy and plan to invest in infrastructure. However, 30% of firms are postponing, scaling down or canceling investment plans due to uncertainty about the US elections.

CFOs were more concerned about demand, sales and revenue in the third quarter than the second quarter. Concerns about inflation, labor and monetary policy receded in the third quarter compared to the second quarter.

The CFO Survey landed a week after Deloitte released its third quarter CFO Signals survey, which had 200 respondents from companies with at least $1 billion in revenue.

Deloitte said, "finance chiefs expressed concern about how talent shortages, wage inflation, and recent regulatory changes and proposals could impact their ability to manage and retain a skilled workforce."

The CFO Signals survey found that CFOs were also more cautious about spending. Deloitte found that CFOs were expecting 2024 earnings growth of 2.1%, less than the two-year survey average of 4.7%. CFOs also expected a slowdown in capital spending with growth of 3.4% in the third quarter, down from 6.2% a year ago.

Just 14% of CFOs rate the current North American economy as good, and only 19% see it improving in a year, according to Deloitte's CFO Signals survey.

Data to Decisions Revenue & Growth Effectiveness New C-Suite business finance Chief Financial Officer

Micron Technology: Q4, 2025 outlook driven by AI data center boom

For Micron Technology more memory due to data center demand and AI workloads is going to mean more money.

The company reported better-than-expected fourth quarter results due to a boom in data center demand. Like other vendors in the infrastructure space, Micron Technology is riding the AI wave. Micron Technology reported record data center revenue and saw strong demand for its data center DRAM and high bandwidth memory as well as data center SSD sales.

Micron Technology delivered fourth-quarter net income of $887 million, or 79 cents a share, on revenue of $7.75 billion, up 93% from $4 billion in the same quarter a year ago. Non-GAAP earnings were $1.18 a share. Wall Street was expecting non-GAAP fourth quarter earnings of $1.11 a share on revenue of $7.65 billion.

As for the outlook, Micron Technology said first quarter revenue will be about $8.7 billion, well ahead of Wall Street estimates of $8.21 billion. Non-GAAP earnings for the first quarter are expected to be $1.74 a share compared to estimates of $1.54 a share.

In prepared remarks, CEO Sanjay Mehrotra said:

“Robust data center demand is exceeding our leading-edge node supply and driving overall healthy supply-demand dynamics. As we move through calendar 2025, we expect a broadening of demand drivers, complementing strong demand in the data center. We are making investments to support artificial intelligence (AI)-driven demand, and our manufacturing network is well positioned to execute on these opportunities. We look forward to delivering a substantial revenue record with significantly improved profitability in fiscal 2025, beginning with our guidance for record quarterly revenue in fiscal Q1."

Constellation Research analyst Holger Mueller said:

"Micron is on a roll, almost doubling its revenue – and showing a stark contrast to YoY performance – where the company had an operating loss of close to  $1.5 billion – and now an operating profit of more than $1.5 billion - all a sign of high demand of course fueled by AI to its high performance memory chips. Kudos to Sanjay Mehrotra to have the intestinal fortitude to keep the cost base and investment base intact. Micron is riding the chip rollercoaster, which for Micron is on the way up."

Other key points from Micron Technology:

  • Data center demand is being driven by AI servers, but there's a refresh cycle for traditional servers that's just starting.
  • The high-bandwidth memory (HBM) total available market is expected to top $25 billion in 2025, up from about $4 billion in 2023.
  • "Our HBM is sold out for calendar 2024 and 2025, with pricing already determined for this time frame," said Mehrotra. "In calendar 2025 and 2026, we will have a more diversified HBM revenue profile as we have won business across a broad range of HBM customers."
  • Micron said data center SSDs topped $1 billion in sales in the fourth quarter.
  • PC makers have built up inventory due to rising memory prices and AI PCs, but Micron expects a better inventory picture by spring of 2025.
  • Micron expects to benefit as PC makers move to a minimum of 16GB of DRAM for value PCs and 32GB to 64GB for higher priced PCs. The same memory buildout is expected for the smartphone market too.
  • For fiscal 2024, Micron reported net income of $778 million, or 70 cents a share, on revenue of $25.11 billion, up from $15.54 billion a year ago.

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Meta AI upgraded with Llama 3.2 models as Meta melds AI, AR, spatial computing

Meta has updated Meta AI with its Llama 3.2 models, launched new Ray Ban smart glasses, Meta Quest 3S and Orion, the company's first augmented reality glasses.

With the moves, announced at Meta Connect, the company is looking to meld AI, augmented reality and spatial computing. Meta also outlined Meta AI features that can be useful to businesses.

Meta said that more than 400 million people use Meta AI across its portfolio--Facebook, Messenger, Instagram and WhatsApp--on a monthly basis with 185 million using it each week. Llama 3.2 will give Meta AI multimodal features.

Here's a look at what's new for Meta AI:

  • Meta AI can be prompted by voice and it'll respond with answers out loud with different voice options.
  • Meta AI will answer questions about photos and edit them. Meta AI can also share AI-generated images and suggest captions.
  • A Meta AI translation tool will be tested with automatic dubbing and lip synching for creators.
  • Businesses can use Meta AI to set up AIs that talk to customers, offer support and drive sales. Meta also added that advertisers have used Meta genAI tools to create more than 15 million ads in the last month.

Meta's plan for Meta AI is to use its platform and hardware ecosystem to drive usage. For instance, Ray Ban Meta glasses will be able to record and send voice messages on WhatsApp and Messenger, get video help and suggest items and places when you're out.

The company also said that Meta AI will play a role in Meta Quest 3S, its mixed-reality headset. Meta Quest 3S has the same performance as Meta Quest 3, but at a lower price point of $299.99. Meta also said it has revamped its Meta Horizon OS to support 2D apps, adding travel mode and improving Meta AI on the device with a Hey Meta wake word. Meta Quest 3 prices for the 512GB version will drop from $649.99 to $499.99.

Meta's Orion glasses aim to build off of what the company has learned from the Ray Ban partnership, but the device will only be available to Meta employees and "select external audiences." Orion has a large field of view in the smallest AR form factor so far.

Meta AI will also run on Orion to add visualizations to the physical world. The company noted:

"While Orion won’t make its way into the hands of consumers, make no mistake: this is not a research prototype. It’s one of the most polished product prototypes we’ve ever developed and is truly representative of something that could ship to consumers. Rather than rushing to put it on shelves, we decided to focus on internal development first, which means we can keep building quickly and continue to push the boundaries of the technology, helping us arrive at an even better consumer product faster."

Future of Work Next-Generation Customer Experience Innovation & Product-led Growth Data to Decisions Tech Optimization Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Verint builds out CX bots, analytics suite as AI approach resonates

Verint said it expanded its analytics suite for contact centers as it launched a series of bots designed to automate customer experience.

At its Engage conference this week, Verint, which bills itself a CX Automation Company, launched the following:

  • The expansion of Verint Data Hub, which unifies behavioral data across customer touchpoints, to include the new Verint Genie Bot in the company's speech analytics platform as well as the Verint Data Insight Bot. The idea is to enable analysts to deliver CX insights faster by having a natural language conversation with the Verint analytics platform.
  • Verint Knowledge Automation Bot, which joins the company's portfolio of Agent Copilot Bots. The Verint Knowledge Automation Bot searches across enterprise content and uses generative AI to summarize results for human agents.

During Verint's recent second quarter earnings call, CEO Dan Bodner said contact centers are in the early stages of adopting AI. "We believe the AI opportunity in the contact center market is very large. The CX industry is spending about $2 trillion annually on labor costs and brands are seeking AI-powered bots that can deliver tangible business outcomes," he said. "AI adoption in our market is currently in its early stages."

Verint's stack includes the Verint Open Platform, Verint Da Vinci, a factory for specialized bots and then ongoing training, and the Data Hub. Verint launched its Open Platform a year ago with 40 AI bots. Bodner added that customers are typically buying a few bots and then scaling with returns.

This over-the-top hybrid strategy for Verint is working out since enterprises can get rolling in weeks instead of months. Verint said it is seeing bot consumption gains as enterprise realize they can deploy one for a use case and then scale without deciding on a complete platform overhaul.

Constellation Research analyst Liz Miller said:

"Verint's AI approach has been one that encourages customers to start their AI journey from right where they stand, selecting a singular challenge or need to address and deploying preconfigured Bots to address those very specific needs. One example of this is their newly revealed "Genie Bot" which will be released as a Beta for now.

This bot addresses the reality in the contact center that there is more business intelligence demands than resources as analytics and operations teams are often spread thin asked to dive deep into the data and answer big "why" questions around interactions and experiences. With the Bot, agents and supervisors will be able to interrogate data themselves to better understand where and how interactions can be improved, even identify trends in agent performance that can be proactively addressed with training or upskilling. This understanding can then be turned into data rich slides to share insights and intelligence with peers and leadership."

Bodner said:

"It's the very beginning of AI impacting the capacity. It's also interesting that from those customers that now have the capacity increased, they’re not just choosing to reduce agent count, some of them are choosing to dedicate agents to improve customer retention, and some of them are actually dedicating agents to increased sales. So, we see really an interesting trend. AI can turn the contact center from a service center to a revenue generating center."

What remains to be seen is how the CX AI agent/bot market develops. Verint is in a competitive market and giants like Google Cloud are getting into the contact center space. OpenAI and T-Mobile also announced a genAI CX partnership. Verint's Bodner argued that the company is in a good position because it is neutral and can put customers on an AI journey quickly and generate cost savings.

Verint said its fiscal year 2025 revenue will be $933 million with non-GAAP earnings of $2.90 a share. Verint's AI bookings in the second quarter increased more than 40%. 

Bodner said:

"There are a lot of companies that are now attracted to the contact center market because obviously there’s a growing TAM and it looks like a great opportunity to for more automation based on Gen AI. But we see companies coming to the market with Gen AI and then IT is struggling to really take that Gen AI technology and create real business outcomes.

And that’s true for the hyperscalers, but also for smaller companies who just want to deliver Gen AI tools. And there’s a big difference between a generic Gen AI model and basically to train the model, to embed the model into existing workflow based on the contact center expertise that we have."

 

 

 

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Agentic AI, Oracle Cloud Infrastructure, Dreamforce | ConstellationTV Episode 89

ConstellationTV episode 89 just dropped! 📺 Co-hosts Martin Schneider and Larry Dignan discuss the latest #tech news, focusing on Salesforce and its Dreamforce announcements around the integration of agentic #AI and the upgraded data #cloud supporting unstructured #data.

Then Larry interviews Constellation analyst Holger Mueller about Oracle's success with its #cloud infrastructure, #AI capabilities, and strategic partnerships, including Amazon Web Services (AWS) and Microsoft. Holger also touches on Workday's AI strategy and public cloud adoption.

🎙 Finally, watch a live interview from Constellation's #AIF24NYC with Tina Chakrabarty from Sanofi, who emphasizes data literacy and a balanced approach to AI development in Sanofi's 2025 strategies.

00:00 - Meet the hosts
01:30 - #Enterprise tech news updates (Dreamforce updates, agentic AI)
13:05 - Oracle and Workday analysis with Holger Mueller
24:00 - 2025 AI strategies with Sanofi's Tina Chakrabarty
27:43 - Bloopers!

ConstellationTV is a bi-weekly Web series hosted by Constellation analysts, tune in live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday!

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/SMb-SGZMKqo?si=Hrr0RVCeC63S0u25" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Google Cloud files EC complaint vs. Microsoft Azure: What it means for enterprises

Google Cloud's move to file a complaint with the European Commission over Microsoft's alleged anti-competitive business practices with Azure highlights how the cloud battle is moving to courtrooms and regulators. 

The big three hyperscalers--Amazon Web Services, Microsoft Azure and Google Cloud--were already ensconced in a sometimes chippy battle for market share. With its EC complaint, Google Cloud is upping the ante to include regulators.

Google Cloud has also made similar appeals to regulators in the UK where cloud marketplace competition is examined. Google Cloud's parent, Alphabet, has been targeted by EU regulators for its search business.

In a blog post, Google Cloud said Microsoft's licensing terms push European customers to Azure over competitor clouds if they want to preserve their Windows Server license pricing. Google Cloud alleges that Microsoft marks up its licensing costs if customers use other clouds.

In its UK testimony in July, Google Cloud noted that Microsoft's licensing practices designed to push customers to Azure had the biggest impact on enterprises, which have legacy ties to the software giant.

Google Cloud said:

“Like many others, we have attempted to engage directly with Microsoft. We have kicked off an industry dialogue on fair and open cloud licensing. And we have advocated on behalf of European customers and partners who fear retaliation in the form of audits or worse if they speak up. Unfortunately, instead of changing its practices, Microsoft has struck one-off deals with a small group of companies.”

While these antitrust complaints take time to play out, there are a few takeaways worth pondering.

Cloud costs are a big concern. Google Cloud's complaint in the EU vs. Azure is just the latest indicator that cloud costs are an issue. Akamai, which has its own cloud computing infrastructure as a service, has launched Project Cirrus to migrate third-party public cloud workloads to its own infrastructure. As a result, Akamai has been able to cut its public cloud costs by 40% in year one with 70% savings projected in year two.

Microsoft could alter pricing practices ahead of EC action assuming the Google Cloud complaint goes anywhere.

AI workloads will bring cloud costs under even more scrutiny. The playing field for AI workloads in the cloud is a little broader with the rise of specialist providers, but the big three dominate here too. The race for AI workloads is well underway and public cloud costs are likely to rise for most enterprises. Those costs are a big reason why on-premises AI infrastructure will be in the mix.

While the big three cloud players duke it out, Oracle Cloud may be a winner. After all, Oracle now has positioned its databases in all three hyperscalers and can be a beneficiary as enterprises move workloads around. Oracle has mastered the art of co-opetition in many ways.

 

Tech Optimization Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Microsoft Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Smartsheet to go private in deal valued at $8.4 billion

Blackstone and Vista Equity Partners are taking Smartsheet private in an all-cash deal valued at $8.4 billion, or $56.40 a share.

The price is a 41% premium to Smartsheet's 90 trading days ending July 17. In recent months, numerous reports noted that Smartsheet was in talks to go private.

Mark Mader, CEO of Smartsheet, said the deal will "accelerate our vision of modernizing work management for enterprises." Blackstone and Vista Equity Partners said Smartsheet will benefit from the combined firms scale and network of companies. For instance, Vista is focused on enterprise software, data and technology. 

Constellation Research analyst Liz Miller said:

"Smartsheet going private is an interesting move as the very idea of what work and project management means today, especially in this age of AI where work is being forever changed by automation. The interesting differentiation with Smartsheet isn’t just their capacity to help manage, automate and optimize work and projects across the enterprise but also their past acquisitions like Brandfolder."

Smartsheet will have a 45-day go-shop period that expires Nov. 8 where the company will be able to solicit other acquisition offers.

For the second quarter, Smartsheet reported revenue of $276.4 million, up 17% from a year ago. Annual recurring revenue was $1.09 billion. Smartsheet reported earnings of $7.9 million, or 6 cents a share.

Smartsheet, which competes with Asana and Monday, had 2,056 customers with ARR of more than $100,000. The company projected fiscal 2025 revenue of $1.116 billion to $1.121 billion.

 

Future of Work Chief Information Officer

Google Cloud rolls out new Gemini models, AI agents, customer engagement suite

Google Cloud launched a series of updates including new Gemini 1.5 Flash and 1.5 Pro models with a 2 million context window, grounding with Google search, premade Gems in Gemini in Google Workspace and a series of AI agents designed for customer engagement and conversation.

The updates, outlined at a Gemini at Work event, come as generative AI players increasingly focus on agentic AI. Google is looking to drive Gemini throughout its platform. The pitch from Google Cloud is that its unified stack can enable enterprises to tap into multiple foundational models including Gemini, create agents with an integrated developer platform and deploy AI agents with grounding in enterprise data on optimized infrastructure.

Google Cloud's agent push was noted by Google Cloud CEO Thomas Kurian at an investment conference recently. Kurian cited a series of use cases in telecom and other industries. Kurian said Google Cloud is introducing new applications for customer experience and customer service. "Think of it as you can go on the web, on a mobile app, you can call a call center or be at a retail point of sale, and you can have a digital agent, help you assist you in searching for information, finding answers to questions using either chat or voice calls," said Kurian.

Google Cloud is showcasing more than 50 customer stories and case studies for Gemini deployments including a big push into customer engagement

During his Gemini at Work keynote, Kurian said customer agents will focus on real-world engagement, natural interaction with voice and understand the information needed to give a correct answer. "Customer agents can interact in natural ways without having to navigate menus and traverse systems," he said. "Agents can synthesize all the information you want and your data privately and securely."

Duncan Lennox, VP & GM of applied AI at Google Cloud, said "the enterprise is shaping up to be one of the most impactful transformations that I've seen in my career." Lennox added that AI agents have the potential "to revolutionize how businesses operate, how people interact with technology and even solve some of the world's biggest challenges."

Lennox said a Google Cloud survey found that 61% of organizations are using GenAI in production and increasingly looking to drive returns. Lennox argued that agents are going to enable new applications and experiences in the enterprise.

As for the news, Google Cloud outlined the following:

Vertex AI (all GA unless otherwise noted)

  • New models Gemini 1.5 Flash and 1.5 Pro with 2 million context window, double what was available before.
  • Controlled generation, which allows you to dictate the format you want to model.
  • Prompt Optimizer in preview.
  • Prompt Management SDK.
  • GenAI Evaluation Service.
  • Distillation for Gemini Models, supervised fine tuning for Gemini 1.5 Pro & Flash.
  • Chirp v2.
  • Imagen 3 editing and tuning in preview.
  • Ground with Google search, dynamic retrieval.
  • Multimodel function calling.
  • Expanded machine learning processing in North America, EMEA and Japan/Asia Pacific.

Google Workspace

  • Premade Gems in Gemini for brainstorming, writing social media posts and coding.
  • Custom Knowledge in Gems to carry out repetitive tasks with specific instructions.
  • Vids, GA by end of the year. Vids can start with a single prompt and guide users to tell a story
  • Summarize and Compare PDFs.
  • Gemini in Chat.
  • Gemini for Workspace Certifications.

Customer Engagement with Google AI

  • 1.5 Flash for Customer Engagement with Google AI.
  • Deterministic and Generative Conversational Agents in preview.
  • Agent Assist Coaching Model in preview.
  • Agent Assist Coaching Model in preview.
  • Agent Assist Summarization in preview.
  • Agent Assist Smart Reply.
  • Agent Assist Translation in preview.

What's an agent?

With the term agent being used extensively, Erwan Menard, Director of Product Management at Google AI, was asked in a briefing how the company segments agentic AI.

The question is a good one considering that in just the last two weeks, Salesforce, Workday, Microsoft, HubSpot, ServiceNow and Oracle all talked about AI agents and likely overloaded CxOs who have spent the last 18 months trying to move genAI from pilot to production. Other genAI front runners—Rocket, Intuit, JPMorgan Chase--have mostly taken the DIY approach and are now evolving strategies.

Menard said there are three flavors of agents across the Google Cloud portfolio. First, there are pre-built agents embedded into experiences in Google Workspace. Then there are Google pre-built agents designed for customer engagement platforms. And then there are agents being built by enterprises using Google Cloud.

The Gemini at Work event will feature a hefty dose of companies that are building agents on Google Cloud. The genAI use cases going to production the fastest are ones that are built into existing applications and those aimed at contact centers, HR and other environments, said Menard.

Menard said:

"We think of AI agents as systems that use an AI technique to push you goals and complete tasks on behalf of users. An agent basically understands your intent, turns it into action. That's how we think of the word agent."

Google Cloud is seeing agentic AI revolving around complexity as well as agency. Agentic AI will revolve around complexity of workflows and the need for multiple systems.

"As we try to get more business impact--let's say a task specific agent that would execute a task on your behalf--we're going to go into workflows that we want to automate, and so we need to interact with many more systems," said Menard.

Agency will also be critical to agentic AI deployments. Agency refers to "the ability of the agent to learn to make decisions, proactively, take action, to achieve a desired goal with a minimum human supervision," said Menard.

Enterprises will likely have the following progression, said Menard, based on what Google Cloud has seen with enterprise customers.

  1. Task specific agents will become an early focus.
  2. Then there will be assistants that can help a human accomplish a ask faster.
  3. Multi-agent systems will then emerge to take a complex task and address it end to end.

Menard said:

"That's kind of the paradigm we're operating in terms of the agents that are being offered and going to production. Clearly, there are important decision factors for customers around the surface where the agent will be presented. Do I build a new surface and attract users, or do I meet the users where they are? Second is the skill set the customers have. Do you invest in building an agent or go with a pre-built agent, or a total DIY approach where you handpick the orchestration framework and all the different elements?"

Timeline to production will also be critical for enterprises, added Menard. "You could very much start with a pre-built experience to confirm the need and the benefit and then decide to decompose into with the DIY approach, to iterate further on your agent," he said, noting that Google Cloud's stack enables all levels and approaches to AI agents. "All of these are not conflicting but different expectations from customers."

Pilot to production takeaways

Speaking after the keynote, Kurian outlined a few takeaways based on what Google Cloud is seeing from customers as they move from pilots to production. Here are the big themes:

Timelines. "Cycle time isn't driven by models," said Kurian. "This is an actual software project and timelines depend on the systems the models interact with."

Kurian said a use case like using Gemini to create content for ad campaigns may take anywhere from 8 weeks to 12 weeks. To enhance search on a commerce site with conversational AI it could take 4 weeks to 8 weeks. If an enterprise is leveraging search and AI conversations in a contact center a project could take up to 6 months depending on the number of modern systems, legacy IT and APIs already in place. Projects where a company has to tap into an old PBX could be longer.

"A lot of it depends on whether you have to change the organization," explained Kurian. "It's not a technology problem alone. If a project doesn't require changes to an organization or workflow then it's faster."

Kurian said it has a maturity model that it has shared with SIs so they "don't go in and here's a big bang project." "There's a sequence to deliver value and we're often dealing with time windows," he said.

Change management and workflows. Kurian's comments on timelines highlight how important change management is with generative AI. Regulation, workflows, processes and technology debt and culture are all factors to consider.

Kurian said enterprises need to keep in mind change management and processes as they deploy AI agents. These processes are critical, but now Google Cloud Gemini models have memory they can wait for an asset or step to take actions.

"It's not a big bang. It's deliver the technology and methodology to deliver an AI solution," said Kurian.

Business value. Kurian also noted that genAI projects need to deliver value whether it's efficiency or revenue growth. The slew of Google Cloud customers noted at the Gemini at Work event have all seen business value. The metrics will differ by company, but the blueprint is the same. Create value quickly and then expand from there.

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity Google Cloud Google ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR SaaS PaaS IaaS CRM ERP CCaaS UCaaS Collaboration Enterprise Service GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

AI projects remain work in progress, pilots, say CxOs

Do-it-yourself AI projects are alive and well as CxOs are trying multiple approaches for generative AI, getting some projects to production and looking for better returns, according to a pop-up attendee survey at Constellation Research's AI Forum.

In a survey of 35 CXOs at the Constellation Research AI Forum, it's clear that the genAI playbook is far from being solidified. Forty-three percent of respondents were from companies with revenue of more than $1 billion. Overall, the pop-up survey at AI Forum aimed to highlight what CxOs were doing directionally. Salesforce CEO Marc Benioff said do-it-yourself genAI doesn't make sense in the long run because there's too much work involved and most enterprises won't keep up. That argument has a lot of merit, but it remains to be seen how genAI projects play out. T-Mobile and OpenAI announced they were building custom applications in a partnership.

Respondents indicated that they were using multiple approaches to build AI capabilities. The majority (79%) said they were developing home-grown AI services on hyperscale cloud services and 48% were also using open-source frameworks and large language models. Many of these efforts included AI embedded in packaged applications that they already used such as Salesforce, Adobe, Oracle, SAP etc.

Automation was the biggest reason to implement AI with operational efficiency No. 3. The No. 2 reason for implementing AI was for cutting edge capabilities.

More from AI Forum:

ROI, however, was a bit elusive. Forty-five percent of respondents said they have yet to see ROI from AI technology and 31% said they've deployed with models returns. Of the respondents implementing AI, all of them said there was room for improvement whether they were seeing ROI or not. The majority of respondents all said their AI investment in 2025 would be up.

Investment priorities included data lakes, predictive analytics, natural language processing and image recognition.

As for functions, CxOs at AI Forum said they have scaled AI projects in employee productivity, back office, IT and sales and marketing.

Other items of note:

  • 40% of CxOs at the AI Forum noted they didn't have the human capital to successfully implement AI.
  • CxOs were recruiting, internal peer networking, training existing employees and partnering with companies and universities to fill talent gaps.
  • Roles are changing as managers aim to acquire data expertise and restructure business models. Managers are also networking to gain knowledge.
  • CEOs, CTOs and CIOs are generally leading the AI charge.
  • Operational efficiency and revenue and growth are areas driving the most ROI for AI projects.
  • Trust, budget and data quality are the three challenges limiting ROI.
  • OpenAI, Llama and Anthropic models were most popular in order.
Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer