Results

AWS posts Q3 revenue up 19% from a year ago, $110 billion annual run rate

AWS posts Q3 revenue up 19% from a year ago, $110 billion annual run rate

Amazon delivered better-than-expected third quarter earnings as its AWS unit showed sales growth of 19%.

The AWS results land following strong cloud growth figures from Google Cloud and Microsoft Azure of 35% and 33% respectively. AWS, however, is working off much larger revenue figures.

Amazon reported third quarter net income of $15.3 billion, or $1.43 a share, on revenue of $158.9 billion, up 11% from a year ago. Wall Street was looking for

earnings of $1.14 a share on revenue of $157.2 billion.

By the numbers for the third quarter:

  • North American commerce reported third quarter operating income of $5.7 billion on revenue of $95.5 billion, up 9% from a year ago.
  • International revenue was up 12% to $35.9 billion with operating income of $1.3 billion.
  • AWS delivered operating income of $10.4 billion, up from $7 billion in the same quarter a year ago. AWS revenue was $27.5 billion, up 19% from a year ago.

Amazon CEO Andy Jassy said the company is executing well and prepared for the holiday shopping season and AI and cloud infrastructure advances at AWS re:Invent in December.

Constellation Research analyst Holger Mueller said:

"It is remarkable for AWS to turn the trend of shrinking revenue growth and go back into growth mode. And this is before AWS re:Invent where major innovations for many offerings will be released and shared. But all eyes are on AI. If AWS gets re:Invent right it will show even more growth in the quarters ahead." 

As for the outlook, Amazon projected fourth quarter earnings of $181.5 billion and $188.5 billion, up 7% to 11%, with operating income between $16 billion and $20 billion.

On a conference call, Jassy said the following:

  • "We've seen significant reacceleration of AWS growth for the last four quarters. With the broadest functionality, the strongest security and operational performance and the deepest partner community, AWS continues to be a customer's partner of choice. There are signs of this in every part of AWS's business."

  • "Companies are focused on new efforts again, spending energy on modernizing their infrastructure from on-premises to the cloud. This modernization enables companies to save money, innovate more quickly, and get more productivity from their scarce engineering resources. However, it also allows them to organize their data in the right architecture and environment to do Generative AI at scale. It's much harder to be successful and competitive in Generative AI if your data is not in the cloud."

  • "While we have a deep partnership with NVIDIA, we've also heard from customers that they want better price performance on their AI workloads. As customers approach higher scale in their implementations, they realize quickly that AI can get costly. It's why we've invested in our own custom silicon in Trainium for training and Inferentia for inference. The second version of Trainium, Trainium2 is starting to ramp up in the next few weeks and will be very compelling for customers on price performance. We're seeing significant interest in these chips, and we've gone back to our manufacturing partners multiple times to produce much more than we'd originally planned."

  • "We're continuing to see strong adoption of Amazon Q, the most capable Generative AI-powered assistant for software development and to leverage your own data. Q has the highest reported code acceptance rates in the industry for multiline code suggestions. The team has added all sorts of capabilities in the last few months, but the very practical use case recently shared where Q Transform saved Amazon's teams $260 million and 4,500 developer years in migrating over 30,000 applications to new versions of the Java JDK as excited developers and prompted them to ask how else we could help them with tedious and painful transformations."

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Meta's sneak peek 2025 budget: A lot more AI infrastructure spending

Meta's sneak peek 2025 budget: A lot more AI infrastructure spending

Meta's 2025 budget planning process goes like this: Spend heavily on AI infrastructure and use AI to drive efficiencies so you can plow more money into GPUs.

That peak into 2025 hyperscale budgeting was delivered by Meta CEO Mark Zuckerberg. Zuckerberg was a little more direct about the spending plans on AI, but the other hyperscale giants--Microsoft, Google Cloud and Amazon--have similar plans.

Speaking on a conference call, Zuckerberg was clear that Meta's appetite for GPUs--mostly from Nvidia--will be insatiable.

He said:

"We're training the Llama 4 models on a cluster that is bigger than 100,000 H100s or bigger than anything that I've seen reported for what others are doing."

Zuckerberg added that Meta is early in the budget process, but is targeting the following: AI for efficiency that will partly fund more investment in infrastructure.

1. AI for efficiency. "First, it's clear that there are a lot of new opportunities to use new AI advances to accelerate our core business that should have strong ROI over the next few years, so I think we should invest more there," said Zuckerberg.

Meta's third quarter results were powered by monetization efficiency due to AI. Meta is also boosting engagement and optimizing ad delivery. The company delivered third quarter revenue of $40.59 billion with net income of $15.69 billion, or $6.03 a share. Meta's results were well ahead of Wall Street estimates.

As for the outlook, Meta projected fourth quarter revenue of $45 billion to $48 billion.

CFO Susan Li noted that Meta can boost productivity with AI as it optimizes monetization.

"On the use of AI and employee productivity, it's certainly something that we're very excited about. I don't know if we have anything particularly quantitative that we're sharing right now. I think there are different efficiency opportunities with AI that we've been focused on in terms of where we can reduce costs over time and generate savings through increasing internal productivity in areas like coding."

She said content moderation is another area where AI can boost productivity. Large language models (LLMs) will also improve multiple work streams in general and administrative categories. Li said that Meta also has a headcount opportunities too.

2. Investment in AI infrastructure. "Our AI investments continue to require serious infrastructure, and I expect to continue investing significantly there too," he said.

Specifically, Meta projected 2024 capital expenditures to be $38 billion to $40 billion, updated from the $37 billion to $40 billion range. The company now expects "a significant acceleration in infrastructure expense growth next year."

Zuckerberg, however, noted that infrastructure will become more efficient and that's why Meta has backed the Open Compute Project. He said:

"This stuff is obviously very expensive. When someone figures out a way to run this better, if they can run it 20% more effectively, then that will save us a huge amount of money. And that was sort of the experience that we had with open compute and part of why we are leaning so much into open source here."

 

Data to Decisions Marketing Transformation Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Future of Work Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

How to burn down your legacy IT in 10 not-so-easy steps

How to burn down your legacy IT in 10 not-so-easy steps

David Giambruno, VP Tivity Health, is the type of person who negotiates his exit package before ever taking a job. Why? He's going to burn your platforms, automate everything possible, cut costs and be hated by everyone at a company except for the CFO.

"I figure out ways to make IT super-efficient, and so I've learned a lot. The biggest thing is having awesome severance and have your lawyer look at it, because everybody's getting angry," he said.

Giambruno has restructured IT operations at Revlon, Tribune Media, Shutterstock and Pitney Bowes. At Constellation Research's Connected Enterprise 2024, Giambruno laid down some truth. "I'm a tech masochist. I never get called by a CTO or CIO. I get called by the CFO or CEO. Generally, if there's some disaster you can fix it," he said.

More from CCE 2024:

Here's a look at the lessons:

  • Best practices are mediocrity. Most organizations confuse entropy as safety.
  • You have to burn the platform. "It is about pain and the burning platform. Without that no one ever wants to change," said Giambruno. "Starting a system is the only way to change."
  • "Spend no money on the old systems. If you're spending money on the old systems you'll never change," he said. "No legacy. Do not waste a second on an old system."

  • Automation makes everything happen. Automation means more speed and speed always wins.
  • No multi-cloud. Every cloud you add is about 3x the cost and 5x the security problem.
  • Run cloud native applications.
  • Always do proofs of concepts because you'll need to show humans what's possible. "It's really about showing people what's possible because no one ever believes it," he said.
  • Proofs of concepts are for CEO and CFO primarily, but product people are interested when they realize how much faster you can deliver technologies.

  • "Nothing runs a computer better than another computer," said Giambruno. Automation means that costs come down and stay down.
  • "No one will like you. Vendors will hate you because you're taking away huge chunks of money from vendors. I take away huge amounts of money from internal teams too. Then you get a whole new set of vendors and whole new set of processes," he said. "I go from massive chaos to structure."
  • "Everybody chooses lock in with a vendor. It's cheapest to pick one and then tell them too pound salt when my contract is over," said Giambruno.

Tech Optimization New C-Suite Leadership Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Experience Officer

Microsoft Q1 strong, Azure revenue growth 33%

Microsoft Q1 strong, Azure revenue growth 33%

Microsoft reported better-than-expected first quarter earnings as commercial cloud revenue was up 22% and Azure grew 33% from a year ago.

The company reported first quarter net income of $24.7 billion, or $3.30 a share, on revenue of $65.6 billion, up 16% from a year ago.

Wall Street was looking for first quarter earnings of $3.10 a share on revenue of $64.51 billion.

Microsoft CEO Satya Nadella said AI is "expanding our opportunity and winning new customers."

CFO Amy Hood said the company's first quarter execution "delivered a solid start to our fiscal year."

As for the outlook, Microsoft said second-quarter revenue will be between $68.1 billion to $69.1 billion. Wall Street was expecting Microsoft to deliver second-quarter revenue of $69.83 billion. 

Nadella said suppliers are late with data center infrastructure and the company won't be able to meet demand. 

Key points:

  • Azure revenue was up 33%.
  • Microsoft 365 Commercial products and cloud services revenue was up 13% from a year ago.
  • Microsoft 365 Consumer products and cloud services revenue was up 5%.
  • LinkedIn revenue was up 10%.
  • Dynamics 365 revenue growth was up 18%.

Speaking on an earnings conference call, Nadella said:

"At the silicon layer, our new Cobalt 100 VMs are being used by companies like Databricks, Elastic, Siemens, Snowflake, and Synopsys to power their general-purpose workloads at up to 50% better price performance than previous generations. On top of this, we are building out our next-generation AI infrastructure, innovating across the full stack to optimize our fleet for AI workloads."

Nadella added that data centers (DCs) are a constraint. He said:

"We ran into a set of constraints, which are everything because DCs don't get built overnight. So, there is DCs, there is power. And so that's sort of been the short-term constraint. Even in Q2, for example, some of the demand issues we have or our ability to fulfill demand is because of, in fact, external third-party stuff that we leased moving up. So that's the constraints we have. But in the long run, we do need effectively power and we need DCs. And some of these things are more long lead." 

Hood addressed capital expenditures.

"Capital expenditures, including finance leases, were $20 billion, in line with expectations, and cash paid for PP&E was $14.9 billion. Roughly half of our cloud and AI-related spend continues to be for long-lived assets that will support monetization over the next 15 years and beyond. The remaining cloud and AI spend is primarily for servers, both CPUs and GPUs, to serve customers based on demand signals."

She said that the capital expenses will pay off over time as capacity catches up to demand. 

"In H2, we still expect Azure growth to accelerate from H1 as our capital investments create an increase in available AI capacity to serve more of the growing demand."
 

Data to Decisions Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Is AI data center buildout a case of irrational exuberance?

Is AI data center buildout a case of irrational exuberance?

The cost of artificial intelligence inference and training will fall and enterprises need to question the current groupthink that revolves around a never-ending data center buildout cycle.

Speaking on a panel at Constellation Research's Connected Enterprise conference, Brian Behlendorf, CTO at the Open Wallet Foundation and Chief AI Strategist at The Linux Foundation said "there's a lot of irrational exuberance about the amount of investment that's going to be required both to train models and do inference on them."

Behlendorf said the capacity buildout is going to lead to indigestion.

"I see a lot of enterprises that are cutting other programs, laying off staff and doing everything to conserve capital to be able to collect all the data in the world and build dumb models that they don't really know what they're going to do."

"Yet, the cost of training AI is going to come down dramatically. There are a raft of 10x improvements in training and inference costs, purely in software. We're also finding better structured data ends lead to higher quality models at smaller token sizes."

Behlendorf added that he expects commodity GPU hardware systems to emerge in the next few years. He noted that the idea that the industry is going to need nuclear reactors and an ongoing data center buildout cycle to train large language models is foolhardy.

"A more sober analysis is that you need to build capacity inside your organization at a personnel level and skills level on how to use these technologies and hold on the massive expansion of data centers," he said.

More from CCE 2024: 

The theme of the panel revolved around open-source models and their role in generative AI, but panelists agreed that costs will come down due open technologies. The upshot is that the Nvidia-OpenAI hammerlock on generative AI isn't going to last.

Other key points from the panel include:

Data hoarding doesn't work. Much of the genAI buildout revolves around the idea that data demand is insatiable. You can't have enough data is the common view. Jana Eggers, CEO of Nara Logics, disagreed:

"More data isn't going to solve your problem and the tech industry hasn't quite gotten it yet. Boards think that you should just go out and acquire more data."

Eggers said that enterprises need to profile the data they have and what's being acquired. Quality matters more than quantity. "Enterprises aren't even doing the basic checks on their own data or open data," said Eggers. "At the very start we tell our customers to profile their data."

It remains to be seen how long the view that data hoarding pays lasts.

Open models will lower costs, but hygiene will be an issue. Brittany Galli, CEO of BFG Ventures, said open models will improve efficiencies in AI, but hygiene will be a problem. "There's a ton of bad data and it's causing a lot of problems. You think that open models equal more transparency and higher efficiency, but the problem is hygiene," said Galli. "There is no perfect model that's going to be more accurate and unbiased. It's going to take time.

Invest smartly because you have to invest in AI. "I think we're to that point with AI that we know it's needed and you have to build or buy or get run over," said Galli. "There are no other options."

Be aware what's really open about models and frameworks. hlendorf said AI builders need to read the fine print. "We need to apply rigor to the use of the word open around AI," he said. Behlendorf said so-called open models often have a series of restrictions and lack transparency.

Models will become more efficient too and smaller. Jeff Welser, Vice President of IBM Research at the Almaden Lab, said smaller models and a wide selection of them will increase efficiency. "One reason you'll want open models is that you don't want to train them. You can choose to train for a specific portion or use case and then string them together," he said.

 

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

How leaders need to think about AI, genAI

How leaders need to think about AI, genAI

Artificial intelligence and generative AI will tax leadership, but ultimately raise the bar for decision making.

Those were some of the takeaways from Cassie Kozyrkov, Founder Kozyr LLC, who delivered the keynote at Constellation Research's Connected Enterprise.

Kozyrkov is best known for founding the field of Decision Intelligence and serving as Google’s first Chief Decision Scientist, where she led the charge in Google’s transformation into an AI-first company. She's now an AI advisor for Gucci, NASA, Meta and others.

Here's a look at what leaders need to know about AI.

Every job is going to have some level of disruption due to AI. "Don't think in terms of jobs. Think in terms of what tasks of any given job are most likely to be disrupted," said Kozyrko. "The key thing to understand is that every job has some component that's a little bit repetitive."

Repetitive work is going to be automated. "The repetitive and digitized task is the ideal target for AI automation when there aren't a lot of consequences for messing up performance," she said.

But automating repetitive work can hollow out your bench. "Here's the thing that I think too many people forget. When you hire somebody with no experience at all the work you give them very early on repetitive and easy to check the answer. The perfect task for AI is also the perfect work for your intern or new graduate," said Kozyrko. "A lot more of the junior person's work is going to get cannibalized by more senior folks."

Kozyrko said:

"You should be preparing for what you're going to do with training your future cohort of leaders."

AI washing is trendy. "It is difficult to know what you're buying these days and what to expect--not only of the software systems but the complex collaboration between human and machine," she said.

Here's the test to see if there's AI washing or something like machine learning. "If it's written in Python, it's probably machine learning. If it's written in PowerPoint it's probably AI," said Kozyrko. "Ask questions."

Strategy matters. AI models are just recipes and a human engineer has to think hard about the problem, how to solve it and come up with instructions. "You need to understand the task to come up with those instructions," said Kozyrko. "When humans teach each other, sometimes we use exact instructions. Sometimes we do it another way and teach with examples. Data is just examples."

Leaders prefer examples when control matters. For more complex work, examples matter more. AI will force leaders to raise the bar on performance and be comfortable with change--there's no choice.

AI decisions belong to leaders not subject matter experts. "We have a problem with absentee leadership, where a lot of folks think that AI is the business of the PhD in the world. It is the business of the leader, the decision maker, the domain expert, not the person who's good at mathematics," said Kozyrko. "AI is the product of decision making with some very subjective decisions made by whoever was in charge. The worst thing you can do is think AI is some independent entity that's objective."

Generative AI makes things more complicated for leaders. "Generative AI has more than one right answer and more than one wrong answer," she said. "Test everything and test it in context. Trust nothing you haven't tested and use it carefully."

Kozyrko said:

"It is hard to set criteria for Gen AI and always think in terms of who takes responsibility. That may be more of a limiting factor than some of the technology. So, at the end, is genAI an act of desperation or the frontier of innovation? It's absolutely both."

AI is a genie that can be a friend or foe. "AI will absolutely do is raise the bar for your decision leadership. This is a genie that may grant you a wish, but we know the genie is dangerous," said Kozyrko. "AI will demand more from us in the future. It is absolutely a leadership concern."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

AMD's Q3 on target, data center unit revenue growth 122% from a year ago

AMD's Q3 on target, data center unit revenue growth 122% from a year ago

AMD continues to see its revenue surge due to its data center unit, which posted sales growth of 122% from a year ago.

The company reported third quarter net income of $771 million, or 47 cents a share, on revenue of $6.8 billion. Non-GAAP earnings were 92 cents a share.

AMD CEO Dr. Lisa Su said: "record revenue was led by higher sales of EPYC and Instinct data center products and robust demand for our Ryzen PC processors."

  • Data center revenue was $3.5 billion due to AMD Instinct GPU shipments as well as AMD EPYC CPUs.
  • PC revenue was $1.9 billion, up 29% from a year ago.
  • Embedded unit revenue was $927 million, down 25% from  a year ago, and gaming revenue fell 69% from a year ago to $462 million.

As for the outlook, AMD projected fourth quarter revenue of $7.5 billion, give or take $300 million.

On a conference call, Su said:

  • "Data Center GPU revenue ramped as MI300X adoption expanded with cloud, OEM and AI customers. Microsoft and Meta expanded their use of MI 300X accelerators to power their internal workloads in the quarter. Microsoft is now using MI 300X broadly for multiple co-pilot services powered by the family of GPT 4 models."
  • "Development on our MI400 series based on the CDNA Next architecture is also progressing very well towards a 2026 launch. We have built significant momentum across our data center AI business with deployments increasing across an expanding set of Cloud, Enterprise and AI customers. As a result, we now expect Data Center GPU revenue to exceed $5 billion in 2024, up from $4.5 billion we guided in July and our expectation of $2 billion when we started the year."
  • "In the Data Center alone, we expect the AI accelerator TAM will grow at more than 60% annually to $500 billion in 2028. To put that in context, this is roughly equivalent to annual sales for the entire semiconductor industry in 2023."
  • "We feel very good about the market from everything that we see, talking to customers, there's significant investment in trying to build out the infrastructure required across all of the AI workloads." 
Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AMD Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Google Cloud Q3 revenue up 35% from a year ago, Alphabet results shine

Google Cloud Q3 revenue up 35% from a year ago, Alphabet results shine

Alphabet handily topped third-quarter expectations and its Google Cloud saw revenue growth of 35% from a year ago due to generative AI.

The company, which includes Google, Google Cloud and YouTube, reported third quarter net income of $26.3 billion, or $2.12 a share, on revenue of $88.27 billion, up 15% from a year ago.

Wall Street was expecting Alphabet to report third quarter earnings of $1.85 a share on revenue of $86.3 billion.

Google Cloud revenue was $11.4 billion, up 35% from a year ago. Alphabet said Google Cloud saw strength across AI infrastructure, genAI and core services.

CEO Sundar Pichai said the company's "long-term focus and investment in AI" are paying off. Specifically, Google Cloud is driving "deeper product adoption" with existing companies while landing larger deals and new enterprises. How Google Cloud is monetizing AI

Indeed, Google Cloud's operating income is accelerating. The company's third quarter operating income was $1.947 billion, up from $266 million a year ago.

Google's core services remain the cash cow with operating income of $30.86 billion.

Google Cloud has been picking up traction as it builds out services for the AI layer and also drills down into industries. 

And more on industries. 

 

Speaking on Alphabet's earnings conference call, CEO Sundar Pichai said Google Cloud's stack is paying off as enterprises leverage AI. 

Pichai said Alphabet continues to invest in AI infrastructure--including nuclear power for data centers. Google Cloud is also benefiting from workloads powered by new Nvidia GPUs as well as its own processors for AI workloads. CapEx in the fourth quarter will be similar to the third quarter tally of $13 billion. The largest component of that CapEx was servers, data centers and networking equipment. 

Takeaways from the conference call:

  • Google services are all leveraging Gemini models.
  • The company has unified teams in AI and machine learnin to move faster. He said the team behind NotebookLM highlights how smaller teams can move faster. "You'll see a rapid pace of innovation," he said.
  • Google is using AI to generate code that is then reviewed by engineers. 
  • Google is seeing search queries surge due to AI overviews and strong engagement as consumers ask longer and more nuanced questions. 
  • Circle to Search is available on more than 150 million Android devices. 
  • Google Cloud's ability to attract generative AI workloads is landing the company larger deals. 
  • Gemini API calls have grown 14x in the last six months. 

On a conference call with analysts, Pichai said the following:

  • "Customers use our AI platform together with our data platform, BigQuery, because we analyze multimodal data no matter where it is stored with ultra-low latency access to Gemini."
  • "Each week, Waymo is driving more than 1 million fully autonomous miles and serves over 150,000 paid rides. The first time any AV company has reached this kind of mainstream use."
  • "On the TPU front, I just spent some time with the teams on the road map ahead. I couldn't be more excited at the forward-looking road map, but all of it allows us to both plan ahead in the future and really drive an optimized architecture for it."
  • "We are in much more of a virtuous cycle with a lot of velocity in the underlying models. We've had two generations of Gemini model. We are working on the third generation, which is progressing well. And teams internally are now set up much better to consume the underlying model innovation and translate that into innovation within their products. There is aggressive road map ahead for 2025." 
Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Takeaways on successful AI, generative AI projects

Takeaways on successful AI, generative AI projects

Enterprise artificial intelligence projects--generative AI, agentic and everything in between--will often depend on old-school IT management techniques.

Here's a look at some of the takeaways from successful AI projects from a panel at Constellation Research's Connected Enterprise.

It's all about value and use cases. Laurie Wheeler, Chief Operating Officer, Information Services & Technology at MultiCare Health System, said AI projects need to be very clear about use cases and the value expected.

Find a champion. Wheeler, a BT150 inductee, said the projects have done well have featured a "great partnerships with operations." "Having a physician champion was critical to success," said Wheeler.

Don't be wedded to a technology or ideology. Chris Claridge, Chair of Trust Alliance NZ, said stakeholders need to be future oriented but leave technology ideology at the door. "You want people who are open enough about digital identity, fabric ontologies and choose vendors that allow interoperability," said Claridge.

Expectations. Patrick Nicolet, Chairman of Linebreak, said expectations need to be set with AI projects and there's a balance between what can be done and the art of the possible. Wheeler added that you can set expectations to hit a particular metric, but be prepared to be surprised at times.

Standards and quality matters. Wheeler added that champions also help with creating standards and then upholding them as a project scales.

Be prepared to manage fear. Claridge said AI projects often have a tangible fear about job loss attached to them. "That fear causes enormous concerns and fear," he said. "Managing the disruption of this technology is going to be a major issue. It is incredibly disruptive once you start to deploy and scale."

Cultural change. "What's different about AI is that we've been trained to bring answers to questions," said Nicolet. "AI is the opposite. We have to be good at asking the right question and have lots of answers. You really have to shift and that's challenging for any organization."

Claridge agreed and noted that organizations' purpose will be challenged by AI. "Organizations will have to challenge why they exist. AI is going to change the way data moves around and the actual activities of the organization," he said.

Change management. All the panelists said that change management is the secret sauce to AI projects. The communication has to be digestible to end users and people need to be reassured about their jobs. Keep in mind, however, that some of those fears are warranted.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

2025 in preview: What Constellation Research’s analysts say

2025 in preview: What Constellation Research’s analysts say

In 2025, you'll have to get ready for "knowledge," AI governance will move to the forefront, enterprise software models will be revamped, decision automation will depend on humans in the loop and data strategies will be a pain point.

At Constellation Research's Connected Enterprise conference in Half Moon Bay, the first panel revolved around the first cut of 2025 predictions.

Here's the recap.

Martin Schneider:

  • Growth strategies and customer journeys will evolve into orchestrated engagements designed for an entire lifecycle.
  • Revenue plans will be modeled from the ground up by AI.
  • AI-generated workflows will bring new approaches to customer data that will drive repeatable and scalable predictions.

Chirag Mehta:

  • Cybersecurity implementations will move toward focusing more on response than prevention.
  • Generative AI will change the way software is built to make it more secure.
  • Chief product officers will have all the tools needed to exploit product development for growth. CPOs will no longer be chief backlog officers.

Me:

  • The enterprise software model will change and alter the way CIOs buy applications. The problem is that revenue models are in flux. Value based models, consumption models and traditional seat models will all be under fire.
  • Agentic AI orchestration and processes will be critical and a main focus for enterprises in the year ahead.

Andy Thurai:

  • AI projects will get real budgets and be under more scrutiny for returns on use cases.
  • AI governance will be critical as enterprises grow concerned about synthetic data.
  • Here's the problem: AI produces AI that is monitored by AI (see the conflict of interest here?).
  • 2025 will bring more data, more uses cases and more issues. The lawyers will be busy.

Doug Henschen:

  • Enterprises will realize that they have real data problems to solve before implementing genAI.
  • "Seventy percent of enterprises in our AI survey aren't seeing the ROI. What these companies have in common is a lack of data, not enough scale and not enough cleanliness," said Henschen.
  • There will be a barrage of vendor announcements looking to solve these data issues to make data usable for AI.

Liz Miller:

  • 2025 will be the year where enterprises are actually doing what they should be with AI. That means guardrails and a focus on processes. "What we've learned about genAI is that when you automate a really old process with AI you really get a really old result," said Miller.
  • Enterprise buyers will hear a lot about "knowledge" in 2025, but CxOs shouldn't treat the topic as just another buzzword. Knowledge is about all the accumulated data across the enterprise that drives experiences.

Holger Mueller:

  • Human capital management will see significant changes. On the people front, employees and gig workers can be doing the same thing. Payroll will become a key sector for innovation. And applications will become more intuitive.
  • AI will become turbocharged by transactional data.
  • 2025 will be the year of quantum computing (again).

Ray Wang:

  • Automation is transforming markets and the field will revolve around decisions, not more AI.
  • "We're going to move the conversation out of AI into agents that can make decisions," said Wang.
  • Budgets will focus on exponential efficiency due to cost pressures.
  • Automation success will depend on where you put humans in the process loop. "The number one question for automation is where do you insert the humans," said Wang.
  • Automation will become less of a concern due to demographics. Most countries won't have enough working people to do the work so automation is necessary.
Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer