Results

AMD Q2 on target on strength of data center, PC chips

AMD Q2 on target on strength of data center, PC chips

AMD's second quarter was in line with expectations as its data center unit delivered sales growth of 14%. The company said US restrictions on exports to China hurt sales.

The company reported second quarter earnings of $872 million, or 54 cents a share, on revenue of $7.68 billion, up 32% from a year ago. Non-GAAP earnings were 48 cents a share.

Wall Street was looking for non-GAAP second quarter earnings of 48 cents a share on revenue of $7.43 billion.

AMD said the quarter was hit by US government export control on its AMD Instinct MI308 data center GPUs. Those restrictions led to an $800 million inventory charge.

Dr. Lisa Su, AMD CEO, said the company saw record server and PC processor sales and "robust demand across our computing and AI product portfolio." Su added that AMD expects more market share gains for its EPYC and Ryzen processors. Those chips are mostly benefiting from Intel's malaise. Intel recently reported a rocky quarter.

Key figures in the second quarter include:

  • Data center revenue was $3.2 billion in the second quarter, up 14%.
  • Client and gaming revenue was $3.6 billion, up 69% from a year ago. PC chips surged due to AMD's latest Ryzen desktop processors. Gaming revenue surged 73% due to semi-custom revenue and AMD Radeon GPU demand.
  • Embedded revenue was $824 million, down 4% from a year ago.

As for the outlook, AMD said third quarter revenue will be about $8.7 billion, give or take $300 million. The outlook doesn't include any revenue from AMD Instinct MI308 shipments to China.

Speaking on an earnings conference call, Su said:

  • "There are now nearly 1,200 EPYC cloud instances available globally and providers continue expanding both the breadth and regional availability of their AMD offerings. EPYC enterprise deployment grew significantly from the prior quarter, supported by new wins with large technology, automotive manufacturing, financing services and public sector customers."
  • "Our sovereign AI engagement accelerated in the quarter as governments around the world adopt AMD technology to build secure AI infrastructure and advance their economy. As one example, we announced a multi billion dollar collaboration with Germany to build AI infrastructure powered entirely on AMD GPUs, CPUs and software."
  • "Demand for AMD powered notebooks was strong with sales growing by a large double digit percentage year over year. We drove a richer mix of higher ASP mobile cards year over year, as we expanded our share in a premium notebook segment where our Ryzen AI 300 CPUs deliver leadership, performance and value for both general purpose and AI workloads. We also closed new enterprise wins with Forbes 2000 pharma, tech, automotive, financial services, aerospace and healthcare companies."

Holger Mueller, an analyst at Constellation Research, said:

"AMD can't catch the break as it can't ignite data center revenue with its AI chips. And while it's fair they maybe affected by tariffs and export controls, the writing was on the wall. That client and gaming are the growth engine to save the quarter is good overall but doesn't help Su and team to play the data center and AI boom. Next quarter will be key the formation of AMD's personality."

Data to Decisions Tech Optimization AMD Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Monday's Musings: Here Come the AI Exponentials

Monday's Musings: Here Come the AI Exponentials

Media Name: @rwang0 Logan Voss Unsplash Vibrant Purple Swirls.png

Legacy Companies are Thinking in the Wrong Scale

Whatever gains expected from digital transformation will be blown to shreds by AI Exponentials at a logarithmic scale not seen since the advent of the internet.  While at first this may sound like more AI hyperbole, the early indications for organizations who begin their journey as AI Natives by design show a tremendous advantage versus the AI Enabled who have to reduce their legacy technology, cultural, and financial debt.

Constellation sees a progression for AI maturity that begin with AI Luddites and ends with AI Exponentials.  Here are the defining characteristics (see Figure 1.)

Source: Constellation Research, Inc.

Inside the Ten Attributes of AI Exponentials

  1. Level of AI investment. Organizations must overcome technical debt and then invest at double or triple their current rates in AI fundamentals such as data strategy, decision automation, and language models.
  2. Stage of AI maturity. Constellation’s AI maturity ranges from augmentation, acceleration, automation, agentic, and autonomous.
  3. Value of data.  How organizations appreciate their data reflects their maturity in AI.  Those organizations who take their data for granted will differ in how they source, curate, nurture, and renew their data.
  4. Monetization models. Traditional businesses monetize the sale of products and services.  Greater levels of digitization enable experiences, outcomes, and revenue share.
  5. Means of production. This attribute assigns the default assumption of how work gets done and ranges from full human involvement to full digital labor.
  6. Agentic usage. Organizations range from no agents to multiple agents and multiple platforms.  The goal is to create an agentic ecosystem.
  7. Machine scale. This attribute addresses the percentage of human led or machine led output.
  8. Profit per employee.  As AI investments increase, expect digital labor to create massive leverage as profit per employee grows.
  9. Growth expectations.  AI ushers in an era of exponential efficiency and growth. Expectations move from double digit percentage growth to a ten times and hundred times growth.
  10. Partnerships. Partnerships mature from technology providers and  enablers, to new data signals and data collectives which share data.  The most advanced AI Exponentials build Data, Inc. business models that monetize data in complex ecosystems.

The Bottom Line: AI Exponentials are Here

AI Natives and AI Exponentials have taken the market by storm.  In three years, Constellation expects the first $1 billion services company to be staffed by 1000 FTEs.  A 100 person software company will take out a $100 billion ARR company in the next three years.  The first single person $1B ARR company will arrive within five years.  Companies that are driving millions of ARR with few employees dominate the new landscape. With AI intelligence doubling every 7 months, the pace of innovation has never been faster.  Constellation Research is tracking these new companies as the Age of AI completely changes the landscape.

Your POV

Have you made the frameshift to exponential thinking? Will you be ready for AI Exponentials?  When will you choose an AI Exponential over a legacy vendor?

Add your comments to the blog or reach me via email: R (at) ConstellationR (dot) com or R (at) SoftwareInsider (dot) org. Please let us know if you need help with your strategy efforts. Here’s how we can assist:

  • Working with your boards to keep them up to date on technology and governance.
  • Connecting with other innovation minded leaders
  • Sharing best practices
  • Vendor selection
  • Implementation partner selection
  • Providing contract negotiations and software licensing support
  • Demystifying software licensing

Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales.

Disclosures

Although we work closely with many mega software vendors, we want you to trust us. For the full disclosure policy,stay tuned for the full client list on the Constellation Research website. * Not responsible for any factual errors or omissions.  However, happy to correct any errors upon email receipt.

Constellation Research recommends that readers consult a stock professional for their investment guidance. Investors should understand the potential conflicts of interest analysts might face. Constellation does not underwrite or own the securities of the companies the analysts cover. Analysts themselves sometimes own stocks in the companies they cover—either directly or indirectly, such as through employee stock-purchase pools in which they and their colleagues participate. As a general matter, investors should not rely solely on an analyst’s recommendation when deciding whether to buy, hold, or sell a stock. Instead, they should also do their own research—such as reading the prospectus for new companies or for public companies, the quarterly and annual reports filed with the SEC—to confirm whether a particular investment is appropriate for them in light of their individual financial circumstances.

Copyright © 2001 – 2025 R Wang and Insider Associates, LLC All rights reserved.

Contact the Sales team to purchase this report on a a la carte basis or join the Constellation Executive Network,

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Innovation & Product-led Growth Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization Insider Associates AR ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Leadership Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Revenue Officer Chief Supply Chain Officer Chief Sustainability Officer Chief Technology Officer Chief AI Officer Chief Experience Officer

Thomson Reuters brings agentic AI to legal workflows

Thomson Reuters brings agentic AI to legal workflows

Thomson Reuters launched CoCounsel Legal with Deep Research and guided workflows, an AI agent designed to answer legal questions, develop reports, draft reports and feature workflows for discovery and depositions.

The launch highlights how agentic AI can be utilized in industries such as legal and evolve from tools that require prompts to agents that can take on tasks.

Thomson Reuters has a bevy of legal products and services including CoCounsel, a generative AI assistant for legal and accounting professionals, Westlaw and Amlaw. Thomson Reuters has more than 20 billion legal documents that were used to train its models.

That Thomson Reuters data and reasoning models are complemented by domain experts. CoCounsel Legal will be a separate product addressing outcomes in litigation and be embedded into other services such as Westlaw Advantage.

Key items:

  • CoCounsel Legal includes Deep Research, which is grounded with Thomson Reuters content and tools such as Westlaw Advantage.
  • CoCounsel Legal is built to reason, plan and deliver legal research.
  • The AI agent is designed to understand process, sourcing of answers and argument foundations.
  • CoCounsel Legal can generate multi-step research plans, trace logic, delivered Westlaw citation-backed reports and draft complaints, discovery requests and responses and operate with humans in the loop.
  • Thomson Reuters said it tested CoCounsel with Deep Research with more than 1,200 customers and attorneys.

According to Thomson Reuters, more than 12,200 law firms, 4,900 corporate legal departments and the majority of top US Courts and Am Law 100 firms use CoCounsel.

I caught up with David Wong, Chief Product Officer at Thomson Reuters, and Omar Bari, VP of Applied Research at Thomson Reuters Labs, to talk about the approach to CoCounsel Legal with Deep Research and guided workflows. Here are the key points:

A model and cloud agnostic approach. CoCounsel Legal with Deep Research uses multiple models that are best suited for the task at hand. Thomson Reuters contracts with OpenAI, Google and Anthropic for foundational models and the big four cloud providers, said Wong.

Developing multi-agent systems. Bari said the multi-agent system behind CoCounsel Legal with Deep Research features agents for research, planning, discovery and orchestrating workflow. "We needed to multiple agents and the ability to launch in parallel," said Bari.

Bari said:

"We knew pretty quickly that we wanted to build a custom agent system for legal deep research that lets agents navigate Westlaw content like an expert researcher would. And that meant taking a lot of the rich content that we have in Westlaw and making it available via tools to agents and then using the best frontier models for the job." 

The importance of process. Wong said Thomson Reuters built CoCounsel Legal with Deep Research based on the process that's used by trained legal researchers. "Legal research is taught in law school as a discipline. Legal research is often different because you're trying to often support or to critique an argument or some type of legal proceeding," said Wong. "We mimicked the process used by trained researchers and the agent is autonomous. Humans trained the models and created the process steps." When the system runs it is fully autonomous, but humans are focused on validation, quality and evaluation.

Build your orchestration layer. Bari and Wong said Thomson Reuters built its own orchestration framework for CoCounsel Legal with Deep Research. "We built the agent scaffolding ourselves," said Bari. The effort required continuous tweaking for high quality function calling, instructions, evaluation, memory management and orchestration, he added.

There are agent orchestration offerings, but most are first generation. "It's pretty easy to get to a prototype or demo, but production requires a lot of work on the details and each piece of orchestration," said Bari.

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Palantir's software revolution: Forget sales people, let value do the talking

Palantir's software revolution: Forget sales people, let value do the talking

Palantir CEO Alex Karp is a bit opinionated and garners his share of haters. But the returns on Palantir are attracting enterprises to the point where word of mouth among customers scales.

In a stellar second quarter, Palantir gave investors a little bit of everything. Palantir delivered revenue growth of 48% in the second quarter as US commercial and government sales surged. The company also raised its outlook.

Palantir reported second quarter earnings of $327 million, or 13 cents a share, on revenue of $1.004 billion, up 14% from a year ago. Wall Street was looking for non-GAAP earnings of 14 cents a share on revenue of $939.5 million.

In a shareholder letter, Karp quoted himself and C.S. Lewis and argued that his company "will become the dominant software company of the future."

Palantir reported second quarter US revenue of $733 million, up 68% from a year ago. US commercial revenue was $306 million, up 93% from a year ago. US government revenue was up 53% from a year ago to $426 million. The company closed 157 deals of at least $1 million and 66 of at least $5 million. Palantir landed 42 deals worth more than $10 million.

As for the outlook, Palantir projected third quarter revenue of $1.083 billion and $1.087 billion with adjusted income from operations of $493 million to $497 million. For 2025, Palantir projected revenue between $4.142 billion to $4.15 billion. US commercial revenue for 2025 will top $1.302 billion, up about 85% from a year ago. Adjusted income from operations will be $1.912 billion and $1.92 billion.

While the numbers shined, Palantir’s conference call with analysts featured a bevy of insights. Here’s a look:

AIP and enterprise traction

Ryan Taylor, Chief Revenue and Legal Officer, said enterprises are using the company's software to make LLMs work as they should. "LLMs simply don't work in the real world without Palantir. This is the reality fueling our growth," said Taylor.

He cited customers such as Fannie Mae, Citibank, Nebraska Medicine and Lear as customers that are seeing strong returns. Taylor also noted Palantir's AIP is gaining traction at a rapid clip. This commercial momentum started to bubble up in late 2023 with Palantir's boot camps for AIP.

Here Come the AI Exponentials

"Lear Corporation recently signed a 5-year extension. Over the past 2.5 years, they have leveraged foundry and AIP to support over 11,000 users and more than 175 use cases, including proactively managing their tariff exposure, automating multiple administrative workflows and dynamically balancing their manufacturing lines," said Taylor.

Ontology and AI FDE

Shyam Sankar outlined enterprises that have replatformed on Palantir, but was largely referring to moving to the company's data ontology with enterprises using the company's AI FDE (Forward Deployed Engineer). A FDE is an engineered focused on deploying Palantir and moving customers to value quickly. AI FDE, launched last month at DevCon 3, is an autonomous agent that will run Palantir's AIP platform delegate tasks and optimize as needed.

Sankar said:

"A substantial development over the last couple of quarters is the realization and acceleration of our vision of Ontology web services as an architectural concept for our customers. AIP isn't just software our customers use, it's software, our customers are building their software on. Software companies are re-platforming away from the highly unopinionated services and building blocks of the hyperscaler stack onto AIP with its highly opinionated building blocks that get you to value 10x faster."

Yes, Palantir's argument is that building blocks in your tech stack need to be opinioned and decisive.

Sankar also noted that Palantir's investment in AI FDE is driving time to value.

"AI FDE is designed to enable autonomous execution across a wide array of tasks, including creating and editing ontology, building data transforms, creating functions, debugging issues and building applications. With its own closed-loop error handling, AI FDE can identify and correct issues and notify human users if needed, and it's been designed for seamless collaboration with humans in the loop through integration with AIPs branching," said Sankar.

The end of software sales?

Karp was asked whether Palantir could continue to grow without a direct sales force.

His answer was clear: Palantir isn't going to load up on sales people. No fancy dinners. No effort to "convince you to buy something."

Karp added:

"Our primary sales force now, and I think likely in the future, are going to be current customers telling other customers."

A sales army would just diminish Palantir's credibility. "Yes, you don't have 10,000 people roaming around selling something they don't understand. But the advantage is we go from once we come in the door, we come in with enormous credibility," said Karp. "The person we're selling to believes we will make them a lot of money, save them expenses or we will make their soldiers safer and more lethal."

Because the word of mouth around Palantir's value is good, the company can come in with higher level discussions with CxOs, said Karp, who said the game is about value creation more than software. Karp indirectly took jabs at SAP on the earnings call. 

While Palantir isn't going all-in on direct salespeople, it is building out its network with systems integrators including Deloitte, Accenture, Booz Allen and a bevy of others.

Time to value and outright cockiness

To say Karp and Palantir are lightning rods would be an understatement.

Karp, never one shy about an opinion or two, said the ROI generated with Palantir will do the talking. "I've been cautioned to be a little modest about our bombastic numbers, but honestly, there's no authentic way to be anything, but have enormous pride and gratefulness about these extraordinary numbers," said Karp.

Karp noted that Palantir's ability to push the Rule of 40 score to 94% shows the company is firing on all cylinders.

"There are almost no parasitic elements to this company. We have a small sales force. We have very little BS internally. We have a flat hierarchy. We have the most qualified interesting people heterodox on their beliefs," said Karp.

Karp said Palantir has earned the right to say what it wants. The company is teaching its customers how to attain its unit economics and telling CxOs: "If you want to have your first amendment rights to an opinion again, get our unit economics and then you too can say things that are true in public like we do."

Data to Decisions Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Palantir's Q2: Growth in US commercial, government accelerates

Palantir's Q2: Growth in US commercial, government accelerates

Palantir delivered revenue growth of 48% in the second quarter as US commercial and government sales surged.

The company also raised its outlook.

Palantir reported second quarter earnings of $327 million, or 13 cents a share, on revenue of $1.004 billion, up 14% from a year ago.

Wall Street was looking for non-GAAP earnings of 14 cents a share on revenue of $939.5 million.

In a shareholder letter, Palantir CEO Alex Karp quoted himself and C.S. Lewis and argued that his company "will become the dominant software company of the future."

By the numbers for the second quarter:

  • Palantir reported second quarter US revenue of $733 million, up 68% from a year ago. US commercial revenue was $306 million, up 93% from a year ago. US government revenue was up 53% from a year ago to $426 million.
  • The company closed 157 deals of at least $1 million and 66 of at least $5 million.
  • Palantir landed 42 deals worth more than $10 million.
  • In the quarter, Palantir closed $2.27 billion of total contract value, up 140% from a year ago.

As for the outlook, Palantir projected third quarter revenue of $1.083 billion and $1.087 billion with adjusted income from operations of $493 million to $497 million.

For 2025, Palantir projected revenue between $4.142 billion to $4.15 billion. US commercial revenue for 2025 will top $1.302 billion, up about 85% from a year ago. Adjusted income from operations will be $1.912 billion and $1.92 billion.

 

Data to Decisions Chief Information Officer

Wayfair starts to reap rewards from optimization, tech replatforming efforts

Wayfair starts to reap rewards from optimization, tech replatforming efforts

Wayfair has optimized its technology and operations to the point where it can grow both its top and bottom lines.

The home retailer delivered net income of $15 million, or 11 cents a share, on revenue of $3.3 billion, up 5% from a year ago. Non-GAAP earnings for the company were 87 cents a share.

For Wayfair, the results were the best since 2021. Wayfair has suffered from a Covid pandemic boom and bust cycle. Niraj Shah, CEO of Wayfair, said "we can and will grow profitably while taking significant share in the market."

A big part of that profitability push has revolved around a technology overhaul that largely revolved around a move to Google Cloud. When we last checked in with Wayfair, it was in the middle innings of a replatforming. Now that work is largely complete.

Shah added that Wayfair aims to invest in the future, grow current profitability and maximize free cash flow in the long run. Those goals will require continuous optimization, efficiency, AI and new features.

"Our model allows us to service the products with the best value for our customers, enabling us and our suppliers to gain share and grow revenue," said Shah.

Shah described the furniture and home goods market as "stable-ish." The higher-end market is stronger than the mass market, but overall demand is "bumping along the bottom after a few years of declines."

Here's a look at Wayfair's big initiatives and how it is flowing through to the bottom line.

Supply chain and logistics. Wayfair has an inventory light approach to its supply chain, but CastleGate, the company's proprietary logistics network, is performing well. The network covers inbound logistics, storage and outbound fulfillment.

CastleGate Forwarding is an inbound logistics and ocean freight forwarding operation that gives suppliers volume rates with carriers. Wayfair consolidates goods to ship and smaller suppliers have been increasing CastleGate usage.

According to Wayfair, CastleGate Fowarding has seen a 40% year over year increase in total volume in the second quarter and long-term inbound commitments are up 30% from a year ago. Wayfair is growing revenue by offering a third-party logistics service tailored to the home category.

Replatforming to Google Cloud is now mostly complete. CFO Kate Gulliver said that Wayfair's second quarter free cash flow of $230 million was its strongest since the third quarter of 2020. Capital expenditures were lower due a technology restructuring after the replatforming.

Customer and supplier experience improvements. Shah said that Wayfair has a 2,500 person technology organization that has been focused on the replatforming of core systems to Google Cloud. Now that migration is complete, that team is focused on increase product velocity and innovation.

"Now that we're very far into that replatforming effort, a lot of the cycles of the team are now back building features and functions to improve the customer experience and supplier experience," said Shah. "And you see that affect things, whether it's conversion rates, enabling suppliers to do more, launching new genAI-powered features and delivering efficiency gains in our operations."

Surgical ad spending with a focus on ROI. Wayfair in the fourth quarter began investing in influencers on Instagram and TikTok. Shah said that influencer investment has performed well, but the spend is modest.

More importantly, Shah said Wayfair has lowered ad costs through "a lot of testing and enhancements to some of our measurement models." "We've also been able to identify pockets of our spend, which we do not believe were contributing at the economic payback we wanted," said Shah. "Even though they were creating some revenue for us, it was not at a cost level that would make sense to us."

AI: Generative and agentic. Shah said there are multiple consumer facing areas where experience is being improved by generative AI. Search results, product descriptions and imagery, accuracy and other areas are just a few.

Shah said Wayfair in the long run is looking at using AI and agents to guide customers because "there's a lot more product discovery and content around trends." Wayfair is also developing features like Decorify and Muse to give shoppers personalization based on price and style.

The company is also looking at partnerships and ways to work with LLM companies such as OpenAI, Google Gemini and Perplexity, said Shah.

Wayfair has spent recent years rightsizing its organization and teams. That restructuring has been a distraction, but now Wayfair teams can focus on new programs including Wayfair Rewards, a loyalty program, Wayfair Verified, a set of goods that has been hand selected and inspected by Wayfair, and logistics efforts.

"The recipe keeps getting better, the technology cycles are available to drive the business forward, and we've been launching and growing new programs," said Shah.

Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Revenue & Growth Effectiveness B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Information Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Marketing Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer

AWS' AI strategy: Jassy's long talking and the big picture

AWS' AI strategy: Jassy's long talking and the big picture

Amazon CEO Andy Jassy's long-winded defense of Amazon Web Services' AI strategy sure caused some consternation, but fears are likely misplaced. After all, nuance doesn't play well on Wall Street and neither do the laws of large numbers.

The hubbub over Amazon's second quarter earnings report was largely attributed to AWS' growth rate of 17.5% vs. growth rates at Microsoft Azure and Google Cloud, which were 34% and 32%, respectively.

Jassy's short answer is that part of AWS' growth rate was due to the laws of large numbers. AWS has an annual revenue run rate of $123 billion compared to Azure at $75 billion and Google Cloud at $50 billion. Backlog for AWS as of June 30 was $195 billion, up 25% from a year ago.

Turns out that AWS’ AI strategy is a difficult to grok given the company is focused on developers, large language model choices and building blocks. AWS is downright practical and spent its AWS Summit New York talking about the architecture and approaches needed to make AI agents scale in the enterprise.

I'd argue that messaging is needed since that's what CxOs are struggling with, but understand why an eat-your-vegetables approach isn't as invigorating as the hype machine. Nevertheless, Jassy said AWS is ramping Nvidia and Trainium2 instances as fast as it can to meet demand. Capacity is being consumed as fast as it's put in, said Jassy, who noted energy and supply constraints are the biggest blockers. "We have more demand than we have capacity at this point," he said.

On the earnings call, Jassy made the following points:

  • Inference workloads are running on AWS infrastructure and that'll grow over time as workloads move to production.
  • Enterprises are at an early stage in AI adoption of AI agents. Cost and security are going to be huge issues as AI agents are adopted in enterprises.
  • Price and performance will matter more as enterprises scale.
  • AWS is focused on efficiency in building and deploying AI agents for enterprises.

With that backdrop let's annotate Jassy's big defense of AWS, which was blamed for Amazon shares falling Friday.

Morgan Stanley analyst Brian Nowak asked whether AWS was falling behind on AI and what to expect in the next 12 months.

Jassy said:

"I think it is so early right now in AI. If you look at what's really happening in the space, you have -- it's very top heavy. So you have a small number of very large frontier models that are being trained that spend a lot on computing, a couple of which are being trained on top of AWS and others are being trained elsewhere. And then you also have, I would say, a relatively small number of very large-scale generative AI applications.

The one category would be chatbots with the largest by a fair bit being ChatGPT, but the other category being really, I'll call it, coding agents. So these are companies like Cursor, Vercel, Lovable and some of the companies like that. Again, several of which run significant chunks on top of AWS. And then you've got a very large number of generative AI applications that are in pilot mode -- or they're in pilots or that are being developed as we speak and a very substantial number of agents that also people are starting to try to build and figure out how to get into production in a broad way, but they're all -- they're quite early.

Takeaway: Vendor talk about AI applications and are way ahead of actual production deployments at enterprises.

And many of them that are out there are significant, but they're just smaller in terms of usage relative to some of those top heavy applications I mentioned earlier. We have a very significant number of enterprises and startups who are running applications on top of AWS' AI services and then -- but they're all -- again, like the amount of usage and the expansiveness of the use cases and how much people are putting them into production and the number of agents that are going to exist.

Takeaway: AWS will make money on the compute and storage that will go along with AI services as much as the AI offerings.

It's still just earlier stage than it's going to be and so then when you think about what's going to matter in AI, what's going to -- what are customers going to care about when they're thinking about what infrastructure use, I think you kind of have to look at the different layers of the stack. And I think for those that are -- both building models, but also just -- if you look at where the real costs are, they're going to ultimately be an inference today, so much of the cost in training because customers are really training their models and trying to figure out to get the applications into production.

But at scale, 80% to 90% of the cost will be an inference because you only train periodically, but you're spinning out predictions and inferences all the time. And so what they're going to care a lot about is they're going to care about the compute and the hardware they're using. And we have a very deep partnership with NVIDIA and will for as long as I can foresee, but we saw this movie in the CPU space with Intel, where customers are anchoring for better price performance. And so we built just like in the CPU space, where we built our own custom silicon and building Graviton which is about 40% more price performance than the other leading x86 processors.

Takeaway: The value of AI will be all about inference.

We've done the same thing on the custom silicon side in AI with Trainium and our second version of Trainium2 is really -- it's become the backbone of Anthropic's next Claude models they're training on top of, and it's become the backbone of Bedrock and the inference that we do.

I think a lot of the inference, it's about 30% and 40% better price performance than the other GPU providers out there right now, and we're already working on our third version of Trainium as well. A lot of the compute and the inference is going to ultimately be run on top of Trainium2.

Takeaway: Like compute, GPUs will commoditize too.

And I think that price performance is going to matter to people as they get to scale. Then I would say that middle layer of the stack are really -- it's a combination of services that customers care about to be able to build models and then to be able to leverage existing leading frontier models and then build high-quality generative AI applications that do inference at scale. And we see it for people building models, they continue to use SageMaker AI very expansively, and then Bedrock, when you're leveraging leading frontier models is also growing very substantially.

And as I said in my opening comments, the number of agents of scale is still really small in the scheme of what's going to be the case, but part of the problem is it's actually hard to actually build agents. And it's hard to deploy these agents in a secure and scalable way.

The launches we made recently in Strands that make it much easier to build agents and then Agent Core that make it much easier to deploy at scale and in a secure way are being very well received and customers are excited is going to change what's possible on the agent side.

Takeaway: AWS is playing for AI at scale and that requires foundational building blocks being built now.

Remember, 85% to 90% of the global IT spend is still on-premises. If you believe that equation is going to flip, which I do, you have a lot of legacy infrastructure that you've got to move. These are mainframes. These are VMware's instances and when we build agents like AWS Transform to make it much easier to move mainframe to the cloud, much easier to move VMware to the cloud, much easier to move .NET windows to .NET Linux to save money, those are compelling for enterprises or things like Kiro that allow customers to develop in a much easier way and in a much more structured way, which is why I think people are excited about it.

I really like the inputs and the set of services that we're building in the AI space today. Customers really like them and it's resonating with them. I still think it's very early days in AI and in terms of adoption. But the other thing I would just say is that. Remember, because we're at a stage right now where so much of the activity is training and figuring out how to get your generative AI applications into production.

Takeaway: The core cloud business is just fine.

People aren't paying as close attention as they will and making sure that those generative AI applications are operating where the rest of their data and infrastructure. Remember, a lot of generative AI inference is just going to be another building block like compute, storage and database. And so people are going to actually want to run those applications close to where the other applications are running, where their data is.

There's just so many more applications and data running in AWS than anywhere else. And I'm very optimistic about as we get to a bigger scale what's going to happen to AWS on the AI side. And I think we have a set of services that is unique top to bottom in the stack. I think on the last part about what do we expect with respect to acceleration, we don't give guidance by segment.

But I do believe that the combination of more enterprises who have resumed their march to modernize their infrastructure and move from on-premises to the cloud, coupled with the fact that AI is going to accelerate in terms of more companies deploying more AI applications into production that start to scale, coupled with the fact that I do think that more capacity is going to come online in the coming months and quarters, make me optimistic about the AWS business."

Takeaway: AWS is playing the long game and it's a somewhat boring is beautiful approach to AI.

And yes, Jassy's defense could have been tighter.

 

Data to Decisions Future of Work Tech Optimization amazon Chief Information Officer

Enterprise technology customers look to AI, efficiency to combat uncertainty

Enterprise technology customers look to AI, efficiency to combat uncertainty

Enterprise technology companies are leveraging artificial intelligence and technology to drive efficiencies designed to offset everything from tariffs and inflation to growth investments.

Cognizant CEO Ravi Kumar said: "The AI opportunity is a double engine transformation for our clients, both on productivity and innovation. In the second quarter, we delivered a healthy combination of wins in AI efficiency-led large deals and innovation-led projects with Agentic AI unlocking new revenue pools and spend cycles.

Also see: Infosys sees good demand for AI agents

This theme has surfaced repeatedly from both integrators, vendors and customers. Here's a look at what CxOs are saying on second quarter earnings calls.

UPS

UPS has a program called Efficiency Reimagined to drive process efficiency and digital efforts. "We are redesigning end-to-end processes to drive savings, like a new global payment strategy. Here, we've centralized how we make and receive payments under a digital-first strategy, which will drive efficiency for UPS and improve the customer experience," said UPS CEO Carol Tome.

The company is also offering digital services that are enabling customers to remap supply chains.

"In the second quarter, nearly 90% of all cross-border transactions were processed digitally. Given our proven trade expertise and vast global network, our customers are coming to us for solutions that will help them navigate tariff uncertainty," said Tome. "In fact, so far this year, we've engaged in over 600 supply chain mapping assessments to help customers visualize, evaluate and optimize their global supply chains, including looking at opportunities for nearshoring."

UnitedHealth Group

UnitedHealth Group is looking to AI to drive efficiency efforts.

Patrick Conway, CEO of Optum, a unit of UnitedHealth, said on the parent company's second quarter earnings call. "We are aggressively advancing operational disciplines across our portfolio of businesses. The more concentrated operating model I mentioned earlier plays into more standardized approaches, predictable outcomes and lower operating costs," said Conway. "We will complete the final stages of our technology integration, which will enable meaningful advances with emerging technologies like AI to drive efficiency gains. For 2026, we expect to deliver almost $1 billion in cost reductions."

Royal Caribbean

Royal Caribbean Cruises President and CEO Jason Liberty said the company is looking to drive efficiency as well as revenue through better customer experiences.

Liberty said:

"We're utilizing disruptive technology like AI and other tools to be able to -- to manage 15 million price points a day and to be able to listen to what our customers are looking for and curate what our customers are looking for that are relevant to them. That enhances the experience for them, takes friction out of the experience and also allows us to be more efficient and gain more margin."

Merck

Merck CFO Caroline Litchfield said the company is looking to save $3 billion in costs to reinvest in higher growth businesses.

"In terms of this $3 billion saving opportunity, which will come through productivity across our enterprise. It will impact the R&D line, SG&A as well as cost of goods. That said, we will reinvest all of that $3 billion plus further investments, especially in R&D, given the strength of our pipeline as well as in SG&A over time as we launch the new products and look to excel in the marketplace with those launches in order to drive long-term growth for our company."

Google

Google CEO Sundar Pichai:

"As we ramp our AI investments, we continue to focus on driving improvements in productivity and efficiency to offset growth in technical infrastructure-related expenses, particularly from higher depreciation."

Waste Management

Waste Management President and Chief Operating Officer John Morris:

"One of the clearest indicators of the progress we're making is our ability to consistently reduce operating costs as a percentage of revenue. Structurally lowering our cost base isn't about temporary cuts, it's about using technology and process discipline to build a more efficient, scalable model for the long term, and our team delivered that in Q2. The second quarter marks a record period in which we achieved operating expenses below 60% of revenue.

This reflects the significant progress we've made in connecting the full value chain of WM from routing and fleet management to customer communication and maintenance. Our connected fleet continues to serve as a key differentiator. We achieved a 70 basis point improvement in repair and maintenance costs as a percentage of revenue in the second quarter as real-time telematics are helping us anticipate and resolve vehicle issues faster, reduce downtime and streamline maintenance scheduling."

ADP

ADP CEO Maria Black:

"On the AI front, we continued the rollout of ADP Assist which provides the latest AI-driven capabilities into our products, and we're seeing fantastic engagement from our clients with millions of interactions in fiscal '25.

To further build on our unmatched expertise, we have also deployed these tools across ADP to thousands of our associates, driving efficiencies in our sales, service and technology functions. By coupling our decades of experience with our significant data insights and AI investments, we are simplifying work for our associates and elevating the end-to-end client experience."

Hershey

Hershey CEO Michele Buck said:

"As we got hit by some of the record high cocoa prices early on, we stated that our approach was going to be taking a long-term approach to ensure continued category health, and we've done that. We've continued to spend on our brands, we've invested in technology with our new ERP platform, and then new AI and tech-enabled capabilities that have driven significant efficiency, whether in the transformation program or other places."

Data to Decisions Future of Work Next-Generation Customer Experience Chief Executive Officer Chief Financial Officer Chief Information Officer

ServiceNow, Salesforce invest $1.5 billion in Genesys as Five9 CEO retires

ServiceNow, Salesforce invest $1.5 billion in Genesys as Five9 CEO retires

ServiceNow and Salesforce will invest $1.5 billion in Genesys, a cloud customer experience platform.

Genesys said that the proceeds from the ServiceNow and Salesforce will be used to buy the shares from existing equity holders. Hellman & Friedman and Permira will remain majority shareholders of Genesys.

According to the company, Genesys Cloud has $2.1 billion in annual recurring revenue as of the first quarter ended April 30, good for growth of 35%.

ServiceNow and Salesforce already have partnerships and integrations with Genesys. CX Cloud from Genesys and Salesforce integrates Genesys Cloud and Salesforce Service Cloud. Unified Experience from Genesys and ServiceNow combine Genesys Cloud and ServiceNow Customer Service Management.

The contact center market has been picking up. Nice acquired Cognigy for nearly $1 billion. Genesys competes with Nice, Five9, Zoom, Amazon Connect and Microsoft Dynamics Contact Center to name a few.Constellation ShortList™ Contact Center as a Service (CCaaS)

Separately, Five9 CEO Mike Burkland said he will retire from his role. Five9 reported second quarter revenue of $283.3 million, up 12% from a year ago, with net income of $1.2 million. The company projected 2025 revenue between $1.1435 billion to $1.1495 billion. 

 

Next-Generation Customer Experience B2C CX Chief Information Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

Apple Q3 strong ahead of iPhone launches

Apple Q3 strong ahead of iPhone launches

Apple's third quarter results were better-than-expected as the company delivered 10% revenue growth from a year ago.

The company reported earnings of $1.57 a share on revenue of $94 billion. Wall Street was looking for June quarter earnings of $1.43 a share on revenue of $89.54 billion.

In a statement, CEO Tim Cook said the company's June quarter saw strong growth across product lines and geographies.

By the numbers in the third quarter:

  • iPhone sales were $44.58 billion, up from $39.3 billion.
  • Mac revenue was $8.05 billion, up from $7 billion.
  • iPad revenue was $6.58 billion, down from $7.16 billion.
  • Wearables revenue was $7.4 billion, down from $8.09 billion.
  • Services sales were $27.42 billion, up from $24.2 billion.

Apple has been under fire for its slow moving AI strategy and progress with Apple Intelligence.

Cook said on Apple's conference call:

  • "We saw an acceleration of growth around the world in the vast majority of markets we track, including Greater China and many emerging markets. And we had June quarter revenue records in more than two dozen countries and regions, including the U.S., Canada, Latin America, Western Europe, the Middle East, India and South Asia. These results were driven by double-digit growth across iPhone, Mac and Services."
  • "We see AI as one of the most profound technologies of our lifetime. We are embedding it across our devices and platforms and across the company. We are also significantly growing our investments. Apple has always been about taking the most advanced technologies and making them easy to use and accessible for everyone. And that's at the heart of our AI strategy."
  • "The situation around tariffs is evolving, so let me provide some color there. For the June quarter, we incurred approximately $800 million of tariff-related costs. For the September quarter, assuming the current global tariff rates, policies and applications do not change for the balance of the quarter and no new tariffs are added, we estimate the impact to add about $1.1 billion to our costs. This estimate should not be used to make projections for future quarters as there are many factors that could change, including tariff rates."

 

Future of Work Innovation & Product-led Growth apple Chief Information Officer