Results

AGI may be far away, but 'jagged AI' will still take jobs

AGI may be far away, but 'jagged AI' will still take jobs

The enterprise AI market appears to be embracing a little nuance and CxOs would be wise to avoid banter about artificial general intelligence and think about systems that drive returns in the real world. Those returns will likely be generated with fewer workers.

Microsoft CEO Satya Nadella had an interesting exchange with a Wall Street analyst who asked about AGI, Microsoft's deal with OpenAI and the never-ending rise in capital expenditures. The exchange followed Microsoft’s strong first quarter results.

Nadella said: "How are these AI systems going to truly be deployed in the real world, make a real difference and make a return for both the customers who are deploying them and then obviously, the providers of these systems? Even as the intelligence capability increases, the problem is it's always going to still be jagged. You may even have a capability that's fantastic at a particular task, but it may not uniformly grow. So, what is required is in fact, these systems, whether it is GitHub Agent HQ or the M365 Copilot system. Don't think of this as a product. Think of it as a system that in some sense smooths out those jagged edges and really helps the capability."

As for AGI, Nadella said it's not coming anytime soon. "I think we will be in this jagged intelligence phase for a long time," said Nadella. "We feel very good about building these systems as organizing layers for agents to help customers. I feel pretty good about the progress in AI, but I don't think AGI as defined by us in our (OpenAI) contract is every going to be achieved anytime soon. I do believe we can drive value for customers with advances in AI models by building these systems."

What about returns for Microsoft? Don't sweat that one. Microsoft generated first quarter free cash flow of $25.7 billion, up 33% from a year ago, even as it stepped up capital expenditures to $34.9 billion with half of that sum going to GPUs and CPUs. CFO Amy Hood said Microsoft is capacity constrained through the fiscal year but is seeing strong demand signals for AI and returns ahead.

In other words, Microsoft doesn't have to chase AGI. Neither does Alphabet, which said it will spend $91 billion to $93 billion in 2025 capital expenditures. Alphabet is seeing strong demand for its TPU instances as well as its Google Cloud AI services and Nvidia-based offerings.

Alphabet CEO Sundar Pichai said the company is seeing extensive productivity gains from AI and monetizing advanced in multiple ways--advertising, YouTube and Google Cloud. The pace of LLM big bang developments be more spaced out, but returns are there.

In addition, enterprises are taking a more systems-based approach to AI and focusing on multiple models and agents. "Our packaged enterprise agents in Gemini Enterprise are optimized for a variety of domains, are highly differentiated and offer significant out-of-box value to customers. We have already crossed 2 million subscribers across 700 companies," said Pichai.

At the Google Public Sector Summit, a few panels noted that "2026 will be the year of AI ROI."

This systems-based approach to AI, agent layers and governance is why ServiceNow CEO Bill McDermott could hardly contain himself on the company's third quarter earnings call.

One product that's resonating with customers is ServiceNow's AI Control Tower. That interest highlights how agentic AI, various models and governance is more the story than AGI or whatever OpenAI's Sam Altman is cooking up this week.

ServiceNow's Amit Zavery, Chief Product and Operating Officer, said: "What we have done is we have integrated all the different systems out there to give you full visibility and control. It resonates instantly."

All of these comments sound very pragmatic. But there are already signs that this productivity is going to lead to fewer jobs. The job market may not be fire as much as it is no hire.

You don't need AGI for a white collar recession

First, let's start with the obvious. This week was where a lot of folks seemed to realize at once that we're in a white collar recession. To be realistic, the white collar job losses have been mounting for two years (you only need to ask someone who lost a gig and can't find one). And now the recent data points are adding up and it appears AI is taking jobs.

UPS cut 14,000 corporate jobs and 35,000 total. The disclosure, which was made on UPS's third quarter earnings call, boils down to AI and automation and the need to align headcount with volume declines.

Brian Dykes, CFO of UPS, said "we finished down nearly 34,000 positions year-over-year, which includes a reduction from our driver voluntary separation program. Nearly 1/3 of the reductions occurred in September."

UPS CEO Carol Tome noted that UPS is automating everything it can.

Amazon did its part to ding white collar jobs when it announced that it was cutting 14,000 corporate jobs. The rationale? AI requires less layers of management. Yes, most companies talk about augmenting human workers, but the reality is you just don't need as many people. Also see: AWS fires up Project Rainier, Trainium2 cluster for Anthropic

Other companies are either cutting white collar jobs (Target, GM) or holding the line on hiring (JPMorgan Chase). Alphabet said it is holding the line on headcount.

Alphabet CFO Anat Ashkenazi said generative AI is enabling the company to become more efficient across multiple fronts including headcount and infrastructure. "This is not a onetime effort but an ongoing way in which we operate the business," he said.

And this isn't just corporate America. At the Google Public Sector Summit in Washington DC, it was a common belief that AI agents were going to take on more work.

Ed Van Buren, an applied AI strategic growth offering leader at Deloitte, said upskilling will be critical. "Most federal agencies are smaller than they were last year. But still government has critical work that has to get done. It's going to be important for industry to help out a smaller government workforce. The Trump Administration is saying very directly that AI and emerging technologies are going to augment the existing remaining Federal workforce," said Van Buren.

A few thoughts:

  • Coming out of Constellation Research's Connected Enterprise agentic AI discussions, I was more optimistic about humans and their ability to find work. This week, I'm back to thinking we're going to need a lot less people to get work done. The point? The AI vs. human employment reality is going to ebb and flow as will your emotions. See: CCE 2025: AI agents: Dreams, reality and what's next

  • No government has an answer or even rough plan for these job losses. Something that stuck with me from both Constellation Research's AI Forum in Washington DC and Google Public Sector is that more will be done with less due to AI. Aside from vague talk of upskilling, retooling work and retraining humans there's no plan for dealing with the labor losses. AI is one reason why the economy is being revamped with a manufacturing spin, but once the construction on AI factories is done where's the work? Should manufacturing come back to the US, there will still be fewer people and more robots.
  • Ultimately, we get to a place where AI-enabled entrepreneurship will be rewarded. However, it's unclear whether everyone is suited to be an entrepreneur.
  • With any luck AI will be like previous technology shifts where humans adapt and new roles are created. The disruption in between will take years to play out.
Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Chief Executive Officer Chief People Officer Chief Information Officer Chief Technology Officer

The Ghosts That Hold Us Back: How Leaders Can Get Out of Their Own Way | DisrupTV Ep. 416

The Ghosts That Hold Us Back: How Leaders Can Get Out of Their Own Way | DisrupTV Ep. 416

The Ghosts That Hold Us Back: How Leaders Can Get Out of Their Own Way | DisrupTV Ep. 416

In this episode of DisrupTV, co-hosts R “Ray” Wang, CEO of Constellation Research and best-selling author of Disrupting Digital Business, and Vala Afshar, Chief Digital Evangelist at Salesforce, sit down with two powerful voices reshaping the conversation around leadership and innovation:

  • Scott D. Anthony, author of Epic Disruptions: 11 Innovations That Shaped Our Modern World, and
  • Muriel Maignan Wilkins, author of Leadership Unblocked: Break Through the Beliefs that Limit Your Potential.

Together, they explore how innovation and leadership intersect—and why the key to unlocking the future often lies in looking inward.

From Florence Nightingale to AI: The Patterns of True Disruption

Scott D. Anthony’s Epic Disruptions traces the arc of innovation across history, from the printing press and gunpowder to AI and clean technology. One of his favorite examples? Florence Nightingale, whose pioneering work in data visualization, public health reform, and education redefined modern healthcare.

By using data to tell a story, Nightingale proved that ideas gain power when they connect emotionally and intellectually. Her ability to pair insight with communication sparked system-wide change—an essential lesson for today’s innovators.

  • “Florence Nightingale didn’t just collect data—she told a story that moved the world.”

Anthony argues that true disruptors act at a systemic level, combining vision with storytelling to drive enduring transformation. He also notes that the dynamics of disruption—whether in healthcare, energy, or AI—are remarkably consistent: bold ideas collide with entrenched systems, and progress depends on leaders who can manage that tension.

When “Disruption” Gets Misunderstood

Anthony also revisits the legacy of Clay Christensen, who first introduced the concept of disruptive innovation. He points out that many modern organizations misuse the term, labeling any change or new product as “disruptive.” This dilution of meaning matters because it leads to confusion—and often, misguided strategy.

Christensen himself might have seen technologies like generative AI as “new capabilities with new downsides,” a reminder that innovation is never purely positive. The leaders who succeed are those who stay grounded in reality—balancing vision with humility.

The Ghosts That Haunt Innovation

Transitioning from the external to the internal, Muriel Maignan Wilkins offers a complementary perspective: innovation doesn’t fail because of bad ideas—it fails because of blocked leaders.

In Leadership Unblocked, she identifies seven common beliefs that quietly sabotage potential, including:

  • “I need to be involved.”
  • “I can’t say no.”
  • “I know I’m right.”
  • “I can’t make mistakes.”

These beliefs often come from early experiences, past successes, or a leader’s “origin story.” Left unexamined, they become ghosts—the internal voices that keep people from evolving.

  • “Every organization has ghosts. The trick is learning which ones still serve you—and which ones are holding you back.”

The Power of Self-Awareness and Coaching

Wilkins emphasizes that the first step to unblocking leadership is self-awareness—recognizing your role in the obstacles you face. Leaders often blame external forces (the market, their teams, or technology) when the real issue lies in their own habits or fears.

Her process of recognize ? reframe ? rebuild helps leaders replace limiting beliefs with empowering ones. This mindset shift not only transforms individuals but also inspires the teams they lead.

Coaching, she adds, is a critical tool for this process. Great leaders, like legendary basketball coach John Wooden, lead through questions, curiosity, and compassion—not control. Effective leaders don’t just manage—they coach their people to see and solve problems differently.

Why Mindset Drives Innovation More Than Strategy

Both Anthony and Wilkins agree: the biggest disruptions don’t start with technology—they start with a mindset shift. From Florence Nightingale to the rise of AI, innovation requires courage, empathy, and adaptability.
And from legacy corporations to startups, leadership success now depends on self-reflection as much as on strategic skill.

  • “The hardest part of innovation isn’t the technology—it’s the mindset.”

As DisrupTV co-host R "Ray" Wang notes, today’s most transformative leaders are those who know when to pause, question assumptions, and unlearn outdated beliefs.

Key Takeaways

  • Disruption is systemic. The same forces that shaped the printing press and the compass are reshaping AI and clean tech today.
  • Self-awareness is innovation fuel. You can’t disrupt your industry if you’re stuck in your own beliefs.
  • Storytelling is power. As Florence Nightingale proved, data alone doesn’t change the world—stories do.
  • Leadership starts with unblocking. Before you can transform your organization, you must first transform yourself.

Final Thoughts: Innovation Starts Within

Leading through disruption isn’t about reacting to the latest trend or technology—it’s about developing the self-awareness and courage to evolve from within.

When leaders confront their hidden blockers and shift their mindset, they unlock the potential not only to adapt—but to define the next era of innovation.

Watch the full episode of DisrupTV to hear Scott D. Anthony and Muriel Maignan Wilkins discuss how to break through hidden blockers, master your mindset, and lead with purpose in an age of constant change.

Related Episodes

If you found Episode 416 valuable, here are a few others that align in theme or extend similar conversations:

 

Future of Work Marketing Transformation New C-Suite Tech Optimization Chief Executive Officer Chief People Officer Chief Marketing Officer Chief Revenue Officer Chief Technology Officer On DisrupTV <iframe width="560" height="315" src="https://www.youtube.com/embed/xyr-T5xck4o?si=H3Ql7OruSLEHfPYI" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Google Public Sector Summit 2025: Takeaways on data, AI, ROI from 7 technology leaders

Google Public Sector Summit 2025: Takeaways on data, AI, ROI from 7 technology leaders

Public sector technology executives laid out a series of takeaways and best practices at the Google Public Sector Summit in Washington, D.C., in October. The takeaways ranged from focusing on your data foundation to use cases for artificial intelligence (AI) agents and the importance of training and human-in-the-loop processes.

Here’s a tour of what seven public sector technology leaders had to say (PDF).

More from Google Public Sector Summit 2025:

It's all about data first

Dr. Chrysoula Malogianni, Associate Vice President of Innovation at ODU: Your data is everything

Old Dominion University worked with Google Public Sector to launch MonarchSphere, a platform that weaves AI throughout ODU's student experience, research and operations.

Dr. Chrysoula Malogianni, Associate Vice President of Innovation at ODU

Malogianni said your AI success largely depends on your data. “Have a data plan. AI is not a catastrophe or a panacea. AI can’t do anything. You need robust data. You need infrastructure and a data foundation so you can validate AI. You need to also start preparing your target population for AI adoption. If we don't understand the AI, you won’t have a plan.”

She added that ODU put a lot of work into the data foundation along with Google Public Sector. Among the key assets:

  • 20 years of recorded courses and data that can be combined with real-time data from interactions
  • Notebooks for mind maps and course outlines to create assistants with the help of instructional designers
  • Data types from transcripts, advisers, and student interests
  • Combined course data and public data to enable students to create personalized journeys

And don’t forget the leadership. “It’s important to have visionary leadership, because transformation doesn’t start from technology. It starts from visionary leadership, appropriate partnership, and having a good plan,” said Malogianni.

See: Old Dominion, Google Public Sector Create AI Incubator

Matthew Gunkel, CIO, University of California, Riverside: Data Governance Matters

Matthew Gunkel, CIO at the University of California, Riverside, said on a panel at the Google Public Sector Summit that he has been working with Google Workspace and Gemini to speed up code transformations, surface unstructured data, and pull together different data sources.

“On the student success side, we really had an opportunity to look at where we can do advanced forecasting and modeling and then work to really use data governance. We’ve been putting data governance in place over the last 12 to 18 months to accelerate our ability to manage a resource-constrained environment. We have a lot of classroom challenges where we don’t have seats and we have too many students. How can we forecast and plan effectively and efficiently but also communicate that back to the degree programs?”

Gunkel said the data governance strategy has enabled the university to start to deploy Gemini agents on top of broad institutional data in Google Cloud BigQuery.

Data governance has been critical, because AI requires an organization to “manage information much more closely and frequently with purpose,” he added.

“AI really is worthless without the data behind it,” said Gunkel, who noted that the university is working closely with Google on Gemini use cases but needs the data strategy aligned.

Key items for UC Riverside include:

  • Improving enrollment support leveraging data and AI to inform students what degree programs are available
  • Semantic data mining to leverage unstructured information from transcripts
  • Automating transcript verification and using AI to deliver high-quality plans of action

Use Cases Abound

Ted Ross, CIO of the City of Los Angeles: Use Cases and Training Matter

The City of Los Angeles and Google Public Sector announced a partnership that layers Google Workspace with Gemini throughout the city’s transformation.

Speaking on a panel, Ross laid out some tips and best practices:

  • Use cases are not hard to find. Ross said that in government, use cases abound for areas where AI can improve efficiencies. Information dissemination and analysis are big ones. “In emergency management, AI has the ability to synthesize real-time information from utilities, cities, counties, states, and multiple jurisdictions,” said Ross. “This information also has to be multilingual.”
  • Ross added that it helps to think through AI use cases in terms of personas. “Think from the perspective of personas like the broad workforce, managers, front lines,” said Ross.
  • Don’t scrimp on training. “I’m a huge fan of training and giving employees an intro to AI,” said Ross, who added that the training and use of AI are critical to employee engagement. “AI is a once-in-a-generation shift of how people are computing, and you have to train the workforce so you can launch them into the future and build AI fluency. Make the investment in training now.”

From left to right: Ted Ross, CIO, City of Los Angeles; Kenneth Zellers, Commissioner, State of Missouri; Richard Smyth, Associate Vice President of Innovation and IT Services, Georgetown University; and Tony Orlando, Managing Director, Partner, and Specialty Sales, Google Public Sector.

Richard Smyth, Associate Vice President of Innovation and IT Services, Georgetown University: Student Lifecycle Management

Speaking on a panel, Smyth said Georgetown is leveraging AI and Gemini to improve the lifecycle of students through different phases, all the way to becoming alumni.

“What we want to do is make sure we can create an experience that allows students to succeed, successfully deliver on their program, but also to become strong alumni and to give back to the university as well as to the community,” Smyth said. “We think about the touchpoints that happen all the way through that journey. Historically, those systems and processes don’t interconnect, so we’re thinking of ways that we can use Gemini and Google Workspace to connect those processes and systems to create the ultimate experience.”

Key points from Smyth about Georgetown’s approach to AI:

  • “We’re focused on customer-centricity but also value.”
  • Georgetown needed to interact with a wide variety of departments and teams that engage the student community.
  • Training to use AI correctly was a necessity.
  • Solving pain points for individuals in various departments meant that those individuals bought into the AI plan.
  • Time was a big value driver. “We were going through this journey with the pilot with 60 to 80 people, and one of the things that came out was time savings. Over the course of the year, we could save $650,000 to $700,000, and that was just with the folks in the pilot program. Imagine if you scaled that across the university,” said Smyth.

Ultimately, Smyth said, the goal is to use that time savings to shift more of the focus to the student experience.

Kenneth J. Zellers, Commissioner of the State of Missouri: AI Enables Digital Services

The State of Missouri is focused on digital citizen services at the speed of business, said Zellers.

To become digital, Zellers said, AI has to be able to collapse silos. “We have 17 different departments, and our 6.2 million customers can go online and access any of them,” he said. “It’s not a facade. We’re using AI to move people to various portals. When the AI is good, it’s efficient.”

Zellers said there are multiple use cases across the state government. Here are a few:

  • Bill reviews where AI can speed up the compare-and-contrast process with previous versions: “You should still go back through it, but AI saves what used to take hours and hours and compresses them to a few minutes,” said Zellers.
  • Department of Revenue Answers (Dora), a chatbot that gives answers to taxpayers: There have been more than six million interactions.
  • Wrangling facility management, design, and construction projects: Zellers said the goal is to use AI and data from multiple assets and construction projects and become more predictive to generate savings.
  • Like other CxOs in the public sector, Zellers emphasized training and finding influencers who can lead others to use AI tools. He said the state focused on introductory AI training that included workers on every level.

“We invited people from senior level,” he said. “We invited people from frontline and administrative staff. But a lot of people forget that the administrative staff is in the middle. Sometimes they get left behind. Those are the true influencers. So, we had the initial training, and then we started getting calls because people go back to the department and talk about what they saw.”

Returns on Investment

Mansour Sharha, Innovation and Technology Director for the City of Dearborn: Build Trust, Drive ROI

Sharha said the City of Dearborn is one of the more diverse cities in the U.S., with multiple Middle Eastern countries and dialects represented.

The big use case for Dearborn was using Gemini to translate documents and assist residents. Instead of using a human agent to translate, Gemini was able to solve problems via a chatbot.

“When we started the model, we provided 10 to 15 questions that residents asked, and two years later, we’re actually addressing more than 75 questions,” said Sharha. “That provided a huge value for us in terms of staffing and efficiencies.”

Sharha said Dearborn can now use its call center for more complicated queries without adding more staff.

To deploy AI, Sharha said, he had to earn the trust of each department. “We started from the bottom up and really listened to the people who will be using the technology,” he said. “A lot of people are afraid of AI. It’s really about learning AI so you can be more efficient. We built that trust and focused as a team on providing [added value] to each department.”

Use cases include:

  • Using Gemini to translate documents into different languages for citizens.
  • Request-for-proposal (RFP) responses via AI. Sharha said Dearborn created a checklist of what an RFP should include, and now the process is automated at the front end. Humans do the evaluation once RFPs are culled from 30 to 50 down to a handful.
  • Planning and zoning commission requests are also sped up with AI, and the approval process has gone from four to five weeks to three to five days.

Looking ahead, Sharha said, Dearborn has plans to leverage AI for police and fire data and analytics.

Marcie Kahbody, Deputy Secretary of Technology and Agency Information Officer, California State Transportation Agency: Human in Loop

The California State Transportation Agency (Caltrans) has terabytes of siloed data that’s hard to get to but nevertheless a big asset. Caltrans collects data from 39,000 ground detectors, transportation management systems, and 3,000 cameras. The plan is to use that data to support the Caltrans vulnerable roadway users (VRU) plan, designed to end road fatalities and serious injuries on California roadways by 2050.

The agency started a pilot with Google Public Sector focused on wrangling the data and using it to improve safety. “We started with a sandbox so we don’t put any personal identifiable information (PII), and then we started sharing it with Gemini,” Kahbody said.

Caltrans conducted risk assessments, ran proofs of concept (POCs), and laid out needs with cybersecurity professionals, she added. Today Caltrans engineers and analysts can look at California Highway Patrol collision data to remediate high-collision areas in minutes, compared with a few days before.

"With VRU, it’s now just a click of a button to analyze,” she said.

However, you need humans in the loop in processes. “We always have a human in the loop. The engineer looks at the data closely to make sure that it’s valid and there’s no hallucination,” said Kahbody. “It’s saving us a lot of time and provides more time for our engineers to do the tasks that are more valuable.”

The project will move to production in January, she said. Reports will include recommendations to make roads safer, including better traffic signals, pedestrian lines, and areas that could be remediated. The returns will be measured in decreases in road fatalities and injuries.

Going forward, Caltrans will be launching a series of POCs and moving them into production. “We serve 23,000 employees,” said Kahbody. “Communication and change management is huge. We had a lot of communication with the unions, labor relations, and legal folks to understand that AI is not replacing anybody. Our strategy is providing tools to help engineers do a better job so they have more time to be strategic.”

Data to Decisions Future of Work New C-Suite Next-Generation Customer Experience Google Chief Information Officer

How DocuSign Uses AI to Drive Customer-Centric Transformation

How DocuSign Uses AI to Drive Customer-Centric Transformation

Esteban Kolsky interviews Marc LeCours, Global Customer Services leader at Docusign, about the company’s impressive journey from single-product e-signature startup to a $3B leader delivering end-to-end intelligent agreement management. 📈 

Marc shared how DocuSign leverages Salesforce integration, adopted Certinia to deliver consistent customer outcomes, and now uses #AI to proactively enhance #CX. He emphasized the importance of a customer-centric mindset and the balance between automation and the human touch.

If you want real-world examples of #digital transformation, scaling operations, and using AI for tangible #business outcomes, you’ll find plenty of value in hearing Marc’s perspective! 👇

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/76BewpvhN4o?si=vkHhOJPj-QHwkvo5" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

AWS Startup Partner Week: Interviews with Julia Chen and Matt Yanchyshyn

AWS Startup Partner Week: Interviews with Julia Chen and Matt Yanchyshyn

Don't miss this interview with AWS superstars Julia Chen and Matt Yanchyshyn at the Amazon Web Services (AWS) Startup Partner Summit. Their insights on startup success, AI-driven growth, and AWS’s robust partner support are shaping the next wave of innovation. 

💡 Key topics included product-led growth, exclusive AWS programs, and how co-selling fuels rapid scale for startups and partners. Enterprise technology leaders: Learn how partnership opportunities can help your organization lead in a rapidly evolving landscape.

Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Revenue Officer Chief Supply Chain Officer Chief Sustainability Officer Chief Technology Officer On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/WJZAsTbV1P8?si=a_MBlonnFhvNW3c4" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

AWS re:Invent 2025 to feature Trainium3, scaling secure AI agents

AWS re:Invent 2025 to feature Trainium3, scaling secure AI agents

Amazon Web Services custom AI chip Trainium2 is fully subscribed and Trainium3 will feature about a 40% performance boost and the ability to reach more customers, said Amazon CEO Andy Jassy.

Jassy used a strong third quarter earnings report and AWS launch of Project Rainier, a cluster of up to 1 million Trainium2 processors used to train Anthropic's next Claude model, to lay out a few big themes you'll hear from AWS re:Invent 2025 in less than a month.

Leading up to Amazon's quarter, the narrative was that AWS is trailing in AI behind Microsoft Azure and Google Cloud. The latter features its own TPU custom AI chips that have also landed Anthropic as a customer. Note that the AWS is trailing in AI narrative played out last year leading up to re:Invent.

With that backdrop, Jassy made it clear that its custom AI chips are going to broaden the market for enterprise AI and give them more price-to-performance options. AWS also sees Amazon Bedrock with Trainium3 as the lead inference engine and a business that'll be as big as EC2.

Jassy also said the tipping point for more AWS customers will be AI agents that are secure. Like Google Cloud and its integrated AI stack, AWS has its own version that'll feature Amazon SageMaker and Amazon Bedrock along with AgentCore running on Trainium. However, Jassy was quick to note that AWS buys a lot of Nvidia, AMD and Intel too.

This AI integrated stack, Trainium3 and building of secure AI agents will be the core themes of re:Invent, which appears to be a little more speeds and feeds than recent years. Here's a look.

Trainium scales

Now that Project Rainier is operational, Jassy has a lot more to talk about when it comes to AI infrastructure, Trainium prospects and capacity.

He said AWS has added more than 3.8 gigawatts of power in the past 12 months, double capacity of AWS in 2022. AWS will double again by 2027 and plans to add at least another 1 gigawatt of power in the fourth quarter.

Jassy added that additional capacity includes, power, data centers and chips, mostly Trainium and Nvidia.

The Anthropic effort also gives AWS a flagship customer where it can expand.

"Trainium2 continues to see strong adoption, is fully subscribed is now a multibillion-dollar business that grew 150% quarter-over-quarter," said Jassy. "Today, Trainium is being used by a small number of very large customers but we expect to accommodate more customers starting with Trainium3."

Jassy said AWS is monetizing as soon as capacity comes online. Amazon has spent nearly $90 billion on capital expenditures in 2025 and most of it relates to AWS infrastructure and Trainium. Some of that sum, expected to hit $125 billion for the full year, goes to fulfilment and transportation for commerce.

Trainium2 has a few very large customers, noted Jassy, and Trainium3 will broaden the base. "As customers start to contemplate broader scale of their production workloads, moving to being AI-focused and using inference, they badly care about price performance," said Jassy. "We have a lot of demand for Trainium. Trainium3 should preview at the end of this year with much fuller volumes coming in the beginning of '26, we have a lot of customers, both very large, and medium-sized who are quite interested in Trainium3."

Amazon's custom chips are made by Annapurna Labs, a chip designer acquired in 2015 for $350 million. Along with 2012's $750 million purchase of Kiva, a robotics company that enabled Amazon to automate its warehouses, Annapurna has to be the company's smartest acquisition.

AI agents: Secure, horizontal

"We're bringing the same building block approach to AI. SageMaker makes it much simpler for companies to build and deploy their own foundation models. Bedrock gives customers leading selection of foundation models and superior price performance to deploy inference into their next-generation applications," said Jassy. "A lot of the future value companies will get from AI will be in the form of agents. AWS is heavily investing in this area and well positioned to be a leader."

Jassy said enterprises will both create agents and use agents from other companies. AWS will be for the builders that can leverage platforms like Strands, Kiro, Transform and AgentCore. Behind the scenes, AWS will be a key player for the ecosystem offering packaged agents.

While the focus is on builders, AWS is also targeting business users with efforts like Amazon QuickSuite.

"I think that the number of companies who are working on building agents is very significant. I do believe that a lot of the value that companies will realize over time and AI will come from agents," said Jassy. "When you talk to enterprises or companies that care a lot about security and scale. They're starting to build agents, and they don't really feel like they've had building blocks that allow them to have the type of secure, scalable agents that they need to bet their businesses and their customer experience and their data on."

Efforts like AgentCore are designed to be a primitive building block for AI agents, akin to compute, storage and database.

Inference everywhere

The strategy for AWS revolves around playing for inference, which will be a much larger category than training. Jassy said:

"We're building Bedrock to be the biggest inference engine in the world and in the long run, believe Bedrock could be as big a business for AWS as EC2, and the majority of token usage in Amazon Bedrock is already running on Trainium. We're also continuing to work closely with chip partners like NVIDIA, with whom we continue to order very significant amounts as well as with AMD and Intel. These are very important partners with whom we expect to keep growing our relationships over time."

Simply put, a proliferation of AI agents is going to require a lot of inference. AWS wants the bulk of those workloads.

And enterprises are increasingly going to look for the best performance at the right price. That reality is why Jassy is so bullish on its custom silicon, which includes Graviton as well as Trainium.

"For our customers to be able to use AI as expansively as they want. Remember, it's still relatively early days at this point. Customers are going to need better price performance and they care about it deeply," said Jassy.

Constellation Research's take

Holger Mueller, analyst at Constellation Research, said:

"It's good to see AWS getting into the supercomputer business. Remarkably, AWS is doing this with its in-house chips, which are both proof points for the R&D chops of Amazon and frugality when it comes to infrastructure spending. The frugality makes for competitive AI spending that CxOs welcome. AWS is on a hardware level with Google Cloud when it comes to running custom silicon.

AWS and Google Cloud are the only cloud players running custom silicon at scale. Why? They have unique scalability and cost performance needs in their core business. Amazon has razor thin retail margins. Google has its freemium offerings. Where Google is ahead is putting custom algorithms on custom hardware (TensorFlow on TPUs). Amazon is still recovering from missing out on the initial AI wave in that sense. It's questionable AWS would have come out with something like TensorFlow - as it usually is an adapter of what is being used. That's Amazon's retail DNA."

Research:

 

Data to Decisions Future of Work New C-Suite Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity amazon ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Big Data Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Apple's annual services revenue tops $109 billion

Apple's annual services revenue tops $109 billion

Apple's fourth quarter results were mixed, but services revenue topped the $100 billion in annual revenue. The results were better-than-expected.

The company reported fourth quarter earnings of $1.85 a share on revenue of $102.5 billion, up 8% from a year ago.

Wall Street was expecting fourth quarter earnings of $1.77 a share on revenue of $102.25 billion.

The September quarter only includes a few days of sales of the iPhone 17 lineup.

CEO Tim Cook said the company's portfolio, which includes iPhone 17 devices, iPhone Air and refreshed MacBook Pro and iPad Pro. 

On a conference call with analysts, Cook said: "We're also seeing developers take advantage of our on device foundation models. So excited for a more personalized Siri. We're making good progress on it, and as we've shared, we expect to release it next year."

By the numbers:

  • Apple reported fiscal 2025 revenue of $416.16 billion with net income of $112 billion.
  • Services revenue was $28.75 billion for the fourth quarter and $109.16 billion for the year.
  • iPhone revenue was $49.02 billion in the fourth quarter and $209.59 billion for the year.
  • Mac sales were $8.73 billion in the fourth quarter.
  • iPad sales in the fourth quarter were flat from a year ago at $6.95 billion.
  • Wearables, home and accessories revenue was $9.01 billion, down slightly from a year ago.
  • China revenue in the fourth quarter was $14.49 billion, down from $15 billion a year ago.

Constellation Research analyst Holger Mueller said:

"Apple had a good quarter. The good news is that the main platform, the iPhone got another boost, showing that innovation gets Apple users to upgrade. Now Apple just has to deliver more of it. The Apple Intelligence release is the big milestone to watch. With rising iPhone sales the platform and services up as well it's and surprising Apple is creating record sales. Services are now more revenue for Apple than all non iPhone categories together. Good to see. Now all eyes are on the always critical holiday quarter. Wearables, iPad and likely Mac won't help much. Apple is and remains the iPhone and growingly iPhone services company."

Next-Generation Customer Experience Tech Optimization apple Chief Information Officer

AWS Q3 revenue growth accelerates to 20%, best growth since 2022

AWS Q3 revenue growth accelerates to 20%, best growth since 2022

Amazon Web Services' revenue growth in the third quarter accelerated 20% as the unit delivered operating income of $11.4 billion on revenue of $33 billion.

The AWS results landed in a better-than-expected quarter for Amazon overall. Amazon reported third quarter net income of $21.1 billion, or $1.95 a share, on revenue of $180.2 billion.

Wall Street was expecting Amazon to report earnings of $1.56 a share on revenue of $177.76 billion.

AWS' revenue growth of 20% was above expectations. Microsoft said Azure revenue was up 40% and Google Cloud delivered sales growth of 34%. Both of those rivals are operating off a smaller base than AWS, which is on a $132 billion annual revenue run rate.

Andy Jassy, Amazon CEO, said:

"AWS is growing at a pace we haven’t seen since 2022, re-accelerating to 20.2% YoY. We continue to see strong demand in AI and core infrastructure, and we’ve been focused on accelerating capacity – adding more than 3.8 gigawatts in the past 12 months."

AWS fired up its Project Rainier data center cluster designed for Anthropic's AI workloads. AWS also said its Trainium2 custom AI chip is "fully subscribed and a multi-billion-dollar business."

In addition, AWS said it saw strong adoption of Transform, an AI agent that makes it easier to migrate to AWS. Transform has saved 700,000 hours in migration work.

As for the outlook, Amazon said fourth quarter revenue will be between $206 billion and $213 billion, or up 10% to 13%. Operating income will be between $21 billion to $26 billion.

On a conference call with analysts, Jassy said:

  • Backlog in the third quarter was $200 billion and "doesn't include several unannounced new deals in October."
  • "A lot of the future value companies will get from AI will be in the form of agents. AWS is heavily investing in this area. Companies will both create their own agents and use agents from other companies. For those building their own, it's been harder to build than shipping. For companies who successfully built agents, they hesitated putting them into production because they lack secure, scalable runtime services or memory or observability built specifically for agents. It's why we launched AgentCore instead of infrastructure building blocks that allow builders to deploy scalable agents."
  • "AWS continues to earn most of the big enterprise and government transformations to the cloud. AWS is where the ponderance company's data and workloads reside, and part of why most companies want to run AI on AWS. We need to have the requisite capacity. We've been focused on accelerating capacity the last several months, adding more than 3.8 gigawatts of power in the past 12 months. To put that into perspective, we're now double the power capacity that AWS was in 2022 and we're on track to double again by 2027."
  • "You're going to see us continue to be very aggressive investing in capacity, because we see demand as fast as we're adding capacity right now. As fast as we're bringing capacity in right now, we are monetizing it."
  • "Starting with Trainium3, we're building Bedrock to be the biggest inference engine in the world. In the long run, we believe Bedrock will be as big a business for AWS as EC2."
  • "We have a small number of very large customers on Trainium2, but because it is 30% to 40% better price performance, there are customers contemplating broader scale for AI focused workloads and inference. Trainium3 should preview at the end of this year and beginning of 2026 we have a lot of customers very interested. Trainium3 will be 40% better than Trainium2." 
  • "We buy a lot of Nvidia, but history shows there's never just one player that satisfies everyone's needs. But we have our own strong chip team. For our customers to use AI expansively, they're going to need better price performance."

Here's a look at the third quarter numbers:

  • North American commerce sales were $106.3 billion, up 11%, with operating income of $4.8 billion.
  • International commerce sales were $40.9 billion, up 14% from a year ago, with operating income of $1.2 billion.
  • Free cash flow for the third quarter was $14.8 billion for the trailing 12 months, down from $47.7 billion a year ago. Amazon said it has spent $50.9 billion on property and equipment--mostly GPUs, CPUs and data centers.

  • Advertising revenue in the third quarter was $17.7 billion, up 24% from a year ago.
  • Subscription services revenue was $12.57 billion, up 11% from a year ago.
  • Amazon ended the quarter with 1,578,000 employees.
  • Amazon took a $2.5 billion charge to settle a lawsuit with the Federal Trade Commission.
  • The company took a $1.8 billion charge related to severance costs and layoffs. Operating income would have been $21.7 billion without those charges.

Constellation Research analyst Holger Mueller said:

"Amazon is picking up speed again, across all segments, with  AWS leading the charge. But it didn't move on the profit side, due to regulatory fines and restructuring. Evidently, Andy Jassy is confident that Amazon can grow with less employees, a sign that the internal AI offerings are maturing. Jassy would not shift the hand to machine ratio if AWS AI wasn't ready. That is a very good confidence indicator for CxOs buying from AWS that they can start investing into the maturing AI offerings of AWS."

Data to Decisions Next-Generation Customer Experience Tech Optimization amazon Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

How Wayfair's replatforming sets it up for agentic AI commerce

How Wayfair's replatforming sets it up for agentic AI commerce

Wayfair plays in a rough neighborhood where it has to deal with tariffs, an anemic housing market and consumer purchasing patterns that were skewed by the Covid-19 pandemic. However, a technology replatforming during that volatility has made it more agile and able to position itself for AI.

The home goods retailer shined this week with third quarter results that handily topped estimates. Wayfair delivered a third quarter net loss of $99 million on revenue of $3.1 billion, up 8% from a year ago. Non-GAAP earnings in the quarter were 70 cents a share, 26 cents a share ahead of Wall Street estimates.

Wayfair had 21.2 million active customers in the third quarter, down 2.3% from a year ago, but was able to increase net revenue per active customers. A big reason Wayfair was able to increase revenue per active customer was that it used technology to help people design and buy easier. Repeat orders were up 6.8% from a year ago.

Niraj Shah, CEO of Wayfair, said the company has benefited from "the groundwork we've laid over multiple years directly driving share capture and profitability despite a category that remains stubbornly sluggish."

Shah added:

"We completed the bulk of our replatforming earlier this year and the timing couldn't have been better as we are in the early innings of a new phase in how customers shop online. While AI has certainly become the buzzword of late, we've been on the forefront of machine learning for a long time, leveraging algorithms to drive everything from pricing decisions to marketing investments. Today, there is new ground being broken with the proliferation and sophistication of generative AI, and Wayfair is a leader in the application of AI in retail."

Simply put, Wayfair doesn’t necessarily need a housing market recovery to thrive.

On the earnings call, Wayfair CTO Fiona Tan, a BT150 member, laid out the company's AI plans. Here's a look at the strategy.

AI as growth engine

Tan said generative AI and agentic AI is central to Wayfair’s next growth phase. The company is leveraging its long history with machine learning (pricing, cataloging, marketing) and is now scaling generative AI capabilities across the customer journey, operations, and supplier ecosystem.

“Our investments in AI are pragmatic and results-oriented centered on three key strategic outcomes: reinventing the customer journey, supercharging our operations and teams, and powering our platform and ecosystem,” she said.

Reinventing the customer journey

Wayfair is transforming the shopping experience into an AI-powered growth flywheel—inspire, engage, learn, personalize—to move beyond traditional personalization.

Key initiatives include:

  • Muse: A proprietary AI-powered inspiration engine that generates photorealistic, shoppable room scenes to attract low-intent shoppers.
  • Discover Tab: Integrates insights from Muse to create a looping shopping experience that drives longer visits and higher conversion.
  • Interest-Based Carousels: Personalize product recommendations based on lifestyle, with future context signals like weather and location.
  • LLM-Powered Search & Visual Search: Moves beyond keywords—customers can upload a photo and find similar products instantly.
  • AI Assistant & Designer-Quality Recommendations: Combines Wayfair’s designer expertise with LLMs to produce curated, personalized design matches—customers shown these are 33% more likely to add to cart or purchase.
  • “Complete the Look” (in testing): Generates full AI-styled rooms using real shoppable items from the catalog.

Operations

AI is being embedded across operations to boost efficiency and accuracy. Here’s a look at the moving parts:

  • Catalog Enrichment: Generative AI improves product data quality and consistency, driving higher add-to-cart rates.
  • Duplicate Detection: AI identifies redundant listings, cutting manual review costs by 75%.
  • AI Customer Service Agents: 24/7 fully autonomous bots handle common inquiries, while human agents use AI copilots with intent-based routing and reasoning models for complex issues.
  • Trust & Safety: Multimodal AI detects fraudulent imagery in real time.
  • AI for All Employees: Every employee has access to a generative AI license; company-wide Gen AI Innovation Challenge encourages practical AI adoption across departments.

“We’re making AI experimentation part of our everyday culture. Our teams are learning with the same urgency and curiosity we expect of the technology itself,” said Tan.

Platform and ecosystem

Tan said the supplier side of the Wayfair marketplace will leverage a series of AI agents as well as generative AI tools. Here’s a look:

  • AI Agents for Suppliers: Automate ticket classification and resolution, reducing manual work.
  • Generative AI for SEO & Ads: LLMs optimize product titles and ad copy—boosting Google visibility, free traffic, and ad performance.
  • Generative Engine Optimization: Ensures Wayfair products are surfaced in AI-driven search and chat platforms.

“We believe customer attention will flow to the most trustworthy API, not the loudest ad,” said Tan.

Agentic commerce

Tan said Wayfair is building a dual-pronged agentic AI strategy that revolves around the following.

  • Integrate with AI Platforms (e.g., Google, OpenAI, Perplexity) — ensuring its vast catalog is verifiable, discoverable, and fully transactable within AI environments.
  • Strengthen Wayfair’s Own Moats — emphasizing proprietary data, curated catalog, verified supply chains, and delivery reliability.

“Our plan is to make our catalog fully transactable on leading AI platforms, allowing customers to shop with confidence wherever their journey begins,” she said.

If successful, Tan said that AI-driven commerce will serve as a moat for Wayfair. “In a world of AI-driven commerce, retailers with a large, well-detailed catalog, verified supply chains, and deep technology capabilities are advantaged,” she said.

 

Data to Decisions Innovation & Product-led Growth Marketing Transformation Matrix Commerce Next-Generation Customer Experience Revenue & Growth Effectiveness B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Information Officer Chief Marketing Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Growth Officer Chief Product Officer Chief Revenue Officer Chief Technology Officer Chief Supply Chain Officer

ServiceNow Q3 shines as it raises outlook due to enterprise AI demand

ServiceNow Q3 shines as it raises outlook due to enterprise AI demand

ServiceNow said its third-quarter results were better than expected, raised its outlook and said it will split its stock 5-for-1.

The company reported third quarter earnings of $502 million, or $2.40 a share, on revenue of $3.41 billion, up 22% from a year ago. Non-GAAP earnings were $4.82 a share.

Wall Street was expecting third quarter earnings of $4.27 a share on revenue of $3.35 billion.

ServiceNow said current remaining performance obligations were $11.35 billion in the third quarter, up 21% from a year ago. Remaining performance obligations were $24.3 billion, up 24% from a year ago.

Bill McDermott, CEO of ServiceNow, said the company was in an elite club and that "enterprise AI was a great neighborhood to be in." McDermott touted ServiceNow CRM, the company's control tower for agentic AI and fast time to value.

CFO Gina Mastantuono aid "Now Assist, U.S. Federal, Workflow Data Fabric, and RaptorDB were all ahead of plan."

As for the outlook, ServiceNow projected fourth quarter subscription revenue of $3.42 billion to $3.43 billion, up nearly 20% from a year ago. For fiscal 2025, ServiceNow projected subscription revenue of $12.83 billion to $12.84 billion.

A few choice quotes from McDermott, who was particularly enthusiastic on the ServiceNow third quarter earnings call.

  • "Here's the headline: ServiceNow is one of the most durable, consistent overperforming growth companies in the enterprise software industry. When you think about brands shaping the future, you have GPU leaders like Nvidia, hyperscalers, foundation models and one company integrating all together, the AI workflow company ServiceNow. It used to be the MAG 7. Now there's a new category. I'm calling this Super Eight. That's the Mag 7 plus ServiceNow."
  • "Our AI Control Tower deal volume more than quadrupled quarter over quarter in Q3. Just since the end of May, AI agent assist consumption has increased over 55x. That's the foundation of a beautiful hockey stick that's coming to you."
  • "ServiceNow's workflow engine is creating the roadmap that AI agents follow to get work done."
  • "Enterprises invested a lot into legacy CRM deployments. For all that investment, they got a sprawling mess of instances and silos. They want a better way with AI. This applies to many legacy vendors, some more than others. Our AI experience turns CRM into an AI-first system of action that drives growth and customer loyalty."

 

Data to Decisions Future of Work Innovation & Product-led Growth servicenow Chief Executive Officer Chief Information Officer