Results

AWS adds AI agent policy, evaluation tools to Amazon Bedrock AgentCore

Amazon Web Services is adding AI agent policy and evaluation features to Amazon Bedrock AgentCore in a move that aims to solve the big issues that keep enterprises from moving from proof of concept to production.

At AWS re:Invent 2025, the cloud giant updated Amazon Bedrock AgentCore, which recently became generally available.

Amazon Bedrock AgentCore sits in the tools and infrastructure layer for building AI agents and applications along with Amazon Bedrock, Amazon Nova and Strands Agents. That AI building infrastructure also includes Amazon SageMaker and compute including AWS Trainium and Inferentia.

More from re:Invent 2025

Customers using Amazon Bedrock AgentCore include PGA Tour, which deployed an automated content system for player coverage, Salesforce's Heroku, which built an app development agent called Heroku Vibes, and Grupo Elfa, which deployed three agents for price quote processing. AWS CEO Matt Garman also noted Nasdaq and Bristol Myers Squibb as AgentCore cusotmers. 

Garman said the addition of policy and evaluation can free up innovation. "Most customers feel that they're blocked from being able to deploy agents to their most valuable, critical use cases today," said Garman.

Mark Roy, Tech Lead, Agentic AI at AWS, said CIOs are racing to deploy AI agents, but want insurance in the form of governance and evaluations to scale them.

Here's a look at the policy and evaluation additions to Amazon Bedrock AgentCore.

Policy in Amazon Bedrock AgentCore is designed to ensure AI agents stay within defined boundaries without slowing them down. The policy system is integrated with AgentCore Gateway to intercept every call before execution.

Amazon Bedrock AgentCore Policy does the following:

  • Gives you control over what agents can access, what actions are performed and under what conditions.
  • Processes thousands of requests per second while maintaining operational speed.
  • Create policies using natural language and aligns with audit rules without custom code.
  • Define clear policies once and apply them across the enterprise.

"You need to have visibility into each step of the agent action, and also stop unsafe actions before they happen," explained Vivek Singh, AgentCore Senior Product Manager at AWS. "This includes robust observability, so if something goes wrong, you can pinpoint exactly into what steps the agent took and how the agent came to that conclusion. You also need the ability to set some of your business policies in real time."

Amazon Bedrock AgentCore Evaluations adds a set of 13 built-in evaluators to assess AI agent behavior for correctness, helpfulness and safety. The evaluators enable developers to deploy reliable agents with real-time quality monitoring and automated risk assessment.

Enterprises can also create custom evaluators for quality assessments using preferred prompts and models. The evaluations are also integrated into AgentCore Observability via Amazon CloudWatch for unified monitoring.

According to AWS, AgentCore Evaluations monitors real-world behavior of AI agents in production. And LLM is used to judge responses for each metric and then write explanations. Evaluations are on-demand so developers can validate AI agents before production and then ensure smooth upgrades.

Amazon Bedrock AgentCore Evaluations is available in preview.

With the addition of Episodic Functionality to AgentCore Memory, AgentCore can enable agents to learn from successes and failures, adapt and build knowledge over time.

Along with the AgentCore updates, AWS also built out Strands Agents, an open-source python software development kit announced in May. Strands Agents is aimed at building agents with a few lines of code with native integration with Model Context Protocol servers and AWS services. It's designed for rapid development.

AWS announced the following for Strands Agents:

  • Strands Agents SDK for TypeScript so developers can choose between TypeScript or Python and run in client applications.
  • Strands Agents SDK for edge devices so developers can run agents using local models.
  • AWS also said it is experimenting with steering tools in Strands Agents to make them more context aware without front loading all agent instructions into a single prompt. The idea is to use steering handlers to make agents more flexible while reducing token costs.
  • Strands Agents Evaluations, an evaluation framework to tests agent quality, interactions and goal completion.

Constellation Research analyst Holger Mueller said:

"AWS is continuing its systematic build out of AgentCore with the new capabilities announced today. And that is key for CxOs because for advanced AI adopters in 2026 it is going to be the battle of the AI frameworks. Who will enable their enterprise to build AI powered Next Generations Applications that help automate and lower costs?"
 

 

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AWS reInvent aws ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Information Officer CEO Chief Executive Officer CIO CTO Chief Technology Officer CAIO Chief AI Officer CDAO Chief Data Officer CAO Chief Analytics Officer CISO Chief Information Security Officer CPO Chief Product Officer

AWS launches Amazon Nova Forge, Nova 2 Omni

Amazon Web Services launched Amazon Nova Forge, a service that gives enterprises the ability to train their own foundational models.

With Amazon Nova Forge, AWS is looking to deliver real enterprise outcomes. The reality has been that off-the-shelf foundational models are inaccurate in many enterprise use cases. When enterprises build on top of open source models, results can degrade as more data is added.

Speaking at AWS re:Invent 2025, AWS CEO Matt Garman said the company is focused on model selection as well as building out Nova. Enterprises will need multiple models as a start kit to customize with their own enterprise data.

“Your data is unique. It's what differentiates you from the competition," said Garman. "If your models have more specific knowledge about you and your data and your processes, you can do a lot more."

More from re:Invent 2025

The goal with Amazon Nova Forge is to get better results with proprietary data in a seamless way.  Garman said Nova Forge is aimed at enabling enterprises to add their unique expertise to a foundation model.  Garman added that today’s techniques can only go so far in adding expertise to models.

Amazon Nova Forge gives enterprises the ability to do the following:

  • Select a starting model and checkpoint.
  • Mix your data with Amazon curated datasets.
  • Leverage multiple checkpoints to ensure performance.
  • Deploy these custom models on Amazon Bedrock for AI agent deployments.

Garman said:

“With Nova Forge, you get exclusive access to a variety of Nova training checkpoints, and then you get the ability for you to blend in your own proprietary data together with an Amazon curated training data set at every stage of the model training, this allows you to produce a model that deeply understands your information, all without forgetting the core information the thing has been trained on. We call these resulting models novellas, and then we allow you to easily upload your novella and run it in Bedrock.”

Nova Forge is being used by Reddit as well as Sony. Sony said it was putting Nova Forge in the middle of its agentic AI architecture, which is built on AgentCore.

Nova Forge is part of a broader Amazon Nova rollout. The company, which first introduced Nova models a year ago and has updated since, launched Amazon Nova 2 including the following versions:

  • Nova 2 Lite, a fast cost effective reasoning model.
  • Nova 2 Pro, AWS most intelligent reasoning model.
  • Nova 2 Sonic, a speech-to-speech foundational model for conversational AI that's embedded into Amazon Connect.
  • Nova 2 Omni, a multimodal reasoning and image generation model.

“Over the last year, we've actually extended Nova family to support more use cases and deliver more possibilities for you that deliver real value,” said Garman, who said AWS will continue to add Nova models.

Amazon Bedrock is adding new models from Google, Nvidia, Mistral including Mistral Large 3 and Ministral 3, Alibaba's Qwen and Amazon.

“We think model choice is so critical. We've never believed that there was going to be one model to rule them all, but rather that there will be a ton of great models out there, and it's why we've continued to rapidly build upon an already wide selection of models,” said Garman. “We have open rates models and proprietary models, general purpose, specialized ones. We have really large ones and small models, and we've nearly doubled the number of models that we offer in bedrock over the last year.”

Data to Decisions Future of Work AWS reInvent aws Chief Information Officer

AWS launches AI factory service, Trainium 3 with Trainium 4 on deck

Amazon Web Services launched AWS AI Factories, said Trainium 3 was generally available and outlined plans for Trainium 4.

The focus on custom silicon for AWS lands as it emphasizes that is still a strong partner to Nvidia--and outlined new instances for the latest Nvidia GPUs.

During a keynote at re:Invent 2025, AWS CEO Matt Garman followed a string of big infrastructure announcements including plans to invest $50 billion for US government high performance computing and AI data centers, the launch of Project Rainier for Anthropic and a deal with OpenAI.

Garman said AI infrastructure will require new building blocks and processes to create agents. "AI assistants are starting to give way to AI agents that can perform tasks and automate on your behalf. This is where we're starting to see material business returns from your AI investments," said Garman. "I believe that the advent of the of AI agents has brought us to an inflection point in AI's trajectory. It's turning from a technical wonder into something that delivers us real value. This change is going to have as much impact on your business as the internet or the cloud."

AWS has added 3.8 gigawatts of capacity added with Trainium growth of more than 150%. AWS has increased its network backbone by 50%. “In the last year alone, we've added 3.8 gigawatts of data center capacity, more than anyone in the world. And we have the world's largest private network, which has increased 50% over the last 12 months to now be more than 9 million kilometers of terrestrial and subsea cable,” said Garman during his keynote.

More from re:Invent 2025

While the launch of Trainium 3 was telegraphed on Amazon’s earnings call, Trainium 4 was also previewed. There was also a messaging twist in that AWS noted that Trainium, which was originally launched as an AI model training chip, is also being used heavily for inference.

Garman said AWS has already deployed more than 1 million Trainium processors and is selling them as fast as they can be produced.

Among the details:

  • Trainium 3 and UltraServers will offer the best price-performance for large-scale AI training and inference. Compared to Trainium 2, AWS Trainium 3 and UltraServers will have 4.4x more compute, 3.9x higher memory bandwidth and 3.5x higher tokens/megawatts.
  • Garman said AWS has seen big performance gains by installing Trainium 3 in its UltraServers.

  • Trainium 4 will build on Trainium 3 with 6x the performance (fp4), 4x the memory bandwidth and 2x the memory capacity.

AWS' custom silicon will in part power AWS AI Factories, which also launched at re:Invent. AWS AI Factories are customer-specific AI infrastructure built, scaled and managed by AWS.

The general idea behind AWS AI Factories is that the cloud provider can take the expertise from the projects behind the Anthropic, OpenAI and Humane deals and democratize AI factories for large enterprises and the public sector.

“We're enabling customers to deploy dedicated AI infrastructure for AWS in their own data centers for exclusive use for them,” said Garman. “AWS AI factories operate like a private AWS region, letting customers leverage their own data center space and power capacity that they've already acquired. We also give them access to leading AWS AI infrastructure and services.”

Now AWS' Garman was sure to make sure Nvidia instances were handy. The pitch is that AWS is the best place to run Nvidia for reliability, uptime and availability.

AWS launched P6e instances based on Nvidia's GB200 and GB300 AI accelerators. These instances are an upgrade over P5 instances based on B200 and B300.

Data to Decisions Future of Work Tech Optimization Chief Information Officer

ServiceNow acquires Veza, will integrate into AI Control Tower

ServiceNow said it will acquire Veza in a move that will bring identity tools to its security and risk portfolio.

Terms of the deal weren't disclosed.

ServiceNow said in a statement that Veza specializes in identity security and enables enterprises to understand and control who and what has access to data, applications, systems and AI artifacts.

Veza's main technology is its Access graph, which maps and analyzes access relationships across human, machine and AI identities. The latter part is critical given that vendors are adding identity access technologies for AI agents.

The plan for ServiceNow is to add Veza to its AI Control Tower, which governs and orchestrates AI agents. ServiceNow will also add Veza to its security and risk portfolio including vulnerability response, incident response and integrated risk management. ServiceNow's security and risk applications have more than $1 billion in annual contract value.

Veza, founded in 2000, has more than 150 global enterprise customers.

Constellation Research analyst Holger Mueller said:

"While ServiceNow keeps declaring AI platform readiness, it keeps making key architecture decisions and Veza is no exception. While the acquisition makes sense and maybe also differentiating, CxOs should expect ripple effects across the architecture from a runtime and implementation perspective."

Data to Decisions Digital Safety, Privacy & Cybersecurity servicenow Chief Information Officer Chief Information Security Officer

MongoDB Q3 surges on Atlas demand

MongoDB revenue surged in the third quarter courtesy of 30% revenue growth in its Atlas platform.

The company, which recently named CJ Desai as CEO, reported a third quarter net loss of $2 million, or 2 cents a share, on revenue of $628.3 million, up 19% from a year ago. Non-GAAP earnings were $1.32 a share.

Wall Street was expecting MongoDB to report non-GAAP earnings of 79 cents a share on revenue of $593.44 million.

MongoDB said its Atlas revenue was up 30% from a year ago. Atlas now represents 75% of revenue. The company added 2,600 customers in the third quarter and as of Oct. 31 had 62,500 total customers.

Desai said the third quarter was driven by "continued strength in Atlas" and the company "delivered meaningful margin outperformance."

As for the outlook, MongoDB raised its outlook for fiscal 2026 and the fourth quarter. For the fourth quarter, MongoDB said revenue will be between $665 million to $670 million with non-GAAP earnings of $1.44 a share to $1.48 a share.

For fiscal 2026, MongoDB is projecting revenue of $2.434 billion to $2.439 billion with non-GAAP earnings of $4.76 a share to $4.80 a share.

On a conference call, Desai said:

  • "MongoDB has the potential to become the generational modern data platform of this evolving era, an opportunity that comes once in a lifetime. I am a truly customer-obsessed leader. So during my diligence, I spoke with multiple customers. Across these conversations, the message was clear. MongoDB already powers core, mission-critical workloads were enterprises that are modernizing their technology stack. At the same time, MongoDB is uniquely positioned at the center of the AI platform shift."
  • "There is still significant room to broaden our footprint within the enterprise. A strong example of this expansion opportunity is a major global insurance provider that has adopted MongoDB broadly across its enterprise. The company selected MongoDB Atlas to modernize several mission-critical systems, including its next-generation policy administration platform, analytics rating engine, unstructured data repositories and hundreds of supporting services.Since moving its policy platform to Atlas, the insurer has expanded from just a small set of regions to nationwide and significantly accelerated the rollout of new products and distribution channels."
  • "As AI adoption accelerates, MongoDB's positioned not just to participate in the wave, but to help define it. we are already beginning to see this play out with AI-native customers."
  • "We are also seeing meaningful traction among large enterprises that are starting to build AI applications that have a material impact on their business." 
Data to Decisions mongodb Chief Information Officer

AWS Transform aims for custom code, enterprise tech debt

Amazon Web Services launched AWS Transform custom, which is a new agent to modernize the custom code that got you into your technical debt pickle in the first place.

The launch, which AWS kicked off at re:Invent 2025 with a stunt that included blowing up a server rack, is the latest in a portfolio of Transform services.

In a blog post, AWS combines pre-built transformations to upgrade anything in Java, Node.js, and Python with the ability to define custom work.

The goal is to use the agent to free up developers.AWS Transform custom includes:

  • Command line and web interfaces to define transformations through natural language and execute them on local codebases.
  • AWS Transform custom can operate autonomously.
  • The web interfaces provides tracking for teams and transformation progress.
  • The system supports runtime upgrades without the need for additional information.
  • AWS Transform custom learns enterprise code patterns that have evolved over time.
  • DevOps can configure AWS Transform custom with integration and continuous delivery tools.

AWS also announced the following AWS Transform updates.

  • AWS followed up the general availability of AWS Transform for .NET with full stack Windows modernization. AWS Transform will cover application, UI, database, and deployment layers and map Windows stacks to AWS services.
  • AWS Transform for mainframe adds capabilities called Reimagine, which uses AI to revamp an enterprise architecture using business logic, patterns, data lineage and legacy source code. With the service, customers can transform mainframe applications to microservices and modern architecture. AWS Transform for mainframe also includes automate test plan generation, collection scripts and AI tools to accelerate testing timelines.

Holger Mueller, an analyst at Constellation Research, said:

"AI has tremendous power when transforming and translating language, and code is nothing but a (very structured) language. As such it has a lot of promise for code renovation and modernization. Good to see AWS tackling this, but it is kind of skimming the low hanging fruit by modernizing 5-10 year old code assets. Staying away from 10-29+ year old code assets, these require code translation from the legacy to a modern programming language. And that is where the prize for CxOs really is."

 

Data to Decisions Future of Work Next-Generation Customer Experience Tech Optimization AWS reInvent aws Chief Information Officer

AWS, Google Cloud engineer interconnect between their clouds

Amazon Web Services and Google Cloud are making their clouds interoperable in a move that will be welcomed by multicloud enterprises.

In a blog post, the two cloud giants said they are simplifying how enterprises string together interconnects between AWS and Google Cloud. The news lands as AWS kicks off its AWS re:Invent 2025 conference in Las Vegas. 

The two companies jointly engineered a multicloud networking system that uses both AWS Interconnect - multicloud and Google Cloud's Cross-Cloud Interconnect.

According to AWS and Google Cloud, the companies will also introduce a new open specification for network interoperability. Customers should be able to establish private high-speed connectivity between AWS and Google Cloud.

With enterprises rolling out architectures for AI adoption and AI agents, the idea of do-it-yourself cloud networking wasn't going to fly. For example, Salesforce, which uses both AWS and Google Cloud, said the interconnect between the two cloud will be critical for Salesforce Data 360.

Here’s a look at how the interconnect between AWS and Google Cloud would work.

AWS said that the unified specification can be adopted by any cloud provider.

Key points:

  • The multicloud connectivity from AWS and Google Cloud mean there will be a managed cloud-native experience.
  • The joint effort will abstract physical connectivity, network addressing and routing policies.
  • Bandwidth can be provisioned on demand via their preferred cloud console or API.
  • The companies published the API specifications.

Holger Mueller, an analyst at Constellation Research, said the collaboration is a good first step.

"Enterprises have their data fragmented across the cloud, but AI forces them to connect them. So it is good to see the partnership between AWS and Google to help customers. But voiding the spec of the interconnect, specifically latency - we can only go for the precursor of this - between Azure and OCI. And that was too slow for analytics use cases - hence Oracle moved the Exadata machines inside of Azure. We will see what use cases CxOs can power from the new partnership - but based on the past - hope should not be too high."

Rob Kennedy, VP of Network Services at AWS, said at re:Invent 2025, that AWS will also be connecting to Microsoft's Azure interconnect in the near future. Key points from Kennedy:

  • Customers asked for interconnects that went across clouds not just on-premises data centers. "We decided to really solve this problem for our customers and just create a full managed solution that kind of abstracts away all the physical components," said Kennedy. "They can simply turn up bandwidth between multiple locations cloud providers."
  • Defining the standard should make it easier to combine clouds and "we've already got buy-in from both GCP and Azure," said Kennedy. "We hope to continue to get buy in from others as we continue to move forward. And it's a full global service."

Customers can define the bandwidth needed with a click and get budget predictability.

Data to Decisions Tech Optimization AWS reInvent aws Google Big Data Chief Information Officer CIO CTO Chief Technology Officer CISO Chief Information Security Officer CDO Chief Data Officer

AWS Marketplace adds solutions-based offers, Agent Mode

AWS Marketplace is moving beyond apps to offer multiple products bundled into one package, faster private offers and agent mode. AWS Marketplace is also accelerating processes post purchase.

With the move, AWS Marketplace is evolving to enable partners, software providers and integrators to combine components to address complex enterprise use cases, offer flexible pricing and utilize one procurement flow. Customers have better transparency for each component, simplified negotiation and purchasing and one seller of record.

The news outlined ahead of AWS re:Invent 2025 is the latest iteration in how AWS is as much of an enterprise technology marketplace as it is a cloud provider. AWS Marketplaces has more than 3 million subscriptions enabled and more than 99% of the top 1,000 AWS customers have at least one AWS Marketplace subscription.

Speaking on a briefing, Matt Yanchyshyn, Vice President of AWS Marketplace and Partner Services, said the club of vendors that have sold more than $1 billion on AWS Marketplace is growing. Salesforce, Databricks and CrowdStrike are in the club but partners such as Presidio are also conducting transactions at scale.

Yanchyshyn added that the July launch of AI agents in AWS Marketplace is scaling. AWS is also using its own AI agents and generative AI tools on the marketplace. "We have a new suite of AI powered capabilities that are facilitating both discovery, product comparisons, but also purchase through the marketplace," he said.

For AWS Marketplace, which has more than 30,000 public listings, 3,500 channel partners, 10,000 professional services providers and 6,000 sellers in 70 categories, the vision is to remove friction from buying services and software from partners. AWS Marketplace is looking to offer automated deployment experiences as well as built-in integration. For instance, CrowdStrike, one of AWS Marketplace's biggest partners, now has CrowdStrike Falcon Next-Gen SIEM for AWS with pay-as-you-go pricing, self-service procurement, an automated deployment experience and built-in integration with AWS CloudTrail, Security Hub and Guard Duty.

The flow consists of subscription, service selection, 1 click launch resources for integration and deployment.

Here's what AWS Marketplace added.

Solution-based buying where customers can purchase multiple products and services in one flow. The seller of record sends a consolidated offer that includes private offers from all components in a stack. The customer reviews terms and pricing, costs for each component and total contract sets and accepts or denies.

AWS Marketplace's move to support a stack in one purchasing flow reflects that reality that most enterprise purchases aren't done in isolation. Customers in theory would remove the hassle of compiling software and services themselves.

Yanchyshyn said the solution approach on AWS Marketplace should enable more industry- and use-case focused sales. "We're starting to see more industry vertical type solutions come to the marketplace as well," he said. "We need to cater to the needs of the sellers, and, more importantly, our joint customers as well. They're looking for solutions that solve their specific use cases, not always point products."

Independent software vendors can use AWS Marketplace to combine software with implementation services, package complementary products or team up with channel partners and integrators. Systems integrators and channel partners can package services with software their authorized to resell, align with outcomes and simplify procurement for enterprise wide deployments.

Here's a look at some representative solutions.

Since anyone can put solutions together, Yanchyshyn said he expects some interesting packages to emerge. He noted that GitLab and MongoDB often sell services along with their software. "The lines between resellers and professional services and ISVs have definitely started to blur over the last few years," he said. "I think we're going to start to see some interesting models evolve with people reselling other people's stuff."

AWS Marketplace also added the following.

Agent Mode, a conversational interface that guides buyers through research and analysis, uploads requirements documentation and provides in-depth comparisons.

From there, Agent Mode generates downloadable proposals. Agent Mode also features a Model Context Protocol server for accessing AI tools and building discovery experiences.

AI-enhanced search for better precision on searches by use cases. Buyers can narrow down results with smart category filters, product grouping and specializations at a glance.

Express private offers where AI on AWS Marketplace evaluates customer needs and aligns them with seller-defined parameters. In other words, offers can be instantly generated. "It allows the seller to essentially expose their private rate card, their discount sheet, and allow us to issue private offers on behalf of the seller to customers, said Yanchyshyn.

Buyers qualify for offers, skip pricing negotiations and procure software at discounted rates. Sellers get flexible pricing at scale, streamline private offers for standard deals and can refocus sales teams.

Variable payments for professional services. AWS Marketplace is supporting flexible services engagement with contract pricing with variable payment, upfront payment and installment plans. Embedded into private offers, services firms can bill customers as work is delivered, based on outcomes or milestone or how time and materials are consumed.

In the variable payment for professional services model, customers get transparency and control and can review and approve requests manually or automate them.

AWS Marketplace added automation features in AWS Partner Central in the AWS Console. Features include API automation, streamlined access to Partner Central and Marketplace Management Portal features, connections to other AWS partners, a personalized Partner assistant powered by Amazon Q, and enhanced user management tools.

New Partner Central APIs, which connect business tools to Partner Central to automate co-selling processes. AWS Marketplace added Opportunity API, Leads API, Account API, Solution API, Benefits API and Connections API.

The AWS Marketplace additions complement new features announced Nov. 19. Leading into re:Invent, AWS Marketplace added billing transfer, which gives customers that ability to retain access to their management account while AWS invoicing and cost data is transferred to channel partner accounts.

AWS Marketplace had also streamlined processes behind post-purchase setup. AWS added identity access management tools that allow partners to request time limited access to customer AWS accounts and streamlined product setup and ongoing maintenance.

Data to Decisions Matrix Commerce Next-Generation Customer Experience AWS reInvent aws Chief Information Officer Chief Procurement Officer

Warby Parker's third act revolves around AI

Warby Parker is entering its third act that will be powered by AI on two fronts: Productivity and revenue growth.

Neil Blumenthal, President and Co-CEO, said on the company's third quarter earnings call that the eyeglass retailer's next act will be "defined by innovation through AI."

Blumenthal added:

"We plan to leverage AI to develop new products like AI glasses, to enhance our customer and patient experience like our homegrown first true to scale virtual try-on that now encompasses features like glasses eraser and adviser and to drive productivity and accelerate EBITDA expansion. We previously announced that we'll be working with Google to bring intelligent eyewear to market and are excited to share that we're partnering with Samsung as well."

For Warby Parker's next act to pay off a lot will be riding on the Google partnership, which was announced in May. More news could be outlined in the coming months with CES 2026 or Google I/O as obvious events to showcase a new product.

Google is looking to be a fast-follower to Meta, which has had success with adding technology to Ray Bans. Meta has a long term partnership with Ray Ban parent EssilorLuxottica.

Warby Parker emerged as one of the first Internet lifestyle brands and sold glasses online with frames starting at $95. Warby Parker, through multiple business cycles, has kept that trademark pricing.

As for the second act, Warby Parker scaled its brick-and-mortar footprint and began providing eye exams and contacts. Today, the company's glasses are often covered by insurance and the customer base has diversified enough to show different buying patterns. For instance, Warby Parker's single vision and contacts business lost some momentum since those buyers are younger and struggling amid economic uncertainty. Sales of progressives had more resilience because the customers are older and better off.

The retailer AI strategy is broad and designed to drive productivity, new customer experiences and new products. Here's a look at the plan.

AI productivity efforts

Blumenthal, along with Co-CEO and Principal Financial and Accounting Officer David Gilboa, recently held the company's annual One Vision Summit, which brings retail leaders and optometrists together.

Because it's AI-driven tools on the website and in Warby Parker's app have been so successful, the company has been able to sunset its Home Try-on program.

Under that program, Warby Parker would send prospective customers a set of frames to try on at home and then send back. The program, which was critical to getting consumers used to buying glasses online, also came with shipping costs and a longer sales cycle.

Sunsetting Home Try-on is part of an effort to align with evolving customer preferences and technology while simplifying operations and carrying less inventory.

"We're encouraged by the engagement and conversion we're seeing from AI-powered tools like Advisor, which gives us confidence in our ability to drive the channel long term. Lastly, we continue to expand our holistic vision care offerings as part of our broader strategy to serve all of our customers' needs," said Gilboa.

Blumenthal highlighted a few ways AI was driving Warby Parker metrics.

  • "We're using AI in our eyewear design process and even evaluating technical designs as we leverage AI as part of our customer journey flow," said Blumenthal.
  • "Our brand and creative tools are leveraging new tools to bring down the cost of content creation as we think about photo shoots and the production costs," he said.
  • "Every corporate team member is using often multiple AI tools per day and we're finding increased productivity across our headquarters team," said Blumenthal.

AI will also play a role in Warby Parker's proprietary point-of-sale system and software for optometrists so they don't get bogged down with administrative tasks, said Blumenthal.

AI powered glasses

The plan for Warby Parker is to expand its total addressable market beyond eyeglasses. The company ended the third quarter with 2.7 million active customers, up 9.3% from a year ago, with average revenue per customer of $320.

Warby Parker is projecting 2025 revenue of $871 million to $874 million, up about 13%.

Although Warby Parker primarily sells online, its stores are acquiring customers, said Gilboa. Retail revenue in the third quarter was up 20% from a year ago due the addition of 15 new stores. The retailer also has a partnership with Target to add shops.

Those stores will be critical when Warby Parker shows off its Google-powered AI glasses. Blumenthal said the company upgraded its optical labs to support future growth, faster delivery times and ultimately fulfillment for AI glasses.

Details of the Google-Warby Parker partnership have been coming out in recent months. In many respects, Google is a big brother to Warby Parker. Google will be covering a chunk of the expenses to stand up a product including product development, experiences and the demo environment, which will be delivered in Warby Parker stores.

Google also said it will invest $75 million in Warby Parker subject to hitting collaboration milestones.

Gilboa said in September that Google AI will be embedded into Warby Parker glasses designed for all-day wear for prescription or non-prescription lenses.

"Our understanding is that the primary use cases are really to replace AirPods or take hands-free photos and our products will do that exceptionally well. But the reason that we were really excited to partner with Google is because of their AI capabilities throughout their organization with Gemini and DeepMind," said Gilboa. "Google also has massive capabilities across Android, Gmail, Google Maps and search."

Gilboa added:

"We think that they're going to be really transformative in terms of how people engage with technology and will enable them to stop being tethered to kind of pulling a screen out of their pocket and engage more with the real world."

 

Data to Decisions Innovation & Product-led Growth Matrix Commerce Next-Generation Customer Experience B2B B2C CX Customer Experience EX Employee Experience business Marketing eCommerce Supply Chain Growth Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Customer Service Content Management Collaboration M&A Enterprise Service AI Analytics Automation Machine Learning Generative AI Chief Information Officer CCO Chief Customer Officer CDO Chief Digital Officer CEO Chief Executive Officer CFO Chief Financial Officer CGO Chief Growth Officer CIO CMO Chief Marketing Officer CPO Chief Product Officer CRO Chief Revenue Officer CTO Chief Technology Officer CSCO Chief Supply Chain Officer

The AI PC upgrade cycle is crawling amid murky value

AI PCs are supposed to be seeing an upgrade boom, but the revolution is still on the runway.

HP and Dell Technologies are still optimistic about the PC upgrade cycle that revolves around AI PCs and the end of Windows 10 support. There are a lot of older PCs out there.

Apparently, not many people want to talk to their PCs because we're still on the runway for this AI PC boom. The commercial upgrade cycle looks better than consumer, but the situation can be summed up as hurry up and wait.

HP reported a shaky fourth quarter and said it will cut 4,000 to 6,000 employees to drive $ 1 billion of gross run rate savings over the next three years. HP is planning to use AI to become more productive.

Enrique Lores, HP CEO, said the company is looking to show the way by delivering productivity gains with AI PCs.

Speaking on HP's fourth quarter earnings call, Lores said the company doubled revenue for AI PCs, but the penetration of the devices sit at about 30%. The driver of AI PC upgrades revolve around being prepared for applications that have yet to show up.

"Customers want to be ready as soon as applications start taking advantage of the capabilities of these products," said Lores, who said HP is working with customers to create apps that leverage AI capabilities. Lores said that Microsoft's tools to manage PCs with voice could be a driver.

HP is hoping that being customer zero will help sell AI PCs. "We have deployed these solutions internally in HP with not only the PCs but with a curated set of applications, we have seen up to 17% of productivity improvement," said Lores.

Now HP is moving PCs. Revenue in the fourth quarter for the PC unit was up 8% from a year ago largely due to commercial and premium consumer devices.

"With 40% of the installed base still on Windows 10 at the end of Q4, the Windows 11 refresh will remain a tailwind for the PC market into 2026. And demand for AI PCs continues to accelerate, now representing more than 30% of our shipments this quarter," said Lores.

It's worth noting that Lores said the following during HP's fourth quarter call in 2024.

"We have not changed our view on the impact that AI PCs are going to have and current results support the assumptions that we have seen. The AI PCs are going to drive an improvement of average selling price. What we have been saying until now when we confirm is three years from now, we expect them to be between 40% and 60% of the mix and half around between a 5% and a 10% impact on the overall category."

"If you ask me how confident I am about the impact the AI PCs are going to have is even more than before because I have seen them in action. I see the opportunity that they bring."

But why buy a PC if we're still waiting for applications that take advantage of AI the device?

For Dell Technologies, the AI PC story is similar. The difference with Dell Technologies is that PCs are mostly commercial and the reality is Wall Street is more tuned into AI servers.

Nevertheless, there's an optimistic hurry up and wait theme with Dell's AI PCs too. Dell Technologies Chief Operating Officer said on the company's third quarter earnings call:

"We have not completed the Windows 11 transition. In fact, if you were to look at it relative to the previous OS end of service, we are 10, 12 points behind at that point with Windows 11 than we were the previous generation.

The installed base is roughly 1.5 billion units. We have about 500 million of them capable of running Windows 11 that haven't been upgraded. And we have another 500 million that are 4 years old that can't run Windows 11. Those are all rich opportunities to upgrade towards Windows 11 and modern technology.

Equally important, AI PCs, small language models, more capable applications, improvements in operating systems and their capabilities and the embedded AI there, the use of an MPU, the capability of an MPU and future PCs gives me the view that the PC market will continue to flourish going forward."

Clarke then went on to define "flourish." He said Dell is expecting the PC market to be roughly flat with a year ago.

In November 2024, Clarke said that he saw indications that customers are lining up new AI PCs in the first half of 2025. Enterprises have been upgrading as part of a normal cycle. But it's not a boom by any means. In other words, PC users should have been primed to upgrade to AI PCs a year ago. PC owners aren't rushing to upgrade.

For Lenovo, the company said AI PCS are 33% of PC shipments just ahead of HP's percentage. Luca Rossi, Executive VP & President of Lenovo's Intelligent Devices Group, said the company has more than 30% market share in Windows AI PCs.

"We are also not standing by, and we look forward to what will be the new AI native device era," said Rossi.

Rossi isn't alone. Every vendor in the AI PC market is still looking forward to the AI revolution, which apparently delayed.

My take

As someone who has been looking to upgrade by laptop for at least a year just based on the reality it's more than 4 years old, I get the delay in the upgrade cycle.

For starters, I don't see the point of an AI PC. I don't want to talk to the equivalent of what is a productivity toaster. And if I did, I could riff with ChatGPT or Google Gemini on my not-AI-PC without any issues.

Now I get the privacy argument and see how on-device models could be handy, but there's not enough value for me to upgrade.

It's a tougher sell to waste expendable income on an AI PC when there's no killer app. Toss in economic concerns, and it's no surprise the PC upgrade cycle is slow.

And Windows 11 isn't much of a sell either. My fleet of PCs were capable of upgrading so that's not rushing things either. In addition, Microsoft seems to be carpet bombing me with Copilot pitches at every turn. I generally click the "x" when Copilot tries to be helpful. AI can't stand for annoying interruptions.

Simply put, if I want Copilot I'll reach out. Otherwise leave me alone.

A year in the AI PC upgrade cycle

 

Data to Decisions Future of Work Chief Information Officer