Results

AWS launches Graviton5 as custom silicon march continues

AWS launched new Graviton5 based instances with 192 core per chip and a 5x larger cache as the company continues its custom processor cadence.

Graviton continues to power more than half of the new CPU capacity for AWS for the third year in a row. AWS said that 98% of its top 1,000 EC2 customers have used Graviton.

Although Trainium is getting most of the attention at AWS re:Invent 2025, Graviton is part of the mix as well as Inferentia. AWS cited Adobe, Pinterest, SAP, Snowflake and a bevy of others as Graviton customers.

"The Graviton processor came from a brand new design based on delivering the best price performance for workloads that customers run every day in the cloud," said Dave Brown, VP of Compute and Machine Learning Services at AWS.

In a keynote, Brown and AWS SVP Peter DeSantis noted the following about AWS' custom silicon strategy.

  • AI is expensive to run.
  • AWS is investing heavily in lowering the costs of running workloads.
  • With custom processors, AWS can leverage its Nitro virtualization layer and reduce jitter.
  • Cloud workloads need to continually optimized for price-performance benefits.
  • By controlling the entire stack from processor to server, AWS could implement innovations like direct-to-silicon cooling, which reduced fan power consumption by 33%.
  • Custom processors allow AWS to iterate and optimize performance with greater control over the hardware and innovation cadence.

Another way to put it is that everybody's margin is an opportunity for AWS. Whether you look at Trainium, Nova models or Graviton, AWS is looking to commoditize.

Key items about Graviton:

  • Graviton5-based EC2 M9g instances have 192 cores in a single package.
  • Latency is improved by up to 33%.
  • Graviton5 has a 5x larger L3 cache, 2.6x more than Graviton4.
  • The processor has up to 15% higher network bandwidth and 20% higher Amazon Elastic Block Store (EBS) bandwidth across instance sizes.
  • Graviton5 is built on AWS's 3nm technology.

Graviton5 instances leverage sixth-generation Nitro Cards to offload virtualization, storage and networking functions to dedicated hardware.

More re:Invent 2025:

Tech Optimization Data to Decisions AWS reInvent aws Big Data Chief Information Officer CIO CTO Chief Technology Officer CISO Chief Information Security Officer CDO Chief Data Officer

HPE delivers solid Q4 results, but server sales fall

Hewlett-Packard reported solid fourth quarter results, but server sales fell 5%. Server sales were lower in the fourth quarter due to timing of AI server shipments and lower US government spending.

HPE reported fourth quarter earnings of 11 cents a share on revenue of $9.7 billion, up 14% from a year ago. Non-GAAP earnings were 62 cents a share. HPE recently outlined its long-term outlook and strategy on its investor day.

Wall Street was expecting HPE to report non-GAAP fourth quarter earnings of 58 cents a share on revenue of $9.91 billion.

However, HPE's results were mixed by product line. For instance server revenue was down 5% to $4.5 billion. Networking revenue, boosted by the acquisition of Juniper Networks, was $2.8 billion, up 150% from a year ago. Hybrid cloud revenue was $1.2 billion, down 12% from a year ago.

"HPE had a good quarter, but the growth came all from networking. Surprisingly, HPE server revenue was down, and blaming the customer mix change for it is a surprising move," said Constellation Research analyst Holger Mueller. "More interesting is also that half of HPE profit came from networking, but which underlined how important the Juniper acquiaition has been HPE. Now it's all about Antonio Neri and team getting growth back into the server segment."

For the fiscal year, HPE reported a net loss of $59 million, or 4 cents a share, on revenue of $34.3 billion.

CEO Antonio Neri said HPE "finished a transformative year with a strong fourth quarter of profitable growth and disciplined execution."

As for the outlook, HPE projected first quarter revenue of $9 billion to $9.4 billion with non-GAAP per share earnings of 57 cents to 62 cents.

For fiscal 2026, HPE said revenue will grow 17% to 22% with non-GAAP per share earnings between $2.25 to $2.45.

At HPE Discover Barcelona, the company announced new features for Greenlake cloud, launched unified AIOps across Aruba and Juniper Networks and an AI factory partnership with Nvidia

On a conference call, Neri said:

  • "The underlying demand environment was strong throughout the quarter with orders growing faster than revenues. We saw an acceleration in orders in the last weeks of the quarter, signaling solid demand for our portfolio."
  • "As we look to 2026, we will draw on our supply chain expertise to secure critical commodity supply and exercise our pricing management discipline. We expect DRAM and NAND costs to continue to increase in 2026, the majority of which we expect to pass to the market while monitoring demand."
  • "We have discipline in passing through the cost through our pricing which again we already did in November."
Tech Optimization Data to Decisions HPE Big Data Chief Information Officer CIO CTO Chief Technology Officer CISO Chief Information Security Officer CDO Chief Data Officer

Salesforce delivers strong Q3 as Agentforce, Data 360 surge

Salesforce reported better-than-expected third quarter results, said it has reached 9,500 paid deals for Agentforce and upped its outlook for the fiscal year.

The company reported third quarter net income of $2.086 billion, or $2.19 a share, on revenue of $10.26 billion, up 9% from a year ago. Non-GAAP earnings were $3.25 a share.

Wall Street was looking for non-GAAP earnings of $2.86 a share on revenue of $10.27 billion.

The company, which just closed its Informatica purchase, said its current remaining performance obligations in the third quarter were $29.4 billion, up 11% from a year ago.

CEO Marc Benioff said, "Our Agentforce and Data 360 products are the momentum drivers, hitting nearly $1.4 billion in ARR." He added that Salesforce processed 3.2 trillion tokens.

Key figures:

  • Agentforce accounts in production were up 70% from a year ago.
  • 50% of Agentforce and Data 360 bookings in the third quarter came from existing customers.
  • Data 360 ingested 32 trillion records in the third quarter.

As for the outlook, Salesforce projected revenue of $11.13 billion to $11.23 billion, up 11% to 12%, with non-GAAP earnings of $3.02 to $3.04 a share.

Salesforce said fiscal 2026 revenue will be $41.45 billion to $41.55 billion with non-GAAP earnings of $11.75 a share to $11.77 a share.

Key themes from the Salesforce call:

Benioff said:

  • "We're really excited about the harmonization, integration, federation that Informatica plus Data 360 plus MuleSoft is giving us. And that's going to strengthen our overall leadership in data and, of course, AI. And we're ensuring that we have the distribution capacity. That's extremely important for us because we are a direct seller in place to support long-term growth."
  • "Six of our top 10 deals in the quarter are now driven by companies that just want to transform with Agentforce."
  • "We use all of the large language models. They're all great. We love all of them. We love all of our children, but they're also all just commodities, and we can have the choice of choosing whatever one we want, whether it's OpenAI or Gemini or Anthropic or there's other open source ones. They're all very good at this point. So we can swap them in and out. The lowest cost one is the best one for us, making us basically the top user of these foundation models." 
  • "I just want to make sure everybody realizes we're not building data centers at Salesforce. We're preserving our gross margins and our cash flow. But we will use the data centers that are being built. And we will take advantage of the lower cost that we're seeing in the market from the incredible build-out of data centers."
Data to Decisions Future of Work Marketing Transformation Matrix Commerce Next-Generation Customer Experience salesforce Chief Information Officer

Snowflake expands Anthropic partnership, delivers strong Q3

Snowflake expanded its partnership with Anthropic to cover joint sales efforts and model integration with Snowflake Intelligence, detailed AWS Marketplace growth and teamed up with Accenture on enterprise AI deployments. The company also reported strong third quarter results.

The news lands after Snowflake's acquisition of Select Star. The purchase expands Snowflake Horizon Catalog view into enterprise data. Select Star integrates with multiple database systems and business intelligence tools as well as data pipelines.

Here's a look at the partnerships, which was announced shortly after Snowflake earnings.

Anthropic and Snowflake expand partnership

The partnership expansion brings Anthropic models to Snowflake Cortex AI in a deal valued at $200 million.

Anthropic and Snowflake will collaborate on joint sales efforts and go-to-market motions.

Since the two companies announced their initial partnership, thousands of customers have signed up for Claude models on Snowflake.

Anthropic Claude will be a key model powering Snowflake Intelligence, the company's data intelligence agent.

Sridhar Ramaswamy, CEO of Snowflake, said the partnership with Anthropic was expanded because the two companies are seeing usage surge as a team.

Anthropic CEO Dario Amodei said "this partnership expansion is a direct result of the momentum and demand we’ve already seen from customers who are driving real business value with Anthropic and Snowflake."

Here's a look at Snowflake's deals with AWS and Accenture.

  • Accenture and Snowflake launched the Accenture Snowflake Business Group to offer AI and data transformation services. The two companies cited Caterpillar as a customer. Accenture and Snowflake will meld Accenture's AI Refinery with Snowflake Intelligence and Snowflake Cortex AI.
  • Snowflake said it has been leveraging AWS marketplace to drive sales and adoption. The company said AWS Marketplace sales have doubled to $2 billion. Snowflake said it has recognized by AWS across 14 Partner Award categories. The two companies are collaborating on multiple integrations.

Better-than-expected third quarter results

Snowflake reported a third quarter net loss of $291.6 million, or 87 cents a share. Non-GAAP earnings were 35 cents a share on revenue of $1.21 billion, up 29% from a year ago.

Wall Street was looking for non-GAAP earnings of 31 cents a share on revenue of $1.18 billion.

Snowflake said it had 688 customers with trailing 12-month product revenue greater than $1 million. Remaining performance obligations were $7.88 billion, up 37% from a year ago.

As for the outlook, Snowflake projected product revenue of $1.195 billion to $1.2 billion, up 27% from a year ago. Ramaswamy said Snowflake Intelligence, the company's enterprise AI agent, saw "the fastest adoption ramp in Snowflake history."

Also:

Data to Decisions snowflake Chief Information Officer

Amazon Connect gets its due at AWS re:Invent 2025

Amazon Connect is the lead for AWS when it comes to taking a bunch of building blocks and "primitives" and compiling them for enterprise use cases. Amazon Connect isn't about being a contact center application as much as it is addressing customer-facing use cases starting with service.

The Amazon Connect playbook is being used for multiple product areas such as Amazon Bedrock, AgentCore and AWS Security Hub to name a few. Amazon Connect has surpassed $1 billion in annual recurring revenue. 

During a re:Invent 2025 keynote Wednesday, Colleen Aubrey, SVP, Applied AI Solutions at AWS, said Amazon Connect a precursor to an AI-powered future of work that revolves around more than automation and efficiently. "What I've learned building on Amazon is that transformation and agility are not opposites. They're actually partners," said Aubrey. "The real prize of AI is new products, new services, better customer experiences and new business models, not less effort."

AI agents require a rethinking of work. Aubrey said one of the places where AI has become a teammate is Amazon Connect, which is used internally for customer service, as well as multiple enterprises. "What we see across many customers is that the center of gravity for customer experience is the same. It's starting in the contact center. But let's be clear, the contact center have expanded beyond just the interactions," said Aubrey.

At re:Invent 2025, Amazon Connect rolled out agentic self-service tools that give AI agents the ability to understand, reason and act across voice and messaging channels. AI agents can automate routine and complex tasks and supervisors can spin them up with tracking and identity-based security.

More from re:Invent 2025

Amazon Connect is also the biggest beneficiary of Nova Sonic, AWS's advanced speech model. Amazon Connect leverages Sonic but also a set of models via Amazon Bedrock and Bedrock AgentCore underneath.

In addition, Amazon Connect is getting tools to analyze conversations for context and sentiment and prepare documentation and handling routine processes. Amazon Connect also has AI-powered product recommendations to turn conversations into potentially revenue-driving engagements.

Other features new to Amazon Connect include:

 

Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Chief Customer Officer Chief Information Officer

AWS CEO Garman on hardware ambitions, Trainium demand, AI and jobs

AWS CEO Matt Garman said Trainium 3 early demand is strong, the company's hardware ambitions revolve around providing cloud services and enterprises are seeing strong returns amid AI bubble talk.

Here's a look at what Garman said at an ask me anything meeting with analysts.

Hardware ambitions. Garman said:

"Our focus is, is about cloud services and hardware in support of that. And so we don't sell hardware. We're not don't necessarily have plans to, although I wouldn't rule it out ever in the future, but it's not currently what we focus on, we're very focused on building the world's best infrastructure for customers to run on, and what we sell, the services and AI factories is no different than that."

Garman added that Amazon is obviously building hardware in Graviton and Trainium custom silicon, but that's in support of services.

SaaS-y efforts. Garman was asked about the success of Amazon Connect and AWS' recent moves to compile services into more of a suite. Garman said:

"When we launched Connect, nothing like that existed, and we thought that we could do a better job. We had a lot of learnings from internal use and I think that's resonated with customers. That's why it's a billion dollar plus business and there will be others like that."

Garman said software development is another space where AWS can offer applications and cited Kiro. Healthcare is another possibility. He added:

"We don't have a concerted plan around SaaS, and we wouldn't go into it just because we want to go into it. And I think it's more there's an area where we think we have a differentiated idea that we can offer some interesting value to customers. We would always consider it. But it's more around that for us, we love leaning into our partners."

More from re:Invent 2025

Useful life of AI infrastructure. Amazon is on a 5-year depreciation, nothing others are pushing 6 years. "For AI infrastructure, we do five years because we think that there may be shorter life there," said Garman. "We have 20 years on our core infrastructure to know roughly how long CPUs last, drives last, network gear, data centers, etc.," said Garman.

Garman said AI infrastructure may be different since it's evolving so fast. As AI infrastructure moves from training to inference it's unclear what the useful life will be. "We're actually the same training infrastructure. And the benefits that go into that, whether it's larger models or better bandwidth or other things like that, actually benefit inference as well," said Garman.

Using multiple models, including smaller ones, will also impact the useful life of AI infrastructure. "I think that we're kind of trying to figure it out. I think as you think about a mixture of models, you actually are going to be able to send models to the right size of infrastructure to run it, ultimately, and take advantage of that," he said.

Building AI agents. Garman said AWS is focusing on offering building blocks as well as applications like AgentCore. Large enterprises want building blocks to build agents. Smaller firms will look for a complete package. "AWS has always been giving small customers the capabilities that only the largest companies used to have," said Garman.

AI bubble? It depends. "I don't expect capex to slow down. We'll keep spending and we'll keep growing. It's a capital intensive business and always has been."

He said if you're a VC funding a zero revenue business we may be in an AI bundle. Garman noted on a CIO panel multiple executives were seeing significantly positive ROI. "I've never met an enterprise that was seeing really good positive ROI investments just decide not to do it," said Garman. "That's my signal of how things are going currently. The industry is still supply constrained by something. Chip capacity, power capacity, laser capacity and things like that."

Developers. Garman was asked if AWS was refocusing on the developer and he said the company is always focused on developers.

Garman said:

"It's always been important. But the focus where we are in the world right now is how much developers are driving some of that innovation. It's an area where I think we can add a ton of value. It's a customer segment that's incredibly important for me and the team."

"We can bring a lot of differentiated value. We think that we can turbocharge what developers can do."

AI and jobs. Garman said, "I don't think AI is replacing jobs, but it is changing them."

Garman added that training will be critical. "We want them to understand how to use AI tools. We want them to figure out how they use AI to code. We want them to figure out how they use AI in their jobs," said Garman. "And because those rules are going to change, we'll continue to iterate on our trainings as well."

AI workloads. Garman was asked about whether AWS was getting AI workloads. he said the multi-model approach has paid off. "Most of our customers are building their AI production applications on AWS. They want them integrated where their applications are. They want them integrated where their data is. They want a choice of models. They want actually a platform to build inference that has enterprise controls that gives them the best price performance they're seeing price performance gains," said Garman, who said AWS will embrace multiple models whether they come from Google, OpenAI or anyone else. "I think we have a really differentiated story for customers on how they customize AI for them."

Multi-cloud. Garman said AI workloads will be inherently multi-cloud. The key will be to offer observability across all of the clouds as well as security and network connectivity.

Trainium. Garman has said Trainium 2 was oversubscribed. Trainium 3 is underway, but just became generally available. "I expect to sell those as fast as we land them as well," he said. "The response to Trainium 3 has been much stronger than Trainium 2."

When asked about AWS custom silicon vs. others, Garman said the company is buying plenty of Nvidia and AMD chips. AWS will follow demand.

On Trainium 4, Garman said AWS' custom silicon will link up with Nvidia's NVLink Fusion and others.

Quantum computing. Garman said:

"I think quantum was going to be a super powerful technology. It's a big investment area for us. Our lab is making some and there's much attacks, and who knows me right that way? There's much different paths on quantum. I like the way that we're going around the error correction. And the team has made some really big advancements over the last year.

But no one has made a useful quantum computer yet. People who should dig into it right now are researchers. There's not really good business reason right now."

Leo. Garman said Amazon was bullish on Leo and satellite internet services. "I think Leo is going to unlock a number of new use cases. I think there's a big consumer business as well as a big business opportunity. There are a lot of companies that would love getting a gigabit line in lightly connected areas or out in the field," said Garman.

Robotics and physical AI. Garman said he was excited about physical AI models and robotics, but noted that the models aren't ready just yet. "I think physical AI and agents are going to play a big role and be hugely transformative, but there just hasn't been a prevalence of data," said Garman.

He also noted that it's unclear whether startups sell the brain of the robot or the robot. The market is in its infancy--even though Amazon is one of the largest buyers of robots. "It's early but it's an area that I'm excited about," said Garman.

Data to Decisions AWS reInvent aws Chief Information Officer

AWS adds doubles down on customizing, fine tuning AI models, agents

Dr. Swami Sivasubramanian, Vice President of Agentic AI, made the case that AWS' suite of AI tools is best suited for wrangling AI agents and customizing models to deliver business outcomes.

Speaking at his AWS re:Invent 2025 keynote, Sivasubramanian said:

"The question isn't whether you should customize your models, but how quickly can you get started?"

The future to Sivasubramanian is custom quality models that can carry out enterprise-specific tasks efficiently. "As agents become easier to build, the next big question emerges, how do we make them more efficient? Today's off the shelf models have broad intelligence. They can handle complex to use, multi-step reasoning and unexpected situation, but they aren't always the most efficient," said Sivasubramanian. "And this efficiency is not just about cost. It's about latency. How quickly can your agent respond? It's about scale. Can it handle quick demand? It's about agility. Can you iterate and improve quickly?"

Sivasubramanian said the barrage of announcements from re:Invent 2025 were about removing complexity and costs for model customization without an army of PhDs. Sivasubramanian followed up on earlier re:Invent announcements revolving around Amazon AgentCore, AWS Marketplace and multiple other products.

More from re:Invent 2025

Here's a look at the news items from Sivasubramanian's keynote:

  • Amazon Bedrock is getting new model customization tools that features reinforcement fine-tuning models that can deliver accuracy gains of 66% over base models. Amazon Bedrock automates the reinforcement fine tuning workflows without needing machine learning expertise. Amazon Nova is the first model offered and with other models coming soon.
  • Amazon SageMaker AI is gaining serverless customization for multiple AI models including Amazon Nova, DeepSeek, GPT-OSS, Llama and Qwen. Amazon SageMaker AI will support reinforcement learning via a simple interface. Models can be customized in days instead of months.

Data to Decisions Future of Work AWS reInvent aws Chief Information Officer

Be Bold or Be Replaced: AI Agents, Human Courage & the New Enterprise Reality | DisrupTV Ep. 419

Be Bold or Be Replaced: AI Agents, Human Courage & the New Enterprise Reality | DisrupTV Ep. 419

This week on DisrupTV, hosts Vala Ashar and R "Ray" Wang sat down with two leaders shaping the future of enterprise AI and leadership: Marty Kihn, SVP of Strategy at Salesforce and author of Agent Force, and Ranjay Gulati, author of How to Be Bold: The Surprising Science of Everyday Courage.

From the rise of agentic AI inside the enterprise to the science behind building courageous organizations, this episode delivered a powerful look at the mindset and technology needed to lead through massive transformation.

Inside Salesforce’s Agent Force: AI Agents at Enterprise Scale

Kihn kicked off the discussion with an inside look at Agent Force, Salesforce’s platform designed to help organizations create, test, deploy, and monitor AI agents safely and at scale.

He emphasized that building enterprise-grade AI requires more than great models—it demands:

  • Safety and trust: personal data masking, toxicity detection, and bias mitigation
  • Data grounding: keeping agents connected to accurate, real-time business context
  • Narrowing scope: focusing agents on specific, well-defined tasks instead of vague ambitions
  • Extensive testing: because LLM-driven agents can be non-deterministic and unpredictable

Kihn framed the future as a hybrid human–agent workforce, where AI augments employees, automates workflows, and gives leaders clearer articulation of what tasks need attention.

He also highlighted the three A’s of the agentic enterprise:

  • Automation — Streamlining repetitive work
  • Augmentation — Boosting human capability
  • Articulation — Making business processes and decisions explicit, observable, and improvable

As organizations move toward this future, Kihn argued, they must invest in protocols and standards that ensure AI agents can communicate with each other and with core enterprise systems. He pointed to emerging frameworks like:

  • Model Context Protocol (MCP)
  • Agent-to-Agent Protocol (ATA)
  • Advertising Context Protocol (AdCP)

These will shape the future of orchestration across the agentic enterprise.

Building Courage in an AI-Transformed World with Ron J. Galati

If agentic AI is the technology of the future, courage is the human capability that will define who thrives in it. Ranjay Gulati joined the show to discuss the science behind bold leadership, inspired by his new book and informed by receiving a forward from none other than the Dalai Lama.

Gulati’s message was clear:

  • Courage isn’t the absence of fear—it’s action despite it.

He explored how leaders can cultivate courage at scale inside their organizations by focusing on:

  • Self-efficacy — the belief that you can succeed
  • Deliberate action — small moves that build momentum
  • Purpose — a deeper mission that fuels resilience
  • Generosity — creating a “clan” mindset where people support one another

Gulati argued that courage will be essential as companies navigate accelerating advancements like autonomous vehicles, AI-driven automation, and massive shifts in job roles.

  • “Fear is real,” he said. “But courage emerges when people feel capable, connected, and committed to something bigger than themselves.”

Why Courage and AI Must Evolve Together

A unifying theme from both guests:

  • The future belongs to organizations that blend bold leadership with trustworthy AI.
  • AI will automate and augment work, but humans must bring purpose and judgment.
  • Agents will take over complex workflows, but leaders must set the vision and build trust.
  • Uncertainty will grow—but so will opportunity for those willing to act boldly.

As Ray and Vala wrapped the episode, they emphasized that the path forward requires both technical excellence and human bravery.

Key Takeaways

  • Agent Force is Salesforce’s platform for building and managing enterprise AI agents.
  • Enterprises must focus on safety, data grounding, and narrow scope when deploying AI.
  • Emerging protocols like MCP and ATA will standardize how agents operate at scale.
  • Courage is a learnable skill, driven by purpose, generosity, and action.
  • The future of work will be shaped by hybrid human–AI teams, requiring both innovation and bold leadership.

Final Thoughts: Innovation Starts Within

As enterprises step into the agentic era, one truth is becoming increasingly clear: technology alone won’t determine who succeeds—courage will. AI agents like those built on Salesforce’s Agent Force will accelerate work, reshape roles, and unlock new forms of value. But it’s bold, purpose-driven leaders who will guide teams through uncertainty, build trust in these new systems, and inspire a culture willing to experiment, learn, and evolve.

This episode was a reminder that the organizations ready to blend AI innovation with human bravery will be the ones that define the next decade. Whether you're exploring agentic architectures or building a more courageous workforce, now is the time to lean in.

If you're navigating AI transformation—or preparing your teams for the future—this episode is a must-watch.

Related Episodes

If you found Episode 419 valuable, here are a few others that align in theme or extend similar conversations:

 

New C-Suite Future of Work Tech Optimization Chief Executive Officer Chief People Officer Chief Information Officer Chief Data Officer Chief Technology Officer On DisrupTV <iframe width="560" height="315" src="https://www.youtube.com/embed/fDprrtzQccc?si=oViXjXdwp0Svlam6" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

CrowdStrike reports strong Q3, 22% revenue growth

CrowdStrike reported strong third quarter results as the company continued to land security wallet share with revenue growth of 22%.

The cybersecurity company reported a third quarter net loss of $34 million, or 14 cents a share, on revenue of $1.23 billion, up 22% from a year ago. Non-GAAP earnings were 96 cents a share.

Wall Street was expecting CrowdStrike to report non-GAAP earnings of 94 cents a share on revenue of $1.22 billion.

CrowdStrike CEO George Kurtz said the third quarter "was one of our best quarter in company history" as net new annual recurring revenue was up 73% from a year ago. CrowdStrike is dueling with Palo Alto Networks to convince enterprises to consolidate cybersecurity platforms.

Burt Podbere, CrowdStrike's CFO, said AI related demand was strong as customers consumed more of the company's Falcon platform and Flex subscription plans. The company said that second half fiscal 2026 net new annual recurring revenue will remain north of 50%.

For the fourth quarter, CrowdStrike projected revenue of $1.29 billion to $1.3 billion with non-GAAP per share earnings of $1.09 to $1.11. For fiscal 2026, CrowdStrike projected fiscal revenue of $4.797 billion to $4.807 billion. Non-GAAP per share earnings for the fiscal year will be $3.70 a share to $3.72.

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology cybersecurity Chief Information Officer Chief Information Security Officer CISO Chief Privacy Officer Chief AI Officer CAIO CXO

AWS' Kiro launches autonomous agents for individual developers

Amazon's Kiro, Amazon Web Services' next-gen AI developer platform, is launching in preview for individual users with an autonomous agent for each developer, integration with GitHub and issue assignments.

Kiro, which recently became generally available, has been expanding tools and features on a two-week cadence since being introduced in July as an agentic AI integrated development environment. Amazon is standardizing its developers on Kiro. 

Speaking during a re:Invent 2025 keynote, CEO Matt Garman said:

"We really see the potential for the entire developer experience, and frankly, the way that software is built to be completely reimagined. We're taking what's exciting about AI and software development and adding structure to it. This is why we launched Kiro, the agentic development environment for structured AI coding,"

More from re:Invent 2025

With the individual accounts and an autonomous agent that learns how you develop, Kiro is moving closer to the goal of moving developers from prompt to prototype faster. Key points about the Kiro autonomous agent, announced at AWS re:Invent 2025 include:

  • The agent is designed for persistent operation beyond individual coding sessions.
  • This AI-powered companion can manage work across multiple repositories and tools like GitHub and Jira.
  • Kiro’s autonomous agent can research an implementation approach to a new feature for an existing code base.
  • It learns from user preferences, orchestrates sub-agents for specialized tasks, and maintains context throughout projects. Early use cases range from bug triage and multi-repo refactoring to maintenance campaigns—tasks often too routine or time-consuming for human developers.

AWS's Kiro autonomous agent is part of a broader plan to offer a unified software development platform that features one interface covering everything from planning to deployment. Nikhil Swaminathan, Kiro’s product lead, said the autonomous agent will be able to spin up sub-agents and complete tasks and judge quality to make the move to production.

Garman said the autonomous agent in Kiro is part of what AWS calls a frontier agent. Frontier agents can carry out longer projects autonomously by operating in the background.

The idea is that the autonomous agent in Kiro will move from tasks to being able to give feedback to a developer. “Just being able to have feedback come through and making it very personalized will be a win,” said Swaminathan. “We’re launching with one agent per developer and will have a private beta. We’ll be expanding more from there.”

"This is AWS first foray into the autonomous AI world when it comes to its largest user population: developers. Good to see the focus, it is likely to increase developer velocity. Leaders of developers and CxOs are now waiting (desperately) for a SDLC focussed autonomous AI offering from AWS," said Holger Mueller, a Constellation Research analyst. 

Ultimately, Kiro autonomous agents for teams will launch.

Kiro is also getting Powers, which is an extension that gives developers extensions to quickly augment a Kiro agent for workflows including design integration, hosting and data handling. Powers gives agents only the tools and context they need. Powers is being launched with partners such as Figma and Netlify.

“The concept of a power is to give developers the ability to extend the core agent. Often what happens with frameworks and tools and technologies is that people keep shipping improvements and the model itself is not trained to understand, there's a lot of trial and error with figuring out how to configure the agent rules in the right way to get the output that you need,” said Swaminathan.

These re:Invent 2025 announcements go with new Kiro features announced in November to go with general availability. Some of those additions include:

  • Kiro has added Anthropic's Opus 4.5 model in Kiro.
  • The new version of the Kiro IDE can measure whether code is up to specifications with property based testing. Kiro will go into a project's specs, extract properties that indicate how a system should work and then test against them.
  • The Kiro agent is available in your terminal. Developers can use the command line interface (CLI) to build features, automate workflows, analyze errors and trace bugs in multiple terminals. Kiro CLI works with the same steering files and Model Context Protocol (MCP) settings that are in the Kiro IDE.
  • Kiro Teams. Kiro is available for teams via the AWS IAM Identity Center with support for other identity providers on deck.
  • A startup program for Kiro Pro+. Startups that have raised funds up to Series B can apply for AWS credits for Kiro until Dec. 31.
Future of Work Next-Generation Customer Experience AWS reInvent aws Chief Information Officer