This list celebrates changemakers creating meaningful impact through leadership, innovation, fresh perspectives, transformative mindsets, and lessons that resonate far beyond the workplace.
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
Oracle reported a mixed second quarter and said it has sold its Ampere unit because it's not strategic to design and manufacture its own chips. Oracle CTO Larry Ellison said, "we are now committed to a policy of chip neutrality where we work closely with all our CPU and GPU suppliers."
Ellison noted that it will continue to buy the latest GPUs from Nvidia but "we need to be prepared and able to deploy whatever chips our customers want to buy," said Ellison, who added that "there are going to be a lot of changes in AI technology over the next few years and we must remain agile in response to those changes."
The company reported second quarter earnings of $6.13 billion, or $2.10 a share, on revenue of $16.06 billion, up 14% from a year ago. Non-GAAP earnings were $2.26 a share. Wall Street was looking for non-GAAP earnings of $1.64 a share on revenue of $16.19 billion. Oracle's earnings got a $2.7 billion pre-tax boost due to the sale of Ampere.
Oracle's closely watched cloud infrastructure sales were $4.1 billion, up 68% from a year ago. Cloud revenue overall in the second quarter checked in at $8 billion, up 34%. SaaS revenue was $3.9 billion in the quarter, up 11% from a year ago.
The company said that remaining performance obligations were $523 billion, up 438% from a year ago, due to large contracts with OpenAI. How Oracle will finance its data center buildout has been a hot topic for the company.
Oracle CEO Clay Magouyrk said Oracle is halfway through building 72 multi-cloud data centers embedded in AWS, Google Cloud and Microsoft Azure. Mike Sicilia, Oracle's other CEO, said the company is embedding AI and automation throughout its applications.
On a conference call with analysts, Doug Kehring, Principal Financial Officer, said Oracle said the fiscal year revenue expectation remains at $67 billion, but the added RPO this quarter will bump up revenue by $4 billion for fiscal 2027. Kehring said fiscal 2016 capital expenditures will be $15 billion higher than forecasted last quarter due to contracts that can be monetized quickly.
âWhile we continue to experience significant and unprecedented demand for our cloud services, we will pursue further business expansion only when it meets our profitability requirements and the capital is available on favorable terms,â said Kehring. He said third quarter total cloud revenue will be up 37% to 41% in constant currency.
According to Kehring, third quarter revenue will be up 16% to 18% with non-GAAP earnings between $1.64 and $1.68 a share in constant currency or $1.70 to $1.74 per share in US dollars.
Key points from the call, which revolved around noting that Oracle plans to be investment grade and build out when the infrastructure can be monetized. After all, Oracle spent $35 billion in capital expenditures for the third quarter with $13 billion in negative cash flow.
Magouyrk said it was difficult to put a number on funding needs. âWe actually have a lot of different options for how we go about delivering this capacity to customers. There's obviously the way that people think about it, which is we buy all the hardware up front. But we don't actually incur any expenses for these large data centers until they're actually operational.â
Oracleâs capital expenditures can also change based on whether customers bring their own GPUs or decide to rent capacity, said Magouyrk.
Magouyrk said: âWe continue to see strong demand for AI infrastructure across training and inferencing. We follow a very rigorous process before accepting customer contracts. This process ensures that we have all the necessary ingredients delivered to customer success at margins that make sense for our business. We continue to carefully evaluate all future infrastructure investments, invested only when we have alignment across all necessary components to ensure profitable delivery for our customers.â
âOver the next month, we see increasing customer demand with billions in identified pipelines,â said Magouyrk.
Sicilia said: âIn our healthcare business, we now have 274 customers live in production on our clinical AI agent, and that number continues to rise daily.â
Vice President and Principal Analyst
Constellation Research
Michael Ni is Vice President and Principal Analyst at Constellation Research, covering the evolving Data-to-Decisions landscape—where CDOs, CIOs, and CPOs must modernize data infrastructure, integrate AI into decision-making, and scale automation to improve business outcomes.
Ni’s research examines how enterprises operationalize AI, automate decision-making, and integrate data management and analytics into core business processes. He focuses on the challenges of scaling AI-driven decision systems, aligning data strategy with business goals, and the growing role of data and decisioning “products” in enterprise ecosystems.
With 25+ years as a product and GTM executive across enterprise software, AI platforms, and analytics-driven technologies, Ni brings a practitioner’s perspective to…
Read more
[Adapted from the transcript of Michael Ni's video interview]
As someone who closely tracks emerging trends in data, AI, and analytics, attending AWS re:Invent 2025 felt like a peek into the future of business innovation. This year’s announcements showcased a clear evolution in AWS’s strategy—not just focused on providing the raw infrastructure for innovation, but positioning itself as a powerhouse for enabling AI agents, automated decision-making, cross-cloud collaboration, and advanced analytics workflows.
For executives navigating today’s complex data landscape—whether you’re a CIO, CDAO, or business leader—here’s my breakdown of the most consequential themes from AWS re:Invent 2025, what they signal for the future, and how you might prepare your organization for this shift.
AI Gets Strategic: AWS Is Climbing Up the Stack
“It’s no longer just about storage and compute rentals,” I heard echoed in conversations throughout the event. AWS is reshaping its narrative by moving up the stack, directly targeting decision-making workflows and governance, and introducing tools for running autonomous AI agents. Historically, infrastructure has been a significant part of AWS's keynote strategy. However, this year we saw a surprising shift—key infrastructure announcements were squeezed into the final moments of Matt Garman’s presentation, delivered in rapid-fire fashion (25 announcements in ten minutes).
This pivot toward process automation and higher-level AI work is clearly a response to competitive dynamics. As Google and Microsoft continue to push higher-value AI experiences into the enterprise market—through tools such as Google Cloud’s AI offerings and Microsoft’s Fabric and Copilot expansions—AWS is adapting by streamlining automation agents and governance workflows.
For technology leaders, the implication is clear: AWS isn’t just renting compute and storage anymore; it wants to be the home of your AI-driven decision-making processes. This is a call to evaluate your architecture—specifically, where AI governance and process automation will sit in your stack.
Agents Are the New Runtime
If there’s one overarching theme that stands out, it’s the rise of AI agents as the next big runtime. AWS unveiled its agent-focused model operations platform, pushing “agents” as fully autonomous decision-makers that move beyond copilots. Key innovations here include the introduction of Agent Core and Frontier Agents within Bedrock.
Agent Core builds upon Bedrock’s prior functionality as a model endpoint platform, adding capabilities for observability, identity management, gateways, policy controls, and both short-term and long-term memory. Frontier Agents, meanwhile, represent specialized deployments, such as security agents, DevOps agents, and virtual team members, that can autonomously drive workflows over extended periods.
The real challenge lies in decision points around governance and human oversight. Questions such as “Who owns agent behavior? Should agents work within a human-in-the-loop framework or fully autonomously? How do we guard against risks when scaling automation?” will become critical as organizations deploy hundreds—or even thousands—of agents across processes. AWS is promising ROI with up to 5-to-10x productivity shifts, signaling that process automation driven by autonomous agents is worth serious investment heading into next year.
Unified Data Analytics Workspace: Sensing, Deciding, Acting
AWS is doubling down on its vision for a unified, comprehensive analytics workspace—one that integrates AI, data engineering, and decision automation seamlessly. The new updates to Studio, Notebooks, and QuickSight create a cohesive ecosystem for developers and analysts to sense, decide, and act all within a single environment.
Notable enhancements include the introduction of SageMaker data agents to address dynamic datasets, serverless notebooks (lightweight, scalable solutions ideal for deployment flexibility), and real-time catalog updates that enable metadata ingestion, notifications, and S3 integration.
The strategic goal here is clear: AWS is positioning itself as a competitor to Microsoft Fabric and Databricks, aiming to win by delivering a unified flow for semantics, governance, and decision-making—all while maintaining developer choice. As enterprises increasingly seek to integrate analytics and decision-making in a connected environment, this new offering holds promise for executives seeking to improve efficiency in managing complex workflows.
Cost-Efficient, Open Compute Remains Central
While AWS’s keynote pushed higher up the stack, foundational compute efficiency was far from neglected. Scaling cost-efficient AI workloads has become critical as organizations deploy increasingly complex and resource-intensive models, and AWS has delivered announcements such as the Graviton5 processors, the Training3 and Training4 roadmaps, and enhanced GPU performance for ultra-scaled environments.
A particularly noteworthy focus was on inference cost management—a FinOps trend designed to help enterprises optimize token-level spending while running large-scale AI workloads. Questions such as “How do we generate tangible cost savings while scaling hundreds or thousands of agents?” highlight the real economics of deploying AI at scale.
AWS’s investments here reinforce its value proposition for driving economic simplicity across the largest-scale deployments. Managing large-scale environments efficiently is central to AWS’s strategy to differentiate itself from competitors in the cloud AI space.
Multi-Cloud and Sovereign AI Strategies
AWS is embracing multi-cloud for the first time in a significant way—a notable shift from earlier years, when the narrative often centered on routing traffic exclusively to AWS regions. This year’s announcement of high-speed private connections between clouds, starting with Google Cloud, opens new possibilities for moving and sharing AI-driven workflows seamlessly across cloud boundaries.
Further investments into sovereign deployments, including specialized AI factories for regulated industries, highlight AWS’s understanding of evolving governance needs. In highly regulated sectors where data cannot reside outside specific geographical boundaries, solutions enabling decisions “where the data lives” (rather than where the cloud resides) will be crucial. As sovereign regulations tighten globally, expect these strategies to play an increasingly significant role in enterprise decision-making frameworks.
The Semantic Future
One of the major trends emerging—not just from AWS, but across the broader industry—is the rise of semantic layers as a bottleneck for trustworthy AI. Data catalogs are no longer just repositories for metadata; they’re becoming critical semantic and contextual layers that power AI-driven decisioning. Executives recognize the need to equip AI agents with the context required to ensure accuracy, governance, and trust across workflows.
A year ago, only a fraction of data leaders were talking about semantics. Today, over half of the executives I’ve spoken with cite context challenges for AI agents as a core roadblock to success.
Looking ahead, it’s clear that AWS is preparing to capitalize on this trend. Investments in areas such as S3 vector support, unified metadata systems, and Bedrock grounding updates for workflow context will likely expand into semantics next year. My bet for AWS re:Invent 2026? The company could establish itself as the go-to platform for semantic-driven enterprise workflows.
Closing Thoughts
AWS isn’t just building infrastructure anymore. It aims to solidify its position as the platform for running AI agents, automating governance, and driving reliable decisions across scalable, efficient, multi-cloud environments.
Whether your focus is boosting productivity with autonomous agents, unifying analytics environments, or managing inference costs at scale, these announcements align clearly with where the market is heading. In 2024, strategic investments into semantic-based governance, autonomous process automation, and efficient compute will become essential competitive differentiators.
Let’s continue the dialogue—I’d love to hear what resonated most for your organization. If you want a deeper analysis of the implications for your business, feel free to drop me a message. For now, I hope this breakdown helps frame the strategic shifts from AWS re:Invent and position your enterprise for the next wave of AI-driven transformation.
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
Anthropic, Block and OpenAI have contributed their technologies and emerging standards to connect AI agents to the Agentic AI Foundation, a project under The Linux Foundation.
The move highlights how one of the biggest issues with deploying AI agents is connecting and governing them. Anthropic, Block and OpenAI contributed Model Context Protocol (MCP), goose and Agents.md, respectively.
MCP is the best known standard to connect AI models to tools, data and applications, but goose, a local-first AI agent framework, and Agents.md, a coding agent standard, will also be key to connecting agents.
The goal of the Agentic AI Foundation (AAIF) is to provide a "a neutral, open foundation to ensure agentic AI evolves transparently and collaboratively."
At first glance, AAIF has most of the big AI agent players lined up. Platinum members include Amazon Web Services, Anthropic, Block, Bloomberg, Cloudflare, Google, Microsoft and OpenAI.
AAIF is launching about a year after Anthropic released MCP, which has become a key standard. Multiple technology vendors have released MCP servers.
The AAIF noted that it has multiple members including Cisco, Datadog, Docker, IBM, Okta, Oracle, Salesforce, Snowflake and a host of others.
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
Accenture and Anthropic launched a partnership that revolves around driving Claude deployments in the enterprise.
The news lands a few days after Accenture announced a similar partnership with OpenAI.
Here's the reality with the Accenture and Anthropic and OpenAI deals: Implementing AI isn't as easy as portrayed and both LLM providers are going to need partners and integrators to scale. The hyperscale cloud providers, AWS, Google Cloud and Microsoft Azure, all have extensive partner ecosystems and Google Cloud and AWS have their own models.
Under the Anthropic deal, Accenture will train 30,000 consultants to drive Claude adoption and become a premier AI partner for using Claude Code.
The two companies will also deliver joint offerings for CIOs and deploying AI in financial services, life sciences, health and public sector.
For Anthropic, the Accenture deal is about distribution.
As for Accenture's OpenAI partnership, the consulting firm agreed to leverage ChatGPT Enterprise internally and be a primary AI partner.
The companies also launched an AI program also aimed at multiple industries including financial services, life sciences, health and public sector.
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
CVS Health is betting it can leverage technology and AI to create an "engagement as a service" strategy and an integrated platform that ties together its brands that include Aetna, CVS Caremark, CVS Pharmacy and Health Care Delivery.
The company is outlining its plans at its investor day. CVS Health CEO David Joyner said the company is focused on "building a simpler, more connected and more affordable health care experience for consumers, health care professionals, and payors."
In late 2023, CVS laid out a strategy that revolved around a data flywheel across its customer base that would improve healthcare delivery across multiple touch points. At the time, CEO Karen Lynch touted a plan to "provide panoramic care" for its members. Lynch was replaced by Joyner in October 2024.
Although leadership changed, CVS Healthâs data assets and touch points remain. CVS Health is still sprawling with 300,000 employees, 185 million consumers and 1.5 million relationships with health care providers. In addition, CVS Health has multiple touch points via its pharmacies, health care delivery facilities and stores.
The working theory for CVS Healthâand other large enterprisesâis that AI can be used to break down data silos and provide better experiences.
CVS projected 2025 revenue of at least $400 billion with operating income of $4.37 billion to $4.54 billion. For 2026, CVS sees revenue of at least $400 billion with operating income of $13.26 billion to $13.60 billion. CVS Health is projecting margin improvements at Aetna and CVS Caremark with continued earnings at CVS Pharmacy, which the company calls the "front door to healthcare."
The challenge and promise for CVS Health is integrating its various units so there's one platform to engage consumers and deliver healthcare. CVS Health's 2023 plan revolved around a data flywheel and since then AI has emerged and may make the company's plans easier to implement.
Hereâs the high level plan.
For this next chapter to come together, CVS Health will need to deliver more healthcare value via "Engagement as a Serviceâ with one platform thatâll combine its units to be useful to consumers.
CVS Healthâs argued that it has credibility as a technology provider is that AI is being used at the company's Aetna, CVS Caremark, CVS Pharmacy and health care delivery facilities. That knowhow will feed the engagement system and enable the company to be proactive, reach consumers and create an ecosystem.
There's no question that healthcare delivery can be more digital and customer friendly. The big question is whether CVS Health is the company to finally make headway on the problem.
Internal efficiencies
In a presentation, CVS Health argued that it can engage consumers better because it has already leveraged AI internally to become more efficient.
CVS Health said it has saved more than $1 billion to invest in new growth and AI.
Some examples of operating efficiencies include:
In the Aetna unit, the company has saved 90 minutes a day per nurse by consolidating clinical documentation, reduced call center volume by 30% and created one care management system from four disparate ones.
Aetna will accelerate interoperability of clinical data.
Oak Street Health, which provides health care services, has ambient AI scribe tools at 90% of its facilities. CVS Health's facilities include Oak St. Health, SignifyHealth and CVS Minute Clinic.
CVS Caremark is now able to process more than 300 claims per second at peak periods with a 99% first call resolution.
CVS Pharmacy will continue to invest in procurement and supply chain to drive down prices.
Does that ability to drive internal efficiencies mean CVS Health can become an engagement platform provider? Weâll find out.
Meet the health care consumer engagement platform
The big plan from CVS Health is to create a platform that will reimagine health care. Tilak Mandadi, Chief Experience and Technology Officer at CVS Health, laid out the plan.
With better engagement, CVS Health can enhance access, simplify navigation and improve affordability. CVS Health is aiming to become a trusted source and consolidate a fragmented experience.
CVS Health certainly has the reach. CVS Health said more than 185 million consumers engage with the company each year and 85% of American live within 10 miles of a CVS Health location. With its omnichannel reach, CVS Health can ultimately commercialize its platform.
For CVS Health, one integrated engagement platform can drive growth for all of its units.
What's unclear is whether CVS Health can become a technology provider. The plan for CVS Health is to launch a series of technology and services in 2026. The play here is to not only engage its own customer base but create an ecosystem revolving around its platform.
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
IBM said it will acquire data streaming company Confluent in a deal valued at $11 billion. The deal will give IBM an open-source data platform with an annual revenue run rate topping $1 billion that can provide governed data to AI agents.
In a statement, IBM said the deal will add to non-GAAP earnings in the first year and boost free cash flow in year two after the deal closes. IBM CEO Arvind Krishna said the companies will "enable enterprises to deploy generative and agentic AI better and faster by providing trusted communication and data flow between environments, applications and APIs."
Confluent gives IBM a platform that will connect and reuse data for applications notably AI agents. In many ways, Confluent will be to IBM what MuleSoft and Informatica is to Salesforce--data connection and integration engines that provide the information that AI agents will need to make decisions. Confluent has said that AI will drive the company's next phase of growth due to the need for real-time data streaming.
Here's a look at the Confluent stack that has multiple deployment options such as Confluent Cloud, a managed data streaming platform, Confluent Platform, a self-managed deployment, WarpStream, a hybrid deployment model, and a private cloud offering.
Confluent CEO Edward Kreps said IBM's scale will accelerate its strategy and boost go-to-market efforts. IBM will use Confluent to advance its hybrid cloud and AI strategy, enhance efforts across the company's portfolio and give it a growth engine. IBM said Confluent will also complement Red Hat and its data and automation portfolio.
Confluent is also based on Kafka and offers managed open source data streaming. That approach fits in with IBM's open source cred, but also appeals to enterprises looking to avoid lock in. Confluent's value prop was outlined by Kreps at a November investor conference:
Managed open source data platform and focusing on what's needed for cloud engineering and running distributed data systems.
Real-time data streaming via a "broad platform where all the parts work together" including connectivity, governance and real-time processing with Confluent's Flink compute offering.
A cloud-native system that works across all environments. "The role of the technology is to kind of act as like a central nervous system that plugs together all the applications and parts of the company," said Kreps.
Here's what IBM is getting with Confluent:
A data platform that can be used to not only wrangle data but offer point-in-time queries on static data that can better automate decisions. Confluent has argued that companies are software and data will enable continuous action.
Permission to play in a hot market. Confluent competes with a range of players including hyperscalers like AWS and its Kinesis.
Connectors and governance that will collapse enterprise systems with an AI layer that turns into the user interface. Kreps said: "We've conceived software applications as being primarily these little islands of UI. And if you think about how these systems are going to work together, that's become less true over the years and AI is probably making it even less true. The access to the data and APIs that drive the functionality is going to be as important as the thing you see on your phone or web browser."
Solid revenue growth and non-GAAP profitability. Confluent went public in June 2021 and through its third quarter report in October topped a $1 billion annual revenue run rate. Cloud revenue in the third quarter was up 24% to $161 million.
Industry data plays that can be scaled with IBM's vertical focus and IBM Consulting.
Constellation Research's take
Michael Ni, a Constellation Research analyst, said:
"This deal gives IBM the nervous system it was missing. Without real-time streaming, you canât empower real-time agents, AI-driven workflows, or dynamic decisioning. They all depend on with streaming context, signals, and intent-driven triggers. IBM just closed a structural gap and jumped into the autonomous enterprise platform race with Microsoft, AWS, Google, and Databricks. Expect accelerated consolidation across data and AI platforms as IBM continues to repositions itself as one of the major data infrastructure consolidators."
Constellation Research analyst Holger Mueller said:
"IBM is taking a now proven strategy to its third iteration: Buy a key piece of technology with an open source flavor--RedHat, Hashicorp and now Confluent--that enterprises want and need professional services to implement. Confluent's streaming data capabilities are relevant in the current phase of AI adoption in the enterprise as it provides data to all the places where it is needed."
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
"Iterative." "Where's the big bang?" "Practical."
Those are a few of the words I've heard when analysts are summing up AWS re:Invent 2025. As usual, there was a firehose of news announcements and talks about strategy today and going forward.
What has emerged is that AWS is becoming a different company. Yes, AWS is firmly committed to developers and dedicated to creating building blocks needed to scale agentic AI. But AWS is clearly more than an infrastructure company now. It's not quite a software company either. When AWS CEO Matt Garman goes through announcements like AWS S3 Vectors in 30 seconds you know the company is a bit more curated.
Here's all the infrastructure stuff AWS announced. In previous years, this slide would take up the whole keynote. In 2025, it's a 10-minute fast break bit.
Like its customers, AWS is on a journey that's being reshaped by AI agents. The AWS picture is never complete. The other thing to realize about AWS is that there's now a 6-month cadence for big rollouts. AWS Summit New York, where the company first took its practical approach to AI for a spin, is now a mini re:Invent. AgentCore, AWS' most important launch of 2025, launched at AWS Summit New York.
Frontier agents are about the future of work than a category.
Garman during his keynote introduced the term with a trio of software development tools. This concept of a frontier agent revolves around having an AI teammate. My hunch is that the term will likely be renamed.
Garman outlined that frontier agents are a new class of agents that are autonomous, scalable and work over time without human intervention. In other words, they're more like teammates.
The frontier agent riff is a concept that's far from fully baked, but you can see AWS focusing on expansion beyond the software development lifecycle. "We think we're only at the beginning of frontier agents," said Garman.
A few observations about where frontier agents may lead:
The concept moves AI agents beyond tools and into collaborators.
AI will ultimately just have to work. The current AI user interface, which will break down software and data silos, isn't human centric. You shouldn't need to learn how to work with an AI agent.
Change management and trust will be critical to move forward with this teammate concept.
Systems of work that have been around for decades will need to be revamped.
However, multiple enterprises and vendors are coalescing around the same vision.
"I believe that over the next few years, agentic teammates will be essential to every team as essential as the people sitting right next to you, they will fundamentally transform how companies build and deliver for their customers," said Colleen Aubrey, SVP of Applied AI Solutions at AWS, during a keynote.
Blue Origin's William Brennan, Vice President of Technology Transformation, highlighted how the space company has deployed more than 2,700 agents into production to assist engineering, manufacturing, software and supply chain teams.
"Everyone at Blue Origin is expected to build and collaborate with AI agents to make their work better and faster," said Brennan, who said the company built its multi-agent orchestration system on AWS. "By equipping our teams with knowledgeable and capable agents, we were able to dramatically accelerate the product life cycle, increase production rate, and, most importantly, reduce the cost of access to space."
It's not surprising that Blue Origin, owned by Jeff Bezos, is an AWS customer, but the use cases were notable.
AI is still far too complicated.
AWS is doing what it does--launch building blocks, combine them into services and solve problems. Yes, AWS made strides in abstracting the creations of AI agents and customizing models, but it's still very early.
There. Isn't. An. AI. Easy. Button.
Garman said AWS is focusing on offering building blocks as well as applications like AgentCore. Large enterprises want building blocks to build agents. Smaller firms will look for a complete package. "AWS has always been giving small customers the capabilities that only the largest companies used to have," said Garman.
Dr. Swami Sivasubramanian, Vice President of Agentic AI, said the aim is to reduce cost and complexity so you don't need an army of PhDs to implement AI. Ironically, many of the speakers this week at re:Invent 2025 had PhDs.
AWS has a software strategy and it's likely to revolve around use cases.
Yes, AWS had its Amazon Bedrock and AgentCore announcements, but the software to watch is Amazon Connect, which has more than $1 billion in annual recurring revenue and is likely to be the starter kit for what Aubrey highlighted about the future of work.
Amazon Connect got some play at re:Invent and largely flies under the radar. Amazon Connect added agentic AI features and the package of applications and building blocks isn't about contact centers as much as it is customer service use cases.
In addition, AWS is tackling other use cases including software development and now model customization. Is AWS a SaaS player. Not really, says Garman: "We don't have a concerted plan around SaaS, and we wouldn't go into it just because we want to go into it. And I think it's more there's an area where we think we have a differentiated idea that we can offer some interesting value to customers. We would always consider it. But it's more around that for us, we love leaning into our partners."
Margin is AWS' opportunity.
Amazon has said for years that fat margins are its opportunity. That approach comes from Amazon's retail DNA where margins in the best of times are single digits. AWS grew up taking margin away from enterprise incumbents. It still is if you just look at AWS Transform. I thought about this margin mantra repeatedly as AWS talked about its Trainium 3 and Trainium 4 launches. Like Google Cloud and its TPUs, AWS can rake in dough by just acquiring some of those workloads that support Nvidia's fat margins. For now, there are AI bottlenecks everywhere, but Trainium 3 is going to see strong demand and likely crib AI inference workloads.
"The response to Trainium 3 has been much stronger than Trainium 2," said Garman.
AI is forcing multi-cloud approaches and hyperscale cloud cooperation.
AWS and Google Cloud announced an interconnect deal and Microsoft Azure will be in the mix too. If it weren't for Oracle's partnership with all three hyperscalers, there's be some surprise. AI is forcing the clouds to collaborate. If you're keeping score at home hell has frozen over a few times already.
AWS Marketplace is a juggernaut.
Given Amazon's commerce roots this takeaway shouldn't be that surprising. However, AWS is removing friction from buying enterprise software at a steady pace. The ability to buy "solutions" is going to be a win for AWS and buyers. When AWS Marketplace is combined with the partner network, it's clear that AWS has its ground game going well.
The $1 billion AWS Marketplace club is also growing. Snowflake said this week that its AWS Marketplace sales have doubled to $2 billion.
Analyst takeaways from re:Invent 2025
R "Ray" Wang:
"AWS has figured out the AI game that they're going to play. It's about builders. It's how builders interface with marketplaces, how builders interface with ISVs, and how builders interface with corporate teams. Everything AWS is doing right now is focused on helping people get there. They were behind on that story, and we're starting to see something different. AWS realizes it has to give everybody the tools they need."
"AWS is moving up the stack and that's the important thing. Amazon is almost an apps company, but we can't say that because it's AI. AWS is going to crank out as many agents as it can. Customers are going to take them and get stuff one. The AI agents AWS uses internally will be the stuff you get to use internally too."
"I think the partners are really excited, and that's the most important piece. They've been selling so much for Amazon. I mean, it's night and day from three years ago."
"This is the first Amazon re:Invent where there wasn't a lot of talk about the future. Something was missing. What about Leo? What about quantum? People think quantum is far away. I think quantum is the one thing that will pull Amazon in a direction it may not expect."
Holger Mueller:
âAWS re:Invent was different in that AWS is moving up the stack and infrastructure as a service was underplayed. â
"The big miss from AWS side, they didn't talk enough about the data side of things. How do you start with lots of framework talk without the data talk."
"I talked to a lot of customers who believed in Athena and were wondering where it was in the keynote. You have to at least show consistency in that you're building stuff."
"Amazon was teeing up something. It was saying we understand the problem here, our abstractions and the outcomes needed."
Mike Ni:
"AI and data clearly go together, and it goes with the brand promise of, you know, best components better together."
âAgentCore delivering on policy and evaluations was important since those pieces are needed to deliver agentic AI.â
"With Nova Forge, you're talking about leveraging your first party data. It's fundamental now to actually go beyond the generic with increased accuracy."
Chirag Mehta:
"Kiro got disproportionate focus in terms of the keynote and attention."
"Partner-ed solution sales exceeded direct sales so we're seeing a new wave of builders where companies are building solutions on top of existing products. Partners are driving adoption."
"Frontier agents are the most murky of terms, but the general idea is that AWS is putting a stake in the ground and saying we're going to build the best DevOps agent anyone can ever have."
Liz Miller:
"This was the first AWS re:Invent, where things like Amazon Connect and applied AI solutions took center stage and a deserved spot in the keynote. You're seeing customers take the building blocks and putting them into action and seeing business results."
"With the Nova suite, AWS has intentionally created very durable, usable and price performant foundation models that can be used to build frontier agents. Nova Sonic is one of the best when it comes to delivering speech to speech."
âI'm going to be watching those customers in 2026. There's a collision between what you're buying between Amazon and what you're buying through AWS. Amazon is a commercially ready package like Amazon Ads and Amazon Connect. AWS is the Lego blocks. When does the Amazon customer look at what they're doing with Amazon and AWS holistically?"
Principal Analyst and Founder
Constellation Research
R “Ray” Wang is the CEO of Silicon Valley-based Constellation Research Inc. He co-hosts DisrupTV, a weekly enterprise tech and leadership webcast that averages 50,000 views per episode and blogs at www.raywang.org. His ground-breaking best-selling book on digital transformation, Disrupting Digital Business, was published by Harvard Business Review Press in 2015. Ray's new book about Digital Giants and the future of business, titled, Everybody Wants to Rule The World was released in July 2021. Wang is well-quoted and frequently interviewed by media outlets such as the Wall Street Journal, Fox Business, CNBC, Yahoo Finance, Cheddar, and Bloomberg.
Short Bio
R “Ray” Wang (pronounced WAHNG) is the Founder, Chairman, and Principal Analyst of Silicon Valley-based Constellation…
Read more
Director of Marketing, Communications & Operations
Constellation Research
Elle is the Director of Marketing at Constellation Research and producer of the weekly enterprise tech show, DisrupTV. She leads marketing and communications efforts across all avenues while also managing an array of programs including the SuperNova Awards, Business Transformation 150, AX100, AI150, S50 and ShortLists. She has previous experience in public relations and has a strong passion for the marketing world.
Contact Elle at [email protected]Read more
Mission Grade Intelligence, AI Factories, and the Rise of Introvert Branding: Highlights from DisrupTV Episode 420
This week on DisrupTV, hosts Vala Ashar and R "Ray" Wang sat down with three leaders shaping the future of AI, risk intelligence, and human-centered leadership: Benji Hutchinson, CEO of Babel Street; Mukund Gopalan, Global Chief Data Officer at Ingram Micro and AI150 executive; and Goldie Chan—once dubbed the “Oprah of LinkedIn”—and author of Personal Branding for Introverts.
The episode explored everything from national security–grade AI to enterprise-scale AI factories to the strengths introverts bring to leadership and brand-building. Episode 420 brought together leaders pushing the boundaries of national security, commercial compliance, enterprise AI, and human-centered brand building.
Mission Grade Risk Intelligence: Bringing National Security Rigor to the Enterprise
Benji Hutchinson opened the conversation by unpacking the concept of mission grade risk intelligence—a discipline born in national security that is increasingly essential to commercial operations.
Hutchinson explained that federal agencies and large enterprises now face many of the same threats: cyber intrusions, fraud, identity risk, and global instability. Yet less than 30% of Fortune 2000 organizations truly understand mission grade intelligence or the dual-use technologies that power it. Too many rely on outdated, simplistic tools when modern AI systems can match names across languages, analyze massive datasets, and detect risks in real time.
As AI evolves from statistical models to agentic systems, Hutchinson sees a future where intelligent agents automate routine workflows and deliver fast, high-fidelity intelligence—dramatically increasing organizational resilience.
Ingram Micro’s AI Factory: Industrializing Intelligence at Scale
Next, Mukund Gopalan, Global CDO at Ingram Micro, detailed how the company is building an AI factory—a scalable framework for deploying AI across every part of the enterprise.
Rather than treating AI as a collection of disconnected experiments, the AI factory approach emphasizes:
Data quality and trustworthiness
Human engagement and change management
Transformation of business processes—not just tech adoption
Gopalan described how Ingram Micro is using AI agents to automate back-office tasks, simplify operations, improve customer experiences, and identify sales opportunities earlier and more accurately. Their partnership with Google accelerates this journey, but the real magic, he stressed, comes from keeping humans in the loop: employees must understand the system’s capabilities, limits, and decision-making logic.
Personal Branding for Introverts: Turning Quiet Strengths into Leadership Power
The conversation shifted gears with Goldie Chan, branding strategist and author of Personal Branding for Introverts. Goldie challenged the stereotype that introverts are shy or passive, noting that many are “loud introverts”—individuals who can thrive publicly but need solitude to recharge.
Drawing from her personal journey, including a life-changing cancer diagnosis, Goldie highlighted the importance of living a “recommendable life.” She also shared her signature 5Cs of personal branding:
Clarity
Community
Content
Consistency
Connection
Introverts, she emphasized, have distinct superpowers: deep thinking, analytical skills, empathetic listening, and strong one-on-one communication. With intentionality and thoughtful boundaries, they can build powerful and authentic personal brands.
Key Takeaways from Episode 420
1. Mission-grade intelligence is no longer optional.
National security-level risk analysis is now relevant across industries as threats grow more sophisticated.
2. AI factories are the future of enterprise transformation.
Organizations need systematic, repeatable AI workflows—not isolated pilots—to realize meaningful ROI.
3. Humans remain essential in AI-driven systems.
Change management, training, and transparency are as important as the algorithms.
4. Introverts have strategic advantages in leadership.
Their strengths—listening, deep thinking, and relationship-building—translate directly into trust and influence.
5. Personal brands thrive when rooted in authenticity.
Goldie Chan’s 5Cs offer a roadmap for building a sustainable, human-centered brand.
Final Thoughts
Episode 420 showcased a powerful blend of technology and humanity. From national security intelligence to enterprise AI factories to introvert-centered leadership, this week’s guests highlighted what’s required to navigate the next era of innovation: smarter systems, stronger communities, and more intentional storytelling.
As DisrupTV continues to feature top leaders, authors, and innovators, one theme remains clear—AI may accelerate the future, but it’s people who shape it.
Related Episodes
If you found Episode 420 valuable, here are a few others that align in theme or extend similar conversations:
Vice President and Principal Analyst
Constellation Research
Chirag Mehta is Vice President and Principal Analyst focusing on cybersecurity, next-gen application development, and product-led growth.
With over 25 years of experience, he has built, shipped, marketed, and sold successful enterprise SaaS products and solutions across startups, mid-size, and large companies. As a product leader overseeing engineering, product management, and design, he has consistently driven revenue growth and product innovation. He also held key leadership roles in product marketing, corporate strategy, ecosystem partnerships, and business development, leveraging his expertise to make a significant impact on various aspects of product success.
His holistic research approach on cybersecurity is grounded in the reality that as sophisticated AI-led attacks become…
Read more
Vice President and Principal Analyst
Constellation Research
Holger Mueller is VP and Principal Analyst for Constellation Research for the fundamental enablers of the cloud, IaaS, PaaS and next generation Applications, with forays up the tech stack into BigData and Analytics, HR Tech, and sometimes SaaS. Holger provides strategy and counsel to key clients, including Chief Information Officers, Chief Technology Officers, Chief Product Officers, Chief HR Officers, investment analysts, venture capitalists, sell-side firms, and technology buyers.<br>
Coverage Areas:
Future of Work
Tech Optimization & Innovation<br>
Background:
Before joining Constellation Research, Mueller was VP of Products for NorthgateArinso, a KKR company. There, he led the transformation of products to the cloud and laid the foundation for new Business…
Read more
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
Vice President and Principal Analyst
Constellation Research
Michael Ni is Vice President and Principal Analyst at Constellation Research, covering the evolving Data-to-Decisions landscape—where CDOs, CIOs, and CPOs must modernize data infrastructure, integrate AI into decision-making, and scale automation to improve business outcomes.
Ni’s research examines how enterprises operationalize AI, automate decision-making, and integrate data management and analytics into core business processes. He focuses on the challenges of scaling AI-driven decision systems, aligning data strategy with business goals, and the growing role of data and decisioning “products” in enterprise ecosystems.
With 25+ years as a product and GTM executive across enterprise software, AI platforms, and analytics-driven technologies, Ni brings a practitioner’s perspective to…
Read more
Principal Analyst and Founder
Constellation Research
R “Ray” Wang is the CEO of Silicon Valley-based Constellation Research Inc. He co-hosts DisrupTV, a weekly enterprise tech and leadership webcast that averages 50,000 views per episode and blogs at www.raywang.org. His ground-breaking best-selling book on digital transformation, Disrupting Digital Business, was published by Harvard Business Review Press in 2015. Ray's new book about Digital Giants and the future of business, titled, Everybody Wants to Rule The World was released in July 2021. Wang is well-quoted and frequently interviewed by media outlets such as the Wall Street Journal, Fox Business, CNBC, Yahoo Finance, Cheddar, and Bloomberg.
Short Bio
R “Ray” Wang (pronounced WAHNG) is the Founder, Chairman, and Principal Analyst of Silicon Valley-based Constellation…
Read more
Vice President & Principal Analyst
Constellation Research
About Liz Miller:
Liz Miller is Vice President and Principal Analyst at Constellation, focused on the org-wide team sport known as customer experience. While covering CX as an enterprise strategy, Miller spends time zeroing in on the functional demands of Marketing and Service and the evolving role of the Chief Marketing Officer, the rise of the Chief Experience Officer, the evolution of customer engagement and the rising requirement for a new security posture that accounts for the threat to brand trust in this age of AI. With over 30 years of marketing, Miller offers strategic guidance on the leadership, business transformation and technology requirements to deliver on today’s CX strategies. She has worked with global marketing organizations to transform…
Read more
Hannah Mason is a marketing director at Constellation Research. She oversees brand positioning, creative direction, and content development across digital platforms, events, and campaigns. Her work spans branding and design—including graphics, sales collateral, and event presentations—along with website content management, newsletters, and Constellation Insights.
Mason also manages surveys and produces Constellation’s video collateral, including ConstellationTV. Her role ensures that the Constellation brand is communicated with clarity, creativity, and consistency across all channels.
Contact Hannah at [email protected].
Read more
AWS re:Invent in Las Vegas marked a pivotal shift in the cloud giant's trajectory. Moving beyond its traditional focus on core compute primitives, Amazon Web Services is aggressively "moving up the stack" to dominate applied AI, data strategy, and agentic frameworks.
Hosted by Liz Miller, the Constellation Research analyst team, Holger Mueller, Mike Ni, Larry Dignan, R "Ray" Wang, and Chirag Mehta distilled the various themes to what is most important for an executive audience to know.
Here is the strategic analysis of AWS’s new direction.
The Big Pivot: Moving Up the Stack to Applied AI
For years, AWS was defined by infrastructure. That era is evolving. Holger Mueller noted a distinct change in tone, observing that AWS is "moving up the stack, less infrastructure as a service". The keynotes moved away from compute fundamentals to focus on higher-level solutions.
This shift is driven by a demand for results. Liz Miller emphasized that "AI-powered solutions and applied AI solutions took center stage," moving the conversation from theoretical technology to demonstrable business outcomes. The narrative has shifted from providing raw tools to showcasing results from frameworks like Nova Forge.
The "Untold Hero": Data Context Over Model Quality
While models grab headlines, analysts argued that data remains the true differentiator. Mike Ni identified data as the "untold story," arguing that the bottleneck to AI success is no longer the model itself but the context it provides.
"It’s actually the context," Ni stated. "And this is where you heard the underlying story of data".
To succeed in 2026, organizations must leverage first-party data. Ni highlighted AWS's introduction of AgentCore and Vector Search technology as critical moves to help enterprises enrich models with relevant, proprietary context.
Cloud Economics: Weaponizing Margins
AWS is returning to its retail roots to squeeze competitors. Larry Dignan observed that AWS is aggressively targeting rivals' margins through vertical integration and custom silicon.
"Everybody’s margin is AWS’s gain," Dignan explained. "Whether you look at Trainium, Nova, they’re just going to…collapse all the rivals’ margin and basically run with it".
For executives, this signals a future of more efficient, price-competitive solutions as AWS leverages its scale to win on cost-effectiveness.
The Rise of "Frontier Agents" and the New Builder
The definition of a "builder" is expanding. R "Ray" Wang noted that AWS has clarified its AI strategy: it is about empowering builders to interface with marketplaces, ISVs, and corporate teams.
However, the terminology is shifting. Chirag Mehta pointed out a bold pivot from "applications" to "frontier agents"—first-party AI frameworks designed for enterprise tasks. "We cannot call them applications, but we can call them frontier agents," Mehta noted.
This aligns with a broader ecosystem shift where partner-led solution sales have exceeded direct sales for the first time, signaling a new wave of builders leveraging collaborative ecosystems.
The Misses: Leaving Legacy Behind?
The aggressive focus on AI did leave some gaps. Holger Mueller criticized the lack of attention to data and legacy tools, specifically noting the omission of Athena and virtual desktop updates.
"No mention of Athena in the keynote… Misvalue. At least you have to show consistency with your building stuff," Mueller remarked. This suggests AWS may be prioritizing its AI mission at the expense of visible support for its legacy service portfolio.
Strategic Outlook: What to Watch for in 2026
As AWS pushes into 2026, the panel offered specific predictions on where the cloud giant goes next:
The Agent War: Holger Mueller predicts a "battle of the agentic frameworks," defined by who can build faster, cheaper agentic applications.
Incremental Innovation: Larry Dignan foresees a move toward sustained execution rather than "big bang" announcements, settling into a "steady case" of ongoing innovation.
Partner-Driven AI: Chirag Mehta advocates that partners will be the primary engine driving AI adoption through the marketplace.
Quantum Disruption: R "Ray" Wang predicts that Quantum computing will eventually "pull Amazon in a direction they may not have expected".
Holistic Adoption: Ultimately, success will be measured by customer behavior. Liz Miller will be watching for when customers stop viewing AWS as a "Lego set" and start seeing it as a holistic solution provider.
AWS is undergoing a profound transformation. By betting big on agents, partners, and silicon economics, they are aiming to turn "Lego blocks" into integrated business results.
Vice President and Principal Analyst
Constellation Research
Holger Mueller is VP and Principal Analyst for Constellation Research for the fundamental enablers of the cloud, IaaS, PaaS and next generation Applications, with forays up the tech stack into BigData and Analytics, HR Tech, and sometimes SaaS. Holger provides strategy and counsel to key clients, including Chief Information Officers, Chief Technology Officers, Chief Product Officers, Chief HR Officers, investment analysts, venture capitalists, sell-side firms, and technology buyers.<br>
Coverage Areas:
Future of Work
Tech Optimization & Innovation<br>
Background:
Before joining Constellation Research, Mueller was VP of Products for NorthgateArinso, a KKR company. There, he led the transformation of products to the cloud and laid the foundation for new Business…
Read more
Editor in Chief of Constellation Insights
Constellation Research
About Larry Dignan:
Dignan was most recently Celonis Media’s Editor-in-Chief, where he sat at the intersection of media and marketing. He is the former Editor-in-Chief of ZDNet and has covered the technology industry and transformation trends for more than two decades, publishing articles in CNET, Knowledge@Wharton, Wall Street Week, Interactive Week, The New York Times, and Financial Planning.
He is also an Adjunct Professor at Temple University and a member of the Advisory Board for The Fox Business School's Institute of Business and Information Technology.
<br>Constellation Insights does the following:
Cover the buy side and sell side of enterprise tech with news, analysis, profiles, interviews, and event coverage of vendors, as well as Constellation Research's community and…
Read more
Vice President & Principal Analyst
Constellation Research
About Liz Miller:
Liz Miller is Vice President and Principal Analyst at Constellation, focused on the org-wide team sport known as customer experience. While covering CX as an enterprise strategy, Miller spends time zeroing in on the functional demands of Marketing and Service and the evolving role of the Chief Marketing Officer, the rise of the Chief Experience Officer, the evolution of customer engagement and the rising requirement for a new security posture that accounts for the threat to brand trust in this age of AI. With over 30 years of marketing, Miller offers strategic guidance on the leadership, business transformation and technology requirements to deliver on today’s CX strategies. She has worked with global marketing organizations to transform…
Read more
Vice President and Principal Analyst
Constellation Research
Michael Ni is Vice President and Principal Analyst at Constellation Research, covering the evolving Data-to-Decisions landscape—where CDOs, CIOs, and CPOs must modernize data infrastructure, integrate AI into decision-making, and scale automation to improve business outcomes.
Ni’s research examines how enterprises operationalize AI, automate decision-making, and integrate data management and analytics into core business processes. He focuses on the challenges of scaling AI-driven decision systems, aligning data strategy with business goals, and the growing role of data and decisioning “products” in enterprise ecosystems.
With 25+ years as a product and GTM executive across enterprise software, AI platforms, and analytics-driven technologies, Ni brings a practitioner’s perspective to…
Read more
Principal Analyst and Founder
Constellation Research
R “Ray” Wang is the CEO of Silicon Valley-based Constellation Research Inc. He co-hosts DisrupTV, a weekly enterprise tech and leadership webcast that averages 50,000 views per episode and blogs at www.raywang.org. His ground-breaking best-selling book on digital transformation, Disrupting Digital Business, was published by Harvard Business Review Press in 2015. Ray's new book about Digital Giants and the future of business, titled, Everybody Wants to Rule The World was released in July 2021. Wang is well-quoted and frequently interviewed by media outlets such as the Wall Street Journal, Fox Business, CNBC, Yahoo Finance, Cheddar, and Bloomberg.
Short Bio
R “Ray” Wang (pronounced WAHNG) is the Founder, Chairman, and Principal Analyst of Silicon Valley-based Constellation…
Read more
Hannah Mason is a marketing director at Constellation Research. She oversees brand positioning, creative direction, and content development across digital platforms, events, and campaigns. Her work spans branding and design—including graphics, sales collateral, and event presentations—along with website content management, newsletters, and Constellation Insights.
Mason also manages surveys and produces Constellation’s video collateral, including ConstellationTV. Her role ensures that the Constellation brand is communicated with clarity, creativity, and consistency across all channels.
Contact Hannah at [email protected].
Read more
LIVE from Amazon Web Services (AWS) re:Invent, Constellation analysts give their unique POVs on the conference and report a clear shift: #AWS is moving from infrastructure to applied AI and agents...
Key takeaways from this ConstellationTV episode:
💡 AI = models + data. Nova, Nova Forge, and Nova Sonic are solid, but the real differentiator is how AWS helps customers govern and activate their first-party #data.
💡 From IaaS to “frontier #agents.” AWS is positioning itself as an “almost apps” company with frontier agents built on its own foundation models.
💡 Amazon Connect as a proof point. Now a $1B+ ARR business, Connect shows how applied AI can handle complex service and support at both #enterprise and SMB scale.
💡 Partners and marketplace on top. For the first time, partner-led solutions exceeded direct sales, signaling that an ecosystem of builders and ISVs will drive AI adoption.
AI is still early and “complicated as hell,” but the move to intelligent, outcome-focused solutions is unmistakable.