Results

Workday's Q3 highlights push, pull, patience of its AI strategy

Workday's Q3 highlights push, pull, patience of its AI strategy

Workday is rounding out its AI strategy, building out its platform with multiple tuck-in acquisitions and looking to become an AI agent player because it can leverage its unified HR and finance data.

The company's third quarter earnings were better than expected, but also highlight how the grand AI strategy, which was outlined at Workday Rising, is just starting.

Workday reported third quarter net income of $252 million, or 94 cents a share, on revenue of $2.43 billion, up 12.6% from a year ago. Non-GAAP earnings were $2.32 a share, 15 cents ahead of Wall Street estimates. As for the outlook, Workday projected fourth quarter subscription revenue growth of 15.5%. For fiscal 2026, Workday projected subscription revenue growth of 14.4%. Fourth quarter operating margin guidance of 28.5% was slightly lower than expected. Workday has had a busy few months.

"I've been on the road a lot lately, meeting with our customers and prospects, and they're all saying the same thing. They see the potential of AI but they're stuck with disconnected systems, bad data and closed platforms. That's where Workday gives them the ultimate advantage by unifying HR and finance on one intelligent platform," said Workday CEO Carl Eschenbach.

The catch is that Workday has just closed key acquisitions to build out its intelligent platform and announced a few more designed to connect AI agents broadly across the enterprise.

"While other vendors confuse the market with thousands of overlapping general purpose agents, we're focused on what we do best, and that is building powerful agents for HR and finance that deliver real ROI and measurable business value," said Eschenbach, who said data quality is hindering enterprise AI.

If this theme sounds familiar it's because you've heard something similar from multiple enterprise software vendors. You've also heard the same argument from all the AI-driven startups looking to become the new SaaS leaders.

Related research:

Making sense of Workday's acquisitions

Workday has been on a run of tuck-in acquisitions since its February 2024 purchase of HiredScore. The company just announced the acquisition of Pipedream and closed Sana, which is billed as the future front door to Workday. Paradox, Flowise and Evisort are all deals that are designed to expand Workday's AI agent ambitions.

These tuck-in deals have largely created Workday's AI agent flywheel. These acquisitions also give Workday attachments to sell to core platforms. For instance, Eschenbach said on Workday's third quarter earnings call that the company is selling Paradox, which focuses on frontline workers, attached to its recruiting software. HiredScore also rides along with recruiting.

Eschenbach said:

"We have the industry-leading AI recruiting platform out there today. At the same time, this is now a new product that is a land-only product for our sales force who can now go and sell Paradox not only on top of Workday or back into our installed base, but also into our competitors' environment. In fact, a significant portion of their existing customers aren't Workday today. And we're going to continue to leverage that go-to-market model, so it gives us another land product without someone having to decide completely on Workday, HR or finance, they can go just with Paradox. And I've seen that come up multiple times just in the first 60 days of us having this great asset."

As for the returns, Eschenbach noted that "for every dollar of recruiting we sell, we sell about $2.50 of HiredScore on top of it." Evisort, a document intelligence for contract management company acquired in Sept. 2024, is a business growing at a triple digit clip for Workday. And Paradox opens the frontline worker recruiting market to Workday.

Zane Rowe, Workday CFO, said both Sana and Paradox are contributing 1.5 points to the company's fourth quarter subscription revenue outlook.

Sana will follow a similar playbook and be sold on top of Workday Learning.

"And then obviously, we're going to refresh our UI/UX, leveraging the Sana platform going forward," said Eschenbach.

Gerrit Kazmaier, President of Product and Technology, said on the third quarter earnings call that Sana will be the "leading UI experience for Workday" and a "complete conversational experience."

He said:

"Imagine every employee having access to HR and finance AI at scale. What that means in cost reduction. On the other side, you can see what drives that interest. And thirdly, Sana goes much more beyond that. And I would recommend you look at the big picture with also Pipedream, adding 3,000 connectors to the Sana platform, which now allows our customers to take Sana knowledge management actions in Workday and the actions that Pipedream adds to really drive enterprise-wide AI transformation with that model."

What Workday is doing is acquiring add-on features and front-ends to the company's HR and financial data.

Using a multi-cloud approach built on AWS and Google Cloud, Workday is mandating that customers move from its own data centers to public cloud. The thinking is that Workday will be able to leverage best-of-breed tooling and spin out innovation faster.

Workday customers will have to update Workday tenant URLs and reconfigure integrations.

The big picture

What Workday is really working toward is an army of controlled AI agents focused on driving enterprise productivity and processes. But to do that you need the data and process intelligence.

Eschenbach said during Workday's Analyst/Investor Day at Workday Rising: "It's not about the quantity of agents you're bringing to market. It's the quality of agents and they have to drive real business value. They have to drive real outcomes."

Kazmaier said AI will be the new UI and enterprise vendors will have to provide leading experiences or lose share.

Here's what Kazmaier outlined as the ingredients to deliver on Workday's ambition. Speaking on Workday's earnings call, Kazmaier outlined the key ingredients to deploying agentic AI:

  1. "The first thing that you need is a vast set of data, which basically describes the domain, the domain of finance, the domain of HR. And you need a vast set of data that basically codifies how their data is being used."
  2. "You need to have strong semantics and clarity about what data we present. You need to have a data model that defines what data element represents what entity in the business? What do they relate on? And what are the rules for these business entities? They have integrity, they have meaning, they have purpose."
  3. "You need to have a business process system, which now basically tells you how to activate this data in a way that can drive towards a business outcome. Even more so, you need to have clarity on what that business outcome is."

Workday's argument is that its consolidated data set and process data is the differentiator. The bet is that data and process drive AI not the other way around.

Data to Decisions Future of Work Tech Optimization workday AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR Chief Information Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer

Dell Technologies ups AI server shipment outlook amid strong Q3

Dell Technologies ups AI server shipment outlook amid strong Q3

Dell Technologies saw record AI server orders in the third quarter and raised its fiscal 2026 AI shipment guidance to $25 billion, up 150% from a year ago.

The company reported third quarter earnings of $2.28 a share on revenue of $27 billion, up 11% from a year ago. Non-GAAP earnings in the third quarter were $2.59 a share. Wall Street was expecting Dell to report non-GAAP earnings of $2.48 a share on revenue of $27.3 billion.

Dell also named David Kennedy as CFO on a permanent basis. He was interim CFO. Kennedy said fiscal 2026 revenue will be $111.7 billion for the year.

Jeff Clarke, chief operating officer of Dell, said the company has landed $30 billion in AI server orders year to date. "Our five-quarter pipeline is multiples of our $18.4 billion backlog with a mix of neocloud, sovereign and enterprise customers," said Clarke, who noted that Dell is building high-performance systems as well as complex clusters.

Like recent quarters, Dell's growth was powered by its infrastructure solutions group. The client solutions group has struggled to deliver revenue growth.

For the infrastructure unit, Dell reported third quarter revenue of $14.1 billion, up 24% from a year ago. Operating income was $1.7 billion, up 16% from a year ago. Servers and networking revenue was $10.2 billion, up 37% from a year ago, and storage revenue fell 1% to $4 billion.

For the PC unit, Dell reported operating income of $748 million in the third quarter on revenue of $12.5 billion, up 3% from a year ago. Commercial client revenue was up 5% and consumer revenue fell 7%.

As for the outlook, Dell projected the following:

  • Fourth quarter revenue will be between $31 billion and $3 billion, up 32% from a year ago. Fourth quarter non-GAAP earnings will be $3.50 a share.
  • Fiscal 2026 AI server shipments will be about $25 billion, up 150%.
  • Fiscal 2026 non-GAAP earnings will be $9.92 a share on revenue of $111.2 billion and $111.2 billion, up 17%.

On the earnings call, Clarke said:

  • Dell has AI racks operational within 24 to 36 hours of delivery with uptime topping 99%.
  • The company shipped $5.6 billion in AI servers in the quarter. 
  • Traditional server demand grew double-digits in EMEA and North America growth accelerating. 
  • All-flash array storage systems had double digit demand growth. 
  • On the supply chain Clarke said: "We are well positioned across our commodity basket - Q3 was deflationary, and our outlook for Q4 is largely unchanged from last quarter. Looking ahead to next year, there will be dynamics that we will have to navigate, but we are confident in our ability to secure supply and adjust pricing as needed."

Kennedy also touched on the fiscal 2027 outlook. He said: "We have strong conviction in our AI business, supported by what we see in our backlog, the pipeline, and ongoing customer discussions. We’ve proven we can execute and deliver for our customers in this space."
 

Data to Decisions Tech Optimization dell Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

A peek at IBM's practical approach to quantum computing

A peek at IBM's practical approach to quantum computing

An IBM executive said the company's steady and practical approach to quantum computing will win out over the bluster that's emerging from multiple vendors.

Speaking at an investor conference, Ric Lewis, senior vice president at infrastructure for IBM, was asked about Big Blue's approach to quantum computing. Lewis said quantum computing isn't about pumping out press releases as much as it is practical use cases and believable roadmaps.

"We are taking a very practical, rational approach to it," said Lewis. "We're not expecting some scientific breakthrough at this point. It's a matter of engineering and execution to get where we need to go."

Lewis added:

"When I watch other quantum people and what they're saying I watch for a few things. One, do they have a believable roadmap. Not a roadmap, not just an aspiration but have you shown through your progress over the last five years that's you're on a certain trajectory. Do you have a believable roadmap for the next several steps?"

IBM's roadmap revolves around delivering quantum advantage in the next year and error correction and other capabilities later in the decade. "In '28, '29 we believe we'll be transacting on a system kind of level," said Lewis. "We're already transacting and we have clients that are buying cycles of quantum."

Lewis said IBM's roadmap is predictable and features an ecosystem of partners and a software stack. IBM features Qiskit, quantum computing software that has solid adoption.

Holger Mueller, an analyst at Constellation Research, said:

"Technical breakthroughs in commercial production do not happen overnight, but are the result of a string of successful completions of architectural advances that get delivered on time and functionally working. That is what IBM has done over the last 2-3 years. That is the progress and fidelity enterprises want to see when adopting a new technology platform is no exception."

Overall, Lewis said quantum computing is fragile and systems will need more resilience. He argued that combining classic and quantum techniques for error correction in quantum computing will be a practical approach to solving big problems.

"I also look for a philosophy that says quantum is not a replacement to classical," said Lewis. "When you combine them together, you end up with something very strong. And since we play strong in classical, we play strong in AI, and I think we're the leader in quantum, we're really well positioned for as the industry gets to this kind of 2030 time frame and all that TAM. So we're pretty bullish and excited about it, though cautious and practical. Just keep executing the road map, make our steps, and we're going to be in a really good spot."

My take

Lewis has a point regarding the bluster meter in quantum computing. I'll tend to listen more to executives that refrain from the trash talk.

The reality is deployments at scale and returns on investment are probably five years away. Compute, networking and hybrid HPC-quantum systems are in the nascent stages. If you listen to quantum executives and the leading players, most companies are talking the same timelines.

Bluster has led to bigger valuations for some of the pure plays and enabled them to build strong balance sheets despite paltry sales. I'm willing to bet we're entering a new quantum computing phase where being more understated plays better. In the end, you need to deliver the qubits, error correction, software stack and scalability over the press release count.

Data to Decisions Tech Optimization Innovation & Product-led Growth IBM Quantum Computing Chief Information Officer

Alibaba Cloud Q2 revenue surges 34% courtesy of AI, Qwen momentum

Alibaba Cloud Q2 revenue surges 34% courtesy of AI, Qwen momentum

Alibaba's cloud revenue in the second quarter surged 34% driven by AI workloads. Alibaba's cloud revenue is on an annual revenue run rate is more than $22 billion.

The Chinese retail and cloud giant said second quarter revenue for its Cloud Intelligence Group was $5.59 billion, up 34% from a year ago. Earnings before taxes, interest and amortization were $506 million in the second quarter.

Alibaba said it is investing heavily on building its vertical AI stack. The company is also betting heavily on open source large language models. As of Oct. 31, Alibaba's open source Qwen models have led to more than 180,000 derivative models on Hugging Face.

The company added that it is seeing "accelerating adoption of our AI products across a broad range of enterprise customers, with a growing focus on value-added applications including coding assistants."

Alibaba said it will continue to invest in AI products and its AI infrastructure.

In September, Alibaba Cloud outlined upgrades to its stack including new servers, networking, distributed storge and computing clusters. That infrastructure complements what Alibaba calls Platform for AI and model training and inference services.

Alibaba Cloud's stack includes:

  • And upgraded set of databases, containers, storage and compute services optimized for data and AI workloads.
  • Qwen3 family of models including Qwen3-Max, which has instruct and thinking versions, and Qwen3-Omni, a multilingual and multimodal model.
  • The upcoming Wan2.5 video generation model.
  • An upgraded agent development and application platform led by Model Studio.
  • Model-Studio-ADK (Agent Development Kit) designed enterprise use cases.
  • An upgraded AgentBay, a multimodal cloud operating environment and expert agent platform.
  • AgentOne, an enterprise AI application platform.
Data to Decisions Chief Information Officer

Zoom delivers strong Q3 as enterprise traction, AI Companion, Zoom Phone gain

Zoom delivers strong Q3 as enterprise traction, AI Companion, Zoom Phone gain

Zoom reported better-than-expected third quarter results as the company gained wallet share and grew the number of customers contributing more than $100,000 in trailing 12 month revenue.

The company reported third quarter net income of $612.9 million, or $2.01 a share, on revenue of $1.23 billion, up 4.4% from a year ago. Non-GAAP earnings per share in the quarter was $1.52.

Wall Street was expecting Zoom to report earnings of $1.44 a share on revenue of $1.21 billion.

Zoom continued to show more growth in its enterprise segment. Enterprise revenue was $741.4 million, up 6.1% from a year ago, and online revenue, which usually refers to small businesses and consumers, had revenue of $488.4 million, up 2%.

By the numbers:

  • Zoom reported that it had 4,363 customers contributing more than $100,000 in trailing 12 months revenue.
  • Online monthly churn in the third quarter was 2.7%, or flat from a year ago.
  • Workvivo had 1,225 customers, up 70% from a year ago.
  • Zoom Phone passed 10 million paid seats in the third quarter.
  • Zoom ended the quarter with $7.9 billion in cash and equivalents.

In prepared remarks, CEO Eric Yuan said the company is seeing strong demand for its AI Companion. He said team chat monthly active users were up 20% from a year ago. Zoom’s AI Companion integrates with Google Workspace, Microsoft 365 and Teams, Slack, Salesforce and ServiceNow.

"AI isn’t just bolstering our core, it’s opening new revenue streams and deeper customer value through customization and automation. Two quarters in, Custom AI Companion is scaling with several Fortune 200 wins and broad interest," said Yuan.

He noted that customer experience is gaining due to AI. "Within Customer Experience, AI has become a clear differentiator, creating additional monetization opportunities. Nine of our top ten CX deals involved paid AI, such as Zoom Virtual Agent or AI Expert Assist, as enterprises use Zoom to deliver faster, more personalized service," Yuan said.

As for the outlook, Zoom projected revenue between $1.23 billion and $1235 billion with non-GAAP earnings between $1.48 a share and $1.49 a share.

For fiscal 2026, Zoom said revenue will be between $4.852 billion and $4.857 billion with non-GAAP earnings of $5.95 a share and $5.97 a share.

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth New C-Suite Marketing Transformation Digital Safety, Privacy & Cybersecurity zoom Chief Information Officer

AWS to invest $50 billion for US govt HPC, AI infrastructure

AWS to invest $50 billion for US govt HPC, AI infrastructure

Amazon said it will invest $50 billion to build and deploy AI and high performance computing AWS infrastructure built for the US government.

Construction on the infrastructure will break ground in 2026. The new investment will add almost 1.3 gigawatt of compute capacity for AWS Top Secret, AWS Secret and AWS GovCloud Regions for multiple classification levels.

Hyperscale cloud providers have been investment multiple billions of dollars to build out capacity for AI. These cloud providers also see a big total addressable market for public sector and government customers.

According to AWS, which kicks off its re:Invent 2025 conference next week, the US government focused infrastructure will be powered by its AWS Trainium AI chips as well as Nvidia GPUs. Federal agencies can use the infrastructure to access Amazon SageMaker AI, Amazon Bedrock and multiple models including Anthropic's Claude and Amazon Nova.

Use cases for the new infrastructure will range from national security to scientific research and innovation. AWS has been building US government focused clouds and infrastructure since 2011.

AWS CEO Matt Garman said the company's first-ever supercomputing infrastructure for government customers will give agencies "expanded access to advanced AI capabilities."

Constellation Research analyst Holger Mueller said:

"AWS has the largest government share of the cloud providers, and as such has practically the most to lose. Implicitly AWS confirms that HPC is a potential disruptor for commercial relations of cloud vendors - and therefore has to invest in it. The key question is going to be if AWS can pony up the capex for other key hardware innovations, AI super computers first, quantum platforms next."

Data to Decisions Tech Optimization Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

NATO to deploy Google Distributed Cloud

NATO to deploy Google Distributed Cloud

NATO will deploy Google Distributed Cloud, an air-gapped version of Google Cloud on-premise.

Under a multi-million dollar contract with NATO's Communication and Information Agency (NCIA), Google will deliver sovereign cloud services to NATO for edge computing and AI use cases.

GDC is a sovereign cloud in a box that’s physically disconnected from the internet but includes everything to deploy virtual machines, run workloads and use services such as Vertex AI.

Google Distributed Cloud (GDC) was a big theme at Google Public Sector's annual conference in Washington DC in October. GDC was being used in public sector deployments, defense use cases and sovereign data and cloud implementations. Typically, GDC was being used in edge locations with limited to no connectivity and hardened environments.

In a statement, GDC will support NATO Communication and Information Agency's Joint Analysis, Training and Education Centre, which is using the infrastructure to modernize and manage classified workloads.

With GDC, NATO will maintain data residency and operational controls as well as autonomy. NATO has been building an interoperable communications and information systems architecture that can bring together 12 allied nations and 36 NATO entities quickly and at scale. The goal is to enable forces to operate together on the fly.

Constellation Research's Holger Mueller said:

"Google Cloud has for a long time invested in its next-generation computing -platform - originally Anthos - now Google Distributed Cloud (GDC). The key selection criteria for a next-gen computing platform is workload portability between public clouds and on premises. The required characteristic is identicality. The tech stack in the supported deployment is identical, portable and has investment protection of code assets. With Gemini and Vertex AI running on GDC, Google sets itself apart from the other next-gen computing vendors, and is therefore very well secured for air gapped solutions required for military workloads."

Data to Decisions Google Chief Information Officer

The enterprise LLM questions you should be asking

The enterprise LLM questions you should be asking

Large language models are at an interesting juncture. LLM breakthroughs have slowed and there are questions about whether they will lead to artificial general intelligence. Coupled with concerns about an AI infrastructure bubble LLMs are going to be closely watched--especially since they're the key ingredient of agentic AI.

In the end, LLMs don't have to necessarily lead to some superintelligence to have a big impact on enterprises. Enterprise AI and the AI market that fascinates venture capitalists and Wall Street investors are two different markets. There are enough AI returns for enterprises even if LLMs stagnate for the next year.

With that backdrop and this week’s headlines, it's worth pondering the key LLM questions.

Note I do not have the answers but usually know the questions to ask. Here's a look at the key LLM questions as 2025 comes to a close.

Will LLMs--and the AI agents they power--upend software as a service? This debate has been bubbling up throughout. The winner of the debate of LLMs vs. SaaS is far from settled, but it's no surprise that enterprise software vendors are scrambling to present themselves as platforms. No more cross-selling clouds. No more functional silos. Today, the masters of the cross sell are talking platforms. Salesforce, Microsoft, Workday, ServiceNow and a cast of hundreds are creating AI agents that work across their applications. Meanwhile, OpenAI and Anthropic are looking to break the SaaS margin profile to woo enterprises. The idea is simple: Relegate systems of record to plumbing where ChatGPT or Claude is the interface and workflow engine. It's early in this LLM vs SaaS debate, but the SaaS crowd has plenty of disgruntled customers looking for alternatives. The running joke is SaaS and healthcare are the two enterprise categories always guaranteed to go up. LLMs could be a SaaS replacement or just a nice negotiation tool.

Five years from now will we all say, 'those LLMs turned out to be a kick ass enterprise search'? The more I use LLMs, the more I think their greatest contribution is perusing structured and unstructured data and surfacing it easily. LLMs clearly collapse the time spent on conducting searches and doing superficial research. Yes, LLMs will make stuff up, but they're a great starting point. When combined with enterprise data and repositories that have been useless for years, LLMs are revamping the search game for companies. Suddenly, context engineering is a thing.

Is the future of UI generative? Enterprises have paid SaaS providers for years as data stores, workflow models and user interfaces (that may or may not be swell). If the UI layer collapses what exactly are you buying? Sure, SaaS providers are talking about how they could be a headless platform, but that'll likely mean lower prices. The idea that LLMs could spin up relative user interfaces on the fly have been appealing--if not a bit theoretical. However, Google's launch of Gemini 3 features a lot of interface goodies where widgets and layout themes area presented on the fly. Suddenly, a search query can provide an answer that comes in a magazine format. Answers can have code for functions like mortgage calculators. According to Google, Gemini 3 knows good design principles. Watch these UI developments closely because there's likely a big impact on enterprise software in the future.

Google published a generative UI paper to go along with the Gemini 3 launch. The company said: "Our evaluations indicate that, when ignoring generation speed, the interfaces from our generative UI implementations are strongly preferred by human raters compared to standard LLM outputs. This work represents a first step toward fully AI-generated user experiences, where users automatically get dynamic interfaces tailored to their needs, rather than having to select from an existing catalog of applications."

Will AI agents be implemented without forward deployed engineers? The most popular technology job today is the "forward deployed engineer." These folks are technical experts that can also consult and co-innovate with customers. Forward deployed engineers swoop in surface use cases, get the data models in shape and then help you implement AI agents and your digital workforce. The big idea is that forward deployed engineers can get you from pilot to production faster. A cynic would say software vendors are starting to look like consultants, which is fine since consultants are also offering software. Palantir made forward deployed engineers and its data ontology popular and now enterprise vendors are all over it. Now forward deployed engineers are swell but likely inflate the price tag for enterprises. The big question is when AI agents can do a lot of this work.

ServiceNow is investing in forward deployed engineers as well as automating as much of the implementation as possible. Amit Zavery, ServiceNow's President, Chief Product Officer & COO, said: "We have 100-plus prepackaged workflows with Agentic built in. So, you don't have to do a lot of handholding to get going. Of course, there are going to be co-innovation required. There might be something specific for our customers. That's why we're investing in FD kind of a model with forward deployed engineers who are really AI black belt who can work very closely with customers on the AI expertise required for some of those use cases."

Are LLMs a dead end in the pursuit of artificial general intelligence? Call it AGI. Call it superintelligence. The working theory is that LLMs will lead to AGI. All the cool kids think so. Here's the recipe for AGI. Advance LLMs with a ridiculous amount of GPUs, data centers, land, and power and bam here's superintelligence. That simplified recipe is behind big valuations, lots of debt and monetization schemes that may or may not work out. If interested in a contrarian view, it's worth checking out this Wall Street Journal profile of Yann LeCun, who headed Meta's Fundamental AI Research group. LeCun is arguing that world models, which are trained through visual information instead of text. The punchline of the WSJ story: "If you are a Ph.D. student in AI, you should absolutely not work on LLMs."

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Chief Information Officer

Quantum, Geo-Targeting, and Freshworks Strategic Pivot

Quantum, Geo-Targeting, and Freshworks Strategic Pivot


ConstellationTV episode 118 covers the latest enterprise tech developments, creative advertising solutions, and strategic pivots from leading organizations. Here are the main takeaways:

SAP’s Bold Move in AI Development

During SAP TechEd Berlin, SAP made a significant announcement regarding its AI innovation. For the first time, SAP released its own large language model (LLM), known as SAP RPT-1, a groundbreaking tool specialized in understanding and processing tabular data. As highlighted in the discussion, this move underscores SAP’s unique position in the enterprise software space, being the first to ship an AI model specifically tailored to meet business users’ needs in this domain.

SAP’s renewed focus on solving practical challenges for its users was another focal point. Instead of being consumed by AI hype, the company demonstrated a clear understanding of its customers' pain points. One major issue is the S/4HANA upgrade conundrum, where legacy code and architecture present substantial hurdles. SAP’s innovative use of AI to assist with ABAP code reviews and streamline transitions to their cherished “clean core” highlights their dedication to helping businesses future-proof their systems.

This marks a significant shift for SAP, finally bridging the gap between AI innovation and users’ pressing needs—a trajectory that other enterprise software companies should observe closely.

The IBM Quantum Revolution

Moving on to quantum computing, IBM’s Quantum Developer Conference showcased some groundbreaking milestones. Among the highlights were the successful proof of error correction enabled by the Lune processor and further hardware advancements with a novel IHOP processor. These developments reveal not only the technical leaps happening in the quantum computing space but also a growing interest in equipping developers to capitalize on this technology.

With over 500 quantum developers in attendance, it’s evident that quantum computing is reaching an inflection point, where theoretical research is transforming into practical applications. For businesses, harnessing quantum computing could soon mean better performance optimization, predictive analysis on an unprecedented scale, and a competitive edge in data-heavy industries. However, businesses must prepare for the learning and skill transition this shift demands.

Advertising Innovations and Business Strategy Pivots: Oracle Analytics-Driven Geotargeting Campaigns

ConstellationTV episode 118 features an interview with Scott Searcy, a CX Supernova Award winner. Scott shared insights into his geotargeting project, facilitated by Oracle Analytics Cloud, which is a prime example of how businesses can creatively leverage data. Initially born out of a zoning change that permitted new digital assets, Scott’s team used advanced analytics tools to overlay political boundary data with latitude-longitude maps.

The project resulted in highly precise advertising campaigns that targeted swing states during political races. By identifying these opportunities, Scott’s team not only increased revenue but also demonstrated the power of combining real-world scenarios like zoning changes with enterprise-level analytics.

The implications go beyond politics. These tools have expanded to more versatile use cases, including live events, sporting events, and concert series. Leveraging geotargeting to maximize revenue while minimizing waste is a masterclass in creative, data-driven advertising. Enterprises should examine this approach to optimize their advertising efforts.

Freshworks’ Strategic Pivot

Liz Miller provides commentary on Freshworks’ recent strategic pivot—perhaps the most instructive segment of this episode for business leaders grappling with organizational complexity. Freshworks, which had previously aimed for an ambitious full-platform engagement solution, recently shifted focus toward simplifying service delivery.

Freshworks identified a core challenge many organizations face: the chaos and complexity of managing expansive platforms across diverse audiences. By re-centering its efforts on improving both employee-facing and customer-facing service delivery, Freshworks embraced simplicity as its key to growth.

Liz captured the essence of this shift with a standout quote: “Freshworks has centered on this idea that complexity is the enemy of growth.” This philosophy not only redefines their offerings but showcases a leadership team willing to pivot boldly when the market demands it.

The company’s decision reflects the need for intentional focus in today’s fast-evolving landscape. In an era where companies often chase ambitious platform solutions, Freshworks demonstrates the value of listening to the market and staying sharply attuned to user needs—something many organizations can learn from.

Central to this pivot is their Chief Marketing Officer, Mika Yamamoto, whose leadership played a crucial role in reshaping Freshworks’ vision. Mika’s ability to address the dual challenges of internal complexity and external customer satisfaction epitomizes the kind of leadership required in turbulent markets.

Practical Applications of Artificial Intelligence

Throughout the segment, a recurring theme emerges: AI’s transition from hype to application. The conversation highlights recent trends where businesses are gradually moving away from fretting about AI’s potential to “replace jobs” toward exploring its more practical use cases to address persistent challenges.

Liz Miller observes that organizations are now asking tougher questions, like: “How can AI simplify my daily operations rather than make them more complex?” AI implementations in areas such as marketing, project management, and customer service are proving far more impactful than lofty claims that AI can solve issues beyond reach.

Yet, there’s a layer of disillusionment, too. While the technology’s potential remains vast, Liz warns that businesses must set realistic expectations for its current capabilities, especially regarding the tools and solutions promised to revolutionize workflows. Understanding the present limitations of AI is just as important as understanding its possibilities.

What Businesses Can Learn from Episode 118

  1. Enterprise Tech Success Lies in Addressing Real Problems: Both SAP and IBM serve as shining examples of how to effectively innovate within the tech space. From developing AI models rooted in functionality to pushing boundaries in quantum computing while providing accessible educational resources, these companies demonstrate that solving tangible business problems should take precedence over chasing headlines.

  2. Creativity in Advertising is Grounded in Data: Scott Searcy’s geotargeting campaign offers lessons beyond just political advertising. Businesses must learn to combine analytics with real-world constraints to uncover untapped opportunities, whether through location-based campaigns, event planning, or revenue optimization strategies.
     
  3. The Boldness to Pivot Pays Off: Freshworks’ strategic simplification serves as a powerful narrative for businesses grappling with uncertainty and internal chaos. Practicing the discipline of stepping back, re-focusing, and simplifying offerings can yield greater long-term growth—and Freshworks is proving that firsthand.
     
  4. AI Hype Must Translate into Practicality: While AI remains a dominant force in enterprise solutions, its role is shifting toward practicality. Organizations must evaluate AI implementations not just for their innovation potential but for their ability to solve everyday challenges.
Data to Decisions Digital Safety, Privacy & Cybersecurity Future of Work Innovation & Product-led Growth Marketing Transformation Matrix Commerce New C-Suite Next-Generation Customer Experience Tech Optimization SAP Oracle ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Revenue Officer Chief Supply Chain Officer Chief Sustainability Officer Chief Technology Officer Chief AI Officer

IBM, Cisco aim to scale, network quantum systems

IBM, Cisco aim to scale, network quantum systems

IBM and Cisco said they will build a connected network of quantum computers that will look to scale to hundreds of thousands of qubits.

The companies plan to demonstrate multiple networked quantum computers within five years.

Both companies have quantum computing plans first in hybrid quantum-HPC systems and then quantum. IBM's quantum efforts are well known and Cisco has been revving quantum networking efforts.

IBM and Cisco, two classical computing giants, are battling pure play quantum companies that are also investing heavily in networking systems together. For instance, IonQ has acquired multiple startups focused on quantum networking.

See: Quantum computing pure plays duel with giants, rivals

Key points:

  • IBM and Cisco plan to design a network of large-scale fault tolerant quantum systems by the early 2030s.
  • The companies will demonstrate network quantum systems within five years.
  • The quantum network will serve as the base for quantum internet, communications and sensing by the late 2030s.
  • IBM is focused on quantum computing with Cisco delivering on networking.
  • According to the companies, this quantum network could run with "potentially trillions of quantum gates."

Jay Gambetta, Director of IBM Research and IBM Fellow, said the Cisco partnership works with the company's roadmap. "By working with Cisco to explore how to link multiple quantum computers like these together into a distributed network, we will pursue how to further scale quantum's computational power. And as we build the future of compute, our vision will push the frontiers of what quantum computers can do within a larger high-performance computing architecture," said Gambetta.

Vijoy Pandey, GM/SVP at Outshift by Cisco, said quantum useful scale is about networking as much as compute. "IBM is building quantum computers with aggressive roadmaps for scale-up, and we are bringing quantum networking that enables scale-out," he said.

The networked quantum computing system will aim to entangle qubits from multiple separate quantum computers located in distinct cryogenic environments.

IBM and Cisco will have to create new connections and a supporting software stack that can preserve quantum states, distribute entanglement resources and network systems.

The to-do list for this collaboration is extensive.

  • The companies said they will explore how to transmit qubits over longer distances with various optical technologies and transfer quantum information.
  • IBM will build a quantum networking unit to be the interface of the quantum processing unit. This quantum networking unit will take stationary information in the QPU and convert it to data that can be networked.
  • Cisco will develop a high-speed software protocol that can reconfigure network paths for quantum information.
  • The companies will create the hardware and open source software to act as a network bridge.
  • IBM will work with the Superconducting Quantum Materials and Systems Center (SQMS) to figure out how many quantum networking units could be used within quantum data centers and demonstrate the connections within the next three years.

In a diagram these networked quantum computers would look like this:

Data to Decisions Innovation & Product-led Growth Tech Optimization cisco systems IBM Quantum Computing Chief Information Officer