Results

IBM launches z17 mainframe, eyes AI workloads

IBM launched its IBM z17 mainframe that includes AI tools to process inference workloads, its Telum II processor and Spyre Accelerator as well as AI agents from its watsonx platform.

While many folks think of mainframes as creaky old systems designed to process transactions without downtime, the z systems are big business for IBM and a strong upgrade cycle. In addition, IBM has reinvented the mainframe a few times for cloud workloads and now AI.

IBM's bet is that the z17, which will be available June 18, can take on new workloads beyond transaction processing. IBM said the system can score 100% of their transactions in real time and process 50% more AI inference operations per day than the z16.

According to IBM, the z17 can handle more than 250 AI use cases including loan risk, managing chatbot services, supporting medical image analysis and curbing retail shrink.

For IBM, the z17 is also a showcase for its integrated stack of R&D, software, AI and hardware.

Components include:

  • Telium II processor, which has a second-gen on-chip AI accelerator with a 1 millisecond response time.
  • IBM Spyre Accelerator that will be available in the fourth quarter of 2025 via PCIe card. Spyre will bring genAI tools to the mainframe and run assistants leveraging data on the system.
  • Spyre will enable z17 to run a host of IBM Granite models natively.
  • A series of AI agents and assistants from IBM watsonx including watsonx Code Assistant for Z and Assistant for Z. Watsonx Assistant for Z will be integrated into Z Operations Unite for AI-driven incident detection and resolution.
  • Z Operations Unite, available in May, combines logs from IBM Z in OpenTelemetry format to streamline operations and detect anomalies.
  • IBM will also include HashiCorp tools to standardize secrets management.
  • z17 will also include data security tools via Telium to identify and protect data with natural language.

In addition to the AI focus, IBM z17 will feature z/OS 3.2 that will be released in the third quarter. The operating system will support AI capabilities, modern data access methods, NoSQL databases and hybrid cloud data processing.

Constellation Research's take

R "Ray" Wang, CEO of Constellation Research, said:

"While many may have written off the mainframe, there are three reasons this release is significant:

  • Y2Q is closer than we think and the Z is quantum secure.
  • Cost per MIPS/kwh gives the Z very efficient performance.
  • AI is one area where a return to on-premises is a real value prop.

When speaking with existing customers who are using mainframe, they are mostly looking to increase their investment."

Constellation Research analyst Holger Mueller said:

"The typical adage of hardware platforms going to die does not apply to the IBM mainframe. Z has had at least 7 lives and is now back in the AI era. And its core value proposition of bringing data and processing power into one place - is even more attractive in the AI era compared to long passed client server era. Kudos goes to IBM for innovating and delivering value on System Z for enterprises over decades (APIs, Java, hybrid cloud and more come to mind as highlights of the last 20 years)."

Data to Decisions Tech Optimization IBM Chief Information Officer

Vint Cerf on AI, critical thinking, the internet's future

Vint Cerf said the introduction of AI is about much more than technology so much so that it may make sense to have a few sociologists, psychologists and anthropologists to help with policy.

Cerf, currently Vice President and Chief Internet Evangelist for Google, contributes to global policy development and is widely known as a father of the internet since he was a co-designer for TCP/IP protocols and architecture.

In an interview with DisrupTV, Cerf touched on multiple topics including space, AI and future connectivity. "We're entering into a period of abundance of computing and communication capability that will, I think, enable some amazing accomplishments over time," said Cerf. "It's hard to believe that so much has happened over the past 50 years, and there's so much more to go."

Here are the key takeaways from Cerf:

Evolution of internet capacity. "The increasing capacity of the internet to move data" is a significant development, said Cerf. He said there's also a need for higher speeds and optical fiber to accommodate a growing base of internet users.

Expansion beyond Earth. Cerf said "the expansion of the internet to low Earth orbiting satellite systems and off-planet expansion, including the development of an interplanetary internet backbone" is the future of connectivity.

AI's role in technology. Cerf emphasized the importance of AI in "accelerating software development and improving accessibility for people with disabilities." Cerf said AI can enhance user experiences.

Critical thinking in the age of AI. Cerf said "the importance of critical thinking and understanding the provenance of information in the age of AI" is more critical now than it ever was. People need to be able to use critical thinking to understand what's real--and not.

Cerf said:

"When we introduce technology, we should bring with us sociologists and psychologists and anthropologists to help us understand the impact of the technology on the societies that are ingesting it. We didn't really do that as the internet. We really seriously need to do that today, because the technology we're now introducing artificial intelligence."

Internet responsibility. Cerf said that while the internet has connected like-minded people, it has also enabled harmful activities and filter bubbles. He emphasized "the need for accountability and responsibility" in online interactions.

Internet education. Cerf proposed the idea of an "internet driver's license" to educate people on safe and effective internet use, highlighting the importance of digital literacy.

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

IBM acquires Hakkoda

IBM said it has acquired data and AI consultant Hakkoda in a deal that will give it data migration platform services. Hakkoda is a big partner for Snowflake.

The purchase gives IBM Consulting more heft in data migrations and transformation. IBM Consulting acknowledges that services to modernize data estates are critical to any future enterprise AI efforts.

Terms of the deal weren't disclosed. IBM has been rounding out its offerings with a series of acquisitions.

IBM said Hakkoda's portfolio includes:

  • Generative AI tools that speed up data modernization projects.
  • A strong customer base in financial services, public sector, healthcare and life sciences.
  • BI modernization tools.
  • Managed services for Snowflake.
  • Award winning services for Snowflake engagements. Hakkoda is an Elte Snowflake partner as well as an advanced-tier partner of AWS.

IBM said Hakkoda will also expand on IBM Consulting Advantage, a program that uses AI to speed up consulting delivery.

 

 

Data to Decisions Innovation & Product-led Growth IBM Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Meta launches Llama 4 suite, ups ante in LLM wars

Meta launched Llama 4 family with open weights with two models available on AWS, Microsoft Azure and Google Cloud.

With the large language models (LLMs) wars well underway, Meta announced Llama 4 on a Saturday so it can have the spotlight for a bit. With LLMs developing at a rapid clip, one advance can be overshadowed by another in just hours.

According to Meta, the first installment of the Llama 4 suite is just the start. The company said more details about its AI vision will be outlined at LlamaCon on April 29.

Here's the Llama 4 family, which features natively multimodal AI, announced so far:

Llama 4 Behemoth: A 288B active parameter model with 16 experts and 2 trillion total parameters. Behemoth is in preview, still training and is Llama's most intelligent teacher model for distillation.

Meta said other Llama 4 models in its suite are distilled from Behemoth, which outperforms GPT-4.5, Claude Sonnet 3.7 and Gemini 2.0 Pro on multiple STEM benchmarks.

Llama 4 Maverick: A 17B active parameter model with 128 experts and 400B total parameters. Maverick, which is available now, is native multimodal with 1M context length.

Maverick competes and wins against OpenAI's GPT-4o and Gemini 2.0 Flash and has comparable results to DeepSeek v3 on reasoning and coding.

Llama 4 Scout: A 17B active parameter model available now with 16 experts and 109B total parameters. Scout has a 10M context length and is optimized for inference.

Meta said Scout is more powerful than all of the previous Llama models and fits in a single Nvidia H100 GPU. Meta said Scout outperforms Gemma 3, Gemini 2.0 Flash-Lite and Mistral 3.1.

Meta's Llama 4 launch is notable because enterprises are using it widely via the big three cloud providers. AWS said Llama 4 is available on Amazon Bedrock, Microsoft Azure features it on Azure AI Foundry and Google Cloud has the LLM family on Vertex AI. Companies are taking Llama models, which are available on Hugging Face and Llama.com, and tailoring them to specific use cases.

Meta CEO Mark Zuckerberg has said the company's goal is to make Llama as the flagship open source model and compete with or top proprietary competitors. Meta is also leveraging Llama 4 throughout its properties so taking it for a spin requires WhatsApp, Instagram or Facebook.

Here's a look at the Llama 4 architecture followed by benchmarks for Maverick.

Constellation Research's take

Constellation Research analyst Holger Mueller said:

"If someone wonders who is the leader in a tech category, watch the announcements before a major user conference by rivals. In this case, Google Cloud Next is looming, and both Microsoft and Meta make their boldest AI announcements yet. in Metas cases even previewing its Behemoth model is designed to stake out new ground. The Meta pitch of proving Llama across the three major cloud providers appeals to CxOs as it allows AI automation portability across cloud providers. On the flip side, the gap to Google is clear when it comes to Meta. Llama 4 is 12 months late with the 1M context window and being multimodal with Llama 4 Maverick. That was the news at Google Cloud Next in 2024."

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity meta AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

After volatile first quarter, these 10 questions loom over enterprise technology, CxOs

In enterprise technology you don't always have the answers, but that should never stop you from knowing the right questions. With that in mind, let's ponder a few questions now that the first quarter is closed and the second quarter is just ramping up. These themes are likely to reappear in our news and analysis in the months ahead.

Here’s a look at the big questions for the second quarter.

What's the state of enterprise tech spending? The first quarter earnings calls and outlook for the second quarter and 2025 are about to get really interesting. The enterprises reporting results in the second half of March mostly pointed to either a slowdown or a pause in spending.

Geopolitics are a big reason why enterprises can't plan ahead. You can expect a lot of tech vendors talking about elongated selling cycles, more deliberation over deals and spending pauses.

For a glimpse of how CEOs are on the struggle bus with geopolitical planning, check out Restoration Hardware CEO Gary Friedman, who was volleying with analysts about his supply chain right as President Trump was announcing tariffs. "Like we're just really well-positioned right now. I think housing is the headwind. I guess the stock went down based on some of the numbers we reported and then it got killed because of a--oh shit okay--I just looked at the screen. I hadn't looked at it," said Friedman, who was talking as Vietnam was hit with a big tariff. "It got hit when the tariff came out and everybody can see in our 10-K where we're sourcing from. It's not a secret and we're not trying to disguise it by putting everything in an Asia bucket."

There were a lot of Gary Friedmans this week. And those CxOs all buy enterprise technology.

Sign up for the Constellation Insights newsletter

What projects will get funded in a recessionary/stagflation environment? Optimization has been an ongoing theme in enterprise technology whether it's via automation or artificial intelligence. Some enterprises will look to reload in a downturn for the rebound. For instance, Rocket bought Redfin and Mr. Cooper in a move to consolidate the mortgage industry. These acquisitions are notable since they are also about acquiring the first-party data to feed models.

The time is now to position for the next economic cycle. How many enterprises will be able to reload?

Will mergers and acquisitions be justified based on first-party data? Rocket's acquisition spree (Redfin and Mr. Cooper) is notable because it's focused on buying the first party data that can be used in its models. However, there's an additional thread to consider here. Rocket covered why the acquisitions made sense based on 30 petabytes of data, but there are also good business reasons to make the purchases beyond training AI models. Rocket's acquisitions are based on industry consolidation and feeding the sales funnel too.

A diagram of a company's homeownership platform

AI-generated content may be incorrect.

Will agentic AI be a wait-and-see affair? We've had the vendor announcements. We've had the slideware talking about orchestration layers. We've even had vendors rolling out agents throughout their platforms just waiting for customers to use them under a consumption model.

However, CxOs in the Constellation Research network are taking a measured approach to AI agents. Yes, these enterprise leaders are bullish on AI agents and the promise. However, CxOs also realize that their AI agents will need to be cross-platform, have communication standards, and include a heavy dose of process knowledge and automation to really work.

Add it up and it sounds like the agentic AI spending curve is going to be more of a second half of 2025 phenomenon. Also see Constellation Research’s report on AI trends for 2025 and beyond.

A diagram of data collection

AI-generated content may be incorrect.

Will consumption models from SaaS vendors be welcome? Enterprises are starting to see their traditional SaaS contracts (seats and subscriptions) adding a consumption layer to account for agents? ServiceNow CEO Bill McDermott calls this hybrid monetization model a Goldilocks scenario. It's quite possible that consumption models will be a headache for IT budget forecasting.

Why can't vendors come up with something better than AI studio when it comes to product names? As vendors tout AI agent orchestration platforms the one common thread is the dreaded "AI Studio." Now it makes sense that these vendors need AI studios to design, manage and orchestrate AI agents, but a new name would be lovely.

Will quantum computing enterprise use cases go mainstream in 2025? The quantum computing announcements just keep coming with vendors claiming various breakthroughs and supremacy over classical computing.

So far, 2025 appears to be the year of quantum computing.

Is the AI infrastructure boom now a bubble? This question doesn't necessarily affect enterprises given that the AI infrastructure boom rides with a few companies you can count on two hands--Nvidia, OpenAI, Meta, Google Cloud, AWS and Microsoft Azure with funding from the likes of Blackrock and Softbank.

But the leverage, big plans and commoditization of models and potentially GPUs may point to turbulence ahead.

Do we need to start sketching out humanoid robotic projects? It’s clearly early, but the intersection of humanoid robotics and AI is going to be interesting.

Will mergers and acquisitions pick up? Google Cloud’s $32 billion purchase of Wiz indicates that the company thinks it can gain regulatory approval. Other acquisitions are more in the tuck-in variety. Nevertheless, the dealmaking appears to be picking up. Given that HPE still can’t get its Juniper acquisition across the finish line, I’d say the jury is still out on M&A activity.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Microsoft updates Copilot, Azure AI Foundry

Microsoft added a bevy of new features for Copilot to enhance memory, ability to take action and conduct research and also updated its Azure AI Foundry with tools to build AI agent systems.

The news, which lands as Microsoft celebrates its 50th birthday, lands as large language model (LLM) advancements are almost happening daily. Here's the rundown of Microsoft's Copilot updates. Also see: Google Gemini vs. OpenAI, DeepSeek vs. Qwen: What we're learning from model wars

  • Copilot now has memory and personalization to retain details from user interactions so it can add suggestions, reminders and personalized answers.
  • Microsoft said Copilot can take action on your behalf via Actions and navigate most sites on the web to book tickets and make reservations.
  • Copilot Vision is available on mobile and Windows.
  • Copilot Deep Research has been added to handle research tasks.
  • Microsoft added Copilot Search to bring Bing directly to Copilot as well as Pages, Podcasts and Shopping.

For Azure AI Foundry, which now has more than 60,000 customers, Microsoft outlined the following updates to position it as an agent factory:

  • AI Red Teaming Agent, which is in public preview, will test AI models to uncover safety vulnerabilities.
  • Agent evaluations, also in public preview, will provide risk and quality assessments for AI agent systems.
  • Semantic Kernel agent framework, which is available, simplifies the code developers need to build and coordinate multiple agents in a system.

More:

Data to Decisions Future of Work Microsoft Chief Information Officer

Long live on-prem data centers in the age of AI. We're serious

The enterprise data center was supposed to be dead by now, but the reality is on-premises workloads are getting new life courtesy of AI workloads.

That's a key take away from Constellation Research analyst Holger Mueller. In a report, "The Case for Data Centers Is Alive and Well in 2025," Mueller acknowledged the reality for on-premises infrastructure and said "workloads are not moving to the public cloud as fast as the prognosis looked a decade ago, so enterprises need to operate in a hybrid cloud environment, with critical enterprise workloads operating both in the public cloud and on-premises, in the private cloud."

Here are a few reasons why hybrid cloud is here to stay:

  • CxOs have to support a range of workloads. Yes, AI agents get the headlines, but enterprises are taking AI all the way to the edge. Those edge use cases are more on-prem.
  • Data center utilization needs make on-premises more cost effective. Performance in many regions is still better with on-prem data centers even with cloud advances.
  • IT operations are complex and require more heterogeneity. Data centers are a nice hedge for shifting business priorities, leverage vs. public cloud providers and budgeting (capital expenses vs. operating expenses).
  • Compliance is more of an issue due to large software portfolios, data residency and regulations that make data centers more attractive.
  • AI automation will need on-prem platforms and predictable workloads run better on data centers.

Mueller noted that every enterprise will have a different take on on-premises computing. Variables include where SAP workloads will reside, data gravity and innovation from cloud vendors vs. hardware vendors.

Bottom line: Data centers have been declared dead for years, but CxOs can see lower cost of ownership and perks as long as they keep options open with a cloudlike consumption model and can leverage high performance computing.

Related:

Data to Decisions Tech Optimization Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

HOT TAKE: ServiceNow’s Logik.ai Pick Up Signals CPQ’s Shift to Experience

Once thought of as a “seller’s tool”, Configure, Price, Quote (CPQ) solutions have shifted in recent years to be a far more robust, interactive and dynamic experiential platform. Once the domain of high-touch consumer commerce motions, and thanks to mainstay players heavily investing in the value of customer experience and sleek interfaces, CPQ has elevated the configuration experience to everything from software to services, without sacrificing the pinpoint accuracy of pricing, discounting and quoting that sellers depend on.

So, it shouldn’t be a shock that on their quest to shake up what cross-enterprise customer relationship management IS, ServiceNow has picked up the AI-forward sales and commerce CPQ darling, Logik.ai. The real question will be how quickly ServiceNow customers reimagine how they can empower selling and commerce no matter where sellers or buyers exist outside of a traditional CRM or Sales environment...and how far CPQ could reach to impact revenue in critical touchpoints like field service, service desk or contact center.

What We Know About the Deal: Not much has been revealed about the financial details of the deal. What we know is thatServiceNow has signed definitive agreement to purchase the company founded in 2021 that in January announced a year-over-year revenue growth cycle tripling in size with a customer roster of many Fortune 1000 enterprises. We also know that Logik.ai had successfully raised a $16 million in Series A (2023) and $25 million in Series B (2024) funding. Coming out the gate as Logik.io, the AI powered CPQ had been backed by the likes of Emergence Capital, Salesforce Ventures, and ServiceNow ventures…which once again has this analyst wondering just how many AI solutions ServiceNow plans on sneaking out from Salesforce. What we do know is that this won’t be the first acquisition rodeo for Logik.ai’s leadership team who hail from a very familiar name in CPQ. The deal is expected to close in mid-2025, pending regulatory approvals and closing conditions. Check out the official release, and the CR Insights on the announcement that includes thougths from my colleague covering Revenue and Growth technologies, Martin Schneider.

What Makes Logik.ai so Interesting: From the User Interface to the AI recommendations that flow across the entire experience, there is little doubt Logik.ai represents an evolution in the CPQ market. And that also should not be a surprise as the leadership team are true veterans of the space. Their deep roots into commerce-driven customer-centric experience has tell-tale heritage reminiscent of the literal OG of CPQ’s, BigMachine, acquired by Oracle and to this day one of the blueprints for experiential interfaces. For BigMachine, the experience was largely differentiated by a keen understanding that these moments of configuration were also opportunities to capture high fidelity signal from the customer. That is exactly where Logik.ai started and hasn't stopped innovating, injected a massive dose of AI and automation without leaving that heritage of experience design. For this analyst, Logik.ai had me at conversational quoting and a highly configurable rules engine, but then again, who doesn’t love saving time and focusing on the efficiency of the revenue process. But where Logik.ai gets really interesting is in its ecosystem and how that has lifted CPQ out from that traditional path of order management. With partnerships with Shopify, Adobe and BigCommerce, Logik,ai hits the ground running across multiple facets of the ServiceNow portfolio. And no…I haven’t even TOUCHED on their recently announced subscription management offerings that offer much needed relief to fast-moving commerce. So bottom line, there is a lot to like here…and a lot that can spread across the ServiceNow ecosystem.

For CX and Strategy Leaders: Yes, ServiceNow is expanding its reach and has emerged as a significant CRM player ideally suited for organizations with complex selling or commerce motions. This addition begins to round out a motion that is omnichannel on purpose, data-driven by design and intentionally leveraging the “platform of platforms” approach to AI and meaningful conversational moments that can and should beget even more meaningful moments. However, this should not be considered a single function tool or an upgrade to a selling system. That would be far too limiting and overlooks the opportunity to finally answer that age old question of how to be empower sellers in non-traditional commerce and revenue roles. In the past we’ve relied on sales and order management self service motions to empower field service to upsell or cross sell. Likewise, we’ve inserted self-service commerce modules into contact center or service desk flows that have often not connected to the customer’s actual real-time journey. What has resulted are sometimes clunky experiences where self-service is most often self-serving routing customers back to sellers and off of their prefered channel or experience of choice.

As CPQ continues to become increasingly experiential, expect to see the field deploying these commerce opportunities to not just move the revenue needle, but to also collect signal and data from customers that has proven vital to future engagement and future opportunity. For CX strategists, ask now about where and how AI tools like Logik.ai are integrated into personalization journeys. How does the signal provided directly from the customer being integrated into CRM to not just track transaction but to inform next best engagement opportunity. Does Service have access to the configuration AND the quote…or are we leaving that front line resource blind to the customer’s original intentions?

But also don’t ignore the AI and workflows that Logik.ai brings to the table…because rest assured ServiceNow has not. These aren’t just random acts of automation, but should be seen as the next evolution of what a platform of platforms is meant to curate, orchestrate and expose when putting AI to work.

Parting Thoughts: There will always be order management and quoting tools that rightly focus in on the effectiveness and efficiency of the seller’s role. These tools will set out to manaage the complexity of the selling process with an organziation. Tools that also manage and orchestrate discounts and the quote-to-cash model remain critical mainstays of the stack, and will continue to manage the complexity of financial processes. But what is important to consider here is where and how AI driven interactive CPQ tools can manage the complexity of a customer's needs and expectations. This is ServiceNow resetting the mold of how experience works and falls, to be honest, it totally in line with how Constellation Research has long asserted Customer Experience stratgegy SHOULD be viewed: as an enterprise-wide team sport that delivers durable, profitable experiences connecting buyers to brands. In this case, that strategy is decidedly interactive, driven by conversation, rich with data and silly with intentional acts of automation that hasn’t forgotten that even the most complex sales  and commerce cycles shouldn’t be painful. This is about allowing CPQ to be an outside in experience tool where everybody wins.

Data to Decisions Future of Work Marketing Transformation Matrix Commerce Next-Generation Customer Experience Chief Customer Officer Chief Executive Officer Chief Financial Officer Chief Marketing Officer Chief Digital Officer

ServiceNow acquires Logik.ai, as it steps up CRM efforts

ServiceNow said it will acquire Logik.ai, which specializes in AI-driven configure, price and quote (CPQ) software.

Terms of the deal weren't disclosed. ServiceNow said that the purchase of Logik.ai will expand its reach into CRM. ServiceNow said Logik.ai will also accelerate its efforts in sales and order management processes.

 

ServiceNow said Logik.ai will bring CPQ tools that include transaction management as well as a rules engine for deals. ServiceNow will integrate Logik.ai into its CRM and Industry Workflows, a fast growing category for the software vendor.

The Logik.ai purchase is the latest in a set of tuck-in acquisitions for ServiceNow.

Other key points about the Logik.ai purchase include:

  • Logik.ai features workflows for direct sales, partner, direct-to-business and consumer self-service.
  • Logik.ai's platform is designed to be composable so it can handle large volumes of quotes at scale.
  • ServiceNow was already a Logik.ai technology partner along with other companies including integrations with Salesforce, Oracle and Adobe.
  • ServiceNow is making a big push into CRM with its latest Yokohama release.

Constellation Research's take

Constellation Research analyst Liz Miller said:

"CPQ is an important yet sometimes overlooked part of the customer’s journey and experience. While often seen as a seller’s tool to close deals faster and quote more effectively, CPQ can deliver high fidelity signal from customers about current and future needs. With Logik.ai, ServiceNow brings on a highly configurable, API forward solution that not only ties into their CRM functionality, but could also be a valuable addition to existing service-centric offerings, bringing much needed commerce options into service environments."

Constellation Research analyst Martin Schneider said:

"The addition of CPQ capabilities to ServiceNow‘s expanding CRM portfolio adds an interesting angle, given how the company is approaching the sales side of CRM. These new capabilities will definitely build out their offerings for human-assisted sales, but also enable the ability for companies using ServiceNow to offer interesting self service options where customers can better configure the right products and services they need. These options increase customer satisfaction and reduce operation costs and streamline repeat business with less friction. Another interesting component will be how the company measures these new capabilities with its agentic AI platform to further optimize how product bundles are configured and pricing is optimized."

 

Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity servicenow Chief Information Officer

The Shifting Sands of AI: Why Enterprise Leaders Need to Look Beyond OpenAI

Media Name: sand-dunes-cc0-pexels-francesco-ungaro-998657-0.jpg
1

A Rapidly Evolving Landscape; OpenAI's Disappearing Moat

We've been watching the generative AI landscape transform at breathtaking speed, and what concerns us most is how quickly the narrative around OpenAI has shifted from "unassailable market leader" to "company facing existential challenges." As leaders who have spent our careers at the intersection of technology, policy, and enterprise strategy, we believe that organizations making multi-million dollar AI investments need to understand the broader context beyond the marketing hype.

The concept of a "moat" in business refers to sustainable competitive advantages that protect a company from competitors. OpenAI's initial moat was built on first-mover advantage, technical superiority, and massive funding. All three pillars are now showing significant cracks.

Microsoft—OpenAI's primary backer—has began testing outside models from xAI, Meta, and even Chinese company DeepSeek. Simultaneously, Apple appears to be reconsidering its OpenAI partnership, now engaging with Google about Siri integration. These moves by two of the world's most valuable companies signal serious concerns about OpenAI's trajectory.

The technical superiority argument is also collapsing. OpenAI's rushed GPT-4.5 release shows a 30% error rate—significantly worse than both Anthropic's Claude 3.7 and xAI's Grok3. When your core product is underperforming relative to competitors, enterprise customers take notice.

 

Competition Is Intensifying; The Economics Don't Add Up

While OpenAI struggles, competitors are gaining momentum. Anthropic secured a $3 billion investment from Google and released Claude 3.7, which many consider technically superior to OpenAI's offerings. Elon Musk's xAI launched Grok3 with impressive deep research capabilities. Even OpenAI's former CTO, Mira Murati, launched Thinking Machines Lab and raised $2 billion at a $9 billion valuation in just two weeks.

And we can't ignore developments from China. Within the last few weeks,they announced what they described as the world's first fully autonomous AI agent, called Manus. Unlike some overhyped Western announcements, Chinese AI capabilities have generally delivered on their promises. This represents both competitive and geopolitical considerations for enterprise leaders.

The financial picture is equally concerning. OpenAI is reportedly burning through $1 billion monthly and could lose up to $44 billion by next year. Sam Altman himself admitted they lose money on every $200/month ChatGPT subscription. Their recent announcement of enterprise offerings priced between $2,000-$20,000 monthly appears to be a desperate attempt to stem these losses.

This pricing strategy reveals a company pivoting toward enterprise customers out of necessity rather than strength. But this market is already dominated by Microsoft, Amazon, and Google, who have decades-long relationships with Fortune 500 companies. OpenAI faces an uphill battle against entrenched competitors with deeper pockets and broader offerings.

Despite the recent headline-grabbing $40 billion funding round that catapulted OpenAI's valuation to $300 billion and reports that the company's revenue has grown by 30% in three months, the company still doesn't expect to break even until 2029—four years from now! This timeline raises serious questions about the sustainability of their business model, especially as they continue to burn through cash at an alarming rate.

In a telling strategic pivot, OpenAI has also announced plans to launch an open-weights reasoning model that developers can run on their own hardware. This represents a significant departure from their closed system subscription model and suggests an acknowledgment that their current approach may not remain competitive in the long term. This move appears to be a course correction in response to mounting pressure from both open-source alternatives and competitors offering more flexible deployment options.

 

Strategic Implications for Enterprise Leaders

For CEOs, CTOs, CIOs, and CMOs, these developments necessitate a more sophisticated approach to AI strategy. The days of simply "partnering with OpenAI" as a complete AI strategy are over. We believe enterprise leaders need to consider:

  • Geopolitical factors: How will US-China tensions affect your AI supply chain? What regulatory frameworks are emerging in different regions?

  • Economic sustainability: Are your AI partners financially viable for the long term? What happens if they significantly raise prices or pivot their business models?

  • Technical diversification: How can you build an AI architecture that isn't dependent on a single provider?

Enterprise clients can implement what we call a "multi-modal, multi-model" approach. This means leveraging different AI models for different use cases and maintaining the flexibility to switch providers as the landscape evolves. The companies that will win in the AI era aren't those that pick the "right" vendor today, but those that build adaptable AI architectures.

OpenAI's current valuation approaching $300 billion seems increasingly disconnected from economic reality. While they deserve credit for catalyzing the current AI revolution, enterprise leaders need to recognize that we're entering a new phase where multiple players will drive innovation.

The next 18 months will be critical. We'll see consolidation among smaller AI companies, continued heavy investment from tech giants, and potentially surprising moves from nation-states viewing AI as critical infrastructure. Enterprise leaders need to stay informed not just about the technology, but about these broader market and geopolitical dynamics.

The bottom line for enterprise leaders: your AI strategy needs to be as sophisticated as the technology itself

Look beyond the hype, consider the full spectrum of factors at play, and build flexibility into your approach. We believe the latest "wave" of the current AI revolution is just beginning, and the winners will be those who navigate its complexities with clear-eyed strategic thinking.

 

Data to Decisions Future of Work Innovation & Product-led Growth Marketing Transformation New C-Suite Tech Optimization Chief Analytics Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Information Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Supply Chain Officer Chief Technology Officer