Results

Informatica says it'll hit Q1 targets, says not actively in acquisition talks

Informatica said its first quarter results will be at the upper end of its guidance given in February and is "not currently engaged in any discussions about being acquired."

Following its fourth quarter results, Informatica projected first quarter revenue of $375 million to $395 million with subscription annual recurring revenue of $1.135 billion to $1.155 billion. Cloud subscription ARR was projected to be between $645 million to $655 million.

Informatica also reaffirmed its 2024 financial outlook. It also said Jitesh Ghai, Chief Product Officer, is leaving to pursue another executive position at another company.

The outlook and statement from Informatica lands amid reports that the company was in talks to be acquired by Salesforce. The Wall Street Journal initially reported Salesforce was in advanced talks to acquire Informatica, but later said the two parties couldn't agree on a price.

Reports of a Salesforce-Informatica combination also gave investors time to vote with their money. Salesforce shares took a hit on the news. 

Informatica's position in the market has improved as the need for data management platforms is critical for enterprises to adopt artificial intelligence applications. In addition, Informatica is viewed as a neutral party amid multiple enterprise applications.

An acquisition by Salesforce would have altered that neutrality. Informatica would have had some overlap with Salesforce's MuleSoft unit but would have been largely additive in data integration, metadata management, data governance and master data management. Informatica would have also turbocharged Salesforce's fast-growing Data Cloud.

Data to Decisions Innovation & Product-led Growth informatica Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Financial services firms see genAI use cases leading to efficiency boom

Generative AI use cases are proliferating in financial services to illustrate a trend that's a bit counterintuitive--heavily regulated industries appear to be better set up for artificial intelligence because they have their data and controls in order.

Goldman Sachs CEO David Solomon said he's bullish on generative AI as an investment and financing theme as well as for internal operations. Solomon turned up at Google Cloud Next keynote and then followed up on the bank's first quarter earnings call. Goldman Sachs CIO Marco Argenti outlined the firm’s AI thoughts nearly a year ago. Solomon noted:

"For our own operations, we have a leading team of engineers dedicated to exploring and applying machine learning and artificial intelligence applications. We are focused on enhancing productivity, particularly for our developers, and increasing operating efficiency while maintaining a high bar for quality, security, and controls."

JPMorgan Chase CEO Jamie Dimon also spent a lot of space in his shareholder letter on genAI. The company has more than 400 use cases in production. "We're also exploring the potential that genAI can unlock across a range of domains, most notably in software engineering, customer service and operations, as well as in general employee productivity," said Dimon.

What's interesting about genAI use cases is that the regulated industries can move ahead other industries. Why? Regulated industries already have the data best practices and controls required for generative AI deployments. That case was made by AWS' Matt Wood, VP of AI, in a recent meetup.

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

Data from the Stanford University's 2024 AI Index Report highlights how financial services is a standout when it comes to embedding AI into the business. Citing McKinsey data, the AI Index Report highlights how financial services stick out. Here's a few charts to ponder.

At Google Cloud Next, Bernd Leukert, Chief Technology, Data and Innovation Officer at Deutsche Bank, and KeyBank CIO Dean Kontul put a few anecdotes around the use case data. They outlined the use cases being deployed. Kontul noted that 2023 was more about banks navigating interest rates than generative AI. But 2024 features genAI as more of a priority.

Leukert and Kontul said generative AI is a technology where enterprises can't afford to wait because the technology will change business. "I think you have to pick your spot on the hype curve and look at what it can do for operational efficiency," said Kontul. "You have to control for costs, but generative AI use case is going to be real, tangible expense takeouts and eventually remove customer friction so there will be use cases around revenue as well."

Other realities driving use cases for generative AI include:

  • High-end sponsorship internally. CEOs want genAI and that top-down support will move proofs of concept to production.
  • There's an intersection of process improvement and automation and generative AI.
  • GenAI projects will require an ongoing conversation with employees about their roles in the future, how to upskill and reskill and remain with enterprises in different capacities. Leukert said his bank "introduced as a principle that genAI is augmenting human capabilities, not replacing the human."

"There is so much acceleration of movement with generative AI," said Leukert.

Here's a look at some of the use cases highlighted by big banks in recent days.

Document processing to execution workflows. Leukert said generative AI's promise is that it can take thousands of documents coming from customers, process them and then take actions.

Any use case that can be deployed once and then scaled horizontally. Leukert said Deutsche Bank looked at that document processing use case and then noted that it was applicable across multiple services. "We reached out to the business and said let's collect use cases and cluster them into categories and then say if it works in that category for that type of a theme, then we know it is possible to be applicable across multiple application areas across multiple themes in that category," he said.

Business re-engineering. Kontul said genAI will transform business, so it makes sense to think beyond individual use cases. Efficiency is the initial target as genAI can make marketing, software, documentation and code development more efficient. "GenAI will touch every employee in some way at the bank and will do it in a more intuitive way than other technologies in the past," said Kontul.

Risk management. Leukert said generative AI is enabling the bank to be a stronger adviser in times of crisis and better manage risks. For instance, Deutsche Bank has been pouring data into its risk management to better manage liquidity demand during crises (Covid-19, Russia's war with Ukraine) and model next moves. "We think that banks need to manage risks at a much more detailed level than they have in the past to be prepared for the unknown," said Leukert.

Research and data collection. Leukert said content management is a primary use case especially for Deutsche Bank Research, which provides research reports to customers. For analysts, 80% of the work is data collection and 20% is digesting and building the report. Generative AI can now automate much of the data collection and build content to be turned into a report.

Compliance. Kontul said KeyBank has a pilot using genAI to keep apprised of all regulatory changes across states that affect the bank's products.

Engineering productivity. Kontul and Leukert both said their banks were pursuing genAI for coding and engineering productivity. Developer productivity also has the attention of the big players such as Goldman Sachs and JPMorgan Chase.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Equifax bets on Google Cloud Vertex AI to speed up model, scores, data ingestion

Equifax said it is deploying Google Cloud's Vertex AI platform across its systems as it accelerates data ingestion as well as new product launches.

Speaking on Equifax's first quarter earnings call, CEO Mark Begor said driving AI innovation is key to the company's growth goals via products such as Ignite and Interconnect. Begor said:

"During 2024, we're deploying both Equifax proprietary explainable AI along with Google Vertex AI across Ignite, Interconnect and our global transaction systems.

For Equifax, Vertex AI enables faster and more predictive model development on our Ignite platform. And for our clients, Ignite, which combines data analytics and technology into one cloud-based ecosystem, customers can connect their data with our unique data through our identity resolution process to gain a single holistic view of consumers."

Constellation Research previously covered Equifax's cloud transformation in a customer story. Equifax also was a reference account at Google Cloud Next.

Equifax reported first quarter revenue of $1.39 billion, up 7% from a year ago, with net income of $124.9 million.

Those platforms combined with Equifax's broad reach into datasets means faster ingestion and analytics and new products and services.

Begor said Equifax has access to 100% of the US population through its data sets in a single data fabric. He added that Equifax's cloud infrastructure is processing data 5x faster than its legacy applications could.

Equifax is also planning on completing its cloud transformation, closing data centers and saving $300 million a year in 2024. From there, Equifax will be focused on growing its EFX.AI offerings with higher performing models, scores and data products.

"Completing the cloud transformation also frees up our team to fully focus on growth and expanding innovation, new products and new markets. Our progress towards completing the cloud is gaining momentum with over 70% of our total revenue in the new Equifax Cloud at the end of the quarter," said Begor. "And we're focused on executing the remaining steps to reach 90% with Equifax revenue in the cloud by year-end."

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Meta launches Llama 3, wide availability with more versions on deck

Meta launched its Llama 3 open source large language model and said it will be available on AWS, Databricks, Google Cloud, Microsoft Azure, Snowflake, IBM WatsonX, Nvidia NIM and have support from enterprise hardware platforms.

Yes, we're in the age of weekly LLMs that leapfrog each other, but Llama 3, which will initial come in 8B and 70B parameters with more versions on deck, is of interest to enterprises.

Why? Companies are likely to look to capable open source LLMs and then look to fine tune them with enterprise data. Llama 3 represents a backdoor enterprise play for Meta.

In a blog post, Meta outlined the Llama 3 effort, which is a big leap over Llama 2. "Improvements in our post-training procedures substantially reduced false refusal rates, improved alignment, and increased diversity in model responses. We also saw greatly improved capabilities like reasoning, code generation, and instruction following making Llama 3 more steerable," said Meta.

Meta added that it evaluated Llama 3 based on prompts covering 12 use cases including brainstorming, advice, coding, creative writing, extraction and summarization to name a few. In this use case testing scenario, Llama 3 70B topped Claude Sonnet, Mistral Medium and Llama 2.

According to Meta, Llama 3 is being deployed across its applications including Facebook, Instagram and WhatsApp. Llama 3 is also available for a spin on the web.

Going forward, Meta said a 400B parameter model is training and it'll "release multiple models with new capabilities including multimodality, the ability to converse in multiple languages, a much longer context window, and stronger overall capabilities."

Data to Decisions Chief Information Officer

Infosys acquires In-tech, posts mixed Q4 with genAI progress

Infosys CEO Salil Parekh said the services provider is landing large deals and "seeing excellent traction with our clients for generative AI work," but its fourth quarter was mixed. Infosys also said it would acquire In-tech, an engineering and R&D services provider catering to the German automotive industry.

The company reported fourth-quarter earnings of $958 million on revenue of $4.56 billion, flat from a year ago. Wall Street was expecting Infosys to report fourth quarter earnings of 17 cents a share. For the year ended March 31, Infosys reported earnings of $3.17 billion on revenue of $18.56 billion.

For fiscal 2025, Infosys projected revenue growth of 1% to 3% with operating margins of 20% to 22%.

Constellation Research CEO Ray Wang said Infosys is facing a bevy of moving parts that the company will have to navigate. "The combination of AI arbitrage, margin compression, and exponential efficiency is having an effect on the overall market.  Constellation expects that the overall service market will be flat to single digit growth for the next year," said Wang.

Constellation ShortListâ„¢ Digital Transformation Services (DTX): Global

Indeed, Infosys outlined a bevy of items on its earnings call. During the fourth quarter, Infosys said it had "a rescoping and renegotiation of one of the large contracts in the financial services segment." That renegotiation led to a 1% revenue hit, but 85% of the contract continued as is.

Parekh on a conference call said Infosys said fiscal 2024 brought in $17.7 billion in large deals. These large deals revolve around cost efficiency and consolidation.

Here's a look at what Infosys is facing.

The good

Generative AI remains a highlight for Infosys. Parekh said:

"We're working on projects across software engineering, process optimization, customer support, advisory services and sales and marketing areas. We're working with all market-leading open access and closed large language models.

As an example, in software development, we've generated over 3 million lines of code using one of generative AI large language models. In several situations, we've trained the large language models with client specific data within our projects. We've embedded generative AI in our services and developed playbooks for each of our offerings."

Public and private cloud migrations remain a priority. "We continue to work closely with the major public cloud providers and on private cloud programs for clients. Cloud with data is the foundation for AI and generative AI and Cobalt encompasses all of our cloud capabilities," said Parekh.

Data and automation. Parekh said the acquisition of In-tech played into the data strategy for Infosys. "We see data structuring, access, assimilation critical to make large language models and foundation models to work effectively, and we see good traction in our offering to get enterprises, data ready for AI," he said.

The challenges

Jayesh Sanghrajka, CFO of Infosys, said the company saw 180 basis points of margin compression quarter over quarter due to the renegotiation of the large contract, salary increases, brand building and visa expenses.

Infosys did offset the margin hit somewhat by lower post sales customer support and efficiency efforts called Project Maximus.

On the economy, Sanghrajka said:

"We continue to see macroeconomic effects of high inflation as well as high interest rates. This is leading to cautious spend by clients who are focusing on investing in services like data, digital, AI and cloud."

Industry demand

Sanghrajka said industry demand was mixed with strength in financial services and manufacturing as well as retail.

Financial services: "Financial services firms are actively looking to move workloads to cloud, pipeline and deal wins are strong and we are working with our clients on cost optimization and growth initiatives."

Manufacturing: "There is increased traction in areas like engineering, IoT, supply chain, smart manufacturing and digital transformation. In addition, our differentiated approach to AI is helping us gain mind and market share. Topaz resonates well with the clients. We have a healthy pipeline of large and mega deals."

Retail: "In retail, clients are leveraging GenAI to frame use cases for delivering business value. Large engagements are continuing S/4HANA and along with infra, apps, process and enterprise modernization. Cost takeout remains primary focus."

Communications: Clients remain cautious, and budgets are tight. Sanghrajka said cost takeout, AI and database initiatives may show promise.

Overall, Sanghrajka said Infosys should benefit with large deals recently won as well as AI. "We are witnessing more deals around vendor consolidation and infra managed services. Deal pipeline of large and mega deals is strong due to our sustained efforts and proactive pitches of our cost takeouts and digital transformation, etc., across the subsectors," he said.

 

 

 

 

Data to Decisions Next-Generation Customer Experience Tech Optimization infosys Chief Information Officer

SAS launches industry-focused models, Model cards that serve as AI nutrition labels

SAS said it will launch a series of AI models that are lightweight and focused on industry use cases. The company also added generative AI features to its Viya platform and unveiled "nutrition labels" for models.

The news, outlined at the company's SAS Innovate conference in Las Vegas, is part of the company's broader investment in AI.

"Once seen as the laggard in the industry, the decades of experience in mastering data by SAS is now very valuable in the packaging of data models for easy consumption for customers for AI.  They are making it easier for customers to put AI to work," said Constellation Research CEO Ray Wang. 

Here's the breakdown of what was announced:

Industry-focused models. SAS said it will roll out a series of models for individual license starting with an AI assistant for warehouse space optimization designed to enable nontechnical users to optimize and plan faster.

The general idea of the industry AI models is to give enterprises something they can deploy quickly with low overhead costs. SAS will target financial, healthcare, manufacturing and public sector AI models.

77% of CxOs see competitive advantage from AI, says survey | Why digital, business transformation projects need new approaches to returns | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge

SAS' bet is that it can move beyond large language models and drive value with industry-proven AI models for fraud detection, supply chain, document conversation and healthcare payments to name a few.

SAS Viya gets generative AI tools. SAS Viya will get trustworthy genAI tools and introduce a synthetic data generator called SAS Data Maker.

Viya has genAI orchestration tools to integrate external models, Viya Copilot for developer, data science and business productivity, Data Maker, which is in private preview, and genAI features in SAS Customer Intelligence 360.

Model cards and AI governance. SAS also said that it will offer model cards, which are a nutrition label for AI designed to flag bias and model drift, as well as AI governance services.

SAS said Model cards will be an upcoming feature in Viya in mid-2024.

SAS Viya Workbench goes GA. SAS announced the general availability of SAS Viya Workbench aimed at model developers. Viya Workbench is a self-service compute environment for data prep, analysis and developing models. Viya Workbench will be available by the end of 2024 with SAS and Python initially and R to follow. Yiya has two development environment options Jupyter Notebook/JupyterLab and Visual Studio Code. 

AWS and SAS expand partnership. SAS said it has expanded its hosted managed services to AWS including SAS Viya. SAS' full product suite is available on AWS. 

Constellation Research’s take

Andy Thurai, analyst at Constellation Research, said:

“Viya Copilot is a useful offering that can help users with multiple tasks and is likely to be particularly useful in knowledge gap analysis and data wrangling tasks. Viya Copilot could useful in reducing the manual tasks that take up data scientist time.

SAS Data Maker, in private preview, can help users create synthetic data but only in tabular format. The problem is that AI needs a lot of unstructured data, which is harder to generate. SAS Data Maker may not be much of use to many organizations that need synthetic data for AI today. I hope SAS can eventually get there.

Packaged AI models and Industry-specific lightweight models are interesting. It is notable that more vendors seem to be moving towards smaller models. Google just announced last week about the specialized models’ concept that can run on edge and other lightweight locations.

There is an argument that specialized, lightweight models may underperform without augmentation. These models need to be heavily trained on industry-specific data for them to be useful. Model cards and AI governance advisory services can help enterprises improve AI governance. However, there are smaller startups such as Guardrails AI that offer much more broader offerings in this space.

SAS’s AI initiatives are noteworthy and should interest the company’s customers. But the AI market is moving fast and many other vendors have already announced more advanced and game-changing features.”

Constellation Research analyst Doug Henschen gave his take on Viya Copilot and SAS Data Maker as well as the big picture. 

"Both Viya Copilot and SAS Data Maker are in private preview at this point, so I’d say they are potentially significant. The part that stands out for me is Data Maker, as SAS is one of the few vendors that is talking about and addressing the need for synthetic data generation. Constellation believes data scarcity will limit the accuracy and effectiveness of AI-based systems. SAS is one of the few companies talking about this capability in the context of their generative AI capabilities."

"SAS’s Copilot is similar to what many vendors have announced, and what a few now have available in public preview or even general availability. The capability that is somewhat differentiated is SAS Data Maker. All the big cloud vendors and big companies like IBM support synthetic data generation, but they don’t tend to talk about it in the same context as natural language-based GenAI. It’s a technique in which you feed the AI samples of data that you might have at a relatively small scale and the system will then generate similar, non-privacy-sensitive data sets for use in training on a massive scale." 

SAS has taken a conservative approach, but it has moved more quickly to infuse GenAI into its Customer 360 app, because that’s where competitors including Adobe and Salesforce have been pushing GenAI aggressively. Large customers doing analytics and AI at scale do not switch horses quickly based on this or that hot new feature. In fact, plenty of companies and CXOs are being very cautious about GenAI. What you will see, and what we have already seen over the years, is innovation teams doing experiments with cutting-edge vendors or cutting-edge capabilities provided by cloud vendors, for example. So SAS has to keep up. I see the company’s Generative AI Orchestration announcement as a signal to customers that SAS will enable customers to tap into a proven stable of open models when they are ready to pursue GenAI at scale.   

The pace of innovation is constantly accelerating, particularly in GenAI, so I’d like to see these private-preview announcements move into public preview and general availability as quickly as possible. I’d also like to see more detail on the portfolio of models SAS plans to make available for orchestration and where those models fit with various use cases and industry applications."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Big Data business Marketing SaaS PaaS IaaS CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Acquisitions, TPU Chips, Zendesk | ConstellationTV Episode 78

ConstellationTV episode 78 is here! Watch co-hosts Liz Miller and Holger Mueller analyze the latest #enterprise tech news (Google TPU chips, acquisitions - HubSpot rumors, Salesforce/Informatica)

Then hear from Constellation analysts live at#GoogleCloudNext and conclude with analysis from Liz on Zendesk Relate 2024. Watch until the end for bloopers!

0:00 - Introduction
1:50 - Enterprise #technology news coverage
14:30 - Google Cloud Next key takeaways
25:36 - Zendesk Relate 2024 analysis
31:25 - Bloopers!

ConstellationTV is a bi-weekly Web series hosted by Constellation analysts, tune in live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday!

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/qqgzD2RhY5U?si=9_BqrapgsnbjLzvV" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

77% of CxOs see competitive advantage from AI, says survey

Seventy-seven percent of CxOs believe AI will give their companies competitive advantage, but 91% of companies will determine that they don't have enough data to achieve a level of precision needed for trust, according to a Constellation Research and Dialpad survey.

The survey is based on responses from more than 1,000 senior executives in the US, Canada, UK, Australia and New Zealand about their AI initiatives.

Key takeaways from the survey include:

  • 77% of leaders believe AI will give them competitive advantage.
  • 75% of respondents believe AI will have a significant impact on their roles in the next three years.
  • 54% are concerned about AI regulation.
  • 38% are moderately to extremely concerned about AI.
  • 72% of CxOs plan to reskill workers for AI.
  • 69% of respondents are already using AI at work.
  • 33% of executives say their companies are using two AI solutions, 15% have three and 9.5% using four or more.

The survey also focused on early adopters who are applying AI to analytics, automating work, content creation, forecasting and insights. These CxOs are betting that the Return on Transformation Investments (RTI) with AI will come from efficiency, revenue and growth, compliance and risk and proactive monitoring.

While security, data leakage, hallucinations with generative AI and cost are key concerns for CxOs, high quality and high-volume data has emerged as a more long-term concern. Ninety-one percent of companies will determine they don't have enough data to achieve a level of precision they trust. For now, however, 66.5% of CxOs believe their team is getting enough data to power AI efforts.

Related: Middle managers and genAI | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge | CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

UnitedHealth sees $1.35 billion to $1.6 billion hit in 2024 due to Change Healthcare cyberattack

UnitedHealth Group has tallied up the costs from its Change Healthcare cyberattack including direct response, funding to care providers and lost revenue as the incident sucked out $3 billion of cash flow in the first quarter.

For 2024, UnitedHealth said the tab for the Change Healthcare cyberattack could be as high as $1.6 billion.

In February, UnitedHealth's Change Healthcare unit was hit with a ransomware attack. Since Change Healthcare processes claims and handles other financial processes prescriptions couldn't get filled and physicians ran low on cash.

On March 27, UnitedHealth said Change Healthcare could process medical claims, but its update page notes that some payer processes are being restored through April. The company also said it has provided more than $6 billion in advance funding and interest-free loans to care providers.

Couched in non-GAAP results and pro forma noise, you have to scroll to the bottom of UnitedHealth's first quarter earnings release to get a feel for the Change Healthcare costs. Here's the breakdown:

  • $1.22 billion: First quarter net loss for UnitedHealth, but some of that was due to the sale of a subsidiary.
  • $279 million: Business disruption impacts to UnitedHealth's Optum unit, which houses Change Healthcare. Business disruption impacts refer to revenue lost during the cyberattack.
  • $593 million: Total direct response costs due to the cyberattack. Costs attributed to the Optum unit were $363 million.
  • $872 million: Total UnitedHealth costs related to the Change Healthcare attack.
  • $1.35 billion to $1.6 billion: Total cyberattack hit for 2024 as projected by UnitedHealth, or $1.15 a share to $1.35 a share.

Adjusted for the Change Healthcare fiasco, UnitedHealth reported earnings of $6.91 per share on revenue of $99.8 billion. Both figures topped Wall Street estimates. UnitedHealth's adjusted figures included the revenue hit to Change Healthcare and excluded direct response costs.

Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Information Officer Chief Information Security Officer Chief Privacy Officer

Broadcom CEO Tan says VMware customers can get support extensions amid subscription transition

Broadcom CEO Hock Tan penned another missive to VMware customers arguing that the software vendor has lowered the price of VMware Cloud Foundation, poured money into research and development, benefited partners and will complete the transition to subscriptions.

Tan also noted that Broadcom is working with extending support contracts for VMware customers struggling with the transition to subscription pricing.

The latest blog from Tan comes as Reuters reported the European Commission has received complaints about Broadcom's VMware pricing changes and the regulator sent requests for information to Broadcom.

Broadcom has had a steady cadence of blogs that appear to be aimed at allaying VMware customer concerns. To date, Nutanix has been the biggest beneficiary of VMware customer angst. It's unclear whether Broadcom's blog barrage is hitting the mark, but the missives collectively acknowledge that VMware customers may be a smidge disgruntled. The rundown:

Here's a look at the key points from Tan in order of importance.

Broadcom acknowledges that "fast-moving (VMware) change may require more time." Tan wrote:

"We continue to learn from our customers on how best to prepare them for success by ensuring they always have the transition time and support they need. In particular, the subscription pricing model does involve a change in the timing of customers' expenditures and the balance of those expenditures between capital and operating spending. We heard that fast-moving change may require more time, so we have given support extensions to many customers who came up for renewal while these changes were rolling out. We have always been and remain ready to work with our customers on their specific concerns."

Customers can keep their existing perpetual licenses for vSphere. Tan wrote:

"It is important to emphasize that nothing about the transition to subscription pricing affects our customers’ ability to use their existing perpetual licenses. Customers have the right to continue to use older vSphere versions they have previously licensed, and they can continue to receive maintenance and support by signing up for one of our subscription offerings.

To ensure that customers whose maintenance and support contracts have expired and choose to not continue on one of our subscription offerings are able to use perpetual licenses in a safe and secure fashion, we are announcing free access to zero-day security patches for supported versions of vSphere, and we’ll add other VMware products over time."

VMware is standardizing the pricing metric across cloud providers to per-core licensing to match its end-customer licensing. Tan said this standardization will allow enterprises to seamlessly move VMware Cloud Foundation on-premise to cloud and back if needed.

Tech Optimization vmware Chief Information Officer