Results

Rubrik's IPO: Everything enterprises need to know

Rubrik's IPO: Everything enterprises need to know

Rubrik, an enterprise backup and recovery company, has filed for an initial public offering in a move that indicates a new batch of security vendors are likely to hit the market as companies prep their post-breach strategies. Rubrik, which trades under the ticker RBRK, priced its IPO at $32 a share, raising $752 million at a $5.6 billion valuation. The price was above its expected range.

The nine-year old company, which is on the Constellation ShortList™ for Enterprise Backup and Recovery, competes with Dell, Cohesity, which is combining with Veritas' data recovery unit, Veeam and others. The IPO market has bounced back in 2024 from a two-year drought.

Chirag Mehta, an analyst at Constellation Research, said Rubrik is hitting the market when post-breach planning is a hot topic. Mehta said:

"As AI-led attacks become more sophisticated and breaches become inevitable, security and technology leaders are actively planning for their post-breach strategy. Organizations are bolstering their post-breach resilience to effectively mitigate the impact of cyber incidents and expedite recovery processes. Organizations recognize the intrinsic value of data as a strategic asset and are prioritizing measures to safeguard its integrity, availability, and confidentiality. Rubrik has an important role to play in helping CxOs as data resilience emerges as a linchpin of cyber resilience."

Rubrik will trade under the ticker RBRK. No price range has been set. Holger Mueller, Constellation Research analyst, said Rubrik's IPO will give the company a mindshare lift, but then the real work begins. "For Rubrik it will be key to manage expectations and have an eye on managing investor expectations. For instance Rubrik has done good progress improving profitability, it now has to deliver towards that and other shareholder needs quarter by quarter," he said.  

Here's what you need to know about Rubrik:

The platform. Rubrik Security Cloud (RSC) has a series of layers designed to address AI security and protecting data from threats. The platform also assesses data protection, security posture, analytics and recovery. Here's the stack:

Simply put, Rubrik's platform is built on Zero Trust design principles to address the fact that cyberattacks are inevitable, so the recovery matters more than ever. "Our Zero Trust Data Security platform assumes that information technology infrastructure will be breached, and nothing can be trusted without authentication," said Rubrik in its regulatory filing.

Indeed, Rubrik disclosed that in March 2023 a malicious third party accessed a limited amount of information in one of its non-production IT testing environments. The incident didn't include access to customer or sensitive data.

Customer base. Rubrik has more than 6,100 customers as of Jan. 31 and 1,742 of them have subscription annual recurring revenue of more than $100,000. For fiscal 2023, Rubrik had 5,000 customers. Rubrik said its cloud ARR for fiscal 2024 was $525 million.

Rubrik is paying heavily to acquire customers and its go-to-market operations. For fiscal 2024, Rubrik delivered a net loss of $354.2 million on revenue of $627.9 million, up from $599.82 million in fiscal 2023. Rubrik spent $482.53 million on sales and marketing in fiscal 2024. In fiscal 2023 and fiscal 2024, operating cash flow was $19.3 million and $(4.5) million, respectively, and free cash flow was $(15.0) million and $(24.5) million, respectively.

Much of that investment in sales and marketing is to transition customers from subscription term-licenses to SaaS. Rubrik cited this transition as a risk factor:

"We are implementing certain initiatives to accelerate our existing customers’ migration to RSC as part of our business transition to SaaS, which include enforcement of migration deadlines. These initiatives may be perceived negatively by our customers. For example, these initiatives may require customers to prioritize preparation for their migration over other organizational needs, potentially resulting in diversion of resources. For certain existing customers, the perceived benefits from undertaking the migration may be outweighed by the anticipated time and effort required to prepare for and execute the migration, resulting in potential delays in customers’ transition to RSC."

Rubrik was founded in December 2013 with products and services launched in 2016. RSC, however, has only been offered as a cloud platform based on subscriptions in fiscal 2023. Rubrik noted that RSC is now the majority of revenue as Rubrik-branded appliances and licenses and support sales diminish.

Leadership. Bipul Sinha, CEO, Chairman and Co-Founder, has been a software engineer, venture capitalist and a CEO. In a shareholder letter, Sinha said Rubrik is trying to leverage market disruption to create a durable business. "We built a distinct architecture to combine data and metadata from business applications across the cloud to ensure data security and availability irrespective of incidence," he said. "This allowed us to transform backup data into a strategic asset that sits at the epicenter of security and artificial intelligence."

The generative AI play. No IPO can go forward without generative AI as a hook. Rubrik said generative AI breakthroughs will require more guardrails for security, privacy and compliance. Generative AI will also mean more sophisticated attacks. In its IPO filing, Rubrik argues that its approach accounts for backup and recovery after a cyberattack.

Rubrik also offers Ruby for AI data defense and recovery. In its filing, Rubrik said:

"Ruby is designed to augment human efforts with its generative AI capabilities, helping customers scale their data security operations with automation, boosting productivity, and bridging the users’ skills gap. Ruby uses Microsoft Azure OpenAI Service in combination with our own proprietary, internally developed software. Our proprietary software augments user queries to generate prompts that are submitted to the Azure OpenAI model and also enhances the model output to generate responses presented back to the user."

The company added that it chose Microsoft Azure OpenAI Service based on its security features that keep data with Rubrik's control. Microsoft is a core Rubrik partner.

Generative AI is also cited in Rubrik's lengthy risk factor section for everything from code to intellectual property, regulation globally, liability and customer adoption.

What does Rubrik secure? Rubrik works across the main enterprise applications and platform including VMware, Microsoft Hyper-V, Microsoft SQL Server, Oracle, Microsoft Windows, Nutanix, Kubernetes, Cassandra, MongoDB, Linux, UNIX, AIX, NAS, Epic, SAP HANA, Google Cloud, Azure, AWS, M365 (Microsoft Teams, SharePoint, Exchange Online, and OneDrive), and Atlassian Jira Cloud.

Competition. Multiple competitors were cited including Dell Technologies, IBM, Commvault, Veeam and Cohesity as well as cloud data management vendors in some areas and broader cybersecurity platforms.

Supermicro makes Rubrik-branded appliances. Rubrik said: "A large majority of the customer enterprise data we secure relies upon Rubrik-branded Appliances, which are currently built on servers supplied and designed by Super Micro Computer, Inc., or Supermicro."

Rubrik offers a Ransomware Recovery Warranty. Rubrik said one risk is liability. The company said:

"As part of our ransomware recovery warranty, or the Ransomware Recovery Warranty, we also provide certain customers with up to $10,000,000 for recovery expenses related to data recovery and restoration in the event that data backed up using our solutions cannot be recovered following a ransomware attack. As part of the Ransomware Recovery Warranty, if an eligible customer’s data that has been backed-up onto a Rubrik-branded Appliance, Rubrik-certified compatible third-party commodity server, or a Rubrik-hosted cloud platform, is not successfully recovered by way of one of our data security products due to a failure of such solution, we will reimburse the customer for its reasonable and necessary fees and expenses to restore, recover, or recreate its data up to $10,000,000. As of January 31, 2024, there had been no claims made under the Ransomware Recovery Warranty."

 

Digital Safety, Privacy & Cybersecurity Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Security Zero Trust ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Product Officer

Here's why Meta is spending $35 billion to $40 billion on AI infrastructure, roadmap

Here's why Meta is spending $35 billion to $40 billion on AI infrastructure, roadmap

As long as the ad revenue continues to flow into Meta properties such as Facebook and Instagram, CEO Mark Zuckerberg is going to invest aggressively in an AI buildout. "I think it makes sense to go for it," he said.

That's the message from Meta following its first quarter results, which landed after the company launched its powerful large language model (LLM) Llama 3. Meta reported first quarter net income of $12.37 billion, or $4.71 a share, with revenue of $36.46 billion, up 27%. Meta said its second quarter revenue will be $36.5 billion to $39 billion. The real zinger was that Meta said its 2024 capital expenditures would be $35 billion to $40 billion, above previous guidance of $30 billion to $37 billion.

"Meta needs to keep investing into AI so it keeps the consumers and clicks that drive ad revenue. If you lose the traffic because AI is better somewhere else it will be hard for Meta to keep the flywheel going," said Constellation Research analyst Holger Mueller. "And the foe is Google, which has a similar spend, but a lead when it comes to custom algorithms on custom silicon. Zuckerberg has no alternative but to spend and stay relevant. Meta must stay ahead of its ad competitors Google, and then Microsoft as well."

Needless to say, that capital expenditure guidance led to a bevy of analyst questions about Meta's AI strategy and potential. Zuckerberg was happy to elaborate. Here's what he had to say about Meta's AI buildout and the long-term view. The upshot is that Meta intends to be an AI leader.

Meta AI as helper. Zuckerberg said Meta AI will be an assistant across the company's apps and glasses that will connect people via APIs for everything from commerce to customer support as well as coding and development. The initial rollout of Meta AI has had "tens of millions of people" trying it. "We believe that Meta AI with Llama 3 is now the most intelligent AI assistant that you can freely use. And now that we have the superior quality product we are making it easier for lots of people to use them within WhatsApp, Messenger, Instagram, and Facebook," said Zuckerberg.

Further model improvements. Meta has said it is working on a 400 billion parameter model that is still in training. "I expect our models are just going to improve further from open-source contributions," he said. "We have the talent, data and ability to scale infrastructure to build the world's leading AI models and services. And this leads me to believe that we should invest significantly more over the coming years to build even more advanced models and the largest scale AI services in the world," said Zuckerberg.

This AI investment isn't cheap. Zuckerberg said Meta will operate the rest of its business as efficiently as possible and will shift those savings to AI investments. Nevertheless, Meta is going to spend heavily. "We will still grow our investment envelope meaningfully before we make much revenue from some of these new products," he said.

Meta CFO Susan Li said:

"As we develop more advanced and compute intensive recommendation models, and scale capacity for our generative AI training and inference needs, we expect that having sufficient infrastructure capacity will be critical to realizing many of these opportunities. As a result, we expect that we will invest significantly more in infrastructure over the coming years."

Investors aren't going to like the AI investment, but Meta is used to it. The company has "historically seen a lot of volatility in our stock during this phase of our product playbook, where we're investing in scaling and new products but aren't yet monetizing it," said Zuckerberg. It has happened before with Reels, news feed and the transition to mobile. APIs are likely to be a profitable AI service for Meta. "Building a leading AI will also be a larger undertaking than the other experiences we've added to our apps, and this is likely going to take several years," he added.

The payoff is there. Zuckerberg said:

"Once our new AI services reach scale, we have a strong track record of monetizing them effectively. There are several ways to build a massive business here, including scaling business messaging, introducing ads or paid content into AI interactions. And enabling people to pay to use bigger AI models and access more compute. And on top of that. AI is already helping us improve app engagement which naturally leads to seeing more ads and improving ads directly to deliver more value."

Zuckerberg said that 30% of the posts on Facebook's feed are served by AI and 50% of Instagram content is AI recommended.

The cost efficiency of model training will be a priority. Zuckerberg said open-source contributions will likely enable efficiency gains as will Meta's proprietary accelerator chips. The goal is to run AI workloads on a less expensive stack.

There's an AI and augmented reality intersection. "Glasses are the ideal device for an AI assistant because you can let them see what you see and hear what you hear. So, they have full context on what's going on around you, as they help you with whatever you're trying to do," said Zuckerberg.

And the time is now. Zuckerberg said:

"With the latest models, we're not just building good AI models that are that are going to be capable of building some new goods, social and commerce products. I actually think we're at a place where we've shown that we can build leading models and be the leading AI company in the world, and that opens up a lot of additional opportunities."

The future. Zuckerberg said:

"I think that the next phase for a lot of these things are handling more complex tasks and becoming more like agents rather than just chatbots. You're going to give an agent something to do like an intent or a goal. Then it goes off and actually performs many queries on its own in the background in order to help accomplish your goal. Whether that goal is researching something online, or eventually finding the right thing that you're looking to buy. I think people don't even realize that they will be able to ask computers to do it for them.

I think the opportunity is really big. So, it makes sense to go for it. And we're going to, and I think it's going to be a really good long-term investment."

Data to Decisions Marketing Transformation Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity meta ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

ServiceNow posts strong Q1 touts genAI uptake

ServiceNow posts strong Q1 touts genAI uptake

ServiceNow posted strong first quarter results and said its genAI offerings are the "fastest selling in the company’s history."

ServiceNow reported first quarter net income of $347 million, or $1.67 a share. Non-GAAP earnings were $3.41 a share. Revenue for the first quarter was $2.6 billion, up 24% from a year ago. Wall Street was expecting ServiceNow to report earnings of $3.13 a share on revenue of $2.59 billion.

CEO Bill McDermott said: "Our GenAI offerings are the fastest selling in the company’s history. We are humbled by the trust our customers are investing in our platform."

ServiceNow CFO outlines method to Pro Plus SKU pricing

77% of CxOs see competitive advantage from AI, says survey | Why digital, business transformation projects need new approaches to returns | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge

The company said its current remaining performance obligations contract revenue to be recognized over the next 12 months was $8.45 billion. The company had 1,933 total customers with more than $1 million in annual contract value. ServiceNow said it had 8 deals in the quarter worth more than $5 million and four worth more than $10 million.

Speaking on a conference call with analysts, McDermott said "GenAI adoption remained on a tear in Q1. Companies are leaning into GenAI as a powerful deflationary force to drive productivity." He added that the pipeline for Pro Plus, which features GenAI is strong. 

McDermott added:

"Process optimization is the number one Gen AI use case in the global economy today. This is why ServiceNow's strategic relevance as the AI platform for business transformation has never been higher. Every business workflow in every enterprise will be engineered with Gen AI at its core. We are the single pane of glass that enables end-to-end digital transformation." 

As for the outlook, ServiceNow projected second-quarter subscription revenue of $2.52 billion to $2.53 billion, up 21.5% to 22% from a year ago. For 2024, ServiceNow projected subscription revenue of $10.56 billion to $10.57 billion.

The company ended the quarter with 23,362 employees, up from 22,668 as of Dec. 31.

Data to Decisions Future of Work servicenow Chief Information Officer

IBM acquires HashiCorp for $6.5 billion, reports mixed Q1

IBM acquires HashiCorp for $6.5 billion, reports mixed Q1

IBM said it will acquire HashiCorp in a deal valued at $6.4 billion as it builds out its infrastructure and security lifecycle management tools to go along with its hybrid cloud and AI portfolio.

The purchase price equates to $35 a share in cash for HashiCorp shareholders.

IBM CEO Arvind Krishna said the acquisition  will help customers "manage the complexity of today's infrastructure and application sprawl" as they build out hybrid cloud and generative AI infrastructure. HashiCorp is a play on hybrid and multi-cloud workflows and a solid add-on to Red Hat. 

According to IBM, HashiCorp will accelerate its growth and cross-selling opportunities with Red Hat, watsonx, data security, IT automation and consulting. HashiCorp, which has more than 4,400 customers, is expected to be accretive to IBM's adjusted EBITDA within the first year of closing.

Krishna said:

"IBM’s and HashiCorp’s combined portfolios will help clients manage growing application and infrastructure complexity and create a comprehensive hybrid cloud platform designed for the AI era.”

Constellation Research's take

Constellation Research analyst Dion Hinchcliffe said:

"Hashicorp is the latest in a long spree of acquisitions over the last year by IBM CEO Arvind Krishna to round out their cloud offerings to make them more competitive with the hyperscalers. Hashicorp has struggled at times to crack the enterprise sales market, despite being one of the cooler companies on the block. While IBM will undoubtedly use Hashicorp's strong developer 'street cred' as a proof point in its own offerings, it remains to be seen if Hashicorp can retain its perceived neutrality as a cloud infrastructure software provider at a time that robust multicloud and crosscloud support continues to grow in importance."

Constellation Research analyst Holger Mueller said:

"Hashicorp does not make sense for IBM. The services model lives from being independent and now they may look biased. And the service revenue around DevOps is going to dry up. The multicloud aspect of HashiCorp makes sense from an IBM credibility perspective." 

Constellation Research analyst Chirag Mehta said:

"HashiCorp's switch from MPL 2.0 to BSL 1.1 for future products has sparked concern among developers. The BSL's perceived limitations on open-source contribution worry them, potentially impacting the long-term development of these products, particularly those previously under the more permissive MPL.

In light of IBM's strong commitment to open source, many developers are urging them to consider a license like Apache 2.0 for the new HashiCorp products. This license allows for wider modification and distribution, even commercially. By embracing a more open-source friendly license, IBM has a golden opportunity to gain developer trust, a crucial factor for successfully integrating HashiCorp's infrastructure and security solutions into their go-to-market strategy.  Ultimately, this shift could strengthen both IBM's offerings and its position in the strategic multi-cloud and cybersecurity domains."

First quarter results

Separately, IBM reported its first quarter results. IBM reported first quarter earnings of $1.6 billion, or $1.69 a share, on revenue of $14.5 billion, up 1% in constant currency. Non-GAAP earnings for the quarter was $1.68 a share.

Wall Street was expecting IBM to report first quarter earnings of $1.59 a share on revenue of $14.54 billion.

  • IBM said software revenue was up 5% in the first quarter with consulting revenue flat. Infrastructure revenue was down 1%.
  • In software, Red Hat revenue was up 9% with automation up 13%. Data and AI revenue was up 1% and security fell 3%.
  • For 2024, IBM is projecting revenue growth in the mid-single digit range with $12 billion in free cash flow.

 

 

 

 

 

Data to Decisions Tech Optimization IBM SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

Moderna uses OpenAI's ChatGPT Enterprise to scale 750 GPTs

Moderna uses OpenAI's ChatGPT Enterprise to scale 750 GPTs

Moderna said it is using OpenAI's ChatGPT Enterprise to scale custom models  across its business.

According to Moderna, the company launched its own instance of ChatGPT called mChat built on OpenAI's API. Moderna said it had 80% internal adoption initially and then it deployed ChatGPT Enterprise with analytics, image generation and GPTs.

Moderna said that it has deployed more than 750 GPTs across the company and multiple functions including legal, research, manufacturing and commercial. These assistants augment employees and offer personalized support.

One example of these GPTs is Dose ID GPT, which uses ChatGPT Enterprise to evaluate the optimal vaccine dose. Dose ID provides rationale, references to sources and generates charts of key findings.

The Moderna example is an illustration of how OpenAI is scaling its enterprise efforts beyond the reach it has with Microsoft. For instance, OpenAI's Moderna case study quotes Brice Challamel, Head of AI Products and Platforms at Moderna. saying that Moderna evaluated mChat, Copilot and ChatGPT Enterprise before making a decision.

Other takeaways on the Moderna-OpenAI partnership:

  • Moderna's 750 GPTs took about two months to create.
  • Each user has 120 ChatGPT Enterprise conversations per week on average.
  • 40% of weekly active users created GPTs.
  • The legal team has 100% adoption of ChatGPT Enterprise.
Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity openai ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Nvidia acquires Run.ai for GPU workload orchestration

Nvidia acquires Run.ai for GPU workload orchestration

Nvidia said it has acquired Run.ai, a startup focused on GPU workload management and orchestration.

Run.ai's platform is Kubernetes-based and will help Nvidia customers distribute workloads across cloud, edge and data center infrastructure.

Terms of the deal weren't disclosed, but CTech put the purchase at about $700 million.

Nvidia said Run.ai's open platform will serve as an orchestration layer for GPU clusters. AI deployments will need the ability to orchestrate and optimize generative AI training and inference for speed and cost.

Run.ai's platform provides a centralized interface to manage compute infrastructure, access to cluster resources and the ability to pool resources.

According to Nvidia, Run.ai's products will keep their current business model for the immediate future. Run.ai's product roadmap will be integrated into Nvidia DGX Cloud. Nvidia DGX and DGX Cloud customers will get access to Run.ai for AI workloads and large language model deployments.

Run.ai is already integrated with much of Nvidia's software including Nvidia AI Enterprise and DGX.

 

Data to Decisions nvidia Chief Information Officer

Snowflake launches Arctic LLM, to release details under Apache 2.0 license

Snowflake launches Arctic LLM, to release details under Apache 2.0 license

Snowflake launched Arctic, an open-source large language model (LLM) optimized for enterprise workloads and efficiency. The move highlights how data platforms are increasingly launching LLMs to combine with their data platforms.

For Snowflake, Arctic is also among the first launches under new CEO Sridhar Ramaswamy's tenure. Arctic will be part of a larger LLM family built by Snowflake.

The Arctic launch also lands as Databricks launched DBRX, an LLM that has been well-received. Ramaswamy said Arctic represents a "watershed moment for Snowflake" and highlights what open-source AI can do. For good measure, Meta launched its Llama 3 LLM last week. Ramaswamy's mission is to speed up Snowflake's product cycles and innovation.

Snowflake said it will release Arctic's weights under an Apache 2.0 license and detail how the LLM was trained. Snowflake pitching Arctic as an LLM that can balance intelligence and compute. Snowflake's plan is clear: Scale Arctic usage to the 9,400 companies on its data platform. These companies will then consume more of Snowflake's platform.

Key points to note about Snowflake Arctic:

  • Snowflake is ensuring it has open-source credibility. Snowflake said it will provide code templates, flexible inference and training options and the ability to customize Arctic via multiple frameworks.
  • Frameworks for Arctic will include Nvidia NIM with Nvidia TensorRT-LLM, vLLM, and Hugging Face.
  • Arctic will be available for serverless inference in Snowflake Cortex, which offers machine learning and AI in the Data Cloud along with model gardens and catalogs from Nvidia, Hugging Face, Lamini, Microsoft Azure and Together.
  • Snowflake said Arctic's mixture of experts (MoE) architecture is designed to activate 17 out of 480 billion parameters at a time for token efficiency. Snowflake claims it activates roughly 50% fewer parameters than DBRX.

Although Snowflake launched Arctic, the company said it will still give access to multiple LLMs in its Data Cloud.

Constellation Research's take

Constellation Research analyst Doug Henschen said:

"It's good to see Snowflake moving quickly, under new CEO Sridhar Ramaswamy, to catch up in the GenAI race. Snowflake rival Databricks started introducing LLMs last year with its release of Dolly and it followed up this March with the release of DBRX, an open-source model that seems to be getting lots of traction. Snowflake is clearly responding to the competitive threat, given the press release’s comparisons between the new Arctic LLM and DBRX.  I’d like to know more about the breadth of intended use cases. Snowflake says Arctic outperforms DBRX, Llama 2 70B and Mixtral-8x7B on coding and SQL generation while providing “leading performance” on general language understanding. I’d like to see independent tests, but the breadth of customer adoption will be the ultimate gauge of success. It’s important to note that Snowflake Cortex, the vendor’s platform for AI, ML and GenAI development and development, is still in preview at this point. As a customer I would want to look beyond the performance claims and know more about vendor indemnification and risks when using LLMs in conjunction with RAG techniques." 

Constellation Research analyst Andy Thurai said:

The massive war between open-source and closed-source LLMs is heating up with multiple competitors. Massive LLMs are available for free and allowing enterprise users to fine-tune their models with their enterprise data. On that note, a few items from this release are notable:

  • This is licensed under Apache 2.0 which permits ungated personal, research, and commercial use. This is different than many other open-source LLM providers such as Meta’s Llama series, which allows use for personal and research purposes with limitations on commercial use.
  • DataBricks, which is gaining market share and momentum fast, had a massive leg up with their acquisition of MosiacML in knowledge, skilled resources, stockpile of GPUs, and expertise to train massive LLMs. Every big cloud and data vendor is going after this market by announcing their own variations including Google, AWS, Databricks, Microsoft, IBM, DataBricks, Anthropic, Cohere, Salesforce, Twitter/X, and now Snowflake.
  • Snowflake aims to make this process easier by providing code templates and flexible inference and training options to deploy in existing AI, and ML frameworks.
  • Snowflake provides options for serverless inference which could help with expanding massively distributed inference networks to operate on demand.
  • The company is trying to go after two specific markets--search and code generation.
  • Snowflake has an advantage over other LLM providers from a data lake standpoint. If Snowflake can convince the users to keep the data in their data lakes to train their custom models, or have them fine tune or RAG it, it can compete easily. Given Snowflake is late as many enterprises are already experimenting with many LLM providers. By providing hosting options for many other open source LLMs in Cortex and Arctic, Snowflake is hoping to catch up.
Data to Decisions snowflake Chief Information Officer

Microsoft expands at Coca-Cola as part of multi-cloud strategy that includes AWS

Microsoft expands at Coca-Cola as part of multi-cloud strategy that includes AWS

Microsoft and The Coca-Cola Co. announced a five-year $1.1 billion deal that includes Azure OpenAI, Microsoft 365 and apps including Power BI and Dynamics 365 as the beverage company rounds out its multi-cloud strategy.

The announcement is part of an emerging multi-cloud strategy at Coca-Cola. Amazon Web Services is also a provider at the company and its various partners and units. Coca-Cola is sprawling and has a bevy of publicly traded bottlers in its orbit (Coca-Cola European Partners, Coca-Cola Femsa and Coca-Cola Bottling Co.). In other words, Coca-Cola likely has more than one technology vendor in every category.

Coca-Cola's contract with Amazon is a global one that includes AWS, Prime and Amazon Ads, sources say. The Coca-Cola Co. has been an AWS customer since 2013 and continues to use multiple services.

Under the Microsoft partnership, Coca-Cola has made a $1.1 billion commitment that includes the following:

  • Experiments with Azure OpenAI Services for generative AI use cases and testing for Copilot for Microsoft 365. Microsoft said Coca-Cola has been using Azure OpenAI Service for a year.
  • Migration of applications to Microsoft Azure.
  • Use of multiple Microsoft cloud applications and platforms.

Coca-Cola's initial partnership with Microsoft was worth $250 million in 2020.

Coca-Cola doesn't disclose its annual technology budget in its regulatory filings or the breakdown of its cloud spending. However, companies are starting to disclose the splits between cloud vendors. For instance, Equifax recently disclosed how it has split its spending between Google Cloud and AWS in recent years. We detailed the breakdown in our Equifax customer story and how it is approaching AI and data products.

Tech Optimization Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

CR CX Convos: Is Customer Success Successful?

CR CX Convos: Is Customer Success Successful?

Customer success can be a valuable, meaningful and profitable endeavor for technology buyers and vendors. So why does customer success seem to miss the mark so often? Is it a misalignment of goals? Teams coming into the process too late? Perhaps a combination of a multitude of factors? When Liz Miller first wrote about customer success in a blog post, she didn't think she'd hit a nerve...but she did. So, the conversation continues! In this CR CX Convo, Liz dives back into the Customer Success conversation and shares a use case story of a complex brand that leveraged their Field Service Management partner, IFS, to define value and turn customer success into a means to rapid scale on demand which turned into even  greater value on demand.

 

On CR Conversations <iframe width="560" height="315" src="https://www.youtube.com/embed/tXDU9aaZMLI?si=N4eFDmPWomur6__w" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Analytical Data Platforms 101: Data Lakes, Data Warehouses and 'Lakehouses' Explained

Analytical Data Platforms 101: Data Lakes, Data Warehouses and 'Lakehouses' Explained

Constellation Research explores how analytical data platforms are evolving and what to expect in a modern platform.

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/Sdc8idvK9ds?si=xpTdrQTLIlLT3o4G" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>