Results

CrowdStrike, AWS expand partnership revolving around CrowdStrike Falcon and Amazon Bedrock

CrowdStrike and Amazon Web Services expanded a partnership where Amazon will standardize on CrowdStrike's Falcon platform and CrowdStrike will expand usage of Amazon Bedrock and SageMaker.

The partnership is notable on a few fronts. For starters, CrowdStrike is in a cybersecurity dogfight with players such as Palo Alto Networks as enterprises consolidate vendors. On the AWS side, a company like CrowdStrike highlights how Amazon SageMaker and Bedrock can scale models since CrowdStrike has Charlotte AI, a generative AI assistant that can serve as a security analyst.

Chirag Mehta on the intersection of cybersecurity, design thinking and AI

Amazon CEO Andy Jassy said on the company's earnings conference call that enterprises have been receptive to Bedrock's approach to model choice.

According to CrowdStrike regulatory filings, the company's platform is largely built on AWS. CrowdStrike said it is optimizing its cloud infrastructure as it scales. In CrowdStrike's annual report, it said:

"Because of the importance of AWS’ services to our business and AWS’ position in the cloud-based server industry, any renegotiation or renewal of our agreement with AWS may be on terms that are significantly less favorable to us than our current agreement. If our cloud-based server costs were to increase, our business, results of operations and financial condition may be adversely affected. Although we expect that we could receive similar services from other third parties, if any of our arrangements with AWS are terminated, we could experience interruptions on our Falcon platform and in our ability to make our solutions available to customers, as well as delays and additional expenses in arranging alternative cloud infrastructure services."

In other words, the expanded CrowdStrike-AWS strategic partnership works out for both sides. Here are the details of the partnership:

  • Amazon will consolidate cloud security vendors on CrowdStrike Falcon Cloud Security.
  • Amazon will use Falcon Next-Gen SIEM to secure big data logging and Threat Detection and Response to thwart identity-based attacks.
  • Amazon will unify its endpoint detection and response on Falcon.
  • CrowdStrike will expand the use of Amazon Bedrock and AWS Sagemaker.
  • The use of Bedrock by CrowdStrike will include Anthropic's Claude family of large language models (LLMs).
Data to Decisions Digital Safety, Privacy & Cybersecurity Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience crowdstrike amazon Security Zero Trust SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Revenue Officer Chief Information Security Officer Chief Privacy Officer Chief Technology Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

MongoDB adds to Atlas platform, scales partnerships, flexibility

MongoDB launched new Atlas features and integrations with Microsoft Azure, Google Cloud and Amazon Web Services as well as an expanded partner program. The effort, announced at MongoDB.local NYC, is designed to make it easier for developers to scale MongoDB applications across clouds and edge infrastructure.

The company's strategy revolves around flexibility and accessing data across multiple locations, said Scott Sanchez, Vice President of Marketing at MongoDB.

"Not every AI model is going to be in every file, or every region and some workloads may only be possible in a specific location or specific hyperscaler," said Sanchez. "So, flexibility really matters. Data comes from all these different places and sources and formats, and being able to put that in one place and represented in a single document-based model is crucial in this AI world."

Here's the breakdown of what was announced at MongoDB.local NYC, which will also include an investor session with Wall Street analysts.

MongoDB Atlas: Stream Processing GA, Search Nodes on Azure, Edge Server

MongoDB outlined new features in MongoDB Atlas and the general availability of MongoDB Atlas Stream Processing.

The company said the new capabilities are designed to make it easier to build, deploy and scale data applications and services. Sahir Azam, Chief Product Officer at MongoDB, said Atlas is targeting optimization and reducing costs as well as building applications.

For instance, MongoDB Atlas Stream Processing will give developers the ability to leverage data in motion and at rest to power applications for Internet of things devices, inventory feeds and browsing behavior.

MongoDB also said Atlas Search Nodes is available on Microsoft Azure in a move that combines the company's data platform with Azure generative AI workloads and services. Enterprises will be able to use Atlas Vector Search and Atlas Search on Azure to optimize generative AI applications.

Atlas Search Nodes on Azure will put the service on all three hyperscalers. MongoDB Search Nodes is available on AWS and Google Cloud already.

And MongoDB Atlas Edge Server was launched so customers can run distributed applications closer to end users. AI workloads are often moving to the edge for lower latency as well as proximity to data.

Atlas Edge Server, available in public preview, is a locale instance that can synchronize data when connectivity is spotty, supports data tiering and maintains a local data layer for low latency.

"MongoDB advances its capabilities with key AI additions to its scope. Database support for AI is key for enterprises to keep using databases – without having to export data to a third party AI data platforms," said Constellation Research analyst Holger Mueller. "This makes MongoDB even more attractive to power next generation applications for an enterprise and combined with multi-cloud capabilities is compelling."

MongoDB AI Applications Program

MongoDB launched the MongoDB AI Applications Program, or MAAP, that includes a bevy of technology partners, foundational models and advisory services to enable enterprises to deploy generative AI applications faster.

The goal of MAAP is to create an integrated stack from MongoDB and partners including AWS, Google Cloud, Microsoft Azure that will feature frameworks and models from Anthropic, Cohere, LlamaIndex, LangChain and a host of others. The stack feature MongoDB Atlas and aim to be turnkey for enterprises looking to build genAI apps.

MAAP includes:

  • Strategies and roadmaps for genAI applications and support services via MongoDB Professional Services and consulting partners.
  • A curated selection of foundational models for multiple use cases from Anthropic, Cohere, Meta, Mistral, OpenAI and others.
  • Reference architecture, integration technology and prescriptive guidance.
  • AI jump-start sessions with industry experts.

MongoDB Atlas Vector Search integration with Amazon Bedrock

MongoDB said it has integrated Atlas Vector Search Knowledge Bases with Amazon Bedrock. MongoDB Atlas Vector Search on Knowledge Bases for Amazon Bedrock gives enterprises the ability to build genAI apps using managed foundation models.

The combination between MongoDB Atlas Vector Search with Amazon Bedrock is designed for a bevy of joint customers between the two companies.

According to MongoDB, customers can use the Atlas Vector Search and Bedrock integration to customize large language models from a bevy of providers with real-time data that's converted into vector embeddings by MongoDB.

MongoDB, Google Cloud to optimize Gemini Code Assist for MongoDB developers

MongoDB and Google Cloud said the companies will optimize Gemini Code Assist with MongoDB suggestions, answers and code.

Gemini Code Assist will give developers information, documentation and best practices for MongoDB code.

MongoDB said that Gemini Code Assist is trained on publicly available datasets with full codebase awareness and integration with various code editors and repositories.

The two companies said the optimization for Gemini Code Assist and MongoDB will be available in the "coming months."

Constellation Research's take

Mueller said MongoDB is focusing on what CxOs care about. He said:

"It's good to see MongoDB not only covering the GenAI basics with the usual vector announcement that has become staple for all database vendors – but also focusing what really matters to CxO. What matters to CxOs is building next generation applications that fuel Enterprise Acceleration. The ability to build multicloud applications that can execute code all the way to the edge is critical. And good to see the multicloud strategy also in the AI announcements by partnering with the relative best that the cloud platforms have to offer – Bedrock for AWS and Gemini for Google Cloud."

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity mongodb AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

AI Budgets, Data Platforms, Results-Driven Comms Strategies | ConstellationTV Episode 79

This week on episode 79 of ConstellationTV, co-hosts Dion Hinchcliffe and Doug Henschen talk #enterprise tech news with Larry Dignan (#AI budgets, Microsoft Phi-3 Model, Snowflake's Arctic LLM)...

Dion then talks platform-based #communication strategies and Chirag Mehta previews the RSA #security conference he's attending.

Round out the episode with Doug's helpful framework for analytical #data platforms... and don't miss the bloopers!

0:00 - Introduction
1:16 - Enterprise #Tech News
13:14 - Using Platform-Based Comms Strategies to Drive #Business Results
20:29 - Preview of 2024 RSA Security Conference
22:13 - Analytical Data Platforms 101: Data Lakes, Data Warehouses, and Lakehouses
34:41 - Bloopers!

Don't miss ConstelaltionTV episode 70, dropping in two weeks with co-hosts Liz Miller and Holger Mueller!

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/4I7rDy1CNis?si=Qf4WcpOVIRrAHFKq" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Atlassian launches Rovo, consolidates Jira Work Management, Jira Software

Atlassian launched Atlassian Rovo, a generative AI assistant built on Atlassian Intelligence, which will operate across the company's teamwork platform. In addition, Atlassian said it was combining Jira Software and Jira Work Management into one project management tool.

The company announced its product updates at Atlassian Team '24 in Las Vegas.

Rovo is designed to find, learn and act on information stored across an enterprise. Atlassian Rovo is designed to surface data, understand it and deliver insights and use specialized agents to handle tasks.

With the move, Atlassian Rovo will leverage one data model, dubbed the teamwork graph, which will pull data from the company's applications and other SaaS apps. The goal is to deliver one view of goals, knowledge, actions, projects and execution.

Core components of Atlassian Rovo include:

  • Rovo Search, which will comb through content wherever it is stored (Google Drive, Microsoft SharePoint, GitHub, Slack etc.), and query across applications. Rovo Search will identify team players, projects and information needed to make decisions. According to Atlassian, Rovo Search will connect niche and custom apps via API and have enterprise-grade governance to data governance.
  • Insights, which are delivered via knowledge cards that offer context about projects, goals and teammates.
  • Rovo Chat, a conversational bot that is built on company data and learns as it goes. Rovo will surface information and offer follow up questions.

On the backend, Atlassian Intelligence will feature generative AI in editor tools across the company's portfolio, AI-powered summaries, Loom AI workflows, virtual help center agents, AIOps and natural language AI automation rules.

Constellation Research's take on Atlassian's Rovo 

Constellation Research analyst Andy Thurai received a full demo and deep dive of Atlassian's AI efforts. Here's Thurai's assessment:

"Rovo (a name designed to satisfy international customers) is primarily an enterprise knowledge and search tool. Powered by Atlassian Intelligence, you can search across Jira and Confluence for information within the platform. Currently, Rovo is limited to Atlassian and some third-party products, but you'll eventually be able to search Atlassian's marketplace. Rovo provides the contextual information that was hard to reach on the Atlassian platform before.

Atlassian uses an OpenAI private instance on the backend but has a specific agreement with OpenAI so it can’t retain the data used for prompting. OpenAI also can't use the data to train an LLM. The chatbots are currently limited only to structured data with no specific plan or timeline for unstructured data. In a demo, the chatbot had contextual awareness from Confluence and Jira and a focus on workforce productivity. 

I also liked AI summarization in the Atlassian platform. When product teams are rushed for time, employees can ask the agents to summarize the critical points without reading a bunch of lengthy documents. Rovo can create actionable items based on those documents. One customer was able to take a backlog risk analysis from two months to 20 minutes with the help of Rovo.

Rovo comes with 20+ default agents but can be extended by the customers with the no code options. Since its release few months ago, 500 internal agents have been created.

Overall, Atlassian has quietly done quite a bit of work on the AI front. Many of the new features are in beta mode so be sure to test after the full release. Atlassian focused on the system of worked and developed a bevy of capabilities. Given that competition is very limited for the knowledge worker category, Atlassian can gain traction. Atlassian's sales motion is geared toward mid-sized enterprises, but the company is trying to move up. 

Going forward, Atlassian may have to address pricing since all its go-to-market and pricing motions are geared toward large teams collaborating in an agile production cycle. With generative AI, team sizes are going to shrink. Atlassian needs to move the model away from seat-based licensing to a value-based AI-driven pricing model."

It's just Jira now

In addition to the Rovo news, Atlassian said it is taking the best of Jira Work Management and Jira Software to create one project management suite called Jira.

Jira will include goal tracking, shortcuts via AI, visualization and integrations with Confluence and Loom for knowledge sharing.

For enterprises, Atlassian said they will be able to combine SKUs and have one project management software invoice.

Atlassian added that Goals in Jira will launch in the next few months to visualize tasks and track progress to goals, Atlassian AI will break works into digestible chunks, and feature list views, calendar integration and collaboration tools.

CEO transition and earnings

Atlassian's conference kicked off a week after the company reported third quarter earnings and said it would transition to one CEO over the co-CEO model. Co-Founder Scott Farquhar will step down as co-CEO effective Aug. 31 and Mike Cannon Brookes will lead the company as CEO.

Farquhar is leaving to spend more time with his young family and philanthropy while remaining an active board member. 

The company in the third quarter reported revenue of $1.2 billion, up 30% from a year ago with net income of $12.8 million, or 5 cents a share. Atlassian said it now gets most of its revenue from its cloud products and has 300,000 customers on its cloud. Non-GAAP earnings for the third quarter were 89 cents a share.

For the fourth quarter, Atlassian projected revenue between $1.12 billion to $1.13 billion with cloud revenue growth of 32%.

Brookes said, "we're incredibly bullish about AI" and the scale across the Atlassian platform is "one of the areas that I always think is underestimated in terms of durable growth and in terms of long-term advantage."

CFO Joe Binz said Atlassian is navigating a mixed demand picture. On the third quarter earnings conference call, Binz said:

"Enterprise was healthy across both cloud and data center and that drove the record billings, strong growth in annual multiyear agreements. Strong migration and good momentum in sales of premium and enterprise additions of our products will roll through our revenue results.

The macro impact on SMB, on the other hand, continued to be challenging, although also in-line with expectations. And that macro headwind in SMB lands primarily in cloud, given SMB makes up a significant part of that business."

Future of Work Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Anthropic launches Claude Team plan, iOS app

Anthropic said it will launch a Team plan and iOS app for its Claude large language model for $30 a month with a minimum of five seats.

With the move, Claude will compete with OpenAI's ChatGPT plans. Microsoft and Google both have apps for Copilot and Gemini, respectively.

Anthropic's Team plan will give teams a workspace and tools for managing users and billing. The Claude iOS app features the Claude 3 model family, sync chat history and support photos.

Key items about the Team plan:

  • The Team plan has more usage per user than the Pro plan and access to all of the Claude models including Opus, Sonnet and Haiku.
  • A 200,000 token context window to process long documents and have multi-step conversations.
  • Admin tools and billing management.
  • All of the features in Claude Pro.
Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

AMD Q1 delivers data center, AI sales surge of 80%

AMD reported better-than-expected first quarter earnings largely due to strong data center growth and the ramp of the company's MI300 AI accelerator.

The company reported first quarter earnings of $123 million, or 7 cents a share, on revenue of $5.5 billion. Non-GAAP first quarter earnings were 62 cents a share.

Wall Street was expecting AMD to report first quarter earnings of 61 cents a share on revenue of $5.45 billion.

AMD is critical as a second supplier for AI processors and GPUs as enterprises and cloud providers spend heavily on Nvidia. Here's the current state of AI chip players:

Lisa Su, AMD CEO, said the "widespread deployment of AI is driving demand for significantly more compute across a broad range of markets. We are executing very well as we ramp up our data center business and enable AI capabilities across our product portfolio."

As for the outlook, AMD projected second quarter revenue of $5.7 billion, give or take $300 million.

By unit, AMD posted record data center revenue in the first quarter of $2.3 billion, up 80% from a year ago. Growth was driven by AMD Instinct GPUs and 4th Gen AMD EPYC CPUs. The PC unit had first quarter revenue of $1.4 billion, up 85% from a year ago. Gaming revenue was $922 million, down 48% from a year ago. Embedded revenue in the first quarter was $846 million, down 46% from a year ago.

On an earnings conference call, Su said server CPU sales were strong in a seasonally down first quarter due to "growth in enterprise adoption and expanded cloud deployments."

She said there are nearly 900 AMD powered public cloud instances across hyperscalers.

Regarding AI, Su said:

"In the enterprise, we have seen signs of improving demand as CIOs need to add more general purpose and AI compute capacity while maintaining the physical footprint and power needs of their current infrastructure."

She added that MI300 is the fastest ramping product in AMD history and has passed $1 billion in total sales in less than two quarters. "We now expect data center GPU revenue to exceed $4 billion in 2024, up from $3.5 billion we guided in January," said Su. "Longer term, we're increasingly working closer with our Cloud and Enterprise customers as we expand and accelerate our AI hardware and software roadmaps and grow our data center GPU footprint."

She added:

"AI represents an unprecedented opportunity for AMD. While there has been significant growth in AI infrastructure build outs, we're still in the very early stages of what we believe is going to be a period of sustained growth driven by an insatiable demand for both high performance AI and general-purpose compute."

Su also was bullish on prospects for AMD's Ryzen processors and AI PCs with additional market share gains in commercial accounts.

Tech Optimization Data to Decisions AMD Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

AWS annual revenue run rate hits $100 billion as growth accelerates

Amazon Web Services revenue growth accelerated in the first quarter as the cloud giant reported sales of $25 billion.

Amazon reported overall first quarter net income of $10.4 billion, or 98 cents a share, on revenue of $143.3 billion, up 13%. Wall Street was expecting Amazon to report earnings of 83 cents a share on revenue of $142.56 billion.

With the Amazon and AWS results, it's clear that hyperscale cloud providers are landing AI workloads. Microsoft Azure revenue in Q3 up 31%Alphabet shows Q1 strength in Google Cloud, initiates dividend

AWS delivered first quarter operating income of $9.4 billion on revenue of $25 billion, up 17% from a year ago. Fourth quarter revenue growth for AWS was 13%. Wedbush was expecting AWS first quarter revenue of $24.6 billion. AWS announced the general availability of Amazon Q earlier in the day

By the numbers:

  • Amazon's North America commerce unit had first quarter revenue of $86.3 billion with operating income of $5 billion.
  • International commerce sales in the first quarter were $31.9 billion, up 10% from a year ago, with operating income of $900 million.
  • Amazon's first quarter net income includes a $2 billion non-operating loss from the company's investment in Rivian.
  • Amazon advertising revenue in the first quarter was $11.8 billion, up 24% from a year ago.

CEO Andy Jassy said AWS was benefiting from "the combination of companies renewing their infrastructure modernization efforts and the appeal of AWS’s AI capabilities is reaccelerating AWS’s growth rate (now at a $100 billion annual revenue run rate)."

Speaking on an analyst conference call, CEO Jassy talked up cloud demand, Amazon Bedrock and the company's approach to generative AI. He said: 

"Companies have largely completed the lion's share of their cost optimization and have turned their attention to newer initiatives. We see considerable momentum on the AI front where we've accumulated a multi billion dollar revenue run rate already."

Jassy touted Bedrock and said enterprises are increasingly looking at generative AI strategies that revolve around model selection and customization ability. He said Bedrock's recent launch of custom model import was "a sneaky launch as it satisfies a customer request that has not been met yet." Amazon Bedrock gets custom model import, evaluation tools, new Titan models

"The prospect of these two linchpin services in SageMaker and Bedrock working well together is quite appealing the top of the stack for the Gen AI applications being built," said Jassy, who added that Q is off to a good start with enterprises. 

He said capital spending will be up as AWS builds out data centers to meet demand. 

"The more demand AWS has, the more we have to procure new data centers power and hardware. And as a reminder, we spend most of the capital upfront. As you've seen over the last several years, we make that up an operating margin and free cash flow down the road as demand scales out. We don't spend the capital without very clear signals that we can monetize it. We remain very bullish in AWS. We're at $100 billion dollar annualized revenue run right now and 85% or more of the global IT spend remains on premises. And this is before you even calculate genAI.  There's a very large opportunity in front of us."

As for the outlook, Amazon projected second quarter sales of $144 billion to $149 billion, up 7% to 11% from the previous year.

More:

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Amazon Q generally available with new pricing plans

Amazon Q is generally available across the Amazon Web Services ecosystem and the generative AI capabilities come with new pricing models.

AWS' Q generative AI assistant was announced at re:Invent and has been tested across multiple use cases before availability. Amazon Q is a layer in the AWS stack that serves as glue across multiple services.

The headliners for Amazon Q are Amazon Q Developer, a coding assistant, Amazon Q Business, designed to make employees more productive, and Amazon Q Apps, which are part of Amazon Q Business and can automate business tasks. Amazon Q Apps are in preview. 

As for pricing, Amazon Q Business has two tiers. Amazon Q Business Lite is $3 per user a month for basic functionality. Amazon Business Pro, which includes all features, Amazon QuickSight and Amazon Q Apps, goes for $20 per user a month.

Amazon Q Developer has a free tier and Pro, which is $19 per month per user.

AWS has a few pricing examples. Here's one for Amazon Business.

You are an enterprise company with 5,000 employees looking to deploy Amazon Q Business. You decide to purchase Amazon Q Business Lite for 4,500 users and Amazon Q Business Pro for 500 users. You have 1 million enterprise documents across sources like SharePoint, Confluence, and ServiceNow that need indexing with an Enterprise Index. Your monthly charges will be as follows:

  • Enterprise Index for 1M documents will need 50 index units of 20K capacity each (assuming that the extracted text size of 1M documents is less than 200 MB * 50 units = 10 GB) :
  • $0.264 per hour x 50 units x 24 hours x 30 days = $9,504

User subscriptions:

  • 4,500 users * $3 per user/month = $13,500
  • 500 users * $20 per user/month = $10,000
  • Total user subscriptions: $23,500

In summary, your monthly charges are as follows:

  • Enterprise Index: $9,504
  • User subscriptions: $23,500
  • Total per month: $33,004

The return-on-investment case for Amazon Q is straightforward--Amazon Q will provide a productivity boost. Amazon Q Developer also has tools to optimize their AWS environments by analyzing billing trends, consumption and costs by region.

Amazon Q Business is able to connect to more than 40 common business tools to surface insights and serve up insights and dashboards on the fly.

More:

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Coursera's outlook highlights how genAI reskilling will be lumpy

The state of reskilling in the generative AI era looks like it's going to be a bit lumpy if Coursera's first quarter results and outlook are any indication. Coursera CEO Jeff Maggioncalda said, "we remain in the early stages of understanding how generative AI will reshape the way we live, learn and work."

Coursera's first quarter results were mixed as earnings beat expectations, but revenue fell short. The second quarter outlook from Coursera calls for revenue of $162 million and $166 million, well below Wall Street estimates of $177.8 million. Coursera said 2024 revenue will be $695 million to $705 million, which was short of $736.5 million.

The company has three operating units--consumer, enterprise and degrees. Consumer revenue was up 18% in the first quarter compared to a year ago and enterprise and degrees revenue was up 10%. AI courses were driving demand in consumer and building in enterprise and degrees. Coursera's generative AI transformation is about a year old

However, for all the talk of reskilling in the genAI era there are false starts. Citing an Accenture report, Maggioncalda noted that 95% of employees see value in working with generative AI, but only 5% of organizations are actively reskilling their workforce at scale.

For a company like Coursera, the challenges are leveraging genAI for content creation in an accurate way, being able to swap models as needed and hitting learnings beyond builders. Coursera has built out its AI courses and landed deals with enterprises as well as universities.

Maggioncalda said Coursera partners have built out more than 75 new courses and project in generative AI since the start of the year. Coursera is also using generative AI to power Coursera Course Builder that will enable "any business, government or campus customer, to easily and quickly produce high-quality custom private courses at scale."

In addition, Coursera Coach is designed to be an AI-powered tutor. So, what's the problem? Here are a few.

  • Consumer revenue. Ken Hahn, CFO of Coursera, said consumer revenue "was softer than anticipated" in North America. "We underperformed in our North American region, where we are experiencing a lower volume and conversion of paid learners compounded by the delay of the key content launch from one of our educator partners as compared to the timing in our financial plan," said Kahn.
  • Businesses are mixed when it comes to reskilling. Government and campus verticals are showing strong demand for Coursera for Business, but corporate learning budgets are tight. "We continue to see a divergence in performance across our verticals, specifically pressuring Coursera for Business, offset by momentum in our other two verticals, government and campus," said Kahn. "Corporate learning budgets remain under pressure."
  • The degree revolution hasn't arrived just yet. Coursera's degree revenue was $14.8 million in the first quarter, up 10% from a year ago. Total number of students was 22,200, up 23% from a year ago. There are no content costs for degree revenue, so the segment sees gross margins of 100% of revenue. Kahn said:

"We remain focused on the long-term opportunity in degrees. We believe that our platform is uniquely positioned to fundamentally transform the college degree. We need to start validating that potential with renewed and increasing growth. We believe that the path to better degrees growth lies in working with our university partners to create stronger pathways between our consumer segment where we benefit from scale and the growing selection of pathway degree programs."

Maggioncalda said AI content is driving engagement, but there needs to be more courses that cater to a larger base. He said:

"People want new AI content, both for the builders who are building these models. But also, the users, people who need to learn how to use this stuff. We see broad appetite 4x what we saw last year in terms of people taking AI-related content."

Ultimately, the population of AI users is going to be much larger than the builders. Coursera needs to accelerate content launches that add AI throughout existing courses and launch modules that educate people on how their roles will change.

Maggioncalda said:

"Generative AI will have a huge impact on the way people do their jobs.

They're going to need to learn new skill to be, you name it, a PR comms person or a financial analyst or a supply chain manager or a UX designer. We think there's a very broad opportunity to really refresh the content to appeal to strong demand that we're seeing from learners around generative AI."

Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief People Officer Chief Information Officer Chief Experience Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Snapchat gets boost from lower cloud infrastructure costs

Snapchat made a bet on using machine learning and AI to improve its advertising platform, increase content engagement and ultimately revenue growth. If it could optimize its infrastructure spending, Snapchat would be able to grow the bottom line.

The first quarter gave an indicator that Snapchat's bets are starting to pay off. What's unclear is whether the company can continue to optimize its cloud spending since the first quarter bottom line was helped along by credits from hyperscale cloud providers.

Google Cloud is Snapchat's primary cloud provider, but Amazon Web Services is in the mix, according to Snapchat's regulatory filings.

CFO Derek Anderson said:

"We benefited from higher-than-average service provider credits in Q1 that helped to further reduce infrastructure costs in Q1. As a result, infrastructure cost per DAU declined from $0.84 in Q4 of 2023 to $0.80 in Q1 of 2024."

Saving 4 cents per daily active user doesn't sound like much until you scale those savings across 422 million daily active users. In Snapchat's results, cloud costs--operating expenses--turn up in cost of revenue. Snapchat executives said on the company's first quarter earnings call that the company is spending about $100 million a quarter on machine learning and AI.

"CxOs need to consider that cloud infrastructure costs are driven by usage, no matter how efficient an application's usage of the infrastructure is," said Constellation Research analyst Holger Mueller. "So for Snapchat shaving infrastructure cost is a step in the right direction – but not an insight on the efficiency of its coding."

The company reported a net loss of $305.1 million, or 19 cents a share, on revenue of $1.19 billion, up 21% from a year ago. Non-GAAP earnings were 3 cents a share, well above expectations.

Snapchat CEO Evan Spiegel said in a shareholder letter and the earnings call that the company is improving content ranking and personalization with the help of its AI investments.

"We've built larger and more advanced ranking models that are driving improvements in content engagement. In addition, we made significant progress toward unifying the ranking models between Spotlight and Stories to a single backend stack that ranks all content types," said Spiegel. "A single, unified stack will benefit Snapchatters by showing them the most relevant and entertaining content across Snapchat and helping creators find and deepen engagement with their audience."

An improving ad market also helped Snapchat, but its cloud savings flowed faster to the bottom line. Snapchat noted that it has improved its cloud infrastructure unit costs with "engineering efficiency and pricing improvements."

Indeed, Snapchat was an active participant in Google Cloud Next with sessions on moving to a microservices architecture and how it is using BigQuery and other data platforms. Snapchat has been built on Google Cloud since its inception and participated in 10 sessions at Google Cloud Next.

However, Snapchat said quarterly costs per daily active user (DAU) will be in the 83 cents to 85 cents range for the remainder of 2024. It's worth noting that Snapchat had an infrastructure cost per daily active user of 59 cents in the first quarter of 2023 and 70 cents due to its AI investment in the second quarter of 2023 before escalating to current levels of 80 cents to 85 cents per DAU.

Snapchat's infrastructure cost per DAU guidance seems to indicate that the company still has to work on its cloud optimizations and account for its AI investments. It's possible that the first quarter cost of revenue was helped in a large part because of one-off cloud credits.

Fully optimized--assuming infrastructure costs from 2022 are a benchmark--Snapchat's optimized state is infrastructure costs of 59 cents per DAU.

Spiegel said that infrastructure costs are slowing quarter over quarter as the company optimizes its spend even as it boosts revenue growth. 

Data to Decisions Marketing Transformation Tech Optimization AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer