Results

Five9 acquires Aceyus, aims to expand analytics, enterprise reach

Five9 acquires Aceyus, aims to expand analytics, enterprise reach

Five9 said it will acquire Aceyus, which ingests data from multiple customer experience systems and provides insights and analytics. Separately, Five9 reported second quarter earnings and projected $908 million to $910 million in 2023 revenue.

Aceyus provides call center analytics, customer journey data, omni-channel reporting and multiple integrations. Terms of the deal weren't disclosed.

According to Five9, Aceyus will provide its CX platform with contextual data across disparate systems. Aceyus integration catalog will also boost Five9's data lake. Five9 said Aceyus will also bolster its AI and automation portfolio.

Mike Burkland, Five9 CEO, said Aceyus and Five9 have multiple joint accounts. "The addition of Aceyus will extend our platform to further facilitate the migration of large enterprise customers to the cloud and to leverage contextual data to deliver personalized experiences," said Burkland in a statement.

Separately, Five9 reported second quarter revenue of $222.9 million, up 18% from a year ago. Five9 reported a second quarter net loss of $21.7 million. Non-GAAP earnings for the second quarter were $37.4 million, or 52 cents a share.

In the second quarter, Five9 reported enterprise subscription revenue growth of 28%. About 87% of Five9's revenue is enterprise.

As for the outlook, Five9 projected third quarter revenue of $223.5 million to $224.5 million with non-GAAP earnings of 42 cents a share to 44 cents a share. For 2023, Five9 projected non-GAAP earnings of $1.79 a share to $1.83 a share on revenue of $908 million to $910 million.

Constellation Research's take

Constellation Research analyst Liz Miller handicapped Five9 recent developments. Here's Miller's take:

Five9 Buys Aceyus: The opportunity here is to accelerate the shift from on-prem to cloud, especially when it comes to all that messy, complex and crazy customer data that delivers that robust, personalized and highly contextual experience that Five9 looks to deliver in an agile, cloud-first experience. Not only is the contact center a treasure trove of critical (and often untapped) customer data, it also sits at the front line of CX delivery. By harmonizing that gold found in the true voice of the customer with the stores of customer data that can be brought in from any number of CX outposts, we get a new, even more potent fuel for CX-centric AI applications. But the danger here is that this runs the risk of creating another silo of customer data that behaves just like a single use appliance for a single department or function. With Aceyus, the vision is to ingest and harmonize complex data…which many out there will recognize as the siren’s song of the enterprise Customer Data Platform (CDP). So the question here is will this vision assume that all of the teams at the front lines of real CX (eg: sales, service AND marketing) will all have their own CDP? Will this Five9 Aseyus offering sit next to, above or below an Amperity, Salesforce Data Cloud (eg Genie), Adobe Real-Time CDP or Segment implementation?

Five9 Earnings: These results are really a testament to a movement started almost 2 years ago to march up-market. The item to focus on here is the partner-driven growth that is behind this continued acceleration. Over 60% of Five9’s international implementations are being done by partners and there is phenomenal demand from partners to continue advancing, especially with the new innovations around AI. One note from the earnings indicates that 15 partners achieved over $1M in bookings for the quarter. Five9 had not always been touted as the “best” partner in the contact center market…but that has changed and they are actively investing in making sure that partners are happy and that channel growth and bookings are healthy.

Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Digital Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Zoom's customer data terms for training AI may be just the beginning

Zoom's customer data terms for training AI may be just the beginning

Zoom updated its terms of service to give the platform the right to use some customer data for training its AI models. Should customers enable Zoom's generative AI features, they'll have to sign a consent form to train models with customer content.

However, Zoom said that it will not use audio, video or chat content for training models without consent.

These new terms are outlined in a blog post and it's likely that other tech providers may have similar efforts in the future. The trade-off will be sharing your data vs. generative AI pricing.

In the post, Zoom noted the following:

  • Service generated data--telemetry and diagnostic data--is Zoom's data and can be used to train models to improve the experience.
  • Meeting recordings are owned by the customer and Zoom has a license to it to deliver the service.
  • Zoom's new generative AI features--Zoom IQ Meeting Summary and Zoom IQ Team Chat Compose--are currently a free trial. Zoom account owners’ control whether to enable those features, but if they do user content will be used to train AI models. No third-party model training will be allowed.

Constellation Research analyst Dion Hinchcliffe said:

"Zoom certainly touched upon a major nerve of marketplace fears when its recently updated Terms of Service granted it an essentially unlimited license to all user content (video, audio, text) that passes through its platform. The license is for a litany of uses, ranging from product improvement to training its generative artificial intelligence models. The big concern of course, is that customer IP and people's private information will get stored in such models, where it could be misused. There are also all sorts of very thorny issues with regulatory regimes like HIPAA that are implicated and likely violated by these license clauses as well.

For its part, Zoom has tried to clarify what it actually does with its license, both in a blog post and in boldface in the new terms of service. Essentially that the company does not apparently use this license without prior consent from the user within the app, Yet the terms of servce still grants Zoom the license regardless. Given the major uproar this change in terms has caused, this is going to be a widely watched test case -- and there will no doubt be others -- that will pave the early path for how vendors and the market negotiate this very sensitive subject. The view from this analyst at least, is that vendors should go out of their way to take the high road with customer data. Those that don't establish and maintain very high levels trust with customers regarding their data will not enjoy the fruits of the coming AI revolution."

A few quick thoughts:

  • Zoom's terms are likely to cause a kerfuffle today, but over time they'll be standard.
  • AI model training and data sharing may lead to discounts as both vendors and customers weigh cost vs. the value of data.
  • While collaboration is the theme with Zoom's terms, this data vs. licensing cost will become more interesting with more mission critical data from CRM and ERP systems.
  • Zoom’s terms will be more of an enterprise issue. Small business customers may not care. On an individual basis we’re all used to being the product (Facebook is exhibit A).
  • It's likely vendors are going to wait and see how customers react to Zoom's terms before doing anything similar.

 

Future of Work Data to Decisions Innovation & Product-led Growth New C-Suite Marketing Transformation Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Tech Optimization zoom AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Education gets schooled in generative AI

Education gets schooled in generative AI

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly.

Generative AI, which appears to be initially loved by students and loathed by educators, is coming to education as its embedded in courseware as well as learning management systems.

Recent earnings reports from Pearson, Coursera and Infrastructure, the company behind the dominant Canvas learning management system, all featured a heavy dose of generative AI and product talk. At a high level the takeaways are:

  • Courseware will get generative AI and large learning models (LLMs) that will highlight proprietary IP and datasets from education companies.
  • Learning management systems will embed generative AI and create alliances with companies that are disruptors.
  • Education will leverage generative AI for personalized learning experiences and create content at scale.

Here's a tour of what's happening in the education technology stack.

Pearson: Sees AI value in proprietary data

Pearson is best known for its courseware and typically known as an education publisher. CEO Andy Bird said generative AI will create turbulence in education, but the technology is likely to be a "long term positive" for Pearson's business in higher education and across the portfolio.

"We believe the value of our proprietary IP and datasets will increase over time. We have deep AI experience and expertise across the whole company. We're starting to introduce new AI enabled products across the business," said Bird, speaking on the company’s earnings conference call. "What this interest does demonstrate is the real value to be had of owning your own intellectual property. We're also continuing to monitor legal and legislative developments very closely."

Bird, who noted Pearson is early in its AI journey, said there are positive implications of embedding generative AI into its higher-ed courseware. "We've been working with one goal in mind, namely, how to improve the learning experience for both faculty and student," said Bird. "We're not interested in utilizing this technology merely to provide students with a shortcut to an answer. When we tested different LLMs with a question from Campbell's Biology, they often didn't get it correct. So, we believe delivering products that are reliable, accurate and trustworthy is paramount."

Tony Prentice, Pearson's chief product officer, said the company is embedding generative AI into its Pearson+ and MyLab & Mastering study tools. The takeaway: Pearson can leverage its content library and surface insights and educational opportunities with generative AI.

Infrastructure: Working AI into Canvas

Infrastructure, the company behind learning management system Canvas, said it will partner and embed generative AI throughout its platform. CEO Steve Daly said the goal is to leverage generative AI "to empower educators to meet students where they are in their educational journey."

Canvas has launched an emerging AI marketplace that will give educators access to AI tools that are integrated into Canvas and ensure privacy and security, said Daly.

At its InfrastructureCon 2023 conference, Infrastructure outlined the following advances in its platform:

  • A partnership with Khan Academy, which will bring Khan Academy AI-tools and content to Canvas. The two companies are looking for design partners and early adopters for the 2024-2025 school year.
  • AI-assisted course templating layouts to improve efficiency, reduce administrative tasks and pace assignments.
  • In-context student support with the integration of AI writing tutor Khanmigo, part of Khan Academy.
  • AI tools to surface analytics and insights.

Coursera: Multiple AI plays

Coursera is worth watching in the education stack for a few reasons. First, it's a higher-ed play. But it also has a big enterprise training business as well as a consumer unit. Coursera is aiming to not only use generative AI for content, but also to reskill employees and democratize knowledge about AI.

In the first quarter, Coursera outlined where AI fits in the reskilling space. Now Coursera has Coursera Coach, "a virtual learning partner, powered by generative AI and grounded in our expert content."

Speaking about Coursera Coach, CEO Jeff Maggioncalda said:

"It is designed to allow learners to ask questions and receive personalized explanations and answers, get personalized evaluations and feedback on their submissions, receive context-relevant examples and practice questions. And discover quick video lecture summaries and resources to better understand a specific concept. We launched a beta version of Coach to millions of Coursera Plus subscribers during the quarter and continue to be excited about the early feedback."

The company also launched Coursera ChatGPT, a plug-in to surface and personalize Coursera's catalog.

Maggioncalda said Coursera ChatGPT is "like an academic counselor." "The ChatGPT plugin allows learners using GPT-4 to identify recommended content and credentials based on the subject or career field the learner says they’re interested in exploring. It’s one example of the initiatives we are working on related to generative AI and reimagining the personalized discovery experience," said Maggioncalda.

Coursera also has a machine learning translation effort to localize content at scale. In the second quarter, Coursera delivered subtitle translation for 2,000 courses in seven different languages.

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Why Llama 2, open-source models change the LLM game

Why Llama 2, open-source models change the LLM game

Meta recently open-sourced Llama 2 and made it free for research and commercial uses. The move quickly put Llama 2 on the open-source leaderboard for large language models (LLMs) and spurred enterprises to give it a spin.

While the move is notable, there are a bevy of nuances for enterprises to consider. Here's a look at moving parts to consider as enterprises put Llama 2 through its paces as they craft generative AI use cases. This research note is based in part on a transcription of a CRTV discussion between Constellation Research's Larry Dignan and Andy Thurai.

Why Llama 2 matters. Thurai said OpenAI captured the imagination of the enterprise, but there's a need for open-source large language models (LLMs). "Everybody is going crazy about OpenAI, but what people don't realize is that after your proof of concept there is a usage model that adds up," said Thurai. "It can get quite expensive, so companies are looking for alternatives with open-source models."

Thurai said:

"Meta's offering is the first that is fully open-sourced and free to use commercially, truly democratizing AI foundational models. It is easier to retrain and fine-tune these models at a much cheaper cost than massive LLMs. Meta also released the code and the training data set freely. And wider availability can make this popular sooner. It is available on Azure (through Azure AI model catalog), on Hugging Face, AWS (via Amazon Sagemaker Jumpstart), and even Alibaba Cloud."

Llama 2's sizes. Llama 2 is also interesting for enterprises because it can be used for small language models or specialized models, said Thurai. Llama 2 also offers more parameters and sizes. "There are three primary variations: 7 billion parameters, 13 billion and 70 billion," explained Thurai. "These are comparatively much smaller models than ChatGPT, but more accurate." Those sizes quickly put Llama 2 on the Hugging Face leaderboards.

 

Is Llama 2 open source? Thurai said there has been a good amount of debate about whether Meta's language model is open source. Usually, open-source software is available for anyone to use without restrictions. Llama 2 has conditions about commercial use, said Thurai. "The average enterprise isn't likely to hit that commercial use number so it's not much of a restriction. Meta put restrictions in because it doesn't want other companies to use Llama 2 against the company in a competitive situation," said Thurai.

Will enterprises use Llama 2? Thurai said enterprises will try Llama 2 for pilots and proof of concept projects, but beyond that point usage is debatable. "It's tough to say what will happen, you have to read the rules carefully to ensure Meta doesn't come after you for licensing infringement," said Thurai.

Using alternatives. Thurai said Llama 2 is worth exploring but the Falcon LLM is popular as is MosaicML, which now falls under Databricks' umbrella. Open-source models should be in the enterprise mix, but it's worth knowing the vendor business models. "The money is in helping you train your own models," said Thurai. For now, enterprises should try alternatives with an eye toward costs. After all, most companies won't have the resources to grab open-source models and train them with proprietary data. Managed model training will also be important.

What's next? Thurai said enterprises are exploring multiple LLM options and it's too early to tell where they'll land. Some enterprises will lean toward proprietary models with industry-specific use cases. Others will fine tune open-source models. "There will be a lot of variations," said Thurai.

Thurai also said there will be a divergence between proprietary and open source LLMs. "I expect to see more divergence and more closed garden models like GPT-4 and Bard grow and models like Falcon and Llama 2," he said.

Thurai added:

"I expect more LLMs to come to market. Some will try to go big, and some will try to go small. But in order to differentiate, I expect domain-specific small language models to appear that will be very specific to industry verticals.

I also expect a lot of smaller companies to provide the missing pieces to make usage of LLMs much better. I also expect more open-source models to hit the market soon."

Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Future of Work Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Salesforce launches Einstein Studio: What you need to know

Salesforce launches Einstein Studio: What you need to know

Salesforce launched Einstein Studio, which allows customers to bring their large language models (LLMs) from other services such as AWS' Amazon SageMaker, Google Cloud Vertex AI and other services via Salesforce Data Cloud.

With Einstein Studio, enterprises can bring their own LLMs to Salesforce Data Cloud without moving data. Salesforce's Einstein Studio, which is generally available, can use custom LLMs along with Einstein GPT's LLMs and remove the need for Extract Transform and Load (ETL).

The theme of model choice has become a key topic among enterprise buyers. AWS has also hit the bring your own model theme in recent days.

Salesforce Einstein Studio has a control panel to manage AI models, zero-ETL integration, connections to Salesforce data and no-code tools.

Here's what Einstein Studio means for customers, according to Constellation Research analysts Liz Miller and Andy Thurai.

Bring your own model approaches. Miller said:

"Einstein Studio is intended to be the point of model centralization and governance for a Salesforce customer. The goal is to be a center point and NOT the whole or only model. Larger enterprise customers were already developing models in everything from Sagemaker to Vertex. This move is in line with Salesforce's more partnership-forward and ecosystem friendly approach. Similar to how Data Cloud invites those lakehouse connections across Snowflake and Databricks, BigQuery and beyond, Einstein Studio is issuing an invitation for all AI models to find a spot in the Salesforce ecosystem."

What's in it for Salesforce? Thurai said:

"This move by Salesforce is rather brilliant. The problem with providing your own set of tools to create AI models is not easy. You have to constantly update the platform almost daily. For the true data scientists who are used to other tools, Salesforce provides an option such as AWS SageMaker and Google Vertex as well as other AI model training platforms to train their own models or use existing ones and fine-tune with custom data. The familiarity of the platform combined with the power of custom data makes this unique."

Model sprawl. Miller said:

"Most businesses are drowning in models that are under development and even more that have never made it out of development. Einstein Studio is Salesforce attempting to unlock AI investments faster by providing a fast, cost effective and readily available on-ramp for AI into business use cases."

Zero-ETL. "Zero-ETL means there will be no data movement. Data movement has been a major issue when it comes to model training," said Thurai. "Data needs to be moved around, managed, maintained, cataloged, secured and governed. Keeping the data where it is and creating the models where it is convenient is somewhat unique. Many vendors want you to create the model in their platform so you can be more tethered to them."

What's in it for customers? Miller said the bring-your-own-model approach from Salesforce means it's a vendor that's committed to moving with the market. Salesforce is moving to "deliver usable AI tools and solutions that bring AI models out of the development cycles and into view for many average business users."

"Right now, business users in sales, marketing and service are being promised a lot in this age of AI. But there is very limited runway to prove that these tools investments will yield real, profitable meaningful CX results," said Miller. "Salesforce is building customers glide paths to AI success."

More on Salesforce:

 
Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity salesforce ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

AWS closes generative AI narrative gap, plays on choice, Trainium, Inferentia chips

AWS closes generative AI narrative gap, plays on choice, Trainium, Inferentia chips

Amazon CEO Andy Jassy fleshed out Amazon Web Services' narrative when it comes to generative AI providing a multi-level view that also includes a good dose of its own silicon.

When it comes to generative AI, AWS has been caught between Microsoft Azure and Google Cloud Platform, two companies that have app-layer stories to tell. What's an app-layer story? Think co-pilots galore. That co-pilot led narrative is dominated by tech vendors that play on the application layer--Microsoft, Google and Salesforce to name a few.

Jassy's argument on Amazon's earnings conference call is that generative AI also means a lot of large language model (LLM) training. That training is why everyone is why Nvidia is the prom queen at the generative AI dance.

But Jassy outlined two key points. Nvidia GPUs won't be the only option for model training. Yes, there will be AMD, but there will also be AWS' custom silicon. To AWS, generative AI is comprised of three layers and so far, the application layer has received all the buzz. AWS is going to play in the lower layers--compute and models as a service.

He said (emphasis mine):

"At the lowest layer is the compute required to train foundational models and do inference or make predictions. Customers are excited by Amazon EC2 P5 instances powered by NVIDIA H100 GPUs to train large models and develop generative AI applications. However, to date, there's only been one viable option in the market for everybody and supply has been scarce.

That, along with the chip expertise we've built over the last several years, prompted us to start working several years ago on our own custom AI chips for training called Trainium and inference called Inferentia that are on their second versions already and are a very appealing price performance option for customers building and running large language models. We're optimistic that a lot of large language model training and inference will be run on AWS' Trainium and Inferentia chips in the future."

Training models isn't cheap and those with the infrastructure are going to fare well. Not everyone needs a luxury training processor.

The middle layer of the generative AI game will be LLMs as a service. Managed services are the AWS specialty in the cloud. Jassy said:

"We think of the middle layer as being large language models as a service. Stepping back for a second, to develop these large language models, it takes billions of dollars and multiple years to develop. Most companies tell us that they don't want to consume that resource building themselves. Rather, they want access to those large language models, want to customize them with their own data without leaking their proprietary data into the general model, have all the security, privacy and platform features in AWS work with this new enhanced model and then have it all wrapped in a managed service."

Jassy's comments line up with what we've heard repeatedly from CXOs. Some enterprises are looking at private cloud options for training. Some are thinking about going on-premises for training. Others want model choice including smaller LLMs that are use case specific. Choice isn't a bad thing and it's highly likely that not every enterprise is going to play along with OpenAI tolls.

This LLM choice mantra was seen last week when AWS outlined Bedrock at AWS Summit New York.

These first two layers are where AWS will play, said Jassy. "If you think about these first 2 layers I've talked about, what we're doing is democratizing access to generative AI, lowering the cost of training and running models, enabling access to large language model of choice instead of there only being one option," said Jassy.

What about the apps? Jassy said AWS is an enabler with services like CodeWhisperer. "Inside Amazon, every one of our teams is working on building generative AI applications that reinvent and enhance their customers' experience. But while we will build a number of these applications ourselves, most will be built by other companies, and we're optimistic that the largest number of these will be built on AWS," he said.

For good measure, Jassy reiterated what many vendors and customers have been saying. Without a good data strategy, you don't have AI, generative or otherwise. And by the way, AWS is embedded in a bunch of data management plays.

See: JPMorgan Chase: Digital transformation, AI and data strategy sets up generative AI | Goldman Sachs CIO Marco Argenti on AI, data, mental models for disruption 

Add it up and AWS laid out its generative AI case and got back into the perception and mindshare game. Yes, AWS growth rates are stable (12% in the second quarter) and that's slower than its rivals. But there's also a much bigger base. AWS is also seeing a lot of cost optimization from customers. In the end, it's likely those generative AI workloads are going boost AWS, which is "starting to see some good traction with our customers' new volumes."

Naturally, analysts want to know how AWS is going to monetize generative AI. The short answer is AWS is going to see more volume. What's unclear is whether there will be some add-on model. My guess is probably not. For starters, the upcharge for generative AI is an approach for SaaS companies that is going to get tired quickly.

Jassy, however, said it's way too early. "I think we're in the very early stages there. We're a few steps into a marathon in my opinion. I think it's going to be transformative, and I think it's going to transform virtually every customer experience that we know. But I think it's really early," said Jassy. "I think most companies are still figuring out how they want to approach it. They're figuring out how to train models. They want to -- they don't want to build their own very large language models. They want to take other models and customize them, and services like Bedrock enable them to do so. But it's very early. And so, I expect that will be very large, but it will be in the future."

Takeaways

Here's what AWS accomplished on Amazon's earnings conference call:

  1. AWS provided a more nuanced view of generative AI that plays to its core strengths--developers and enterprise builders. By not going co-pilot happy, AWS' narrative was almost refreshing.
  2. Outlined how important custom silicon will be. Nvidia infrastructure isn't cheap and CXOs will be looking for whatever processors get the training job done.
  3. Put Jassy back into a familiar role: AWS lead singer.
  4. Gave enterprise buyers a sermon more in line with their current thinking on generative AI.
  5. Allayed concerns from Wall Street. I noted the narrative gap when it came to the cloud and generative AI players and how AWS got lost. Analysts will be on the bandwagon again based on Amazon's second quarter results. Why does Wall Street matter? CXOs watch a lot of CNBC and Bloomberg too.

Related:

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Apple Q3 services revenue shines

Apple Q3 services revenue shines

Apple's fiscal third quarter revenue was down 1% from a year ago, iPhone and iPad sales were light relative to expectations and services revenue surged.

For the third quarter, Apple reported earnings per share of $1.26 with revenue of $81.8 billion.

Wall Street was expecting Apple to report fiscal third quarter revenue of $81.7 billion with earnings of $1.19 a share. As usual, iPhone revenue was expected to carry the quarter.

Going into the quarter, analysts were also interested in Apple traction in India as well as the ability to sell services to the iPhone installed base. Apple iPhone 15 sales won't land until November/December.

In a statement, CEO Tim Cook said third quarter services revenue hit a record in the June quarter with more than 1 billion paid subscriptions.

By the numbers:

  • Apple's China revenue was $15.76 billion, up from $14.6 billion a year ago.
  • iPhone sales were $39.7 billion in the third quarter, down from $40.67 billion a year ago.
  • Mac sales were $6.84 billion, down from $7.38 billion a year ago.
  • iPad sales in the third quarter were $5.79 billion, down from $7.22 billion a year ago.
  • Wearables revenue was $8.38 billion, up from $8.08 billion a year ago.
  • Services revenue in the third quarter landed at $21.2 billion, down $19.6 billion.

Estimates for iPhone sales were $39.9 billion with iPad and Mac, delivering $6.4 billion and $6.6 billion, respectively. Services revenue was expected to come in at $20.8 billion. All estimates were from Refinitiv.

 

Speaking on a conference call with analysts, Cook said Apple saw record sales in India as well as other countries. India is a closely watched growth region for Apple. Cook said the macroeconomic picture remains mixed and continues to "manage deliberately and innovate relentlessly."

Key comments include:

  • Cook said he was "thrilled" by the response to Vision Pro.
  • Mac sales were down 7% in the quarter, but Apple completed the transition to its own processors.
  • iPad revenue was down 20% in the quarter due to a difficult comparison to a year ago when the iPad Air launched. Back to school season can be a tailwind to iPad.
  • Apple Watch sales were in line with the company's expectations.
  • Services saw a sequential acceleration in the quarter with Apple Care, Apple TV and Apple Pay showing strength. Cook also said Apple Card is riding Apple Pay momentum.
Next-Generation Customer Experience apple Chief Information Officer

Amazon Web Services posts Q2 revenue growth of 12%, teases AI launches

Amazon Web Services posts Q2 revenue growth of 12%, teases AI launches

Amazon Web Services sales in the second quarter were up 12% from a year ago to $22.1 billion with operating income of $5.4 billion.

Overall, Amazon reported second quarter net income of $6.7 billion, or 65 cents a share, on revenue of $134.4 billion. The earnings include a $200 million gain from the valuation of Rivian.

Wall Street was expecting Amazon to report second quarter non-GAAP earnings of 35 cents a share on revenue of $131.5 billion. AWS revenue was expected to be about $21.8 billion.

Leading up to Amazon's report, analysts were focused on AWS growth and cost cutting. Google Cloud grew at a 28% clip in the second quarter and Microsoft Azure grew 26% in its latest quarter from smaller bases. Microsoft doesn't break out Azure revenue. 

By segment:

  • North American e-commerce sales were up 11% in the second quarter to $82.5 billion with operating income of $3.2 billion.

  • International e-commerce sales were $29.7 billion, up 10% from a year ago, with an operating loss of $900 million.

  • AWS had operating income of $5.4 billion on revenue of $22.1 billion.

In a statement, CEO Andy Jassy said the company has continued to lower its cost on the retail side and said the following about AI:

"Our AWS growth stabilized as customers started shifting from cost optimization to new workload deployment, and AWS has continued to add to its meaningful leadership position in the cloud with a slew of generative AI releases that make it much easier and more cost-effective for companies to train and run models (Trainium and Inferentia chips), customize Large Language Models to build generative AI applications and agents (Bedrock), and write code much more efficiently with CodeWhisperer."

As for the outlook, Amazon projected third quarter sales between $138 billion and $143 billion, or up 9% to 13% from a year ago. Operating income is expected to be between $5.5 billion and $8.5 billion. 

Tech Optimization Next-Generation Customer Experience amazon B2C CX Chief Information Officer

Meta's Llama 2 and what that means for GenerativeAI

Meta's Llama 2 and what that means for GenerativeAI

What's happening with Meta's latest Llama 2 #LLM and its new partnership with Microsoft? What does this mean for the future of #GenerativeAI?

Constellation analyst Andy ThurAI sits down with Constellation Editor in Chief Larry Dignan to discuss trends and pitfalls of large language models, and how #enterprises will use #LLMs and #AI technology in the future...

On ConstellationTV Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/5a6z9KfneGQ" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

SAP user group DSAG rips S/4HANA innovation plans, maintenance increases

SAP user group DSAG rips S/4HANA innovation plans, maintenance increases

User groups are not happy about SAP's plan to only offer the latest innovations to cloud customers only.

During SAP's second quarter earnings call, SAP CEO Christian Klein said:

"SAP's newest innovations and capabilities will only be delivered in SAP public cloud and SAP private cloud using RISE with SAP as the enabler. This is how we will deliver these innovations with speed, agility, quality and efficiency. Our new innovations will not be available for on-premises or hosted on-premises ERP customers on hyperscalers."

The upshot is that SAP wants its customers to migrate to S4/HANA.

In a release, DSAG, the German-speaking SAP user group, blasted the plan to only make innovations such as AI and green ledger only available to customers using SAP S/4HANA Cloud, Public Edition or SAP S/4HANA Cloud, Private Edition via GROW-with-SAP or RISE-with-SAP contracts.

DSAG said:

"On-premises customers cannot benefit from major innovations such as artificial intelligence (AI) and green ledger. This also applies to larger function modules and extensions based on the Business Technology Platform (BTP). At the same time, SAP plans to increase maintenance fees. From the point of view of DSAG, SAP is leaving numerous loyal customer companies in the lurch with this approach."

According to DSAG, the SAP cloud-only plans will hit on-premises customers, direct-to-consumer hyperscalers and managed services providers hard. DSAG also blasted SAP's 30% price increase for new innovations.

Thomas Henzler, DSAG Board Member for Licenses, Service & Support, called the price increase and SAP operating model a "real showstopper and a big disappointment." DSAG advised companies to reevaluate planned S/4HANA implementations.

DSAG's key gripes include:

  • SAP's focus on cloud ERP customers creates a two-tier system.
  • On-premises customers have invested in S/4HANA and will lack sustainability and AI innovations.
  • SAP didn't justify the move beyond it being a good business decision for the company.
  • Sustainability innovations saw price increases before the products were ready.
  • All innovations for S/4HANA clouds should be available to on-premises customers.
  • The value of maintenance fees from SAP is unclear.

Constellation Research's take

Constellation Research analyst Holger Mueller said:

"Price increases for maintenance and support are in the basic enterprise software strategy book of vendors – especially when they want and need to move customers to new offerings. But customers know this and are sensitive to any pressure tactics like this – so SAP needs to be very careful. Otherwise customers may be motivated to start talking to SAP competitors – something SAP does not want and frankly cannot afford. SAP needs to convert the clear majority of its install base to S/4HANA to remain the SAP as we know it- the leading ERP vendor on this planet."

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work SAP Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer