Results

The generative AI buildout, overcapacity and what history tells us

The spending on generative AI infrastructure is accelerating at a breakneck pace, but it's quite possible that the "build it and they will come" approach may lead to overcapacity or some serious indigestion.

That is the argument from MIT's Daron Acemoglu and Goldman Sachs Research's Jim Covello. In a recent Goldman Sachs podcast, the two were skeptical about whether the $1 trillion expected to be spent on AI capex is going to pay off. And there are some incremental items that point to more skepticism about the AI buildout.

Overcapacity ahead?

Covello noted that the internet buildout was huge and led to multiple companies and services like Uber that arrived later. The issue is that this buildout didn't pay off big for about 30 years. "It ends badly when you build things the world is not ready for," said Covello. "When you wind up with a whole bunch of capacity because you build something that isn't going to get utilize it takes a while to grow into that supply."

He said: "One of the biggest lessons I've learned over 25 years here is bubbles take a long time to burst. So, the build of this could go on a long time before we see any kind of manifestation of the problem. I'm very respectful of how long they can go on."

Clearly, the AI buildout is underway. Alphabet, Amazon Web Services, Oracle and Microsoft are all spending billions to build genAI capacity. Companies like OpenAI and Anthropic are also spending. Most of the genAI infrastructure profits are going to Nvidia for now.

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

This buildout could lead to indigestion should genAI fail to deliver the returns. "Very few companies are actually saving any money at all doing this," said Covello. "How long do we have to go before people start to really question?"

To hear Covello tell it, we're clearly in the FOMO stage with genAI. It's likely that genAI won't fizzle like the metaverse, but you never know.

For now, genAI infrastructure spending isn't going to ease. Alphabet CEO Sundar Pichai crystallized the FOMO-fueled AI infrastructure boom when he answered a question about the company's genAI capital spending. Speaking on Alphabet’s second quarter earnings call, Pichai said:

"We are at an early stage of what I view as a very transformative era. When we go through a curve like this, the risk of under-investing is dramatically greater than the risk of over-investing. Even in scenarios where it turns out that we are over-investing the infrastructure is widely useful for us. I think not investing to be at the frontier definitely has much more significant downside. Having said that, we obsess around every dollar we put in. Our teams work super hard and I'm proud of the efficiency work, be it optimization of hardware, software, model deployment across our fleet."

Covello said: "AI is pie in the sky big picture. If you build it, they will come, just you got to trust this because technology always evolves and we're a couple of years into this. And there's not a single thing that this has been used for that's cost effective at this point. I think there's a unbelievable misunderstanding of what the technology can do today."

The fundamental issue with genAI is that the buildout starts from a different point. E-commerce started out cheaper. Mobile turned out to be cheaper. The internet made everything cheaper. "With AI, you're starting from a very high-cost base. I think there's a lot of revisionist history about how things start expensive and get cheaper. Nobody started with a trillion dollars," said Covello.

GenAI can get cheaper, but first Nvidia needs real competition. "Why is AI so expensive? It's really the GPU costs," said Covello. "I think the big determination in whether AI costs ever become affordable is whether there are other players that can come in and provide chips alongside Nvidia."

The economics

Acemoglu recently penned a research paper that questioned the economic benefit from genAI. Acemoglu said genAI economic prognostications depend on how quickly the technology can be integrated into an organization. Some genAI projects will boost productivity. Other companies will find that genAI was a waste of time and money.

The big issue is time horizon. Acemoglu said most genAI economic prognostications, which range from a 1.5% to 3.4% boost to average annual GDP over the next decade, have too many uncertainties. His paper concludes that AI and productivity improvements at the task level will increase total factor productivity by 0.71% over 10 years. That amount is nontrivial but too modest to justify the genAI building boom today. You can find plenty of folks who disagree here, here and here.

Acemoglu said: "I think economic theory actually puts a lot of discipline on how some of these effects can work once we leave out those things like amazing new products coming online, something much better than silicon coming, for example, in five years. That's good. That happens. All right, that's big. But once you leave those out, the way that you're going to get productivity effects is you look at what fraction of the things that we do in the production process are impacted and how that impact is going to change our productivity or reduce our costs."

Acemoglu is that genAI isn't going to be big enough to replace humans in the short run. Transport, manufacturing, utilities and other industries have people interacting with the real world. GenAI can offload some of that work, but not much in the next few years. Pure mental tasks can be affected and that work replaced isn't trivial but it's not huge, he said.

In 10 years, genAI may be cost effective and move GDP, but a lot has to happen first. Acemoglu said: "I am less convinced that we're going to get there very quickly by just throwing more GPU capacity. Any estimate of what can be achieved with an anytime horizon is going to be uncertain. There's a view among some people in the industry that you double the amount of data, you double the amount of compute capacity, same number of GPU units or their processing power, and you're going to double the capacities of the AI model."

There are a few issues with scaling genAI and driving this alleged productivity boom. For starters, it's unclear what doubling capabilities of genAI will do in economic terms. Meanwhile, more data doesn't matter if it doesn't improve predictions and help you solve problems. It's also not clear whether data or compute will be cheap enough to really move productivity.

Acemoglu said: "I think there is the possibility that there could be very severe limits to where we can go with the current architecture. Human cognition doesn't just rely on a single mode. It involves many different types of cognitive processes, different types of sensory inputs, different types of reasoning. The current architecture of large language models has proven to be more impressive than many people would have predicted, but I think it still takes a big leap of faith. There are all the sorts of uncertainties."

In the end, genAI's march to super intelligence and a productivity boom depends on time horizon. Acemoglu said.

A Sequoia report makes you go hmm

In June, venture capital firm Sequoia chimed in on the AI buildout. Sequoia partner David Cahn noted the big gap between revenue expectations implied by the AI infrastructure buildout and actual revenue growth, which Cahn used as proxy for end user value.

Cahn first raised the issue in Sept. 2023, but updated his argument since conditions have changed. He noted that the supply shortage for GPUs has been "almost entirely eliminated."

In addition, GPU stockpiles at large cloud providers is growing. If these stockpiles grow enough demand will decrease--and so will Nvidia's valuation. Cahn's other concern is that most of the direct AI revenue is going to OpenAI. Microsoft, Google, Apple and Meta need a big leap in direct AI revenue. Here’s a look at Sequoia’s payback figures.

Cahn said: "Speculative frenzies are part of technology, and so they are not something to be afraid of. Those who remain level-headed through this moment have the chance to build extremely important companies. But we need to make sure not to believe in the delusion that has now spread from Silicon Valley to the rest of the country, and indeed the world. That delusion says that we’re all going to get rich quick, because AGI is coming tomorrow, and we all need to stockpile the only valuable resource, which is GPUs."

My take

First, this topic has been on my mind for a few months. The data center buildout, machinations of bitcoin miners suddenly pivoting to AI with a new narrative and the lack of trickle-down economics beyond Nvidia have given me pause. After all, I've seen this movie before with the dotcom boom and bust, 2008 financial crisis, the COVID-19 pandemic component panic and current housing bubble.

All of these bubble cycles include a gold rush that leads to too much capacity. This capacity usually pans out in the long run, but squeezes investors that aren't first movers.

Nvidia is priced for perfection and it's only logical that there will be some indigestion ahead as the genAI supply chain starts optimizing for costs. There's a rush to buy GPUs now, but you can only build data centers so fast. Nvidia GPUs have become their own asset class, but competition looms.

The big genAI spenders are touting massive capex budgets that may not be tolerated by Wall Street forever. This spending is fine until sales growth and/or productivity gains slow. Spending on genAI will always work--until it doesn't.

And perhaps the biggest reason to play contrarian about the genAI boom. I've heard cashiers, grandparents and randoms on the street all talking about Nvidia and genAI--and I don't live in Silicon Valley.

In the end, it's worth paying attention to these genAI contrarians but keep in mind that bubbles last longer than you think. Also keep in mind no one rings a bell at the top, but there are signs worth watching.

Insights Archive

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity nvidia Big Data ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

7 takeaways on emerging trends from ARX 2024

Constellation Research analysts outlined emerging trends for the second half of 2024 at ARX 2024. Here's a look at the themes that were surfaced.

Constellation Research CEO Ray Wang outlined the following macro economic backdrop influencing enterprise technology buying decisions.

  • Business decisions are on hold due to the election cycle, interest rates and the need to push for exponential efficiency.
  • Companies are wrestling with AI arbitrage and when it makes sense to insert humans into automated processes.
  • Vendors will need to enable either 10x growth improvement for enterprises or be able to deliver at 1/10th of the cost. "You'll have to be either better or cheaper," said Wang. As a result, there will be some interesting duels between vendors trying to preserve margins and those enabling margin compression and value.

Themes to watch across the Constellation Research coverage areas.

Growth operations are becoming critical to AI strategies. Constellation Research analyst Martin Schneider said sales automation, revenue operations and customer success teams are converging into a growth ops org. AI is breaking down silos between those teams and there's a need for enterprises to forge a holistic strategy for growth operations. "Chief growth officers will emerge to drive holistic strategy," said Schneider. Chief growth officers will be more than chief revenue officers and sales leaders. There will be a more holistic view.

Cybersecurity will be about resilience and response instead of prevention, said Constellation Research analyst Chirag Mehta. The cybersecurity stories and best practices that emerge will revolve around how enterprises bounce back from data breaches. You're not going to avoid them.

Infinite computing and AI will enable new opportunities. Constellation Research analyst Holger Mueller said that infinite computing has become a reality now and genAI is democratizing data and making sense of it. Data contained in documents and transactional systems will enable new processes and expand what enterprises can do. "We'll move from infinite computing to infinite deep learning and automation," said Mueller.

The intersection between CX and AI is going to surface a host of issues including data droughts, personalization and a bunch of trust concerns, said Constellation Research analyst Liz Miller. Privacy regulation will also be a big issue as AI and CX converge. "CX will be a proving ground for AI and where the business meets the customer," said Miller.

There's a buy vs. build debate emerging around generative AI. As genAI is operationalized, enterprises are focusing on industry use cases and verticalization. It's currently unclear whether vendors will be able to go vertical or whether enterprises will build. The other wrinkle to ponder is vendor pricing models as generative AI goes vertical, according to Constellation Research analyst Andy Thurai, who noted value-based and usage pricing will be in the mix.

Data platform vendors have invested heavily in AI and they can't pull back even though they're not necessarily seeing a payoff, said Constellation Research Doug Henschen. "Vendors are way ahead of what customers are doing," said Henschen.

Open vs. closed models. As enterprises develop their data and AI strategies one key debate will revolve around open-source large language models and proprietary offerings. If AI is to be democratized, the industry has to move toward an open system.

Data to Decisions New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work Sales Marketing Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service Customer Experience Chief Information Officer Chief Executive Officer Chief Data Officer Chief Digital Officer Chief Marketing Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Salesforce’s State of Sales Survey Reveals Roadblocks Still Remain with AI: Can RevOps be the Heroes?

Salesforce released its annual “State of Sales” survey this week. The company surveys more than 5,500 sales professionals each year from around the world to gain insights on the pain points and opportunities facing sellers. This year, as in recent years, AI dominates the headspace, but other positive surprises were revealed as well. 

The most positive news was that 79% of respondents said sales have increased over last year. As we fully cycle out of supply chain and other post-pandemic issues, this is not a huge surprise, but positive nonetheless. Challenges still remain for sellers, as they cited changing customer needs and expectations, competition with other businesses, lingering supply chain issues, macroeconomic conditions, and inadequate or ineffective tools/technology as their biggest barriers to success.

Perhaps less surprising was the fact that while 81% of respondents claim to be using AI in some form today - sellers still cited that up to 70% of their time is spent on non-selling activities. There seems to be a lag between AI becoming the productivity booster vendors are claiming it to be. 

Lack of budget, headcount, and training to effectively implement AI were the main reasons cited by those not seeing strong returns from AI investment to date. Nearly a third of RevOps professionals also have concerns about data security, completeness, and accuracy. The same amount expressed concerns about having sufficient human oversight of AI — for example, monitoring AI outputs to ensure they’re correct. RevOps respondents also pointed to customer distrust as a common obstacle they've faced while implementing AI. Only 55% of business buyers trust AI to be as accurate as a human, according to survey results.

While AI may not be a silver bullet, the survey did note that 83% of sales teams with AI saw revenue growth in the past year — versus 66% of teams without AI. 

For those looking to improve upon existing AI investment, or just getting started - RevOps teams have the ability to take a more strategic and phased approach to where AI should be implemented. They can work with IT to provide both the training and guardrails that improve usage, effectiveness and security. RevOps leaders need to be a critical stakeholder when building out strategy, evaluating technology, and providing effective rollouts of AI to sales people as part of a larger enablement initiative. 

 

Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity Tech Optimization salesforce ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Revenue Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

IBM's Q2 led by software revenue

IBM reported a better-than-expected second quarter fueled by software revenue growth.

Big Blue reported second quarter net income of $1.8 billion, or $1.96 a share, on revenue of $15.8 billion, up 2% from a year ago. Non-GAAP earnings were $2.43 a share.

Wall Street was looking for IBM to report second quarter earnings of $2.17 a share on revenue of $15.62 billion.

As for the outlook, IBM projected annual revenue growth in the mid-single digit range with free cash flow topping $12 billion.

By the numbers for the second quarter:

  • Software revenue was $6.7 billion, up 7%. Red Hat revenue was up 7% with automation sales growth of 15%. Data and AI revenue in the quarter fell 3%.
  • Consulting revenue was $5.2 billion, down 1% from a year ago.
  • Infrastructure revenue was $3.6 billion, up 0.7% from a year ago. IBM Z revenue was up 6%.

IBM CEO Arvind Krishna said the company saw strength in hybrid cloud demand and software.

"Technology spending remains robust as it continues to serve as a key competitive advantage allowing businesses to scale, drive efficiencies and fuel growth. As we stated last quarter, factors such as interest rates and inflation impacted timing of decision making and discretionary spend in consulting. Overall, we remain confident in the positive macro-outlook for technology spending but acknowledge this impact."

Krishna added that watsonx and IBM's generative AI has been infused across its business. He said genAI has been used in consulting, Red Hat and even IBMz. IBM is also focusing on offering models suited to enterprises. 

He said:

"Choosing the right AI model is crucial for success in scaling AI. While large general-purpose models are great for starting on AI use cases, clients are finding that smaller models are essential for cost effective AI strategies. Smaller models are also much easier to customize and tune. IBM's Granite models, ranging from 3 billion to 34 billion parameters and trained on 116 programming languages, consistently achieve top performance for a variety of coding tasks. To put cost in perspective, these fit-for-purpose models can be approximately 90% less expensive than large models."

IBM's book of business related to generative AI is now $2 billion inception to date. 

Tech Optimization IBM Chief Information Officer

ServiceNow Q2 strong, Desai out

ServiceNow reported better-than-expected second quarter earnings and announced that president and chief product officer CJ Desai will leave the company after an internal investigation.

Long-time ServiceNow executive Chris Bedl will serve as interim chief product officer. Bedl had previously served as Chief Digital Information Officer and Chief Customer Officer.

Here's what ServiceNow had to say about Desai's departure and its internal investigation that stemmed from an employee complaint:

"As a result of the investigation, the Company’s Board of Directors determined Company policy was violated regarding the hiring of the former Chief Information Officer of the U.S. Army. As such, the hired individual, who led the company’s public sector thought leadership and business development efforts since March 2023, departed the company. In addition, the Company and CJ Desai, President and Chief Operating Officer, came to a mutual agreement that Desai would resign from all positions with the Company effective immediately. The company believes this was an isolated incident."

ServiceNow reported second-quarter earnings per share of $3.13 a share on revenue of $2.627 billion, up 22% from a year ago. Wall Street was expecting second quarter earnings of $2.84 a share on revenue of $2.61 billion.

As for the outlook, ServiceNow said third quarter subscription revenue will be between $2.66 billion to $2.665 billion, up about 20%. For 2024, ServiceNow said subscription revenue will be between $10.57 billion to $10.58 billion, up 22%.

ServiceNow earlier in the day announced the acquisition of Raytion, a genAI search tool that will be integrated into the Now Platform. Boomi and ServiceNow also formed a strategic partnership that will blend Boomi's application programming interface management and automation platform with ServiceNow's Now Platform.  In addition, Salesforce and Workday said it will combine data to enable employee workflows in a move aimed at ServiceNow. 

Speaking on an earnings conference call, ServiceNow CEO Bill McDermott said the company has signed 11 NowAssist deals more than $1 million in ACV. "Enterprises are investing in business transformation. They are investing in AI. They are building a new reference architecture for the decades to come. This is the largest, most compelling business opportunity in the world. We are bullish on what's ahead," said McDermott.

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization servicenow Chief Information Officer

Salesforce, Workday form unified data foundation aimed at employee workflows

Salesforce and Workday formed a strategic partnership that revolves around a unified data foundation that connects Workday financial and HR data with Salesforce CRM data to streamline workflows.

With the move, Workday and Salesforce are combining forces to deliver an AI-powered assistant for employee use cases that include onboarding, health benefits and career development.

The upshot here is that the Salesforce and Workday partnership adds seamless integrations across the platforms to extend into workflows. ServiceNow has become an increasing threat to HR and CRM use cases championed by Workday and Salesforce.

Constellation Research CEO Ray Wang said the integration of business processes and data is critical to CxOs. "The shared data foundation between ;Workday and Salesforce will enable these partners to deliver AI capabilities that could completely transform the employee experience," said Wang. Wang added:

"Organizations are having a tough time bringing all their datasets from multiple systems into one place.  Workday and Salesforce are often in the same organization and represent a large bucket of data that will be needed for AI. Executives want to ask how many FTE's do they have in an area and whether or not they should add more people in sales lead generation or marketing. To do that, the data and the process have to come togeether in one place."

Here are the moving parts of the Salesforce and Workday partnership:

  • The partnership combines Salesforce's Agentforce Platform with Einstein AI with Workday AI and the platform.
  • The AI agent will be powered by the unified data foundation and natural language. The AI employee service agent will run on Einstein 1 Platform and Workday AI.
  • According to the companies, the AI employee service agent will use LLMs built on the common data platform, which is built on Salesforce Data Cloud.
  • Salesforce and Workday said the combination will enable employees to take action and automate tasks.
  • Workday will be natively integrated inside Slack.
  • The partnership is aimed at boosting productivity for joint customers.

Salesforce CEO Marc Benioff said the partnership with Workday "to jointly build an employee service agent" will enable employees to "get answers, learn new skills, solve problems and take action quickly and efficiently."

Carl Eschenbach, CEO of Workday, added that the partnership will boost employee experiences with genAI. For employers, the common Salesforce and Workday data layer will improve workforce planning, financial planning and sales enablement.

 

Data to Decisions Next-Generation Customer Experience Future of Work Tech Optimization Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity workday salesforce Big Data AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR GenerativeAI LLMs Agentic AI Disruptive Technology Chief People Officer Chief Information Officer Chief Marketing Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Customer Officer Chief Human Resources Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Pitchit Comes Out of Stealth as LQaaS Space Continues to Heat Up

The lead qualification as a service (LQaaS) sector has been heating up rapidly as more and more embedded AI capabilities streamline and automate the lead qualification process. One of the biggest barriers to fully automated lead qualification has been the preponderance of data silos that make it difficult to generate a truly predictable lead score with less than complete prospect data. But as AI breaks down data silos,  confidence continues to grow in using SaaS lead qualification tools that augment CRM systems.

But in some industries, speed is more important than deep qualification. Think of telecommunications firms trying to land new customers who might be in the process of switching providers - it is a race to grab attention and response and win over the customer. Time is of the essence…but, you also need to align with the buyer preferences and their own timeline and availability. 

Enter Pitchit. The company says it “automates the first 24 hours of manual labor required to qualify inbound leads.” As soon as a marketing-qualified lead enters a CRM, there is a 24-hour time window where a salesperson must manually qualify the lead as quickly as possible before the lead loses interest. Historically, this stage in the sales pipeline was labor-intensive and required direct phone calls, discovery meetings or live chat conversations to finish the qualification process.

The notion is to quickly engage with prospects and leads, and at the best moment hand off to a human. This enables the AI to do the heavy lifting, sifting through volumes of leads to find those ready to engage, both filtering by data-based qualification as well as readiness to engage with the brand. 

The company says it can sync leads from 7,000+ channels — social media, CRM, email, SMS, and more — to quickly quote pricing, book meetings, handle objections, and capture personally identifiable information (PII) before handing qualified leads off to a human sales rep. 

By focusing on purchase-ready leads, brands with a large volume of B2C sales across multiple consumer touchpoints can dramatically increase sales qualified lead (SQL) conversion rates, lower the time and cost associated with every sale, and maximize revenue per sales representative. In addition, by reducing sales friction, brands can offer their customers a faster and more personalized sales experience while reducing costs associated with lead qualification, like SDR headcount, training costs, etc. 

To date, Pitchit claims it has helped telecom and insurance sales teams qualify 531,000+ leads, save 4,400+ labor hours, book 4,000+ meetings, and close $280 million in customer revenue. Customers using Pitchit experienced a 250% increase in their lead qualification rate, on average. As they come out of stealth, the company has raised a $2.5m seed round.

For telecos and insurance providers, tools like Pitchit can be simple, cloud-based addition to the lead management stack. The company is honing its pricing, but for now the company gets paid based on conversions, which creates a low risk, high reward investment for businesses who have higher volumes of leads, historically low conversion rates, and short time windows to capitalize on consumer interest. 

Marketing Transformation Innovation & Product-led Growth Tech Optimization Future of Work Data to Decisions New C-Suite Sales Marketing Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Marketing Officer Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief AI Officer Chief Product Officer

Google Cloud Q2 revenue $10.37 billion, lands $1 billion in operating income

Google Cloud revenue for the second quarter was better-than-expected at $10.35 billion, up from $8.03 billion a year ago. Analysts were modeling $10.2 billion for Google Cloud revenue.

It was the first quarter where Google Cloud topped operating income of $1 billion. Google Cloud reported second quarter operating income of $1.17 billion.

Alphabet, parent of Google, reported second quarter earnings of $18.37 billion, or $1.89 a share, on revenue of $84.74 billion, up 14% from a year ago. Wall Street was looking for second quarter earnings of $1.84 a share on revenue of $84.19 billion.

Analysts were expecting Google to report second quarter earnings of $1.84 a share on revenue of $84.26 billion.

Google’s second quarter earnings landed as the company’s plans to buy Wiz and HubSpot fell apart. Google has gone shopping in recent months to bulk up Google Cloud. Google Cloud also forged a pact with Oracle.

Speaking on an earnings conference call, Pichai said Nvidia's latest platform will be coming to Google Cloud. "We continue to invest in designing and building robust and efficient infrastructure to support our efforts in AI given the many opportunities we see ahead," said Pichai.

He added that Google is also looking to drive efficiency in its AI models and matching "the right model size to the complexity of the query to minimize the impact on costs and latency."

For Google Cloud, the Oracle partnership can embed its AI services into more enterprises. Customer references cited by Google Cloud for AI adoption include Bayer, Best Buy, Discover Financial and TD Bank to name a few. Also see what Equifax and Wayfair have done with Google Cloud.

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Google Google Cloud SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Meta launches Llama 3.1 450B and for Zuckerberg it's personal

Meta released Llama 3.1 405B, an open "frontier-level model" that aims performs as well as proprietary models. For Meta CEO Mark Zuckerberg, the Llama cadence is designed to play the long game and bet open-source models ultimately win.

The company released Llama 3.1 405B, which will support synthetic data generation and model distillation. Those two features haven't been available in open-source models. Meta also said it released upgraded 8B and 70B Llama models with context length of 128K and better reasoning. The models are also multilingual and support multiple use cases.

For Zuckerberg, Llama is a mission. In a blog post, he said large language models (LLMs) will develop much like Linux did. First there was Unix and over time open-source Linux won. He said:

"I believe that AI will develop in a similar way. Today, several tech companies are developing leading closed models. But open source is quickly closing the gap. Last year, Llama 2 was only comparable to an older generation of models behind the frontier. This year, Llama 3 is competitive with the most advanced models and leading in some areas. Starting next year, we expect future Llama models to become the most advanced in the industry. But even before that, Llama is already leading on openness, modifiability, and cost efficiency."

Meta's latest Llama release and letter from Zuckerberg are designed to court developers that want to fine tune models, evaluate models for specific applications, pre-train and make models their own. The company said developers can leverage workflows and directions from partners such as Nvidia, AWS, Google Cloud, Microsoft Azure, Dell Technologies, Databricks and others.

Zuckerberg added that developers want affordable model options that can be fine-tuned on sensitive data while avoiding vendor lock-in. Llama is a high-profile effort, but not out of character for Meta, which also led the Open Compute Project.

The win for Meta's approach with Llama is that it can leverage the open-source community and build an ecosystem. Meta wants Llama to be a standard and can play neutral party since its business model isn't about selling access to LLMs.

Enterprises start to harvest AI-driven exponential efficiency efforts | Generative AI use cases, takeaways from projects underway and how the technology fits in with broader digital transformation.

In the end though, Meta's Llama efforts may be personal for Zuckerberg. He said:

"One of my formative experiences has been building our services constrained by what Apple will let us build on their platforms. Between the way they tax developers, the arbitrary rules they apply, and all the product innovations they block from shipping, it’s clear that Meta and many other companies would be freed up to build much better services for people if we could build the best versions of our products and competitors were not able to constrain what we could build. On a philosophical level, this is a major reason why I believe so strongly in building open ecosystems in AI and AR/VR for the next generation of computing."

More:

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

CrowdStrike CEO called to testify before House committee

CrowdStrike CEO George Kurtz is being called to the House to testify about the global IT outage that has hampered enterprises--notably airlines like Delta--for days.

House Committee on Homeland Security Chairman Mark E. Green, MD (R-TN) and Subcommittee on Cybersecurity and Infrastructure Protection Andrew Garbarino (R-NY) sent a letter to Kurtz requesting public testimony.

Kurtz's testimony will be closely watched, but he’s hardly the first technology CEO to testify and take lumps from lawmakers. CrowdStrike’s outage lands just as cybersecurity vendors are pushing platformization in a bid to consolidate IT budgets. While the cybersecurity industry sees innovation from startups and smaller companies, it’s still dominated by a few large vendors, including CrowdStrike. Some vendors, like Palo Alto Networks, advocate for platformization, aiming for even greater consolidation.

Green and Garbarino wrote:

"We write in response to the global information technology (IT) outage, which has been attributed to a “defect” in a CrowdStrike software update that impacted Microsoft Windows. While we appreciate CrowdStrike’s response and coordination with stakeholders, we cannot ignore the magnitude of this incident, which some have claimed is the largest IT outage in history. In less than one day, we have seen major impacts to key functions of the global economy, including aviation, healthcare, banking, media, and emergency services."

The letter also requests that CrowdStrike schedule a hearing with the Subcommittee on Cybersecurity and Infrastructure Protection no later than 5 p.m. July 24.

Delta has struggled with restoring services, but the CrowdStrike outage affected multiple industries. For its part, CrowdStrike has documented what went wrong and published a remediation and guidance hub.

Kurtz said in a statement:

"I want to sincerely apologize directly to all of you for the outage. All of CrowdStrike understands the gravity and impact of the situation. We quickly identified the issue and deployed a fix, allowing us to focus diligently on restoring customer systems as our highest priority."

Digital Safety, Privacy & Cybersecurity crowdstrike Security Zero Trust Chief Information Officer Chief Information Security Officer Chief Revenue Officer Chief Privacy Officer