Results

Creatio’s “Energy” Release Fuels its Continued Disruption of the CRM Space

Creatio has been taking a “no code” approach to building a midmarket and enterprise focused CRM platform and set of applications for about a decade now. The company has offered a competitive suite of CRM tools spanning marketing automation, sales automation and customer service automation that rests upon a solid workflow engine (the company was previously named “BPMonline”).

So, with an existing solid foundation in workflow, the innovations we are seeing in AI provide a unique opportunity for Creatio to offer its customers the ability to supercharge existing workflow oriented CRM with generative and agentic AI tools to drive user productivity, reduce costs, and provide an enhanced customer experience. And that is just what its latest release, dubbed “Energy,” aims to do.

Embedded AI to Drive CRM Adoption and ROI

The highlights of the Energy release include a new “AI Command center.”  The goal of AI Command Center is to combine prescriptive, generative, and agentic AI into one single destination. It allows admins to design, deploy, and refine AI skills in a single place, to both optimize the actual performance of AI implementations but also better audit and moderate AI usage across the organization.

The AI builder tools in Energy are impressive, and bring Creatio’s AI vision closer to parity with most CRM providers. However, given its small and midsize business base of customers, Creatio is releasing a slew of pre-built AI tools for multiple use cases. These initial 20 pre-built tools cover marketing, sales and support use cases. For example, users can more quickly segment target lists, leverage generative AI to create personalized digital engagements, and take advantage of integrated predictive analytics to refine offers and promotions to improve conversion rates.

The AI advancements build upon existing strong copilot and other generative AI tools inside Creatio’s offerings. These tools have provided inline, contextual insights around leads, accounts, and even predictive and prescriptive insights around the reporting and analytics tools. With agentic AI advancements, Creatio users can now better use AI to automate common tasks in the system, driving productivity and enabling growing but resources constrained teams to “do more with less” and to manage growth efficiently without needing to constantly add human headcount. And, by providing a more streamlined and effective employee experience, users can potentially retain more employees as they can be more productive and less likely to experience employee burnout.

Out of the box AI tools include support for the following use cases:

  • Sales: Meeting Scheduling, Conversion Score Insights, Lead and Opportunity Summaries
  • Marketing: Bounce Responses Analysis, Enhanced Email Subject Lines, Text Rewriting, Text Translation
  • Service and Support: Case Resolution Recommendations, Case Performance Analysis, Case Summaries, Knowledge Base Articles
  • General CRM: a brand new drag-and-drop Email Designer, revamped Product Catalog, redesigned pages for Order and Contract Management, improved navigation panels, Ada AI chatbot integration, embedded Google Analytics integration, and many others.

Flexible Pricing for Growing Businesses

In a recent launch event for the Energy release, Creatio showcased some customers who are taking advantage of the platform, as well as how they plan to leverage the new AI capabilities to drive productivity and foster growth without having to make significant people hires in the process, further driving efficiency. But what was notable was the size of Creatio’s reference customers. These were manufacturers and others firms with more than a thousand CRM users across marketing, sales and support. Creatio has also landed marquee enterprise reference customers including AMD, Howdens, Colgate Palmolive, and Metlife to name a few. In short, Creatio is proving itself a legitimate platform that can scale to meet enterprise customers’ needs as their businesses grow.

But as businesses grow and their needs change, traditional CRM pricing can actually stymy an organization’s ability to be successful in their CRM initiatives. High per-user pricing in legacy CRM offerings, as well as arbitrary silos between application functions (example: you cannot use any aspect of marketing automation as a sales automation user without significant increase in spend) have forced line of business and IT leaders to make hard decisions as to who has access to the system, and what functionality they can utilize.

Creatio is trying to solve this by offering a far more flexible pricing model that better meets the needs of growing businesses. The company offers Growth, Enterprise, and Unlimited platform access plans that enable users to build CRM deployments that fir their unique business needs, and low cost ($15 per user/month) feature set access for its core sales, marketing and support tools. Platform plans start at $25 per user/per month, making the platform compelling to businesses of all sizes. In addition, Creatio includes its AI capabilities including the new features in Energy as part of the base price. This gives users the ability to both experiment and deploy AI without a lot of risk in terms of cost, complexity or security concerns.

For businesses looking to deploy their first packaged CRM system, Creatio should be on any short-list. The competitive platform fees,  the ability to enable every employee to access data and functionality as needed without premium SaaS prices, and the fact that the AI capabilities require no fees, makes Creatio a compelling alternative to more premium priced offerings.

Add to that, the free AI tools can enable smaller business to expand their business without headcount additions, and the return on investment could far outweigh any cost of ownership when we think of the ability to grow without hiring more workers. But beyond pricing, the workflow oriented platform, the no-code approach and the well-integrated marketing, sales and support applications offers growing and midsized enterprises a tool they can really “own” and create a differentiated employee and customer experience without a lot of IT assistance, or needing to engage with VARs or SIs for any change in the system, driving both a more effective and efficient CRM deployment.

Next-Generation Customer Experience Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Revenue Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

ExxonMobil to outline process, automation, AI efforts

ExxonMobil is looking to save $15 billion in operating costs by 2027 and it will leverage artificial intelligence in combination with process intelligence to get there.

During ExxonMobil's third quarter earnings call, CEO Darren Woods said the company would outline how it will leverage AI across the enterprise via its global business services (GBS) group. That unit is responsible for end-to-end processes across the company.

"The organization is getting more efficient and effective at the core task of driving value in the company," said Woods. "The technology side of the equation will drive a double effect of higher revenue and lower costs to improve profitability."

On December 11, ExxonMobil will outline its AI and process transformation plans for investors. Woods said, "AI is part of the equation" and there's a "concerted effort to apply that new technology."

CFO Kathy Mikelis said AI is just one part of the mix. The company has been historically siloed and lacked standardized process. Those silos "made it more difficult to apply any single type of technology across the company."

She said:

"We'll be continuing to automate much of what we do today manually and that's going to drive improved efficiency and a way better experience for our people, our customers and our vendors. We're not always the easiest company to do business with when it comes to information technology and self-service. We have a pretty complicated IT environment we're in the process of simplifying to drive a much higher degree of automation into the business."

Speaking at Celonis' Celosphere 2024 event in Munich last month, John DiTullio, Process Transformation Executive in ExxonMobil's Global Business Solutions unit, highlighted how the company is putting in the technology that'll lead to more automation and process efficiency.

"In ExxonMobil we talk in the B's not M's but to get to the billions you have to do it million by million," said DiTullio, who was an early adopter of Celonis' Process Intelligence Graph, which analyzes processes across an enterprise as well as the interdependencies.

DiTullio said ExxonMobil is using Celonis to better migrate from various SAP ECC6 instances to SAP S/4HANA. That IT transformation is one example where ExxonMobil is becoming more efficient, and there are many more said DiTullio.

Celonis is being used for multiple use cases including procurement, supply chain, logistics, maintenance, inventory and accounts payable and receivable to name a few by riding on top of various systems like SAP and Salesforce to optimize processes.

ExxonMobil is also a large Amazon Web Services customer and the stack includes Snowflake and a bevy of other vendors. Like any enterprise with the scale of ExxonMobil the company is trying to optimize a complex IT stack that includes one of everything.

"Value comes in many shapes and sizes and collaboration is enabled by leveraging scale and a common language to make decisions," said DiTullio. "Data speaks louder."

Although ExxonMobil won't outline its AI plans until next month, it appears that the data infrastructure and stage is set for AI adoption and more automation.

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Revenue & Growth Effectiveness New C-Suite Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

AI data center building boom: Four themes to know

The tide has turned as Wall Street is starting to ask questions about the capital expenditures being laid out for data centers designed for generative AI. Some answers from CEOs have been better than others, but common themes around logistics, land and power, business models and efficiencies have emerged.

Here are the big themes to note amid this AI-fueled data center binge.

Logistics matter

During Amazon's third quarter earnings call, CEO Andy Jassy made a few critical points about the data center buildout. First, he noted that data center assets last 20 to 30 years and have a long runway for monetization. But when asked about margins, Jassy said the following:

"I think one of the least understood parts about AWS, over time, is that it is a massive logistics challenge. If you think about it, we have 35 or so regions around the world, which is an area of the world where we have multiple data centers, and then probably about 130 availability zone through data centers, and then we have thousands of SKUs we have to land in all those facilities.

If you land too little of them, you end up with shortages, which end up in outages for customers. Most don't end up with too little, they end up with too much. And if you end up with too much, the economics are woefully inefficient."

Jassy added that AWS has sophisticated models to anticipate capacity and what services are offered. Given that it's early in the AI boom, demand is volatile and less predictable. "We have significant demand signals giving us an idea about how much we need," said Jassy, who acknowledged that there are lower margins because the AI market is immature.

Following the logic of Jassy's comments you'd expect companies that are used to building out data centers--Meta and Alphabet--will have advantages over a company like Microsoft--historically a software vendor.

Indeed, Microsoft CEO Satya Nadella said:

"We have run into lots of external constraints because this demand all showed up pretty fast," he said. "DCs don't get built overnight. There is the DC, there is power. And so that's sort of been the short-term constraint. Even in Q2, for example, some of the demand issues we have or our ability to fulfill demand is because of external third-party stuff that we leased moving up. In the long run, we do need effective power and we need DCs. And some of these things are more long lead."

Lead times, land and power

In the end, AI data centers require a lot more than the gear that goes in them.

Comments from data center players like Equinix highlight the lead times with genAI facilities go well beyond GPU supplies. "Customer requirements and data center designs are evolving rapidly. Energy constraints and long-term development cycles pose challenges to our industry's ability to serve customers effectively," said Equinix CEO Adaire Fox-Martin, who noted that his company has the land and power commitments to continue to scale.

Digital Realty Trust CEO Andrew Power noted that hyperscalers that are running to nuclear energy for future power needs, but these purchase agreements in many cases (small nuclear reactors for instance) will take years to deliver.

"Sourcing available power is just one piece of the data center infrastructure puzzle. Supply chain management, construction management and operating expertise are all challenges that customers rely on Digital Realty to solve," said Power.

Some business models lead to multiple AI wins

Alphabet and Meta are both spending massive amounts on data centers and have already told Wall Street 2025 capital expenditures are going higher.

Neither company talked about logistics or capacity issues with data center buildouts. These firms do depend on Nvidia GPUs, but also have their own processors. Amazon, Alphabet and Meta have been building data centers at scale since inception.

The big difference for Alphabet and Meta is that they have the models and business models to better justify the data center buildout. Both Alphabet's Google and Meta properties are using AI to better monetize their core properties.

Meta CEO Mark Zuckerberg said the company is leveraging generative AI across the properties to bolster efficiency and drive revenue.

Alphabet CEO Sundar Pichai had a similar refrain. Yes, AI is a big investment, but services like AI Overview, Circle to Search, and Lens can drive revenue and engagement.

Massive efficiencies ahead

Feel free to question the theory that you need to build out a data center footprint based on today's relatively inefficient AI workloads.

AI workload efficiency is improving for a variety of reasons, but the primary one is money. Sure, Zuckerberg loves open-source AI and infrastructure, but he's also following the money.

Zuckerberg noted that infrastructure will become more efficient and that's why Meta has backed the Open Compute Project. He said:

"This stuff is obviously very expensive. When someone figures out a way to run this better, if they can run it 20% more effectively, then that will save us a huge amount of money. And that was sort of the experience that we had with open compute and part of why we are leaning so much into open source here."

Pichai made a similar point. He said:

"We are also doing important work inside our data centers to drive efficiencies while making significant hardware and model improvements. For example, we shared that since we first began testing AI Overviews, we have lowered machine cost per query significantly. In 18 months, we reduced cost by more than 90% for these queries through hardware, engineering and technical breakthroughs while doubling the size of our custom Gemini model."

Speaking on a panel at Constellation Research's Connected Enterprise conference, Brian Behlendorf, CTO at the Open Wallet Foundation and Chief AI Strategist at The Linux Foundation said "there's a lot of irrational exuberance about the amount of investment that's going to be required both to train models and do inference on them."

"The cost of training AI is going to come down dramatically. There is a raft of 10x improvements in training and inference costs, purely in software. We're also finding better structured data ends lead to higher quality models at smaller token sizes," he said.

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Constellation Research's Connected Enterprise 2024: All the takeaways

Constellation Research's Connected Enterprise 2024 spurred big ideas, a community of tech leaders and dozens of takeaways. It's a fire house of information designed to get you thinking about what's around the corner.

With that in mind, here's a look at the takeaways that I’ve flagged for follow up. Much of this reporter's notebook includes themes that could emerge over time. We'll be posting our interviews from CCE 2024 and interviews in the weeks to come.

Put active inference on your radar. "There are several technologies that are coming and one of them is our Internet is evolving into spatial domains that are programmable," said Denise Holt, Founder and CEO AIX Global Media. "It'll create digital twins of everything, and this is going to provide a grounding layer of reality for these next evolution of agents."

Active inference will have an understanding of the real world. Holt said active inference will be able to take in sensor information, ground it to reality and then leverage edge computing for inference. "Active inference enables the distribution of these agents to where they're not tethered to a giant database that is consuming all of this energy," she said. "The processing gets distributed to edge computing and edge devices. And then the other aspect of it is that it uses the right data in the moment for the task at hand, so it's not having to process through all of this irrelevant data to come up with an output that's close to your expectations. It actually deals with real world data in the moment over time."

Key themes of active inference. In the talk, Holt did a deep dive of active inference. Here are a few takeaways to know:

  • Active inference mimics biological intelligence and can overcome the limits of AI and machine learning.
  • Active inference is explainable and capable of governance.
  • Real-time decision making is a core feature of active inference and agents are building blocks that continually adapt to their environment.
  • Active inference is decentralized.
  • There isn't a big data requirement because active inference uses real world context.
  • Agents that power active inference leverage sensor data from IoT, cameras and robotics and measure it against a real-world model.
  • The standards of the spatial web, which enables active inference, have been in development for the last four years.
  • Active inference will be powered by digital twins of everything and nested ecosystems with them.

Holt said the impact goes like this: "We will have smart cities, autonomous systems, improving the efficiency of everything, global supply chains, personal and critical systems. Homeostasis of all complex systems can be achieved--healthcare, education, environmental management. This is adaptive. It's efficient edge computing, improved human AI, collaboration, cooperation and safety."

Focus on the application of AI, not the technology itself. "AI is all about the application of technology. AI is a universal technology instead of a small pocket of innovation," said Hari Shetty, Chief Strategist and Technology Officer at Wipro. "It's about the outcomes of technology for our clients."

EQ will save us. John Nosta, President of NostaLab, argued that large language models (LLMs) are expanding the cognitive dynamic with humans. He said:

"The cognitive experience itself is supported by large language models, and it makes it better. It makes it enjoyable. LLMs are tuned to your brain's creative frequency. It taps the genius within you to facilitate this dialog. LLM is your favorite teacher that you had in first grade or second grade. It gets you and that cognitive dynamic, that iterative dynamic, facilitates faster, better, and it lets you enjoy the ride."

Nostra said there is a risk that AI will cut human knowledge into smaller pieces, but doesn't see it happening. Why? Emotional intelligence. "I think that AI s going to be smarter. But maybe there is light at the end of the tunnel if humans have an EQ based process. This where it gets sticky," said Nostra. "There's a fundamental human component here that will ultimately be the domain of humanity."

Agentic AI has runway, but there are multiple challenges ahead. A panel focused on making agents more human and surfaced the following questions:

  • Can agents become more human with synthetic emotion?
  • Agent sprawl will mean a nearly infinite number of types.
  • It's unclear what role regulation will play. Will agents comply with EU's GDPR?
  • Can orchestration make managing agents worse?
  • Agents are focused on autonomous tasks today, but are likely to morph into avatars and creations that carry more meaning to humans.

"The Internet was decentralized. AI is centralized, closed and only a few win. We need to break that model. We have designed centralized scarcity. We need decentralized abundance. Otherwise, the AI overlords will take over," said Constellation Research CEO & Founder R “Ray” Wang.

One of those overlords is Nvidia, but there are questions emerging that perhaps this data center buildout for brute forces approaches to AI is misguided. Brian Behlendorf, CTO at the Open Wallet Foundation and Chief AI Strategist at The Linux Foundation said, "there's a lot of irrational exuberance about the amount of investment that's going to be required both to train models and do inference on them."

Change management is never out of fashion--even in new projects including AI. One common theme from the successful AI projects cited at CCE was change management--cultural, technology and emotional intelligence.

"Hold on to your ideas a little lightly. Experts of the past are not the experts of tomorrow," said Vala Afshar, Chief Digital Evangelist, Salesforce and Co-Founder and Co-Host of DisrupTVShow. "Be humble and kind and know you can be wrong."

Innovation sometimes means burning legacy technology to the ground. David Giambruno, VP Tivity Health, walked through the steps of cutting IT budgets, eliminating tech debt and enabling innovation. Exponential efficiency isn’t easy.

Get ready to hear a lot about knowledge as a theme. Enterprise buyers will hear a lot about "knowledge" in 2025, but CxOs shouldn't treat the topic as just another buzzword, said Constellation Research analyst Liz Miller. Knowledge is about all the accumulated data across the enterprise that drives experiences.

The enterprise software buying process is broken and it's swallowing up the CIO role. I held a panel on marketplaces and removing friction from the enterprise software buying process. Ashwin Rangan, who has held CIO roles across multiple enterprises, said: "I think the CIO role is becoming the chief cat herder role," said Rangan. "The CIO is turning into a chief procurement officer for technology vs. the joy of innovation. The innovation track is being disrupted by trying to figure out how to buy all of this stuff."

Automating repetitive work can hollow out your bench. "Here's the thing that I think too many people forget. When you hire somebody with no experience at all the work you give them very early on repetitive and easy to check the answer. The perfect task for AI is also the perfect work for your intern or new graduate," said Cassie Kozyrkov, Founder Kozyr LLC, who delivered the keynote at Constellation Research's Connected Enterprise. "A lot more of the junior person's work is going to get cannibalized by more senior folks."

Alan Boehme, Future Tech Advisor to CEO H&M, argued that personalization doesn't exist. "Personalization doesn't exist because people don't know what they want. You're just leading them down paths--that's not personalizing," he said.

Privacy doesn't exist either. "Our job as designers, as brands, is to make the fact that you have no privacy worth your while," said Benjamin Wiener, SVP and Strategic Business Unit Head, Cognizant Moment.

Industries are made up and a business construct that's outdated. Rita McGrath, Founder, Valize Strategy and Professor, Columbia Business School, said industries are false constructs that hamper leadership. "We made up what an industry is and what its boundaries are and who participates in it. If you think about most of your companies, what industry are you really in? Who are you really competing with? It's not always an obvious question," said McGrath.

Governance as an AI enabler. Governance was a recurring theme on day one as an enabler for responsible AI. Although governance isn't a fun topic, it's needed in data and AI for transparency and fairness. Without those two things you don't have trust and your AI projects are likely to fail.

Cybersecurity is also a key theme for boards. The odd thing is that cybersecurity budgets have taken a hit to fund AI. Focus on the response to a cyberattacks, which by the way will be more AI-driven.

Capture the data that matters. Numerous execs pushed back on the urge to collect data on everything. The argument was more for quality than quantity. "The biggest opportunity is to actually capture more data not to change models, but improve the day-to-day job. If you think of the typical day of a broker, there is so much data we are not capturing," said Ibrahim Gokcen, Chief Data & Analytics Officer at AON. "There are all these PDF files of codes and proposals and policies we connect and feed into models, dashboards and insights."

Active metadata will be critical. Mark Potter, CEO of Actian, said: “Active metadata is all about understanding how to map the lineage of data and knowledge. Graphs that allow you to understand how data is connected to other sources or people will be key.”

GenAI and agents are your enterprise software UI. Boomi CEO Steve Lucas said ERP vendors should all be concerned because AI is the new UI. Chatbots and AI are going to replace the UI that requires you to "to log in and find some random pain in the ass thing and spend 7 hours figuring it out. You're going to see AI as the new UI and ERP vendors know it."

The future of health will be AI, tech enabled. A panel of healthcare professionals said they expect the industry to move toward robotic care by 2050, sensor and scanning technologies at scale, and preventative care that is incentivized through an award system. "At the end of the day, it's about human nature and that's a beast of its own," said David Giambruno, VP Tivity Health.

Employee experience can be assisted by software and AI, but you need humans. Systems, tools and processes are foundational to employee experience especially with AI. Monica Kumar, EVP & CMO at Extreme Networks, said software can enable a good employee experience, but requires the worker to provide input. Others agreed. "Excellent employee experiences are technology combined with people and good data to improve the software," said Anne Kao, Board Advisor, Product, AppFaktors.

2025 themes. With budget season well underway, 2025 brings a few big questions to ponder? How will AI be funded and what projects will lose out? In other words, when will AI get its own budget line. How do enterprises approach growth in the age of AI? See: The art, ROI and FOMO of 2025 AI budget planning

CCE 2024, SuperNova Award interviews

Data to Decisions Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Intel defends Gaudi 3 as it misses 2024 sales targets

Intel CEO Pat Gelsinger argued that its Gaudi 3 AI accelerator will be able to deliver strong total cost of ownership even as it falls short of the $500 million revenue target for 2024.

Gelsinger said on Intel's third quarter earnings call that the company is encouraged by Gaudi 3 interest despite the sales setback. Gaudi 3 AI accelerators are Intel's play to be in the mix for AI workloads. Nvidia is the clear leader in AI infrastructure with AMD playing No. 2 competitor.

Here's what Gelsinger had to say:

"While the Gaudi 3 benchmarks have been impressive and we are pleased by our recent collaboration with IBM to deploy Gaudi 3 as a service on IBM Cloud. The overall uptake of Gaudi has been slower than we anticipated as adoption rates were impacted by the product transition from Gaudi 2 to Gaudi 3 and software ease of use.

As a result, we will not achieve our target of $500 million in revenue for Gaudi in 2024. That said, taking a longer-term view, we remain encouraged by the market available to us. There is clear need for solutions with superior TCO based on open standards and we are continuing to enhance the Gaudi value proposition."

When pressed by analysts, Gelsinger said Gaudi 3 should be viewed as part of a CPU combination.

Gelsinger made the following points:

  • CPU is playing an increasing role in data center AI compute due to inference. "As you go into enterprise AI, we expect to place a more prominent role, databases, embedding, refinement or much more attuned to CPU workloads. And our strategy there is CPU plus accelerator or CPU plus Gaudi," he said.
  • Enterprise use cases will be more about inference and CPUs and Gaudi 3.
  • Intel is seeing a good pipeline for Gaudi 3 and early interest.
  • "Our strategy there is CPU plus accelerator or CPU plus Gaudi," said Gelsinger.

Intel's third quarter results and fourth quarter outlook left room for optimism, but the chipmaker still has a lot of work to do.

By the numbers:

  • Intel said it expects fourth quarter revenue to be between $13.3 billion and $14.3 billion and the midpoint of $13.8 billion was above the $13.66 billion estimate. Adjusted earnings for the fourth quarter are expected to be 12 cents a share, above the 8 cents a share estimate.
  • Intel reported a third quarter net loss of $16.6 billion, which includes restructuring charges on revenue of $13.3 billion, down 6% from a year ago. On a non-GAAP basis, Intel lost $2 billion, or 46 cents a share.
  • The company took a $2.8 billion restructuring charge in the third quarter.
  • Intel said its client computing unit revenue was down 7% in the third quarter and the data center division was up 9%.

 

Data to Decisions Tech Optimization Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Apple Q4 ahead of estimates along with iPhone sales

Apple reported a solid fourth quarter as earnings, revenue and iPhone sales were above expectations.
 
The company reported fourth quarter earnings of $1.64 a share on revenue of $94.93 billion. Wall Street was looking for earnings of $1.60 a share on revenue of $94.58 billion.
 
With the results, Apple allayed some fears about iPhone sales. iPhone revenue was $46.22 billion in the fourth quarter ahead of expectations. However, Mac, iPad and services revenue fell short of what many analysts were expecting.
 
CEO Tim Cook said revenue in the fourth quarter was at a record level and up 6% from a year ago. He touted new product cycles for iPhone, Apple Watch and AirPods.
Innovation & Product-led Growth apple Chief Information Officer

AWS posts Q3 revenue up 19% from a year ago, $110 billion annual run rate

Amazon delivered better-than-expected third quarter earnings as its AWS unit showed sales growth of 19%.

The AWS results land following strong cloud growth figures from Google Cloud and Microsoft Azure of 35% and 33% respectively. AWS, however, is working off much larger revenue figures.

Amazon reported third quarter net income of $15.3 billion, or $1.43 a share, on revenue of $158.9 billion, up 11% from a year ago. Wall Street was looking for

earnings of $1.14 a share on revenue of $157.2 billion.

By the numbers for the third quarter:

  • North American commerce reported third quarter operating income of $5.7 billion on revenue of $95.5 billion, up 9% from a year ago.
  • International revenue was up 12% to $35.9 billion with operating income of $1.3 billion.
  • AWS delivered operating income of $10.4 billion, up from $7 billion in the same quarter a year ago. AWS revenue was $27.5 billion, up 19% from a year ago.

Amazon CEO Andy Jassy said the company is executing well and prepared for the holiday shopping season and AI and cloud infrastructure advances at AWS re:Invent in December.

Constellation Research analyst Holger Mueller said:

"It is remarkable for AWS to turn the trend of shrinking revenue growth and go back into growth mode. And this is before AWS re:Invent where major innovations for many offerings will be released and shared. But all eyes are on AI. If AWS gets re:Invent right it will show even more growth in the quarters ahead." 

As for the outlook, Amazon projected fourth quarter earnings of $181.5 billion and $188.5 billion, up 7% to 11%, with operating income between $16 billion and $20 billion.

On a conference call, Jassy said the following:

  • "We've seen significant reacceleration of AWS growth for the last four quarters. With the broadest functionality, the strongest security and operational performance and the deepest partner community, AWS continues to be a customer's partner of choice. There are signs of this in every part of AWS's business."

  • "Companies are focused on new efforts again, spending energy on modernizing their infrastructure from on-premises to the cloud. This modernization enables companies to save money, innovate more quickly, and get more productivity from their scarce engineering resources. However, it also allows them to organize their data in the right architecture and environment to do Generative AI at scale. It's much harder to be successful and competitive in Generative AI if your data is not in the cloud."

  • "While we have a deep partnership with NVIDIA, we've also heard from customers that they want better price performance on their AI workloads. As customers approach higher scale in their implementations, they realize quickly that AI can get costly. It's why we've invested in our own custom silicon in Trainium for training and Inferentia for inference. The second version of Trainium, Trainium2 is starting to ramp up in the next few weeks and will be very compelling for customers on price performance. We're seeing significant interest in these chips, and we've gone back to our manufacturing partners multiple times to produce much more than we'd originally planned."

  • "We're continuing to see strong adoption of Amazon Q, the most capable Generative AI-powered assistant for software development and to leverage your own data. Q has the highest reported code acceptance rates in the industry for multiline code suggestions. The team has added all sorts of capabilities in the last few months, but the very practical use case recently shared where Q Transform saved Amazon's teams $260 million and 4,500 developer years in migrating over 30,000 applications to new versions of the Java JDK as excited developers and prompted them to ask how else we could help them with tedious and painful transformations."

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Meta's sneak peek 2025 budget: A lot more AI infrastructure spending

Meta's 2025 budget planning process goes like this: Spend heavily on AI infrastructure and use AI to drive efficiencies so you can plow more money into GPUs.

That peak into 2025 hyperscale budgeting was delivered by Meta CEO Mark Zuckerberg. Zuckerberg was a little more direct about the spending plans on AI, but the other hyperscale giants--Microsoft, Google Cloud and Amazon--have similar plans.

Speaking on a conference call, Zuckerberg was clear that Meta's appetite for GPUs--mostly from Nvidia--will be insatiable.

He said:

"We're training the Llama 4 models on a cluster that is bigger than 100,000 H100s or bigger than anything that I've seen reported for what others are doing."

Zuckerberg added that Meta is early in the budget process, but is targeting the following: AI for efficiency that will partly fund more investment in infrastructure.

1. AI for efficiency. "First, it's clear that there are a lot of new opportunities to use new AI advances to accelerate our core business that should have strong ROI over the next few years, so I think we should invest more there," said Zuckerberg.

Meta's third quarter results were powered by monetization efficiency due to AI. Meta is also boosting engagement and optimizing ad delivery. The company delivered third quarter revenue of $40.59 billion with net income of $15.69 billion, or $6.03 a share. Meta's results were well ahead of Wall Street estimates.

As for the outlook, Meta projected fourth quarter revenue of $45 billion to $48 billion.

CFO Susan Li noted that Meta can boost productivity with AI as it optimizes monetization.

"On the use of AI and employee productivity, it's certainly something that we're very excited about. I don't know if we have anything particularly quantitative that we're sharing right now. I think there are different efficiency opportunities with AI that we've been focused on in terms of where we can reduce costs over time and generate savings through increasing internal productivity in areas like coding."

She said content moderation is another area where AI can boost productivity. Large language models (LLMs) will also improve multiple work streams in general and administrative categories. Li said that Meta also has a headcount opportunities too.

2. Investment in AI infrastructure. "Our AI investments continue to require serious infrastructure, and I expect to continue investing significantly there too," he said.

Specifically, Meta projected 2024 capital expenditures to be $38 billion to $40 billion, updated from the $37 billion to $40 billion range. The company now expects "a significant acceleration in infrastructure expense growth next year."

Zuckerberg, however, noted that infrastructure will become more efficient and that's why Meta has backed the Open Compute Project. He said:

"This stuff is obviously very expensive. When someone figures out a way to run this better, if they can run it 20% more effectively, then that will save us a huge amount of money. And that was sort of the experience that we had with open compute and part of why we are leaning so much into open source here."

 

Data to Decisions Marketing Transformation Next-Generation Customer Experience Tech Optimization Innovation & Product-led Growth Future of Work Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

How to burn down your legacy IT in 10 not-so-easy steps

David Giambruno, VP Tivity Health, is the type of person who negotiates his exit package before ever taking a job. Why? He's going to burn your platforms, automate everything possible, cut costs and be hated by everyone at a company except for the CFO.

"I figure out ways to make IT super-efficient, and so I've learned a lot. The biggest thing is having awesome severance and have your lawyer look at it, because everybody's getting angry," he said.

Giambruno has restructured IT operations at Revlon, Tribune Media, Shutterstock and Pitney Bowes. At Constellation Research's Connected Enterprise 2024, Giambruno laid down some truth. "I'm a tech masochist. I never get called by a CTO or CIO. I get called by the CFO or CEO. Generally, if there's some disaster you can fix it," he said.

More from CCE 2024:

Here's a look at the lessons:

  • Best practices are mediocrity. Most organizations confuse entropy as safety.
  • You have to burn the platform. "It is about pain and the burning platform. Without that no one ever wants to change," said Giambruno. "Starting a system is the only way to change."
  • "Spend no money on the old systems. If you're spending money on the old systems you'll never change," he said. "No legacy. Do not waste a second on an old system."

  • Automation makes everything happen. Automation means more speed and speed always wins.
  • No multi-cloud. Every cloud you add is about 3x the cost and 5x the security problem.
  • Run cloud native applications.
  • Always do proofs of concepts because you'll need to show humans what's possible. "It's really about showing people what's possible because no one ever believes it," he said.
  • Proofs of concepts are for CEO and CFO primarily, but product people are interested when they realize how much faster you can deliver technologies.

  • "Nothing runs a computer better than another computer," said Giambruno. Automation means that costs come down and stay down.
  • "No one will like you. Vendors will hate you because you're taking away huge chunks of money from vendors. I take away huge amounts of money from internal teams too. Then you get a whole new set of vendors and whole new set of processes," he said. "I go from massive chaos to structure."
  • "Everybody chooses lock in with a vendor. It's cheapest to pick one and then tell them too pound salt when my contract is over," said Giambruno.

Tech Optimization New C-Suite Leadership Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Experience Officer

Microsoft Q1 strong, Azure revenue growth 33%

Microsoft reported better-than-expected first quarter earnings as commercial cloud revenue was up 22% and Azure grew 33% from a year ago.

The company reported first quarter net income of $24.7 billion, or $3.30 a share, on revenue of $65.6 billion, up 16% from a year ago.

Wall Street was looking for first quarter earnings of $3.10 a share on revenue of $64.51 billion.

Microsoft CEO Satya Nadella said AI is "expanding our opportunity and winning new customers."

CFO Amy Hood said the company's first quarter execution "delivered a solid start to our fiscal year."

As for the outlook, Microsoft said second-quarter revenue will be between $68.1 billion to $69.1 billion. Wall Street was expecting Microsoft to deliver second-quarter revenue of $69.83 billion. 

Nadella said suppliers are late with data center infrastructure and the company won't be able to meet demand. 

Key points:

  • Azure revenue was up 33%.
  • Microsoft 365 Commercial products and cloud services revenue was up 13% from a year ago.
  • Microsoft 365 Consumer products and cloud services revenue was up 5%.
  • LinkedIn revenue was up 10%.
  • Dynamics 365 revenue growth was up 18%.

Speaking on an earnings conference call, Nadella said:

"At the silicon layer, our new Cobalt 100 VMs are being used by companies like Databricks, Elastic, Siemens, Snowflake, and Synopsys to power their general-purpose workloads at up to 50% better price performance than previous generations. On top of this, we are building out our next-generation AI infrastructure, innovating across the full stack to optimize our fleet for AI workloads."

Nadella added that data centers (DCs) are a constraint. He said:

"We ran into a set of constraints, which are everything because DCs don't get built overnight. So, there is DCs, there is power. And so that's sort of been the short-term constraint. Even in Q2, for example, some of the demand issues we have or our ability to fulfill demand is because of, in fact, external third-party stuff that we leased moving up. So that's the constraints we have. But in the long run, we do need effectively power and we need DCs. And some of these things are more long lead." 

Hood addressed capital expenditures.

"Capital expenditures, including finance leases, were $20 billion, in line with expectations, and cash paid for PP&E was $14.9 billion. Roughly half of our cloud and AI-related spend continues to be for long-lived assets that will support monetization over the next 15 years and beyond. The remaining cloud and AI spend is primarily for servers, both CPUs and GPUs, to serve customers based on demand signals."

She said that the capital expenses will pay off over time as capacity catches up to demand. 

"In H2, we still expect Azure growth to accelerate from H1 as our capital investments create an increase in available AI capacity to serve more of the growing demand."
 

Data to Decisions Future of Work Tech Optimization Innovation & Product-led Growth Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Microsoft SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer