Results

Google Cloud's healthcare push accelerates with Vertex AI, MedLM, Healthcare Data Engine

Google Cloud's healthcare push accelerates with Vertex AI, MedLM, Healthcare Data Engine

Google Cloud said its Vertex AI Search for Healthcare with integration with MedLM and Healthcare Data Engine v2 is generally available.

The announcement, made at HIMSS 2024, highlights how Google Cloud is pushing into the healthcare industry, which wrestles with data silos and multiple electronic medical record systems. Google Cloud's plan is to combine Healthcare Data Engine, MedLM and foundational models to drive efficiencies.

Aashima Gupta, Global Director of Healthcare Strategy & Solutions at Google Cloud, said generative AI use cases started out with skepticism in healthcare, but now is seen as a key component to innovation. The industry is "assessing where the most data intensive manual work is for non-value tasks such as note taking, summarization, finding information and synthesizing information," she said.

"We are seeing possibilities for genAI across the value chain," said Gupta, who noted that employees, nurses and clinicians are seeing workflow improvements. She added that the industry is looking for low-hanging fruit for uses cases and expanding from there. Healthcare providers are also investing in the data foundation needed to implement genAI.

More healthcare:

Google Cloud's play in healthcare is to lean into its search capabilities and leverage genAI to synthesize information across multiple systems. "GenAI has moved much faster than any project we have seen, and we are moving from pilots in some cases into production," said Gupta.

Here's a breakdown of the announcements:

Vertex AI Search for Healthcare. Lisa O'Malley, Senior Director Product Management for Google Cloud Industry Products, said the company is looking at streamlining processes like prior authorization claims, submissions and reviews easier between providers and payers. In life sciences, O'Malley said Google Cloud is looking to reduce the administrative burden with clinical trials.

"This is really about driving insights and information to clinicians," said O'Malley.

Healthcare Data Engine v2. O'Malley said that MedLM and Google Cloud's Healthcare Data Engine v2 is moving beyond proof-of-concept to production in healthcare organizations. "We are starting to see these assistants do more complex tasks like nurses creating a handover to the next shift," said O'Malley.

Richard Clarke, Chief Analytics Officer at Highmark Health, said generative AI is part of an effort to improve experiences. "We're doing everything we can to be more proactive, personalized and enabling patients and clinicians to focus on health," he said. "In order to do that we knew we need a better digital experience, interoperability and data and AI."

Clarke said the Google Cloud healthcare tools are building blocks to create better outcomes. For instance, Highmark is using Vertex AI and Healthcare Data Engine to connect the dots between Epic, an electronic medical record system, survey, records and systems of engagement to provide insights and proactive interventions.

"This next best action system is leveraging Vertex for more than 700 billion data points across our entire population to identify next best actions," said Clarke. "We see that growing substantially as we onboard new data sources."

In the second quarter, Highmark will use Google Cloud to create a learning healthcare system, which is a closed loop system that will collect information from Epic and other systems and make recommendations. "It will be rolled out by the end of the second quarter," said Clarke. "I think we're finally getting past is we're data rich and insight poor. It is a use case rich environment. The challenge has been scale."

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AR AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Oracle's Q3 IaaS revenue up 49%

Oracle's Q3 IaaS revenue up 49%

Oracle reported better-than-expected fiscal third quarter earnings and said cloud revenue was $5.1 billion, up 25% from a year ago. Infrastructure-as-a-service revenue was $1.8 billion, up 49% from a year ago, with SaaS at $3.3 billion, up 14% from a year ago.

The company reported third quarter earnings of 85 cents a share ($1.41 a share non-GAAP) on revenue of $13.3 billion, up 7% from a year ago.

Wall Street was expecting Oracle to report fiscal third quarter earnings of $1.38 a share on revenue of $13.3 billion.

CEO Safra Catz said Oracle landed large cloud infrastructure contracts in the third quarter to drive the company's remaining performance obligations up 29% to $80 billion.

"We expect to continue receiving large contracts reserving cloud infrastructure capacity because the demand for our Gen2 AI infrastructure substantially exceeds supply—despite the fact we are opening new and expanding existing cloud datacenters," said Catz.

Catz added that 43% of Oracle's $80 billion RPO will be recognized as revenue over the next four quarters.

Oracle CTO Larry Ellison said Oracle finished moving most of its Cerner customers to its cloud infrastructure. That move sets the stage for product launches. Oracle's plan for Cerner: Cloud shift, generative code rewrite

Ellison said:

"In Q4, Oracle will start delivering its completely new Ambulatory Clinic Cloud Application Suite to these same customers. This new AI-driven system features an integrated voice interface called the Clinical Digital Assistant that automatically generates doctors' notes and updates Electronic Health Records—saving precious time and improving health data accuracy."

Ellison, who has been relatively quiet about Cerner, said Oracle is set to "transform Cerner and Oracle Health into a high-growth business for years to come."

On a conference call with analysts, Catz said "OCI has emerged as the largest driver of our overall revenue acceleration" and the company is landing workloads via its integrated stack for AI and databases. "This quarter marks the first time our total cloud revenue is more than our total license support revenue," said Catz.

Catz added that OCI consumption revenue was up 63% in the quarter and could have been higher with more capacity. Cloud database services revenue was up 34% and is seen as "the third leg of revenue growth alongside SaaS and OCI," she said.

As for the data center buildout, Catz said she expects overall gross margins to continue to go higher. "While we continue to build data center capacity, overall gross margins will go higher as more of our cloud regions fill up," she said. "We monitor these expenses carefully to ensure gross margin percentages expand as we scale. We are working as quickly as we can to get the cloud capacity built out given the enormity of our backlog and pipeline."

She said capital expenditures should end the fiscal year around $7 billion to $7.5 billion. "we now have 68 customer facing cloud regions live with 47 public cloud regions around the world and another eight being built. 12 of these public cloud regions interconnect with Azure and more locations with Microsoft are coming online soon," said Catz. 

Going forward, Catz said the company's operating margins will improve from cloud scale and bringing Cerner profitability up to Oracle's standards. She said fourth quarter revenue will accelerate from last year and will be significantly higher in fiscal 2025 without the Cerner headwind.

In the fourth quarter, Oracle revenue including Cerner will grow 4% to 6% and without Cerner will grow 6% to 8%. Total cloud revenue without Cerner will grow 20% to 24% in the fourth quarter. Non-GAAP earnings will be between $1.62 to $1.66 a share.

Ellison said:

"In addition to selling infrastructure for training AI large language models, Oracle is completely reengineering its industry specific applications for generative AI."

Later on the conference call, Ellison had some interesting comments about data sovereignty and how the company's autonomous database may make some strange bedfellows. 

Ellison added his take on multi-clouds and opened the door to building OCI regions inside of other hyperscalers in a move that would rhyme with Oracle's Azure partnership. Oracle's Autonomous Database would be the reason multiple clouds would partner with the company. Ellison said:

"We expect the multi-cloud initiative to continue to expand amongst other hyperscalers where we build we build OCI regions inside of and coexisting with their existing cloud infrastructure. We think the world the era of walled gardens is coming to an end. What customers really want is the ability to use multiple clouds that talk to one another. It is really called cloud computing. It's not called a bunch of separate clouds. We expect multi-cloud to become the norm and Oracle DB to be available everywhere. We think that will preserve our franchise in database because the autonomous database is a unique piece of technology, and there's nothing like it in the world. No one else is working on anything like that. No one else is even trying to duplicate the autonomous database. We think it will be it will become a very successful product. In every cloud."

Constellation Research's take

Constellation Research analyst Holger Mueller said:

"Things are just humming for Oracle these days, out of the perfect alignment of OCI, Database and Generative AI revenues. And Oracle has a pretty healthy SaaS business, too. The investment streak into OCI seems to slowly come to an end, with Oracle investing ‘only’ almost $6 billion in CAPEX, practically a third of cash flow. 

It looks like Oracle has managed the build out of its data centers close to where its database customers are due to data residency and privacy. With the Exadata platform being the best place to run the Oracle database, it was only a question of time when local data centers would be available.  

Credit goes to Larry Ellison to see the transformative nature of GenAI earlier than many other tech titans, and then investing substantially into a very large feet of Nvidia servers. The bet is definitely not cheap but that gamble is paying off now. Now all eyes are on the traditionally very strong Q4 of Oracle."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Oracle ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Cloud CCaaS UCaaS Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

AX100 interview: How The Joint maps the customer journey as it scales

AX100 interview: How The Joint maps the customer journey as it scales

Charles Nelles, CTO at The Joint Chiropractic, has to deliver a customer experience that encompasses multiple moving parts, segments and stakeholders including patients, franchisees and doctors. In some ways, Nelles is building systems to herd cats.

I was interested in the business model of The Joint since it rhymed with businesses such as U.S. Physical Therapy and VCA Antech, which is now owned by Mars. The general idea is that a parent company can roll up independent practitioners and physicians and provide technology platforms and efficiencies. Many physicians want to focus on giving care instead of the grunt work involved with running a business.

Constellation Research Ambient Experience: Everything we learned about CX, EX, AI, data | PGA Tour eyes spatial computing, new models |Ambient Experience Summit | AX100 2024 | All AX Leaders

The catch is that it's hard to provide a consistent customer and employee experience in those models. And The Joint is scaling rapidly. In 2023, The Joint performed 13.6 million adjustments, treated 1.7 million unique patients, and added 923,000 new patients. For 2023, the company reported a net loss of $9.8 million on revenue of $117.7 million.

I caught up with Nelles in an interview at Constellation Research's Ambient Experience in Austin. Here's a recap of our chat.

The business. Nelles said that 75% of The Joint's clinics are not owned by doctors and the majority of them aren't owned by the company. "We focus on people that are looking for regular adjustments to just improve and maintain quality of life range of motion, health and wellness," said Nelles.

The Joint has a business model that's built on efficiency and throughput. The typical patient is not taking a lot of time out of the day for an adjustment but stopping in during errands.

During The Joint's fourth quarter earnings conference call, CEO Peter Holt said the company is refranchising corporate owned clinics. He added that monthly memberships in 2023 contributed 85% of system-wide sales, up from 84% a year ago. "As we move into 2024, we've renewed our mission to improve quality of life through routine and affordable chiropractic care, and we’ve advanced our vision to be the champion of chiropractic," said Holt.

Providing a trio of experiences. Nelles noted that The Joint is managing the experience of three different players. The patient needs the ultimate experience, but there's the franchisee as well as the doctors, who get more of an employee experience. "The other adventure for me is a lot of franchise owners are either successful franchisees that compare our tech to every other tech and every other franchise across the globe. Or they are successful business people that need another passive source of income," said Nelles.

Who owns the customer journey? The Joint controls the journey for new patients with intake and ongoing relationships. Doctors control more of the experience for existing patients. "We have a wellness coordinator who's up front patient intake new patients coming in existing patients coming through, we have the doctor who is at the core of that experience. And then we also have the franchised seed, who is running that clinic," said Nelles.

On the company level, Holt said The Joint is focusing on the top of the funnel to bring new customers to the company.

"We focused on initiatives to drive new patients including increasing our media efficiency by adjusting our channel mix and increasing our working media spend to reach even more prospective patients. This adjusted media mix pairs with our patient strategy to ensure that we're delivering the message of affordable, convenient chiropractic care to those most likely consumers.

Additionally, we plan new promotions and offers aimed directly at adding new patients. To take advantage of our local differences, we're creating more robust local store marketing programs by providing proven tactics and more nuanced tools for our system."

Nelles said The Joint just completed customer journey mapping and going through the experiences for people that new to chiropractic care and ones that are familiar. "With the journey mapped we can start talking about the outcome we want and the brand experience we're expecting," said Nelles.

The Joint will also enable initial patient bookings, reengage lapsed customers and automate messages to keep in touch.

The tech stack. The Joint has to deal with electronic health records, compliance as well as a series of business systems supporting franchises.

"You want consistency across all the sites. With The Joint you can go into any site in any state and get adjusted. It needs to be a familiar chiropractic experience. A lot of that comes through those systems and delivering content for training, measuring performance against company standards, and answering questions," said Nelles.

Nelles said The Joint decided to buy a base CRM system, SugarCRM, and then build customizations. "When we started this, we were doing something no one else was doing," said Nelles. "It was challenging but we had to build from scratch. We bought something we're not going to have to maintain and put our secret sauce on top of it," said Nelles. "We added on top of the CRM and built two portions--the back office where we're managing patient visits and the front office for managing the user experience."

Change management. Nelles said he has spent time with the contact center to handle the three flavors of callers--customers, doctors and franchisees. "It is very challenging to mix all of those together," said Nelles.

Generative AI. Nelles said the power of generative AI will be to use the right customer data to personalize experiences. The data will inform you what content needs to be delivered and in what channel. "I think we're headed toward handing over the experience eventually to customers and that's what we're looking for with a lot of different audiences," said Nelles. "How do I get the right information about them, deliver the right content and right channel so customers are getting whatever value they need out of the experience."

Employee and customer experiences. For The Joint, employee and customer experiences are intertwined. "You rarely have one without the other," said Nelles. "We are looking to remove complexity."

Most valuable metrics. Nelles said the number one metric is attrition. "We focus on conversion and attrition," said Nelles. "Did we keep you and are you happy? Or did you leave. Some people will come in and join driven by pain and when that goes away they leave."

The goal of the tech stack is to bring customers value and support the lifecycle with processes that are quick, easy and frictionless.

For Holt, The Joint’s most critical metrics are patient counts, conversions, and attrition.

Nelles said: "A lot of what we're working on is communicating with you in the middle. We've nailed how you talk about pain and what can do for you, but once you're in the middle and feeling good we need to educate better and maintain relationships longer. It's a mix of education, coaching and maintaining that doctor patient relationship."

Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Lessons learned from four Zoho customers

Lessons learned from four Zoho customers

Zoho customers Sparex, Luxer One, ITV Studios and Bergen Logistics are deriving returns from using multiple applications, a common data model and embedded insights.

At Zoho Day 2024, Constellation Research analysts caught up with Zoho customers to talk about implementations, business challenges and future plans. Here's a look at the interviews, what we learned and how companies are thinking about leveraging artificial intelligence.

Zoho CEO Sridhar Vembu: Long-term thought leadership on innovation, technology, people

Kris James, Business Intelligence Manager at Sparex

Sparex, a Zoho Analytics customer, is a tractor and agricultural parts manufacturer and distributor with $100 million in sales. Sparex has been using Zoho Analytics for 6 years to aggregate data from 18 different subsidiaries and ERP systems around the globe.

James explained the setup:

"We have 18 different subsidiaries around the globe, and they all have different ERP systems. We merge them into one through our SQL Server Management Studio and reduce the formatting issues and discrepancies between the systems. Once we've cleaned that up, we ingest it in a half hour with the Zoho data bridge tool. It's about 10 million rows in about half an hour daily. We distribute that amongst 200 users on Zoho."

Zoho usage scaled by department, said James. Sales reps started first as Sparex scaled Zoho by bringing on subsidiaries. Managing directors took to Zoho Analytics next followed by head office departments. Today, product management, purchasing, supply planning and finance are pulling some form of reporting data from Zoho Analytics. Of the 200 Zoho Analytics user, about 140 are sales reps.

James' approach to dashboarding is to tailor to business purpose and sometimes a simple pivot table is fine.

"You can host widgets, dashboards and charts all in one go, but we also send automated emails. We have one report that identifies which products are coming towards the end of life. We flag those in our system and send that out to all the product managers and the purchasing teams."

Zoho Analytics was chosen by Sparex after comparing it to Microsoft PowerBI, Tableau and Qlik, said James. Zoho Analytics was the pick since Sparex wasn't leveraging all of the data in the platform and the embedded analytics approach made sense. Constellation Research analyst Doug Henschen wrote a report noting that in many cases the embedded analytics of your existing applications should be considered before adding another tool.

"The scope of how we could improve still was so much further within Zoho. Zoho pricing was more competitive and much more friendly the features and support. The support is fantastic," said James.

Going forward, James said Sparex is looking at the AI tools built into Zoho, namely Zia Insights and data preparation features, and already using them to some degree. "Ask Zia is so helpful for our users so they don't have to wait to email me and ask a question," said James. "They can type in a question and get an answer."

James added that Sparex wants to use Zoho globally but needs more language support. Zoho Analytics works in English, Spanish and French, but Sparex has more locations around the world.   

Matt Kuczka, Lead Application Analyst at Luxer One

Luxer One launched as a locker service for dry cleaning in 2005, developed a software platform to manage lockers in 2012, and steadily built out a network of lockers to accept deliveries--even for perishable goods. En route to more than 200 million packages received, Luxer One has been installed in apartment buildings, retailers, universities, offices and libraries.

In an interview with Constellation Research analyst Liz Miller, Kuczka said Luxer One started in San Francisco and Sacramento with a simple concept: "People would drop off their stuff, pick it up two days later and that's how the company started," he said. "From there, it grew quickly into having lockers in apartment buildings, condos, multifamily and retail. People can pick up their packages anytime."

Luxer One built an internal system to manage deliveries, but has used Zoho for CRM, service desk and capturing data for Zoho Analytics. "We knew initially what we wanted to measure: How many packages, what sizes and time of day," explained Kuczka. "Once we started adding in additional data from our CRM and Desk, we saw a lot of tickets where a certain locker system had the same issues over and over."

"Our use grew naturally over time, and we got more insights and better reports," said Kuczka. "We create a dashboard for the things that a customer may want to see. We cater to each customer."

Luxer One has been able to collect data for Zoho Analytics in part because it is on the same platform as other applications. Going forward, Kuczka said Luxer One sees potential for AI and automating data chores like standardizing and cleansing.

He said:

"We're realizing we need to automate more to do take those jobs that are mundane that people do over and over again and try to automate them and build them. And this sink of data allows us to do that."

Kuczka said his company creates a data flywheel the more it scales its lockers. The data flows into the service desk, tickets, and customer interactions. Luxer One is experimenting with large language models with its chat bot and classifying question categories. "We're at a very early stage and we've tested a little bit. We're able to automate some tasks like reset codes for lockers," said Kuczka.

Rob O'Brien, ITV Studios, Head of International Technology, at ITV

ITV is a British free-to-air public broadcast television network that has a studios division and broadcasting and streaming arm. ITV Studios creates and distributes TV content and operates in 13 countries. O'Brien said ITV Studios need to track data beyond spreadsheets and looked to Zoho Analytics.

ITV has evolved over the years via consolidation of regional networks that are needed to streamline operations and functions. "We needed a system to bring data together, so we weren't pitching the same idea and increase costs. We needed some kind of audit trail," explained O'Brien. 

Previously, ITV Studios would track 40 to 100 productions by email, and Google Sheets. This approach became really difficult during the COVID-19 pandemic. "We needed a better place to put all of our data into a single place so our leadership could understand where our shows are being made, budgets, crew and locations," said O'Brien.

"We also wanted to add things like risk values, mitigating facts and markup comments so we can adjust plans."

The business side of ITV saw the argument for analytics quickly and was clear about requirements and what it wanted to achieve. "Given COVID, we had to get a system up running quickly," said O'Brien. "That led to a low-code toolset. We had an MVP product in about six weeks. There was change management afterwards, but we knew the requirements and didn't need to get complex."

O'Brien said it used an agile approach to move quickly. Change management was needed to ensure that employees filled in data fields and forms with production data for the risk and legal teams to make decisions. ITV leveraged Zoho Creator and additional products on the platform. Today, ITV is working through various Zoho features on the platform. "This journey over the last five years we've learned about the power of low code, what you can do with it, and integrations into tools like analytics to create dashboards," said O'Brien. 

Keith Cooper, Vice President of Customer Experience at Bergen Logistics

Bergen Logistics is a third-party logistics company primarily serving eye, fashion and lifestyle brands globally.

Cooper said Bergen Logistics was struggling to track leads that were coming in and went to market for a CRM system. Ultimately, Cooper went with Zoho CRM and later used Zoho Desk for client requests and service tracking before adopting Zoho Analytics.

"With the growing number of applications Zoho had available we went with a Zoho One license," said Cooper. "We've been on Zoho One for two years and we have 250 users in our organization."

A deciding factor for Zoho CRM was whether Bergen Logistics would need consultants to implement it. "We knew what the processes should be but weren't sure how we would do the implementation. Once we opened up the hood and dug around, we saw it was fairly straightforward," said Cooper. He said the company used outside help for a few processes, but it was mostly a DIY project to implement Zoho.

Bergen Logistics' stack includes a proprietary warehouse management system that feeds data into the Zoho application stack. "We more than tripled our annual sales production, improved our time to respond and resolve by a hundredfold," said Cooper, who said Zoho is being rolled out globally.

With the implementation, Cooper said Bergen Logistics has honed its KPI game. With Zoho applications, Bergen Logistics has homed in on response time by individual service agent and whether a case was resolved.

Being on one platform also makes it easier to get a dashboard to see where customer issues stand. "One of the biggest benefits of having a cloud-based platform is that anybody in the organization at any time can find what's going on with a particular account or a particular issue," said Cooper. "We have full visibility and can build dashboards for real-time performance in any operational area."

"We've been much better at forecasting. We know what our customer acquisition costs and manage our lifetime value of a customer," said Cooper. "Bergen Logistics is not the lowest cost provider in our space so we have to make sure we're helping customers know the value proposition and manage the sales process."

Going forward, Bergen Logistics is looking to add Zoho applications for account-based marketing as well as embedded artificial intelligence for further insights.

Next-Generation Customer Experience zoho Chief Information Officer

#ZohoDay2024 Customer Interviews: Matt Kuczka, Luxer One

#ZohoDay2024 Customer Interviews: Matt Kuczka, Luxer One

Liz Miller talks with Matt Kuczka, lead applications analyst for Luxer One about the impact of Zoho Analytics 📊 on supply chain issues for their #enterprise... including accurate and clean #data points, better measurement capabilities, and automated #customer reports.

Today, Luxer One has removed the bulk of data speculation and enabled #accessibility and data-driven decision-making across their enterprise.

👉 If you're considering adopting Zoho #analytics tools, watch the full interview to learn more from Matt's experience...

On <iframe width="560" height="315" src="https://www.youtube.com/embed/dYwQBdjq-fw?si=4wrip6V39Dj9y3bT" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

#ZohoDay2024 Customer Interviews: Keith Cooper, Bergen Logistics

#ZohoDay2024 Customer Interviews: Keith Cooper, Bergen Logistics

Dion Hinchcliffe talks with Keith Cooper of Bergen Logistics about Zoho tools solving lead tracking issues in their enterprise and transforming their customer journey, time taken to resolve issues, helpful sales enablement data, and overall communication across their enterprise teams.

If you're considering adopting the Zoho Suite, watch the full interview about's Keith onboarding experience and continued successful experience at Bergen Logistics.

On <iframe width="560" height="315" src="https://www.youtube.com/embed/8aL8-fMT8N4?si=8JrQuo-vIb2VDdRa" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

#ZohoDay2024 Customer Interviews: Rob O'Brien, ITV

#ZohoDay2024 Customer Interviews: Rob O'Brien, ITV

Holger Mueller talks with Rob O'Brien, Head of International Technology for ITV 📺 about how Zoho technology transformed their unstructured #risk data systems through efficient and customizable #storage management that enabled ITV to connect the dots and make data-driven decisions.

If you're considering adopting Zoho #datamanagement systems, watch the full interview to learn more about Rob's experience...

On <iframe width="560" height="315" src="https://www.youtube.com/embed/xMu_unYOlcE?si=gHAYFqpGy8LYHZD4" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

Broadcom posts solid Q1, reiterates outlook

Broadcom posts solid Q1, reiterates outlook

Broadcom's first quarter results were better-than-expected as technology buyers plan to closely watch comments about retaining VMware customers. Broadcom's chip business continues to benefit from the AI buildout.

The company reported first quarter earnings of $1.32 billion, or $2.84 a share, on revenue of $11.96 billion, up 34% from a year ago. Non-GAAP earnings for the first quarter were $10.99 a share. Revenue growth excluding VMware was 11% in the first quarter.

Wall Street was expecting Broadcom to report first quarter earnings of $10.42 a share on revenue of $11.72 billion.

Broadcom CEO Hock Tan said the acquisition of VMware is "accelerating revenue growth in our infrastructure software segment, as customers deploy VMware Cloud Foundation." He also noted that networking demand was strong as data centers retool for AI.

While Tan talked up VMware, signals in the field are less than positive.

Broadcom reiterated its fiscal 2024 guidance of revenue of $50 billion and adjusted EBITDA of $30 billion.

In the first quarter semiconductor revenue was 62% of revenue with infrastructure software (CA and VMware primarily) was 38%.

Tan said VMware will grow at a double-digit percentage rate sequentially. "This is simply a result of our strategy with VMware," said Tan. "We are focused on upselling customers, particularly those who are already running their compute workloads on vSphere virtualization tools, to upgrade to VMware Cloud Foundation otherwise branded as VCF."

"VCF is the complete software stack, integrating compute, storage and networking that virtualizes and modernizes our customers data centers. This on-prem self-service cloud platform provides our customers with a competent and an alternative to public cloud."

In other words, Tan's bet is that if VMware upsells you will come. Tan also said that AI workloads will be more on-premises for cost savings and that means VCF.  

Tan was asked about VMware upselling with a bit of skepticism. Tan said:

"We're very focused on selling upselling and helping customers to not just buy but deploy this private cloud, but what we call virtual private cloud solution or platform on their on-prem data centers. It has been very successful so far, and I agree that it's early evenings at this point. We've been very prepared to launch this push on private cloud."

Tan was also asked about VMware's annual revenue run rate and Tan agreed that the company is still at a $11 billion to $12 billion pace. Analysts were focused on VMware questions even after starting with AI questions. 

Tan said Broadcom is focusing on go-to-market and engineering VCF so it is more easily deployed. He added that the focus is on about 1,000 strategic customers that will be on-premises and leveraging hybrid deployments. "Most of these customers do not have an on-prem data center that resembles what's in the cloud, which is very high availability. very low latency, and highly resilient," said Tan. "What we are offering with VCF replicates what you get in the public cloud and we are seeing it in the level of bookings."

Tech Optimization vmware Chief Information Officer

MongoDB Q1, fiscal year outlook light, but eyes stable workload gains

MongoDB Q1, fiscal year outlook light, but eyes stable workload gains

MongoDB's outlook for the first quarter and fiscal year was lighter than expected even as its fourth quarter results were strong.

First, the good news. MongoDB reported a fourth quarter net loss of $55.5 million, or 77 cents a share, on revenue of $458 million, up 27% from a year ago. Non-GAAP earnings for the fourth quarter were 86 cents a share. MongoDB was expected to report fourth quarter earnings of 48 cents a share on revenue of $435.55 million.

As for the outlook, MongoDB projected first quarter revenue of $436 million to $440 million with non-GAAP earnings of 34 cents a share to 39 cents a share. Wall Street was looking for revenue of $449.08 million with non-GAAP earnings of 61 cents a share.

For fiscal 2025, MongoDB projected revenue of $1.9 billion to $1.93 billion, or $2.27 a share to $2.49 a share. Wall Street was expecting annual sales of $2.03 billion with non-GAAP earnings of $3.22 a share.

MongoDB's outlook landed a few days after Snowflake took a hit on its outlook and CEO change.

In a statement, MongoDB CEO Dev Ittycheria said the company "we will continue to invest in our key product development and go-to-market initiatives." The company ended its fourth quarter with more than 47,800 customers and 2,052 customers paying more than $100,000.

“MongoDB’s results reflect the belt tightening and slower growth we’re see across the mainstream IT market outside of the white-hot pockets tied to AI,” said Doug Henschen, VP and principal analyst at Constellation Research. “Nonetheless, MongoDB turned in another quarter of steady, double-digit growth and has plenty of relational migration and net new customer win opportunities to continue on its current healthy growth path.”

For fiscal 2024, MongoDB reported a net loss of $176.6 million, or $2.48 a share, on revenue of $1.68 billion, up 31% from a year ago. Non-GAAP earnings were $3.33 a share.

On a conference call with analysts, Ittycheria said he saw solid and stable growth for consumption. 

"Overall we are pleased with our performance in the fourth quarter. We had a healthy quarter of your business led by continued strength and new workload acquisition within our existing Atlas customers. I see stable consumption growth going into next year. Consumption trends have been steady for several quarters now."

However, Ittycheria noted that it's early in the AI application buildout and enterprises need to move beyond pilots to ramp consumption at scale.

He laid out the progression. 

"I strongly believe that AI will be a significant driver of long term growth for MongoDB. We are in the early days of AI akin to the dial-up days of the Internet era. To put things in context, it's important to understand that there are three layers to the AI stack. The first layer is the underlying compute and LLM. The second layer is the fine tuning of models and building of AI applications. And the third layer is deploying and running applications that end users interact with. MongoDB strategies to operate as a second and third layers."

Today, the vast majority of AI spend is happening the first layer that is investment in compute to train and run LLM neither are areas in which we compete. Our enterprise customers today are still largely in the experimentation and prototyping stages of building their initial AI applications.

We expect that will take time for enterprises to deploy production workloads at scale. 

Platforms like MongoDB will benefit as customers build AI applications to drive meaningful operating efficiencies, create compelling customer experiences, and pursue new growth opportunities."

The short version is that MongoDB expects the fiscal 2025 consumption and adoption trends to be similar to fiscal 2024. Ittycheria said MongoDB will focus on workload acquisition, building out its go-to-market operations and hone its migration game from relational databases. "This year we are investing in a number of pilots leveraging AI for relational migrations paired with services to substantially simplify and scale the process," he said. 
 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity mongodb Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

AI is Changing Cloud Workloads, Here's How CIOs Can Prepare

AI is Changing Cloud Workloads, Here's How CIOs Can Prepare

For years, the promise of the public cloud has been the primary end-game for IT infrastructure, offering enterprises the most flexible and scalable platform for their SaaS applications. These applications, characterized by their inherently dynamic nature, typically experience significant fluctuations in usage, a key aspect that public cloud was designed explicitly to address. However, times are changing and AI is now poised to upend cloud economic by altering the very nature of workloads, just as cloud spend becomes the top issue for CIOs when it comes to IT infrastructure.

Traditional Web workloads have long exhibited spiky usage patterns, with traffic surging during peak hours (e.g., Black Friday sales) and plummeting during off-peak times and unexpected events. This unpredictable demand curve has aligned perfectly with the on-demand nature of cloud resources – businesses can easily scale their cloud instances up or down to meet the fluctuating needs of their Web applications, paying only for the resources they utilize. However, the overall complexion of cloud workloads has recently shifted due to generative AI, throwing capacity planning into flux. A widely-folowing 2023 study by Flexera, found that optimizing cloud costs has just moved to the top priority of cloud teams (64% of respondents), as they struggle with both workload forecasting and the growing impact of AI on their cloud compute mix.

So there's little question now: The arrival of generative AI has officially thrown a wrench into this well-established dynamic. Unlike Web applications, which exhibit intermittent bursts of high compute demand, AI models are insatiable consumers of continuous compute power, requiring consistent and substantial resources throughout both their training and operational phases. Training a large language model, for instance, can devour vast amounts of compute power for weeks or even months on end, relentlessly pushing the boundaries of available processing capacity. This long-term hunger for compute resources has ignited a fierce competition among cloud providers, each vying to be the leader in AI in the cloud. Evidence for this abouds: GPUs, the chips which provide most of the capacity for training and running AI models, has recently propelled the stocks of AI chip providers like NVIDIA into historic terrorities, and the trend is just beginning.

Cloud Consumption in Generative AI Era - Where Workloads Run Best

The top three hyperscalers, Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP), are each aggressively optimizing their public cloud offerings for AI workloads, investing heavily in the development of both AI cloud capacity as well as their own custom AI chips including Google's well-established Tensor Processing Unit (TPU), AWS's new Tranium2 chip, and Microsoft's upcoming Maia 100 AI processor which each compete with Nvidia's data center-friendly H200. These specialized chips offer significantly improved performance and efficiency for AI workloads compared to traditional CPUs. Of these, notably, only the H200 will be widely available for use within private enterprise data centers, thus making AI chips an emerging risk factor for a new type of cloud lock-in, to add to the concerns of CIOs seeking to come to grips with this new cloud landscape.

While this relentless pursuit of ever-greater AI performance promises significant advancements in various fields, it comes at a substantial additional cost. Hyperscalers must factor in the research and development expenses associated with creating custom AI chips and cloud-based AI tools, alongside the razor-thin profit margins typical of the cloud industry. Additionally, the staggering infrastructure required to support these highly demanding AI workloads translates into significant long-term capital expenditure for cloud providers. Ultimately, these costs are passed on to the cloud customer in the form of service fees, raising a critical question: as AI workloads continue to grow in complexity and resource demands, is the public cloud the most cost-effective solution for most AI workloads in the long run? This is the fundamental question today as AI becomes a growing percent of overall compute utilization.

AI Greatly Tranforms Cloud Economics

The shift towards AI workloads throws a stark light on the limitations of traditional cloud pricing models designed for bursty web applications. Unlike CPUs, which can be easily ramped up or down, GPUs, the workhorses of AI training and inference, are a different beast altogether. These specialized processors excel at parallel processing, making them ideal for the computationally intensive tasks involved in AI. However, unlike CPUs that can be idled during downtime, GPUs are most cost efficient when constantly utilized. While cloud providers typically offer GPU instances with per-second billing cycles, they argue that with their special chips and "paying only for what you use" they offer the highest net cost efficiency for AI workloads. This creates a scenario where businesses will indeed pay just for the AI workloads they need, but still face the increasingly large overhead cost components that used to be easier to hide when the cloud was a smaller industry, which now also includes custom silicon for AI, which each hyperscaler designs and builds out themselves. Although of course some also leverage third parties like NVIDIA as well.

Furthermore, the ever-evolving nature of AI training necessitates a continuous cycle of experimentation and improvement. Businesses are no longer dealing with static applications; whether they realize or not, they're now engaged in a perpetual competitive race to develop and refine better AI models. This ongoing process requires readily available compute resources for training and fine-tuning, pushing IT departments to grapple with the financial implications of constantly running AI workloads in the public cloud. The high bar for entry in terms of infrastructure investment and ongoing operational costs associated with large-scale AI training is creating a fertile ground for alternative solutions.

Cloud Consumption in Generative AI Era - Cost Components of Workloads

The take-away: Any certainty that public cloud was the best place for all AI workloads has greatly receded. CIOs are now considering all their options that yes, still includes the hyperscalers' AI services, but also specialty cloud providers, AI training service bureaus, and private GPU clouds.

Perhaps the most disruptive trend is that this evolving AI landscape is witnessing is the rise of a new class of cloud providers specifically designed to cater to the unique needs of AI workloads. Smaller but more nimble players like Vultr and Paperspace are carving out a niche by offering cloud instances optimized for GPU workloads. These providers often leverage economies of scale by utilizing custom hardware and innovative pricing models that align billing more closely with actual compute usage. Additionally, larger enterprises are increasingly exploring private cloud deployments as a means to maximize control over their AI infrastructure and optimize FinOps (financial management of the cloud), including the burgeoning practice of FinOps for AI. By bringing AI workloads in-house, businesses aim to squeeze every penny out of cost overhead and gain greater flexibility and strategic autonomy in managing their ever-growing compute needs for AI training and operations. This shift towards private and specialized cloud solutions suggests a potential bifurcation within the cloud computing market, with established hyperscalers potentially facing pressure from more targeted, cost-efficient alternatives.

Navigating the AI Cloud Conundrum: A Roadmap for CIOs

The future of cloud computing is undeniably intertwined with the relentless rise of AI. However, for CIOs, this presents a strategic conundrum. Public cloud providers offer unparalleled scalability and access to cutting-edge AI tools, but their cost structures are often ill-suited for always-on, high-performance AI workloads. The path forward necessitates a careful balancing act between agility, cost-efficiency, and control.

Here's an roadmap for CIOs to prepare for this AI-driven cloud future:

  • Invest in Advanced AI Expertise: Building a competent internal team with expertise in AI development, data science, and especially, full stack cloud infrastructure management is now vital. This allows for a deeper understanding of workload requirements and informed infrastructure decisions and internal build out if neeeded. Upskilling for AI is now preferred to hiring in many cases.
  • Hybrid Cloud Strategies: A hybrid cloud approach, leveraging both public and private cloud resources, is now the target environment today, which we saw last year in my research on the rebalancing between public and private cloud. Bursty workloads can reside in the public cloud, while mission-critical, always-on AI workloads can be migrated to a private cloud environment, optimizing cost and performance.
  • Containerization: Containerization technology such as Docker and its robust ML/AI support allows for efficient, rapid packaging and re-deployment of AI models across various cloud environments. This fosters portability and flexibility in choosing the most cost-effective infrastructure for specific workloads. Kubernetes remains popular with larger enterprises, while Docker is favorted by mid-market firms in managing AI deployments.
  • Cost Optimization Tools: Utilize cloud cost management platforms optimized for AI like Cast AI that offer granular insights into resource utilization and spending patterns and cuts costs by up to 50% in some cases. This enables proactive cost management strategies for AI workloads in either the public or private cloud, and allows evaluation of whether private cloud is a more optimal environment for a given AI workload depending on the optimizations needed.
  • Security & Regulatory Considerations: AI workloads in the public cloud raise significant concerns about data security, regulatory and compliance requirements, as well as potential biases. CIOs must implement robust security protocols, conduct thorough risk assessments, and ensure alignment with all relevant regulations. This is another decision point that reflects heavily on the choice between public and private clouds, as private AI deployment can provide consierably more proactive, granular control over data residency and privacy issues with AI.

Competitive Ramifications

The ability to navigate this new landscape will have significant competitive ramifications. Companies that can develop and deploy AI models most cost-effectively and securely will, put simply, gain a significant edge in their industry. Conversely, those struggling to optimize AI workloads for the cloud will fall behind, as their investments just won't get them as far as their cohorts.

The Bottom Line

The future of AI in the cloud demands forward-thinking strategies. As AI workloads reshape the cloud landscape, CIOs are presented with a unique challenge and opportunity. The future demands not just technical expertise in generative AI and large language models, but also a spirit of creative adaptability, a rethinking of cloud orthodoxy, and eagle-eyed vision. Crafting a clear and frequently updated AI roadmap for the enterprise will be crucial to mobilize IT and the business, outlining key decision points and strategic considerations as lessons are learned.

This journey requires continuous learning and the ability to evolve alongside the technology. IT leaders must embrace a growth mindset, exploring diverse deployment models, and remaining open to new possibilities that will be vital for success. Those who are prepared to adapt and learn will not only survive but prosper in this transformative era. The cloud, once a playground for bursty applications, is now evolving into a dynamic ecosystem where 100% load AI workloads reign supreme. For the CIOs who embrace this change with an expansive vision and an open mind as to where AI workloads will best operate over time, the future holds immense promise.

My Related Research

A Roadmap to Generative AI at Work

Spatial Computing and AI: Competing Inflection Points

How to Embark on the Transformation of Work with Artificial Intelligence

AWS re:Invent 2023: Perspectives for the CIO

Dreamforce 2023: Implications for IT and AI Adopters

Video: Moving Beyond Multicloud to Crosscloud

My new IT Strategy Platforms ShortList

My current Digital Transformation Target Platforms ShortList

Private Cloud a Compelling Option for CIOs: Insights from New Research

The Future of Money: Digital Assets in the Cloud

New C-Suite Tech Optimization Chief Information Officer Chief Digital Officer Chief Data Officer Chief Technology Officer