Results

Microsoft unveils Majorana 1, aims to scale quantum computing

Microsoft launched Majorana 1, a quantum computing chip with a Topological Core architecture.

Topological quantum computing uses a concept similar to semiconductors using "anyons," which can arrange qubits into patterns. A topological superconductor is a material that can create a new state of matter. It's harnessed to create a more stable qubit that can be digitally controlled.

According to Microsoft, Majorana 1 has a breakthrough material that can observe and control Majorana particles to create more reliable and scalable qubits. Chetan Nayak, Microsoft Technical Fellow, said the goal for Majorana 1 was to invent "the transistor for the quantum age."

Microsoft is betting that Majorana 1 will be a more fault tolerant way to scale quantum computing. Microsoft's architecture used in Majorana 1 creates a path to fit 1 million qubits on a chip the size of a palm of a hand.

There are various flavors of quantum computing in addition to the approach Microsoft is using:

  • Superconducting qubits are seen as general quantum computing options and vendors in this category include IBM, Google and Rigetti Computing.
  • Trapped Ion quantum computing has high fidelity and long coherence times. IonQ is the big player in this category along with Quantinuum, which was created by the merger of Honeywell's quantum unit and Cambridge Quantum.
  • Neutral atom quantum computing has the potential to scale better and QuEra is a player here.
  • Quantum annealing is designed for optimization over general purpose computing and D-Wave has championed this approach.

Microsoft said Majorana 1 has eight topological qubits on a chip and can scale from there. Microsoft is building its own hardware as well as partnering with the likes of Quantinuum and Atom Computing.

Years or decades?

Quantum computing has been in the middle of a big debate about whether it'll be useful in years or decades. Nvidia CEO Jensen Huang said in January that quantum computing was 15 to 30 years away from being useful. Microsoft said its approach will scale quantum computing "within years, not decades."

Microsoft outlined Majorana 1 in a paper in Nature.

Under a program with DARPA, Microsoft said it will build the world's first fault-tolerant prototype based on topological qubits.

Nayak said Microsoft's plan now revolves around "making more complex devices" including its first QPU Majorana with a topological core. Nayak said Microsoft "can scale to a million qubits on a chip the size of a watch face."

Given Microsoft's developments, the move by Quantinuum to combine quantum and generative AI and various hybrid HPC and quantum efforts, enterprises need to prepare potential use cases.

Indeed, cloud vendors, which will deliver quantum instances, have been busy setting up services to get enterprises quantum ready. Given the mileposts, quantum computing is developing quickly.

Constellation Research analyst Holger Mueller said:

"Right when you think quantum computing approaches were set we have a new approach and vocabulary to learn with topological qubits and majorana. It is good to see that alternate approaches are feasible, promising and could accelerate the path to quantum, but it does leave a few question marks for other quantum vendors scaling out alternate approaches. What will matter for CxOs will be a consistent software layer across platforms to traverse quantum platforms. But first we need to see the viable quantum platforms."

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Microsoft Quantum Computing Chief Information Officer

Let's Talk Configure, Price, Quote | 2025 Q1 ShortList Spotlight

Configure, Price, Quote (CPQ) is often seen as just a sales tool. Constellation analyst Liz Miller explains how CPQ can be a critical part of customer experience (#CX) and a valuable source of high-quality #data for business. 📊

CPQ solutions must go beyond streamlining the quoting process. They should integrate with the broader #tech stack, leverage #AI for guided selling and recommendations, and provide a visually engaging experience for customers.

💡 Liz highlights PROS Smart CPQ as a stand-out solution bringing deep pricing expertise and data-driven capabilities to the #B2B CPQ space. Watch below to learn about CPQ potential & why PROS should be considered in your technology buying considerations.

Revenue & Growth Effectiveness Marketing Transformation Next-Generation Customer Experience On ShortList Spotlights <iframe width="560" height="315" src="https://www.youtube.com/embed/zTkciYcpCr0?si=DtKnTpKFs797UypK" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Airbnb: With tech stack in place, expansion plans accelerate

Airbnb plans to become a platform for travel akin to how Amazon in commerce courtesy of a multi-year transformation and a new tech stack.

Speaking on the company's fourth quarter earnings call, Airbnb CEO Brian Chesky said the company has spent the "past several years preparing for Airbnb's next chapter" and rolled out more than 535 features in upgrades in its app over the last two years.

Indeed, that velocity is due to a revamped tech stack designed to take advantage of AI and improve the overall experience. Airbnb is largely built on Amazon Web Services and has nearly 5 billion visitors a year.

Chesky said Airbnb has launched Guest Favorites, which makes it easier for guest to find listing, the Co-Host network, to find local hosts to manage your Airbnb, and destination and map improvements. For good measure, Airbnb is redesigning its checkout experience to remove friction.

These improvements have boosted conversion rates to drive fourth quarter results. Airbnb reported fourth quarter net income of $461 million on revenue of $2.5 billion, up 12% from a year ago. For 2024, Airbnb reported net income of $2.6 billion on revenue of $11.1 billion, up 12% from a year ago.

Chesky in February 2023 noted that AI would benefit Airbnb's long-tail of data. Since Airbnb's 2020 IPO revenue has tripled. In many ways, Airbnb appears to be a travel version of Uber, which has a model that revolves around data.

Chesky said:

"We rebuilt our platform from the ground up with a new technology stack. We've also upgraded our messaging system into a single unified platform, making communication between guests and hosts smoother more reliable. Now, with this new tech platform, we are able to innovate faster and expand beyond short term rentals into becoming an extensible platform with a range of new offerings and 2025 marks the start of Airbnb's next chapter."

Simply put, Airbnb plans to expand beyond its core business to adjacent areas starting in May. Airbnb plans to invest $200 million to $250 million to launching and scaling new businesses. A lot of that spending will go to product development and marketing.

The AI strategy

With the new tech stack, Chesky said Airbnb wants to be in position for trends like agentic AI, but noted that "it's still really early."

"I think AI is going to have a profound impact on travel, but I don't think it's yet fundamentally changed for any of the large travel platforms," said Chesky. "We want to be the leading company for AI enabled traveling and eventually living."

Chesky noted that trip planning with AI is early and not ready for prime time. "We're actually choosing a totally different approach, which is we're actually starting with customer service. So later this year, we're going to be rolling out as part of our summer release, AI powered customer support," said Chesky, who noted genAI works well with multiple languages and can read thousands of documents easily.

Airbnb CEO: GenAI's impact on app experiences minimal so far

Going forward, the plan for Airbnb is to take that AI customer service agent and bring it to Airbnb search to "eventually graduate to be a travel and living concierge," said Chesky.

Chesky said AI will drive efficiencies for customer service as well as developer productivity.

With model prices falling, the commoditization of models will drive value to platforms. Chesky noted that Airbnb aims to be the traveling and living platform.

In the big picture, Chesky said that Airbnb can leverage AI effectively because it is one brand and one app.

He said:

"We want the Airbnb app, kind of similar to Amazon, to be one place you go for all of your traveling and living needs. A place to stay is just really, frankly, a very small part of the overall equation. Every new business we launch, we'd like to be strong enough. It could stand alone, but it makes the core business stronger. I think that each business could take three to five years to scale. A great business could get to a billion dollars of revenue. Doesn't mean all of them will. And you should be able to expect, like, one or a couple businesses to launch every single year for the next five years. We're going to start initially with things very closely adjacent to travel."

Chesky added that new businesses for now will stay close to travel and then Airbnb will expand from there. For now there are dozens of adjacent markets for Airbnb to expand into.

The tech stack

Although Airbnb has been a relatively vocal customer of AWS in recent years it has been quiet. Airbnb was a featured customer at AWS re:Invent 2022, but has noted in its annual report that it is busy integrating AI into its tech stack.

Airbnb noted in its annual report that "our technology platform incorporates the use of AI and ML Technologies, for example, for fraud detection, search, enabling customized features and enhancing community support."

Chesky seemed to indicate that Airbnb was focused on multiple models and optimization, which would point to services like Amazon Bedrock. DeepSeek and cheaper models will also bring down prices, he noted. "I think it's a really exciting time in the space because you've seen like with DeepSeek and more competition with models is models are getting cheaper or nearly free, they're getting faster and they're getting more intelligent and for all this purpose, starting to get commoditized," he said.

Airbnb said in its annual report that it is investing in developing, maintaining and operating AI and machine learning models. That investment will also increase compute costs in the future. Airbnb is also investing in developing its proprietary data sets.

Beyond the AWS foundation, Airbnb is decidedly has a build culture focused on open source tools, sponsoring them and building frameworks that can be used to continuously improve and integrate new technologies.

Notable upgrades include the following:

 

Data to Decisions Innovation & Product-led Growth Matrix Commerce Next-Generation Customer Experience Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity B2C CX AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Customer Officer Chief Information Officer Chief Analytics Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Information Security Officer Chief Product Officer

AI agents bring consumption models to SaaS: Goldilocks or headache?

Enterprise procurement departments are already annoyed with software-as-a-service contracts and AI agents--and the consumption-based models that go with them--are likely to make deals even more complicated.

Welcome to the new world of enterprise software--licenses, seats and a heavy dose of credits and consumption charges. Get ready for conversations like the following:

CFO: "Our IT operating expenses are running hot."

Procurement: "Yeah, we were dinged by extra AI packs, power ups and consumption units."

IT: "We need to optimize our AI agent costs. These $2 conversations with AI are adding up."

Within a few weeks you can rest assured that your enterprise software providers will be layering in consumption models. On the bright side, CIOs are used to consumption models from their hyperscale cloud providers including AWS, Microsoft Azure and Google Cloud as well as data platforms such as Snowflake and Databricks. The bad news: CIOs have struggled to manage those cloud consumption costs for years.

In recent weeks, we've seen the following:

HubSpot CEO Yamini Rangan explained on the company's fourth quarter earnings call that the company's approach has been to create an AI-first product without add-ons for AI. Monetization has come by raising prices for the overall product. Going forward, hybrid models based on seats and usage will be the norm for AI agents.

"I do think that the future of pricing for AI will be hybrid. We'll have both seat based and usage based pricing. Right now, we're focused on delivering value with our agents and as more customers get consistent value with AI, we will introduce usage based pricing," explained Rangan. "The pricing model will be a combination of usage and seats based pricing. But what is really important is that we will consistently focus on delivering value first before adding on to our seat-based models and then monetizing based on usage."

And those are just some recent examples. Salesforce monetizes Agentforce via consumption pricing, Adobe has sold credits for its AI usage and Dynatrace, Confluent and others are going the same route. You can expect a consumption announcement from a SaaS vendor almost weekly going forward.

What does this mean for the enterprise buyer?

Consumption will surge. These consumption models from software vendors often include a healthy free tier because they want more usage. Once enterprises move past free tiers there will be a learning curve to optimize costs and AI agent use cases. Remember how shocked companies were when they thought the cloud cut costs? Get ready for the SaaS-y version. The good news is that DeepSeek and cheaper models will also bring down prices.

Transparency will be at a premium. AI agents are going to require pricing transparency that SaaS vendors aren't used to providing. Enterprise software vendors will need to provide the same dashboards and consumption dashboards that hyperscale cloud players deliver. One customer, who is an early adopter of AI agents, said he expects that same cost transparency from his SaaS vendor as he gets from AWS. If anything, SaaS vendors should have better transparency.

Cloud marketplaces will be critical. Efforts like AWS Marketplace that enable enterprises to purchase software and roll up procurement under one dashboard will become popular. Procurement is already coming around to cloud marketplaces and consumption transparency will accelerate that move.

It's unclear who will manage the digital labor force. One news item that was notable this week was the Workday Agent System of Record. The big idea is that Workday already manages human capital and it can extend into digital capital, aka AI agents too, and track returns and onboarding.

Attribution of outcomes will be the missing link in AI agent consumption models. Ron Miller, operating partner and head of editorial at boltstart ventures, said on DisrupTV that figuring out what vendor is responsible for an outcome is going to be messy. Miller said: "If you start talking about outcome pricing as another element of consumption-based pricing it's chaotic. Who was responsible for the outcome? Was it the Salesforce piece? Was it the Box piece? Was it the ServiceNow? That's just another piece of all this."

Customers will demand cross-platform AI agent transparency. Miller's take revolves around the harsh reality of AI agents today: Every vendor thinks enterprises only operate on one platform. The reality is that there will be AI agent workflows that may be connected from AWS to Google Cloud to Salesforce to Workday to ServiceNow. How do you optimize that mess when each vendor has different pricing? AI agents will make those per-core pricing schemes look straightforward.

The Goldilocks scenario will be delayed. McDermott obviously thinks that the seat, subscription and consumption hybrid model is a win-win for vendors and customers, but I'll bet that there will be a lot of grumbling on the way to Goldilocks.

Data to Decisions New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Leadership AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Financial Officer Chief Information Officer Chief Procurement Officer Chief Experience Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief AI Officer Chief Product Officer

Palo Alto Networks delivers strong Q2, launches Cortex Cloud

Palo Alto Networks reported better-than-expected second quarter results and indicated that demand was helped by the need for AI-driven cybersecurity and enterprises consolidating platforms.

The company reported second quarter net income of $300 million, or 38 cents a share, on revenue of $2.3 billion, up 14% from a year ago. Non-GAAP earnings were 81 cents a share.

Wall Street was expecting Palo Alto Networks to report non-GAAP earnings of 78 cents a share on revenue of $2.24 billion.

CEO Nikesh Arora said performance "was fueled by customers adopting technology driven by the imperative of AI, including cloud investment and infrastructure modernization." The company said it had 75 platformization deals in the second quarter, up 45% from a year ago. A year ago, Palo Alto launched its platform play to consolidate cybersecurity.

The company also said it is on track to halve contract labor to support IT processes due to AI and 80% of engineers are using a coding copilot.

Palo Alto Networks also raised its outlook for the third quarter and fiscal 2025. For the third quarter, Palo Alto Networks projected revenue of $2.26 billion to $2.29 billion, up 14% to 15% from a year ago. Non-GAAP earnings will be between 76 cents a share to 77 cents a share. Next-gen security annual recurring revenue will grow between 33% and 34%.

For fiscal 2025, Palo Alto Networks projected revenue between $9.14 billion and $9.19 billion, up 14% from fiscal 2024. Non-GAAP earnings for fiscal 2025 will be $3.18 to $3.24 a share.

Separately, the company announced two new board members. Helle Thorning-Schmidt, former prime minister of Denmark, and Ralph Hamers, former chief executive officer of UBS Group AG and ING Group, will join the company's expanded board of directors.

Palo Alto Networks also launched Cortex Cloud, which is the combination of the company's Prisma Cloud and Cortex CDR. The platform will use AI to prioritize and automate remediation and improve a new user experience.

Cortex Cloud includes application security, risk management tools with AI, the ability to stop attacks in real time and a security operations center. Existing Prisma Cloud customers will be upgraded to Cortex Cloud.

Shortlists that include Palo Alto Networks:

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience palo alto networks AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology cybersecurity Generative AI business Marketing SaaS PaaS IaaS Digital Transformation Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief AI Officer Chief Experience Officer Chief Executive Officer Chief Technology Officer Chief Data Officer Chief Analytics Officer Chief Product Officer

SAP launches Business Data Cloud, partnership with Databricks: Here's what it means

SAP launched Business Data Cloud, a platform that gives applications access to structured and unstructured data across its platform as well as third-party data. The effort is getting a big assist from Databricks.

Simply put, SAP's plan to open its applications to third party data fills in a key part of its strategy to put its Joule AI everwhere, leverage AI agents with real business context and automate processes.

According to SAP, the ability to integrate the business data in its ERP systems, third party data and use-case knowhow in areas like finance, supply chain and life sciences give its Joule assistant an advantage. SAP also aims to leverage its process mining and automation applications such as Signavio.

The company previewed its AI strategy during its fourth quarter earnings call. At a high-level, SAP's product strategy includes the following pillars:

  • AI everywhere unified by Joule and AI embedded across SAP's platform including Customer Data Hub, SAP Knowledge Graph and SAP Foundation Model.
  • BDC that includes a harmonized data model across SAP, Insight Apps for decision-making and actions and an ecosystem of partners.
  • A suite that's integrated across user experiences with the ability to configure by customer, industry and use case.

SAP's argument for being an enterprise AI leader is that it has the transactional systems and business processes data, the ability with BDC to unify data types and systems of AI agents led by Joule. SAP CEO Christian Klein said Business Data Cloud will "combine SAP's expertise in miss-critical, end-to-end processes and semantically rich data with Databricks' data engineering capabilities."

Databricks CEO Ali Ghodsi said the SAP partnership will help enterprises "bring together all their data regardless of format or where it lives."

With BDC, SAP is looking to deliver "fully-managed SAP data products" across business processes in finance, procurement and supply chain. SAP will also infuse contextual data and AI into SAP S/4HANA, SAP Ariba and SAP SuccessFactors. 

BDC will also include insight apps that will use data products and AI models connected to real-time data for analytics across the enterprise. 

Joule, SAP's genAI copilot, will create agents that combine processes and data. Enterprises will also have the ability to build, deploy and manage their own AI agents. 

Speaking at SAP's launch event, Klein said SAP Business Data Cloud will make its AI foundation stronger and enable more analytics and insights. That foundation sets SAP up for Joule Agents.

Klein said:

"We are announcing the availability of Joule agents for claims management, sales and customer service that will help you resolve disputes faster, elevate the speed and quality of sales engagements and process customer inquiries more efficiently. This library of jewel agents is expanding this year to include more agents across human resources, supply chain, spend management, finance and more."

Klein said Joule agents will be able to leverage BDC to run simulations and make better decisions.

Constellation Research analyst Holger Mueller said SAP's strategy makes sense because it needs to line up its data repository with AI use cases. He said:

"For the longest time SAP was not ready for big data, but that did not hurt the German ERP giant - as it did not hurt any of its competitors. But the need for an AI data repository switched this. It is good to see SAP doing the right thing with the SAP Business Data Cloud (BDC). Under the hood BDC is mostly Databricks, but SAP customers who want and need to build AI powered next generation applications don't care. SAP just needs the capability. Equally important is the SAP intention that BDC is open for third party data since any data lake in 2025 needs to be open for any content."

Here's a look at the various moving parts of SAP's Business Data Cloud and the Databricks partnership.

  • The launch of BDC gives SAP a centralized data approach to support AI use cases and the application stack.
  • For SAP, BDC gives the company the ability to harmonize data across multiple applications and clouds. SAP is also betting on an open ecosystem with integrations with Databricks, Google Cloud, Collibra, Confluent and others.
  • BDC will also use Open Resource Discovery, an open-source metadata standard to describe and share data and include governance with automated compliance tools.
  • At the core of BDC is Databricks. SAP customers will get native access to SAP data within Databricks with zero-copy data movement, Unity Catalog for governance and most capabilities except for AI, business intelligence, Lakehouse Federation and advanced data engineering.

With the Databricks partnership, SAP is aiming to solve a customer pain point--integrating SAP data with non-SAP data. For SAP, the Databricks partnership may also help customers migrate under the SAP RISE program.

For Databricks, the win would be having SAP migrate from SAP Databricks to native Databricks over time. It remains to be seen how customers choose between SAP HANA and Databricks SQL.

Initial reaction to SAP BDC was positive. DSAG, which represents SAP's German customers, said in a statement: "SAP is aiming to harmonize SAP data management across systems with this offering. This new solution in turn accesses various partial solutions, such as SAP Analytics Cloud (SAP SAC), SAP Datasphere, SAP Business Warehouse and SAP BW/4HANA in the S/4HANA Private Cloud Edition."

"The business data fabric is at the heart of SAP BDC. This is where the data is processed semantically and made available in a standardized way. The data used can come from the connected ecosystem - either directly from all SAP applications or already prepared via existing business warehouse systems," said Sebastian Westphal, DSAG CTO.

Constellation Research's take

Constellation Research analyst Doug Henschen handicapped the SAP Databricks partnership and how it works with BDC. He said:

"According to Databricks, the number one ask among joint SAP customers is a simpler, easier way to get data from SAP into Databricks. The new Databricks on SAP BDC (Business Data Cloud) partnership package gives these joint customers a zero-copy bridge via Delta Sharing that ensures that all the semantics of SAP applications remain intact. The assertion is that this new Databricks instance type will provide simpler and easier data access at lower cost than the alternative of exporting data from SAP with third-party data integration tools. The caveats: this offering is exclusively for customers that have gone cloud with SAP Rise (and not on-premises SAP ECC deployments). This instance type does not include Databricks AI/BI (in a nod to SAP Analytics Cloud) and does not include Lakeflow or Lakehouse Federation."

Henschen's bottom line:

"I suspect new customers will eventually want the Lakeflow and Lakehouse Federation capabilities, but will have to upgrade to get them."

Mueller's bottom line:

"SAP BDC is the most important platform innovation for SAP on this side of the millennium. People will mention HANA as well, but HANA was a replacement for partner RDBMS and did not expand SAP's ability to build the business applications of the 21st century. With SAP BDC, SAP goes back partnering for platform (in this case Databricks) and it is more than relevant for SAP customers as it allows to address non-structured data. While AI is the driver, insights to action and just good old analytics are the benefits. Moreover, BDC created the data foundation for SAP to build its next generation ERP applications. And finally, BDC may likely create the incentive as well as the necessity for SAP customers to upgrade to the cloud." 

Related research:

Data to Decisions Future of Work Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity SAP ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Cisco Q2 powered by Splunk; AI infrastructure sales pick up

Cisco delivered strong second quarter results powered by Splunk and strong demand for AI infrastructure.

The company has been a work in progress as it added Splunk to bolster its software and security reach with a lag in AI infrastructure sales. That lag appears to be over.

Cisco reported second quarter earnings of 61 cents a share on revenue of $14 billion, up 9% from a year ago. Non-GAAP earnings were 94 cents a share. Wall Street was looking for Cisco to report second quarter non-GAAP earnings of 91 cents a share on revenue of $13.78 billion.

For the third quarter Cisco projected revenue between $13.9 billion to $14.1 billion with non-GAAP earnings of 90 cents a share to 92 cents a share. For fiscal 2025, Cisco projected revenue of $56 billion to 56.5 billion with non-GAAP earnings of $3.68 to $3.74 a share.

CEO Chuck Robbins said that "as AI becomes more pervasive, we are well positioned to help our customers scale their network infrastructure, increase their data capacity requirements, and adopt best-in-class AI security."

Splunk adds genAI tools, more Cisco touchpoints across observability and security

Robbins added that Cisco is seeing strong demand despite macroeconomic concerns:

"Despite the uncertainty that's going on in the in the US and in the marketplace around the world, I think the one thing our customers understand is that their need to continue spending on technology is just there."

Regarding tariffs, Robbins said that the company has built them into its guidance. 

"It's such a fluid environment right now, it's very difficult to say what's actually going to happen. I wanted to protect the guide and ensure that we built in the 25% that's been proposed. We have a supply chain team that, over the last several years, has built a lot of muscle around the tariff that we had in China and how do we work our way around that. We've game planned out several scenarios and steps we can take, depending on what actually goes into effect."

By the numbers:

  • Cisco product orders in the second quarter were up 29% from a year ago and up 11% excluding Splunk.
  • The company raised its dividend to 41 cents per share and stock buyback program with an additional $15 billion authorized.
  • Security revenue in the second quarter was up 117% due to Splunk with observability up 47%. Networking revenue was down 3% from a year ago.

New switches, new demand

Cisco also recently launched new smart switches that aim to meld AI, security and networking together. Cisco announced the Cisco N9300 Series Smart Switches, which feature embedded DPUs for AI data centers. The switches features Cisco Silicon One E100 and AMD Pensando DPUs that can handle data processing to improve overall performance.

In addition, Cisco Hypershield is being embedded into the new switches in a move that brings together networking and security.

The launch of new switches is part of Cisco’s plan to reimagine data centers for AI training and inferencing. The first available Cisco N9300 Smart Switch, which features 24 100G ports, will ship in spring 2025. A higher-end model, which will feature 48 25G ports, two 100G ports, and six 400G ports, is targeted for summer 2025.

On the new switches, Robbins said that there are strong demand for networking gear that's more integrated. "This switch is a great example of the innovation that we are going to drive in our core networking portfolio," he said. "Security needs to be deployed everywhere. And it this really allows us to integrate security deeply into the network and at the speed of the network."

Robbins added that there will be more demand for networking as agentic AI tax networks. He said: "It's clear that agentic AI work streams are going to put more capacity onto the network."

Tech Optimization cisco systems Chief Information Officer

DeepSeek, Mental Health AI, ShortLists | ConstellationTV Episode 98

📺 ConstellationTV ep. 98 is here! Co-hosts Martin Schneider and Larry Dignan give a news roundup, analyzing how #DeepSeek is shifting the focus to value, price, and performance in #enterprise technology, and Zoho's potential to become an enterprise platform player with its comprehensive #AI strategy for mid-market companies. 

Next, Larry interviews Matt Lewis, founder of LLMental whose mission is to leverage #generativeAI for augmented mental wellness - both for individuals and enterprises.

Finally, Martin previews his 2025 Q1 Constellation ShortLists, highlighting the leading CRM and customer management solutions.

00:00 - Meet the Hosts
00:21 - #Enterprise Tech News
11:14 - Interview with Matt Lewis, LLMental
35:22 - ShortList Highlight
40:08 - Bloopers!

ConstellationTV is a bi-weekly Web series hosted by Constellation analysts, tune in live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday!

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/ucDAtad_J8c?si=P8N2Y-PywjWYUf9y" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Freshworks delivers strong Q4: Is it a giant killer?

Freshworks is planning to expand by courting enterprises fed up with their SaaS providers and layering in AI in its platform. Freshworks is also looking to land ServiceNow defectors.

Speaking on Freshworks fourth quarter earnings call, CEO Dennis Woodside portrayed the company as a giant killer. He said:

"We ended the year with over 72,200 customers who’ve chosen Freshworks CX and EX (customer and employee experience) software to transform their business. Time and again, overpriced legacy software vendors with overcomplicated products drive customers directly into our hands.

More mid-market and enterprise customers are turning to Freshworks as they leave behind our largest IT competitors. We believe that’s because big SaaS vendors are overcharging and underserving their customers, particularly in the mid-market."

Freshworks offers customer experience software and IT service, operations and asset management applications, which fall under the company's EX category.

Woodside specifically focused on Freshworks ability to poach ServiceNow customers for its Freshservice software. He cited wins over ServiceNow from a hard-drive manufacturer, the city and county of San Francisco and Mesa Airlines.

The Freshworks CEO added:

"Coherent, a global manufacturer of industrial and laser equipment, transitioned 500 internal agents and all ITSM workflows from multiple tools, including ServiceNow to Freshservice. Coherent recently expanded its use of Freshservice beyond IT to their HR department, supporting 25,000 employees. They have plans to expand Freshservice to additional teams, such as facilities and procurement."

Constellation Research analyst Liz Miller said:

"What we are seeing with Freshworks is the result of focus and doubling down on solving real experience issues through a service-driven automated approach. Freshworks had been attempting to compete in a very broad market swath and courting a broad list of customers…the we can be everything to anyone approach. In reality, the midmarket and small business needs focused solutions that can implement fast and scale even faster without breaking the bank at a critical time of growth and velocity."

To reinforce Freshworks' giant killer strategy, the company is also looking to monetize its Freddy Copilot. "We expect AI to be a tailwind for our business as customers are realizing tangible business value," said Woodside. "After launching Freddy Copilot in February, we ended the year with more than 2,200 customers, reflecting quarterly net adds of more than 500 or 30% growth quarter-over-quarter."

For new deals, Freshworks had more than 50% Copilot attach rates in new deals worth more than $30,000. For customer experience, Freshworks saw more than 1,300 customers using Freddy AI Agent.

Freshworks plans employee experience push to land midmarket companies

To keep momentum, Freshworks is also adding executives from some of the giants it is hunting. The company recently hired Srini Raghavan as chief product officer. Raghavan is an alum of RingCentral, Five9 and Cisco. Freshworks also hired Venki Subramanian, SVP of Product Management and an alum of SAP and ServiceNow.

Strong quarterly results

Freshworks reported a fourth quarter net loss of $23.8 million, or 7 cents a share, on revenue of $194.6 million, up 22% from a year ago. Non-GAAP earnings in the quarter were 14 cents a share.

For 2024, Freshworks reported a net loss of 32 cents a share on revenue of $720.4 million, up 21% from a year ago. Non-GAAP earnings for 2024 were 43 cents a share.

As for the outlook, Freshworks projected first quarter non-GAAP earnings of 12 cents a share to 14 cents a share on revenue of $190 million to $193 million, up 15% to 17%. For fiscal 2025, Freshworks is projecting revenue of $809 million to $821 million, up 12% to 14%, with non-GAAP earnings of 52 cents a share to 54 cents a share.

Future of Work Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Supermicro's accounting woes curtail fiscal 2025 revenue outlook

Supermicro cut its revenue outlook for fiscal 2025 as the company still hasn't filed its audited annual report. The company said that it intends to make filings by Feb. 25 and raised $700 million in convertible bonds.

For fiscal 2025, Supermicro is projecting revenue of $23.5 billion to $25 billion. The company previously projected fiscal 2025 revenue of $26 billion to $30 billion.

The company, one of Nvidia's biggest customers, has seen sales surge due to demand for AI workloads. Supermicro has also seen its accounting headaches surge too along with subpoenas from the Department of Justice and Securities and Exchange Commission.

The company issued preliminary second quarter results.

However, CEO Charles Liang acknowledged that uncertainty around Supermicro's financials have hampered demand and cash flow. Nevertheless, Liang said the company "is well positioned to grow AI infrastructure design wins based on Nvidia Blackwell" and can deliver $40 billion in revenue in fiscal 2026. Supermicro competes with HPE and Dell Technologies among others.

As for the unaudited second quarter results, Supermicro said earnings will be between 50 cents a share to 52 cents a share. Revenue for the second quarter will be between $5.6 billion to $5.7 billion. Non-GAAP earnings will be between 58 cents a share to 60 cents a share.

For the third quarter, Supermicro said sales will be between $5 billion to $6 billion in the third quarter with non-GAAP earnings of 46 cents to 62 cents a share.

On a conference call, Liang said Supermicro has swapped auditors, after the previously one quit, and added a new CFO and chief commercial officer.

What remains to be seen is how fast Supermicro can regain credibility. Liang said the company is focused on technology wins. He said:

"Our NVIDIA Blackwell products are shipping now. We have begun volume shipments of both air cooled 10U and liquid-cooled 4U NVIDIA B200 HGX systems. Meanwhile our NVIDIA GB200 NVL72 racks are fully ready as well. Utilizing our system building blocks, we are going to soon offer more brand-new platforms for customers seeking further optimized, higher-density and even greener AI solutions. While most key components are ramping at full speed, it will take some time to fulfill our current AI solution backlogs. Some customers also need more time to finish their DLC data centers build out. At the same time, we see strong new demands keep coming in from enterprises, CSPs, sovereign entities, and hyperscalers. "

Liang touted Supermicro liquid cooled infrastructure wins for xAI's Colossus AI supercomputer. Indeed, Supermicro still operates amid an AI infrastructure boom.

On the cash front, it's worth noting that Supermicro began the second quarter with $2.1 billion in cash but ended with $1.4 billion in cash. Here's the explanation from CFO David Weigand.

"Super Micro began the quarter with approximately $2.1 billion in cash and recorded approximately $320 million in GAAP net income for the second quarter. Cash was provided from lower inventory and other sources totaling $1.5 billion. The company used cash to pay down accounts payable by $1.2 billion, realized higher other receivables from purchase rebates and prepaid inventory of $484 million, increased accounts receivable by $335 million, reduced bank loans by $346 million net, incurred capital expenditures of $28 million and had other uses of cash totaling $87 million. This resulted in a reduction in cash of $660 million, thereby ending the Company’s 2QFY25 quarter with $1.4 billion in cash in December. We have continued to prudently manage our working capital and ended January 2025 with approximately $2 billion in cash."

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer