Results

Digital Asset Management: The Last Mile Of CX Delivery

Digital Asset Management: The Last Mile Of CX Delivery

Don't miss this CR #CX Convo! Liz Miller sits down with Adobe's Shelly Chiang to unpack the game-changing potential of modern Digital Asset Management (#DAM). 

Key Insights:
📌 DAM is no longer just a digital filing cabinet
📌 #AI is transforming how teams discover, use, and activate content
📌 Modern DAM breaks down silos across marketing, sales, HR, and global teams
📌 Intelligent search means finding the RIGHT asset in seconds, not hours

Looking to turn your content into a strategic advantage? This convo is a must-watch for #marketing leaders trying to scale creativity and drive efficiency.

Watch the full interview!

On CR Conversations <iframe width="560" height="315" src="https://www.youtube.com/embed/RjgUh4vrHk4?si=oHwHF7VbWLjZbB2O" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

CFOs lose appetite for risk in Q2 amid economy worries, says Deloitte

CFOs lose appetite for risk in Q2 amid economy worries, says Deloitte

Chief financial officers are paring expectations for the economy, revenue, earnings and capital investments for the second quarter as they become more risk averse, according to Deloitte's latest CFO Signals report.

Deloitte's quarterly report showed that CFO sentiment fell in the second quarter as the CFO confidence score was 5.4, indicating medium confidence, down from 6.4 in the first quarter. The first quarter tally indicated high confidence among CFOs.

By the numbers:

  • 23% of CFOs rate the North American economy as good now. In the first quarter, half of CFOs said the North American economy was good.
  • 46% of CFOs said the US stock market was undervalued with 41% saying it was overvalued.
  • 53% of CFOs said debt financing was attractive.
  • One in three CFOs believe it's a good time to take on risk. That reading is the lowest since the third quarter of 2024 and well below the 60% of CFOs who thought the first quarter was a good time to take risk.

The top external and internal risks for CFOs were worth noting. CFOs were worried about the following external risks.

  • 53% of CFOs said the economy was top external risk.
  • 51% cited cybersecurity.
  • 43% said interest rates.
  • 42% supply chain disruptions.
  • 42% said inflation.
  • 37% said geopolitics followed by 32% who cited taxes.

The top internal risks included the following:

  • 46% of CFOs were worried about talent.
  • 46% were worried about lack of ability and resilience.
  • 45% cited cost management.
  • 43% cited efficiency and productivity.
  • 43% cited data compatibility and accessibility.
  • 43% cited technology deployment including AI.
  • 36% cited strategy execution.
Data to Decisions Tech Optimization Revenue & Growth Effectiveness New C-Suite business finance Chief Financial Officer

Ingram Micro confirms ransomware attack, eyes recovery

Ingram Micro confirms ransomware attack, eyes recovery

Ingram Micro, a large technology distributor, said the company has been hit with a ransomware attack that took down its systems.

The press release confirms reports from BleepingComputer as well as comments on Reddit about the outage. Ingram Micro systems have been down since Thursday. The company said Tuesday that it has been restoring systems, but is taking orders via phone and email in many cases.

In a statement, Ingram Micro said:

"Ingram Micro recently identified ransomware on certain of its internal systems. Promptly after learning of the issue, the Company took steps to secure the relevant environment, including proactively taking certain systems offline and implementing other mitigation measures. The Company also launched an investigation with the assistance of leading cybersecurity experts and notified law enforcement.

Ingram Micro is working diligently to restore the affected systems so that it can process and ship orders, and the Company apologizes for any disruption this issue is causing its customers, vendor partners, and others."

According to BleepingComputer, Ingram Micro was hit by the SafePay ranswomware attack.

On July 7, Ingram Micro had posted an update. Ingram Micro said it has made progress restoring systems, but is using its unified support organization to fulfill orders. The company said:

"Today, we made important progress on restoring our transactional business. Subscription orders, including renewals and modifications, are available globally and are being processed centrally via Ingram Micro’s support organization. Additionally, we are now able to process orders received by phone or email from the UK, Germany, France, Italy, Spain, Brazil, India, China, Portugal and Nordics. Some limitations still exist with hardware and other technology orders, which will be clarified as orders are placed. To place subscription orders, customers should contact Unified Support. For general inquiries, customers should contact their sales representative."

Ingram Micro didn't detail what systems have been impacted, but the company's recently announced Xvantage distribution platform is reportedly one of them. Ingram Micro had scheduled a preview of Xvantage components for July 17.

Xvantage was launched in 2022 and now uses AI to automate quote creation, order management and real-time tracking.

The thing to watch now is whether Ingram Micro pays the ransom to get its systems back or continues to take a sales hit. Ingram Micro's outage occurred before a long July 4th holiday weekend when business was likely to be slow. Each day Ingram Micro is down, there will be a sales hit. Ingram Micro will also have to detail costs associated with the attack.

Based on Ingram Micro's first quarter sales of $12.28 billion, the company stands to lose more than $136 million in sales for each day it can't fulfill orders. For instance, Clorox had to issue a profit warning due to a cyberattack and saw sales decline more than 20%.

 

Digital Safety, Privacy & Cybersecurity I am Team Leader at the Nominee Organization (no vendor self nominations) Distillation Aftershots Security Zero Trust Chief Information Officer Chief Information Security Officer Chief Privacy Officer

CoreWeave buys Core Scientific for $9 billion

CoreWeave buys Core Scientific for $9 billion

CoreWeave said it will acquire Core Scientific in an all-stock deal valued at about $9 billion. Core Scientific, a crypto mining company, was a data center provider to CoreWeave.

According to CoreWeave, the acquisition of Core Scientific will eliminate about $10 billion of future leases, create $500 million annual run rate cost savings by the end of 2027 and simplify operations.

CoreWeave said it may "repurpose or divest" Core Scientific's crypto mining business over time.

Under the terms of the deal, expected to close in the fourth quarter, CoreWeave sill issue 0.1235 shares of Class A share for each Core Scientific shares. When the deal closes, Core Scientific shareholders will own less than 10% of CoreWeave.

The purchase of Core Scientific will give CoreWeave 1.3 GW of gross power and an expanded data center footprint. CoreWeave also said the deal will give it 1 GW of power for expansion.

Michael Intrator, CEO of CoreWeave, said adding CoreScientific's data center network enables it to " significantly enhance operating efficiency and de-risk our future expansion, solidifying our growth trajectory."

CoreWeave said the purchase will give reduce its cost of capital, give it more control over its footprint and add expertise in construction, power and site management. CoreWeave derives 83% of its revenue from three customers led by Microsoft. The company also recently announced deals with OpenAI and IBM.

The AI-cloud provider has been expanding rapidly since its March 2025 IPO. CoreWeave has been standing up systems based on Nvidia's latest technology, acquiring firms like Weights & Balances and growing revenue at a rapid clip amid a boom in AI infrastructure.

For the first quarter, CoreWeave reported revenue of $981.63 million, up 420% from a year ago. The company's net loss was $314 million. Non-GAAP net loss for the first quarter was $149.55 million.

Data to Decisions Tech Optimization Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Sakana AI: Think LLM dream teams, not single models

Sakana AI: Think LLM dream teams, not single models

Enterprises may want to start thinking of large language models (LLMs) as ensemble casts that can combine knowledge and reasoning to complete tasks, according to Japanese AI lab Sakana AI.

Sakana AI in a research paper outlined a method called Multi-LLM AB-MCTS (Adaptive Branching Monte Carlo Tree Search) that uses a collection of LLMs to cooperate, perform trial-and-error and leverage strengths to solve complex problems.

In a post, Sakana AI said:

"Frontier AI models like ChatGPT, Gemini, Grok, and DeepSeek are evolving at a breathtaking pace amidst fierce competition. However, no matter how advanced they become, each model retains its own individuality stemming from its unique training data and methods. We see these biases and varied aptitudes not as limitations, but as precious resources for creating collective intelligence. Just as a dream team of diverse human experts tackles complex problems, AIs should also collaborate by bringing their unique strengths to the table."

Sakana AI said AB-MCTS is a method for inference-time scaling to enable frontier AIs to cooperate and revisit problems and solutions. Sakana AI released the algorithm as an open source framework called TreeQuest, which has a flexible API that allows users to use AB-MCTS for tasks with multiple LLMs and custom scoring.

What's interesting is that Sakana AI gets out of that zero-sum LLM argument. The companies behind LLM training would like you to think there's one model to rule them all. And you'd do the same if you were spending so much on training models and wanted to lock in customers for scale and returns.

Sakana AI's deceptively simple solution can only come from a company that's not trying to play LLM leapfrog every few minutes. The power of AI is in the ability to maximize the potential of each LLM. Sakana AI said:

"We saw examples where problems that were unsolvable by any single LLM were solved by combining multiple LLMs. This went beyond simply assigning the best LLM to each problem. In (an) example, even though the solution initially generated by o4-mini was incorrect, DeepSeek-R1-0528 and Gemini-2.5-Pro were able to use it as a hint to arrive at the correct solution in the next step. This demonstrates that Multi-LLM AB-MCTS can flexibly combine frontier models to solve previously unsolvable problems, pushing the limits of what is achievable by using LLMs as a collective intelligence."

A few thoughts:

  • Sakana AI's research and move to emphasize collective intelligence over on LLM and stack is critical to enterprises that need to create architectures that don't lock them into one provider.
  • AB-MCTS could play into what agentic AI needs to become to be effective and complement emerging standards such as Model Context Protocol (MCP) and Agent2Agent.
  • If combining multiple models to solve problems becomes frictionless, the costs will plunge. Will you need to pay up for OpenAI when you can leverage LLMs like DeepSeek combined with Gemini and a few others? 
  • Enterprises may want to start thinking about how to build decision engines instead of an overall AI stack. 
  • We could see a scenario where a collective of LLMs achieves superintelligence before any one model or provider. If that scenario plays out, can LLM giants maintain valuations?
  • The value in AI may not be in the infrastructure or foundational models in the long run, but the architecture and approaches.

More:

Data to Decisions Chief Information Officer

AI's boom and the questions few ask

AI's boom and the questions few ask

The money being thrown around AI talent and infrastructure is staggering, but the return on investment may be sketchy for longer time frames. What happens if AI demand doesn't deliver triple-digit growth forever?

In recent weeks, we've seen the following:

Oracle is predicting revenue gains for fiscal 2028. CEO Safra Catz told employees Oracle is off to a strong start in fiscal 2026 and the company signed multiple large cloud deals "including one that is expected to contribute more than $30 billion in annual revenue starting in FY28." Bloomberg later reported that Oracle's big cloud deal was with OpenAI.

Meta CEO Mark Zuckerberg is trying to hire a dream team and throwing billions into the effort. Zuckerberg is chasing superintelligence, but supergroups can be tough to manage.

CoreWeave said it’s the first AI cloud provider to deploy Nvidia's GB300 NVL72 systems for customers. CoreWeave has also signed a $11.9 billion deal with OpenAI for future compute capacity for model training. CoreWeave's model is fairly simple: Lever up with debt ($8.8 billion as of March 31) and grow your way out of it as future demand materializes. The issue: CoreWeave paid $460 million to service its debt for the first quarter ended March 30 and delivered overall net cash of $61 million. Simply put, CoreWeave would be a great business if it didn't have to pay interest rates between 9% and 15% depending on the credit facility. The company has cash and equivalents of $1.3 billion as of March 31. CoreWeave raises $7.5 billion in debt financing for AI data center buildout

As previously noted, the AI infrastructure game is really just a big leveraged bet that's working for now, but it's worth asking a few questions.

  • How much of the AI boom is dependent on OpenAI posting crazy growth years into the future? Turns out a good bit. Oracle is building like mad on the bet that OpenAI is going to be bigger, badder and superintelligent in two years. What could go wrong? Well, Google, China's AI champions, Microsoft competition, hardware risks and a model training wall to name a few. CoreWeave is betting that OpenAI will be "a significant customer in future periods." Let's hope so. Microsoft was 72% of revenue in the first quarter. Three customers were 83% of CoreWeave revenue.
  • Is Microsoft the smartest of the bunch? Microsoft is allowing OpenAI to diversify its infrastructure spending so it doesn't have to fork over so much dough. Microsoft and OpenAI are bickering over the terms of their partnership as the latter tries to ultimately go public and needs a new structure.
  • Will Nvidia's rivals be good enough? The base of this AI infrastructure boom is Nvidia. Giants are spending mostly on Nvidia, but the market is diversifying with hyperscale cloud custom silicon and AMD. Is it possible that levering up to buy Nvidia GPUs isn't a slam dunk?
  • When will the AI infrastructure music stop? The only guarantee is that the spending boom will pause and there will be glut. Timelines are debatable, but rest assured that deals based on demand years into the future are going to produce spectacular failures.

Add it up and AI infrastructure is looking a lot more like the sports world. Billionaires are spending hundreds of millions if not billions of dollars on player that may produce into the future (or not). A $300 million contract for a player often doesn't pay off. These AI deals aren't much different.

Data to Decisions Tech Optimization Big Data Chief Financial Officer Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Grammarly gets more interesting with Coda, Superhuman deals

Grammarly gets more interesting with Coda, Superhuman deals

Grammarly built a strong business by bringing its AI writing assistant to wherever you work, but recent acquisitions of Coda and Superhuman and new capital point to much larger ambitions.

The company's acquisition this week of Superhuman, which gives Grammarly a native email platform, is just the latest in a series of moves to redefine the company in the AI age. Grammarly is best known as a handy writing tool, but large language models (LLM) threaten to usurp it.

The fix? Grammarly, which has annual revenue of more than $700 million, is going to become an AI-native productivity platform.

Here's the recap of recent events:

Mehrotra, former CTO and product chief at YouTube, noted that Grammarly has "a massive opportunity to reinvent productivity as we know it." Grammarly with Coda set out a mission to focus on how AI agents can improve applications and work across the enterprise.

Grammarly's writing assistant is used across 500,000 apps and more than 40 million people daily. Coda brought Coda Docs, a productivity suite, and Coda Brain, which surfaces corporate knowledge, to Grammarly.

Mehrotra said Superhuman will give Grammarly customers a place to collaborate and be a staging ground for orchestrating AI agents. Grammarly works across more than 20 email providers, but it can do more with a native email platform.

The vision here is that the Grammarly platform will use AI agents to triage your inbox, schedule meetings, analyze your content and write full emails in your voice.

As for the future, Grammarly said the following:

"The future platform will enable scenarios where users can work with multiple agents simultaneously. For example, while writing a customer memo, users could have Grammarly’s trusted communication agent handle spelling and grammar, while a sales agent ensures accuracy of sales facts, a support agent provides context about recent customer issues, and a marketing agent suggests optimal feature positioning."

Now what?

Grammarly clearly has the parts for a broad productivity platform and now has to integrate them. Grammarly is selling its various parts separately, but the magic will really happen with an integrated platform.

Rest assured that a platform launch is on deck.

It's also possible that Grammarly is going to need a rebrand. Grammarly is certainly handy enough to drive $700 million in annual revenue, but sounds like a feature more than a productivity suite. Perhaps, Grammarly simply becomes Superhuman or comes up with a new moniker.

Constellation Research's take on Grammarly's moves were generally positive and strategically on point. It remains to be seen whether Grammarly can pull its users into a broader productivity platform.

Estaban Kolsky, an analyst at Constellation Research, said Grammarly's recent moves are a "way to ensure survival in AI world." He added that "Grammarly's core offering is superseded by AI now so the company needed a new hook. Superhuman is a decent one, but unclear if too little too late."

Holger Mueller, an analyst at Constellation Research, said that Grammarly needed to have more native email support and Superhuman fills the void. The ability to integrate with multiple email systems was useful, but Grammarly needed native support for email, which is the most written document type and natural collaboration space.

Liz Miller, an analyst at Constellation Research, noted that Grammarly just got a lot more interesting with the Superhuman purchase since it makes its future of work mantra more of a reality. Don't understate Grammarly's core offering though.

"Users like it, it delivers the exact value they believe they are opting in for and Grammarly's understanding of language and its imperfections is a tangible and sellable asset to anyone," said Miller, who agreed with the idea that Grammarly needed a new hook.

Constellation Research analyst Michael Ni added:

"This Superhuman purchase isn’t just Grammarly buying an email client—it’s a full-blown AI productivity fabric play. Start with always-on assistance that already has mass appeal, drop into real-time action in Superhuman, and pull it all together in Coda. Boom. It’s AI at the point of thought, decision, and execution. Expect this to challenge those who are playing for the worker "pane of glass" for where work gets done and how it gets measured."

Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization New C-Suite Chief Information Officer Chief Experience Officer

Boardroom AI Investment, Customer Success, Agentic Fatigue | ConstellationTV Episode 108

Boardroom AI Investment, Customer Success, Agentic Fatigue | ConstellationTV Episode 108

Don't miss ConstellationTV episode 108! 📺 This week, co-hosts Larry Dignan and Martin Schneider kick off with #enterprise tech trends, including the slow of AI adoption, innovation fatigue/data challenges, and how practical AI with measurable productivity trumps agentic AI hype. 

Next, Martin tees up his latest Market Overview on customer success, focusing on...
📌 CustomerSuccess as a full-journey strategic function
📌 Moving beyond renewals to driving growth and expansion
📌 Leveraging AI, community tools, and education platforms

Finally, 2025 AI150 inductee David Bray, PhD, shares AI wisdom for the boardroom. Bray recommends that boards need 9-12 month flexible plans, embracing decision elasticity, and prioritizing adaptability over rigid strategies. 

Watch the full episode below!👇 

00:00 - Meet the Hosts
00:23 - Enterprise Tech News
11:56 - Customer Success Research
17:37 - Interview with David Bray

Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Next-Generation Customer Experience Tech Optimization Chief Analytics Officer Chief Customer Officer Chief Data Officer Chief Digital Officer Chief Executive Officer Chief Financial Officer Chief Information Officer Chief Information Security Officer Chief Marketing Officer Chief People Officer Chief Privacy Officer Chief Procurement Officer Chief Product Officer Chief Revenue Officer Chief Supply Chain Officer Chief Sustainability Officer Chief Technology Officer On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/-Oyx7jcZMD0?si=RSIxf7_n60mrBRyD" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

HPE completes Juniper Networks purchase, eyes integration next

HPE completes Juniper Networks purchase, eyes integration next

HPE's has closed the acquisition of Juniper Networks in a move that will double its networking business.

Completing the deal took more than 18 months.

The company announced plans to buy Juniper in January 2024 for $14 billion, received shareholder approval in April the same year and then ran into regulators and a new administration in the US.

Last month, HPE said it settled with the US Department of Justice and agreed to divest its Instant One campus and branch networking business and provide limited access to Juniper's Mist AIOps technology.

HPE, which just held its Discover annual conference, is looking to use Juniper to offer a complete AI stack and improve its margins. Juniper delivered first quarter net income $64 million on revenue of $1.28 billion, up 11% from a year ago.

CEO Antonio Neri said HPE is looking to capitalize on the convergence of AI infrastructure and network and expand its total addressable market. In a blog post, Neri added:

"AI relies on vast, distributed datasets that must be connected securely and sustainably to train or fine-tune foundational or agentic models, and to deploy them for inferencing. That means the network must do more than simply connect users, servers, and storage. It must adapt, scale, and continually become more intelligent."

With the deal closed, now the hard work begins. Here's what HPE said it would do with Juniper in the fold.

  • Integrate Juniper networking with HPE's stack across AI infrastructure and hybrid cloud.
  • Expand into adjacent markets including data center, firewalls and routers.
  • Develop AI-centric integrated systems.
  • Offer a complete stack via HPE's global sales teams and channel.
  • Grow non-GAAP earnings in the first year after the close with the combined networking business accounting for more than 50% of HPE operating income.

Constellation Research analyst Holger Mueller said:

"A deal that looked like it may not clear regulatory hurdles has made it to the finish line. A compliment for HPE and its leadership tenacity - and a strategic win for HPE as it bolsters its networking business - almost a decade after the Aruba acquisition. Juniper gives HPE key capabilities of software defined networking and even more importantly - sizeable public cloud revenue, which has been an area of growth that has eluded HPE. The deal takes HPE back to the future when it was an HP that offered all most all a CIO needed to buy for an enterprise."

Here’s a look at Juniper’s trended revenue it will bring to HPE.

In its first quarter commentary, Juniper said:

“Product orders remained strong and were better than expected, growing double digits year-over-year for the fourth consecutive quarter. Cloud orders continued to be particularly robust, growing triple digits on a year-over-year basis and double digits sequentially, as these customers invest to enable their AI initiatives. Enterprise orders saw double digit year-over-year growth, with orders for Mist and other products attached to the Mist cloud growing more than 40% year-over-year. Service Provider orders were down on a year-over-year basis.”

Data to Decisions Tech Optimization Innovation & Product-led Growth HPE Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Cloudflare's pay per crawl system takes aim at AI crawler freebies

Cloudflare's pay per crawl system takes aim at AI crawler freebies

Cloudflare said it is offering a "pay per crawl" plan where web sites will automatically block AI crawlers and can charge for access. Is this the start of the data dark ages for AI?

The move by Cloudflare makes a lot of sense. Publishers and content creators are providing data to AI models to consume for free as traffic tanks. Some sites have licensing deals, but many don't. As Cloudflare noted, people are getting content from models and not driving traffic to sites. We're consuming derivatives not original with large language models in the middle.

Cloudflare is proposing a system where news sites, publishers and social media platforms can be paid per crawl. Basically, we're talking about a pay wall for humans and machines.

The pay per crawl system is in private beta and could be the start of similar models. While this effort is focused on content producers, it does raise a few interesting potential developments for enterprises.

Brands will want their content out there and would enable AI crawlers. The downside for users is that if all content creators blocked AI crawlers we'd have a web of marketing speak.

Will models advance without unfettered free access to content? If we live in the land of pay walls for humans and machines the data scarcity issue will only get worse. Constellation Research CEO R “Ray” Wang has noted that data is going to become scarce and create a data dark age in 2027.

Constellation Research analyst Michael Ni said the Cloudflare move highlights three issues:

  • It reflects the broader trend of open data, going dark as collective look to monetize
  • It also reflects the shifting business model needed to fund quality information.
  • Implications to those needing data to ensure accurate decisions - whether automated or guided.

Perhaps LLMs will be evaluated on their access to the most current and accurate information for grounding purposes. If grounding becomes a larger part of LLM evaluation, it's likely that Google and its Gemini models would have an edge.

How Cloudflare's pay per crawl system develops is worth watching and enterprises will need to ponder the following going forward.

  • What's the impact on model performance if there's data scarcity for new information?
  • Should industries form data collectives to ensure there is accurate information for model training?
  • Will enterprises need to rely on synthetic data setups if a data dark age emerges?
  • How do content pay roads impact AI agents? Cloudflare said:

"The true potential of pay per crawl may emerge in an agentic world. What if an agentic paywall could operate entirely programmatically? Imagine asking your favorite deep research program to help you synthesize the latest cancer research or a legal brief, or just help you find the best restaurant in Soho — and then giving that agent a budget to spend to acquire the best and most relevant content."

Data to Decisions Big Data Chief Information Officer Chief Data Officer Chief Technology Officer Chief Information Security Officer