Results

Will IonQ make quantum computing enterprise relevant in 2025?

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly.

IonQ, seen as one the major players in quantum computing, is arguing that enterprise relevance for its nascent market will be here in 2025--well before most observers are expecting.

That argument, made during IonQ's Analyst Day Sept. 19, is notable and may surprise folks that are betting that quantum computing will have enterprise relevance in a decade or so. Chuckle if you will, but I'll argue it's worth hearing IonQ out as it develops its #AQ 64 quantum system for 2025 commercial deployments.

First, let's state the obvious--quantum computing is in the early stage. In some ways, quantum computing is a fascinating market that could be the next big thing after generative AI, cloud, mobile, Internet and personal computing. It's not a question of IF quantum computing takes off, but WHEN.

You can see Constellation Research Shortlists from Holger Mueller to get the lay of the quantum computing land.

IonQ's financials tell the tale of early-stage markets. For instance, IonQ has an estimated $52.5 million in bookings estimated for fiscal 2023, but year-date-revenue through June 30 was $9.8 million with a net loss of $71.05 billion. IonQ's second quarter revenue was $5.5 million, up from $2.6 million a year ago. The company does have more than $500 million in cash to figure things out and executives noted IonQ will be self-sufficient without raising more dough.

Whether IonQ is worth a market capitalization of more than $3 billion remains to be seen.

CEO Peter Chapman made the argument that IonQ can be one of those generational companies, but in the meantime, it's hiring a lot of talent from the likes of Nvidia, Microsoft, Amazon, Oracle, Uber and Apple. IonQ has also partnered with QuantumBasel to establish a European quantum data center housing its #AQ 35 and #AQ 64 systems, signed a memorandum of understanding with South Korea's Ministry of Science and ICT.

Here's a look at my takeaways from IonQ's Investor Day as I sat through 5 hours of presentations and 103 slides, so you didn't have to.

The big picture. Quantum computing will be important, but the big question is when. Chapman argued that quantum computing will be enterprise relevant and solve real problems when its AQ 64 system scales up in 2025. Chapman said there's a big difference between solving real problems and waiting until quantum supremacy or quantum advantage to do anything. "We do not care about quantum advantage or quantum supremacy," said Chapman. "The test for me is this. Can I solve a customer problem with a better mousetrap at a better price? Our goal is to build quantum computers that solve problem for customers."

IonQ Thomas Kramer added that "it's too late to start a quantum computing company if you wait for quantum supremacy."

IonQ is pragmatic. A deep dive into the company's production and engineering plans revolved around usage of common parts, modularity and rack-mounted systems. Sure, IonQ's presentation was as quantum geeky as the rest of the field, but the company is homed in on solving problems and setting the stage for on-premises deployments, servicing and maintaining systems as easily as possible.

There's a solid roadmap and IonQ has customer references and use cases. IonQ outlined three customers where it has helped develop algorithms that can scale in future quantum systems. The general idea: Develop the algorithms now for quantum so when the compute lands you'll be ready to roll. Those customers were also heavy hitters: Airbus, Hyundai, GE Research and Air Force Research Laboratory.

The company also is self-aware. IonQ has added strong executives, outlined a strong plan and has matured a lot since its October 2021 special purpose acquisition company (SPAC) IPO. Executives noted that IonQ has evolved from an academic and research driven organization to one that is focused on engineering. The next evolution will be moving from an engineering focused org to a product focused one. That evolution to be product focused will be "the next phase over the next several years," said Chapman.

Today, IonQ is engineering driven and that means tackling issues like error correction. Chapman and his team noted that A64 may not require error correction since it'll be able to mitigate issues ahead of time. "Error mitigation is a statistical approach to remove errors before systems are delivered to the customer," said Chapman. IonQ is pursuing both options at this phase of A64 development.

IonQ is more of a services firm today. IonQ is public but still an early-stage company that talks in terms of bookings and interest without actual sales. IonQ looks like more of a services firm as it develops products, algorithms and its ecosystem. The company will sell hardware and has multiple revenue options, but today it's helping customers with know-how, proofs of concepts and applications and use cases. This approach isn't surprising and there's precedent. Palantir and C3 AI were more consulting and services firms before becoming more product focused.

Manufacturing and supply chain are big unknowns. IonQ is building its manufacturing facility in Seattle, but it remains to be seen if it can deliver quantum systems at scale. IonQ has to build out its supply chain, source components, vertically integrate as needed and decide what parts it needs to create itself. IonQ's Seattle Manufacturing Hub and Data Center is set to start manufacturing in the fourth quarter.

Software will be critical. IonQ said that its software approach will be critical for everything from reducing noise in quantum systems, error mitigation and connecting to the broader ecosystem that'll include quantum processors, GPUs and CPUs. In addition, software will need to be developed for quantum systems while still working with classic computing.

What does the revenue model look like in the future? Executives walked through the go-to-market approach and future revenue streams. Production systems for commercial, government and academia will create hardware revenue, but there will also be access agreements, usage based, and work completed models. Broadly speaking, IonQ revenue drivers in the future include:

  • Application co-development where IonQ partners with companies to develop end-to-end quantum systems.
  • Partner cloud access via hyperscale cloud providers.
  • Preferred computing agreements.
  • Dedicated hardware. "We are seeing sustained interest from multiple parties," said Kramer. "Hardware will be sold and quantum will run on-premises."
  • Apps and software.

Chapman added that there's a lot of interest in quantum networking and that has potential too. IonQ could also be involved with designing products with quantum systems. "At some point in the future, we'll be doing designs and getting royalties for things like battery design and drug discovery," said Chapman. "If 10- to 15-years from now our only source of revenue is systems sales we somehow failed."

Final thought. It's easy to argue that IonQ will simply be roadkill for much larger players including IBM, Google, Nvidia and a bevy of others. Then again, IonQ has a pragmatic approach and focus that potential rivals don't have. For now, track the quantum computing space in a future file.

Tech Optimization Data to Decisions Innovation & Product-led Growth Quantum Computing Chief Information Officer Chief Technology Officer

HOT TAKE: Salesforce Announces Airkit.ai Intention and Advances Promise for Easy Trusted AI

The dust hasn’t even settled from breaking down the mega-campground known as Dreamforce that we see Salesforce following through on a promise made in those crowded halls of Moscone Center: AI should be simple, trusted and available for all to deploy in meaningful ways. Salesforce has announced its intention to acquire Airkit.ai, a low-code/no-code bot-builder that has been hot in the customer service and contact center space with their easy to deploy and manage AI-powered agents.

Airkit.ai’s claim to fame has largely grown around a belief that AI-empowered bots should do far more than spit back FAQ and simple help responses. Instead Airkit.ai has encouraged users to think beyond answers and into more proactive, rich and resolution-driving engagements. This has found a natural sweet spot in commerce where the outcome of a bot experience is measured in positive (and profitable) experiences as opposed to engagement deflection as the purpose of a self-service motion.

What We Know About The Deal: Not much.

In the press release officially launching the intention news, Salesforce declined to provide any terms or dollar amounts for the deal and noted that it would not be disclosing any further details about the acquisition. What we DO know is that this is not the first time Salesforce has acquired a company from Airkit.ai’s founders, Adam Evans and Stephen Shikian. In 2014, the duo sold their previous company, RelateIQ, to Salesforce for $390 million, making them part of this Salesforce boomerang trend that has seen former employees AND former entrepreneurs make their way back to the fold. For what its worth, Airkit.ai was not an unknown quantity to Salesforce as it was part of the Salesforce Ventures portfolio that, as of late, has been hyper-focused on AI solutions and tools. In fact, Salesforce Ventures was part of Airkit.ai’s initial funding back in 2020. Airkit.ai will become part of Service Cloud and will continue to be led by Evans who was both Airkit.ai and RelateIQ’s Co-Founder and CTO.

Interestingly, both RelateIQ and Airkit.ai represent building block pieces for Salesforce. When it was time to advance their goal of automation, RelateIQ was a great pick up to leverage unstructured data across things like social networks, chats and calendars to automate the sales process. Now, in this "Data + AI Era" for Salesforce, Airkit.ai helps build AI-powered customer service agents that learn from everything including business policies to customer data from transactions or previous engagements.

The acquisition is expected to close in the second half of Salesforce’s fiscal year 2024.

What This Means for Salesforce: Bots everywhere!

In the early days of Salesforce’s Einstein strategy, bots played a central, if not exclusive role with the introduction of multi-channel and multilingual “Einstein bots” that could automate common tasks and answer common questions. But in this new world of AI and data-enriched, contextual and personalized engagement as the table steaks of a profitable customer experience, these simple answer-focused, deflection-as-outcome bots wouldn’t necessarily be the best of breed. In fact, according to Airkit.ai, leading edge experience driven brands need to “ditch” these bots that center around “deflection and containment KPIs” in favor of intelligent experiences that embrace “resolution as the ultimate mark of success.”

With Data Cloud in place and the addition of the Einstein Platform and Einstein Studio on top of the Salesforce Platform, the company is now ready to supercharge their customer’s capacity to not just deliver results through AI-powered bots, but have the actual data and AI infrastructure in place to ensure that sales and service teams can safely, quickly and easily deploy digital cross-channel assistants. This is as much of a play for Salesforce’s Service Cloud as it is for the growing portfolio of smart features in Commerce Cloud as AirKit.ai offers proactive service and commerce engagements that keep customers happy and coming back for more.

The Bottom Line: A Nice Little Pickup That Could Deliver Big Applications

This pickup makes good on the promise that deploying AI tools and experiences shouldn’t require a costly cadre of data scientists and prompt engineers to get an engagement up and rolling. While Airkit.ai will find its initial home under the Service Cloud banner, it is unlikely that the functionality of Airkit.ai’s easy to configure and deploy bots will stay exclusively in those cloud walls. Expect to see both functional use cases ready for deployment right out of the gate as this acquisition closes with bots and engagements tailored for service, sales and marketing. But I’m also excited to see how David Schmaier and team weave in that strong history of industry-centered expertise and products to extend the Airkit.ai use case far beyond a familiar customer service or contact center storylines as Salesforce dives into industry-specific bots that are proactive and contextual to a customer or employee's journey.

Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience Chief Customer Officer Chief Marketing Officer Chief Digital Officer Chief Revenue Officer Chief Data Officer

IT incidents, response hurdles drive up enterprise costs, says Constellation Research survey

Enterprises are being barraged by IT incidents, face a shortage of skilled personnel and lack the time to follow best practices or automate response processes, according to a new Constellation Research report. And major IT incidents aren't cheap.

In fact, 53% of enterprises have seen anywhere from 3 to 10 major IT incidents in the past 12 months, up from 47% in the previous year. More shocking is that 56% of respondents say at least 50% of incidents could have been avoided with best practices. In addition, 55% of enterprises say the cost of major IT incidents have cost their organizations less than $100,000 with 27% putting expenses at $100,000 to $500,000 and 12% citing costs of $500,000 to $1 million.

Those are some of the findings of Constellation Research's recent report, "An Executive Guide to Faster Incident Resolution" by Andy Thurai. The report is based on a survey by Constellation Research and Dimensional Research of more than 300 respondents. A third of respondents were incident responders, a third were their direct managers and another third were budget holders of the incident response unit.

Thurai's report is timely given Cisco's $28 billion acquisition of Splunk and expand in the observability, security and AI markets. One key finding in the survey is that 40% of respondents take 10 to 30 minutes to identify an incident (not resolve it).

One big reason for this time to identify an incident is enterprises have siloed IT observability tools. Other enterprises have legacy tools that are slow. And assuming observability tools work well, users can quickly run into alert fatigue and miss critical incidents.

Simply put, the idea that Cisco and Splunk could combine observability forces makes more sense to consolidate vendors in the field. 

It remains to be seen whether Cisco with Splunk can help enterprises respond to IT incidents. In the meantime, here are a few not-so-fun facts from Thurai's report.

  • 46% of respondents said automating resolution response to IT incidents was the biggest area to improve.
  • 36% said they could improve incident identification and find a resolution.
  • 49% of all incidents are straightforward and responses could be automated.
  • 64% said AI would be critical to identifying the root cause of incidents.
  • 34% said the top reasons for major IT incidents were manual processes and human error.
Tech Optimization Data to Decisions Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity Future of Work Chief Information Officer

Microsoft's Copilot enterprise upsell begins Nov. 1, Copilot fatigue will follow

Microsoft 365 Copilot will be available to enterprises Nov. 1 in a move that will test the limits of the add-on approach to cloud services and create a new condition: Copilot fatigue.

At an event allegedly focused on Microsoft Surface hardware, the software and cloud giant outlined plans to roll out Copilot to Windows 11 with more than 150 new features. Microsoft is also adding OpenAI's latest DALL.E model to Bing and updating Bing Chat Enterprise.

But the real experiment begins Nov. 1. Enterprises will have to start game planning for Microsoft's $30 per user per month add-on for Microsoft 365 E3, E5, Business Standard and Business Premium customers. Do you simply add all of your employees? Pick a few core functions? Wait and see? Another possibility: Enterprises will have to start budgeting for Microsoft's Co-pilot add-ons as well as other monetization models from core vendors including ServiceNow, Salesforce, Adobe and Google.

Add it up and you can easily see how generative AI add-ons are going to be like your streaming subscriptions. You ditched cable for streaming only to find that all you created was a DIY cable subscription. Enterprises ditched software licensing for the cloud only to find more per seat charges as a subscription that you may or may not use. The real scary thought: Generative AI (Copilot) may just give Clippy more brainpower and scale.

Microsoft's Copilot fiesta in Windows is one area where this new use cases have gone too far. Do I really want copilots in Paint and Notepad? Those two programs are popular because they're kind of dumb. If I wanted smart, I'd use Photoshop and Word. Sometimes you just don't want or need the overkill. I feel the same way about appliances in that I prefer dumb and dependable. Your operating system should be the same way.

Multiply Microsoft's Copilot plans with similar efforts by Google (Duet AI everywhere) and other helpful models giving you insights and recommendations at every turn and I'm already fatigued.

Next step: HR sessions talking about copilot fatigue. In a few years, we'll have therapy sessions about copilots not allowing us to think, being annoying and simply helping us too much. You can almost hear the rank-and-file workers telling HR reps that their copilots hallucinate, lie, are too demanding and won't shut up. Even worse: That copilot in FP&A almost got someone fired.

Copilot fatigue will happen. You heard it here first.  

Future of Work Data to Decisions Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Microsoft AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Cisco acquires Splunk in $28 billion observability, AI and cybersecurity play

Cisco said it will acquire Splunk for $28 billion, or $157 a share, in a deal that will give the networking giant a big play in security, AI and observability.

On a conference call, CEO Chuck Robbins said the Splunk deal (statement, Cisco blog) will make Cisco one of the largest software companies. It also transforms Cisco by positioning the company in growth markets AI, security and observability. "The value of data only increases. That's why this deal makes sense," said Robbins. "Our combined capabilities will provide an end-to-end data platform."

Robbins added that the two companies product lines were complementary and provide "full observability for the entire IT stack." He added that Cisco has a big opportunity to expand into AI driven networks.

Splunk CEO Gary Steele will join Cisco's executive team and report to Robbins. “Uniting with Cisco represents the next phase of Splunk's growth journey," said Steele, adding that the two companies have compatible cultures and scale to invest and expand market reach.

Among the key details:

  • Splunk will add $4 billion in ARR to Cisco.
  • Cisco expects the deal to be cash flow positive and gross margin accretive in the first year after the close and non-GAAP accretive in the second year.
  • The deal won't impact Cisco's stock buyback or dividend program.
  • The acquisition is expected to close in the third quarter of calendar 2024.

Observability is becoming a hot space. For instance, New Relic recently went private in a deal valued at $6.5 billion.

"There's a natural synergy when you can handle threat detection and security with AI. That's what you get with Cisco and Splunk," said Constellation Research CEO Ray Wang. "Customers get better network security and Splunk gets a key home and Cisco has a better story that drives AI valuation."

Robbins said the combination of Cisco's security business and data flow from Splunk can enable the company to solve more enterprise issues. "We also think there's an opportunity to expand our global presence," said Robbins. Steele added that international expansion can also boost Splunk and expand go-to-market opportunities. Although the companies said there isn't much product overlap, Cisco does have its own observability platform in the same areas as Splunk. 

The combination of Cisco and Splunk may also help enterprises looking to improve incident response. In a recent research report, Constellation Research analyst Andy Thurai found that 57% of incident response teams have more issues than they can handle.

Here are the Constellation Research Shortlists where Splunk appears.

Constellation Research's take

Given Cisco's reach in the enterprise technology market, there are a bevy of analysts covering the company. Here's how team Constellation Research assessed the deal.

Andy Thurai, the Constellation Research analyst following AIOps and the observability markets, made the following takeaways:

  • The first thought that comes to mind is that this deal is a very natural fit. Cisco entered the observability race with their acquisition of AppDynamics and ThousandEyes and has been trying to build the full stack observability (FSO) platform for a while. Adding Splunk to this mix brings a true full-stack observability capabilities between APM, DEM, Logs, other observability capabilities, and SIEM, to add to their own network monitoring.
  • Given that there is very little product overlap, Cisco gets a big customer base and potential upsell opportunity. A big TAM expansion.
  • Cisco's XDR is well-established and has been around for a while. Adding Splunk SIEM to the mix, assuming the companies can integrate soon, will be a huge boost. It will be hard to integrate two big platforms with considerable technical debt built over the years.
  • Cisco is known for channel selling. It can push Splunk, and the FSO platform thru the channel when it is ready.
  • Some nervous customers are already reaching out and asking for opinions and strategies on what to do. The pricing strategy will be a mess for a while. Splunk has recently moved to mostly consumption-based pricing. Cisco needs to figure out how to integrate this pricing model with its model soon.
  • Though the deal rationale suggests taking advantage of AI, security, and observability, I don't see it as much in AI. I see the synergies in security and observability. Neither company is a leading player in applied AI in their solutions. Splunk is ahead of Cisco on that front, but both need to catch up.
  • Cisco took a while to integrate AppDynamics and ThousandEyes long after the acquisitions. I hope this integration goes easier. Splunk was already struggling with too many acquisitions with TruStar, TwinWave, Phantom Cyber, and Metafor, on the security side, and Flowmil, Rigor, Plumbr, SignalFX, Omnition, and VictorOps on the observability side. My advice would be to dump the smaller, not useful ones to concentrate on the bigger goal.
  • While the acquisition price seems high, this gives Cisco an opportunity to expand its TAM. The observability market is growing and if Cisco and Splunk can integrate their platforms soon they can win on observability. Overall, this deal is good for both companies.  

Constellation Research analyst Dion Hinchcliffe said enterprise buyers will be thinking about vendor consolidation and pricing:

"Cisco's acquisition of Splunk is a bold move with the potential to reshape a key new sector of the IT industry. While there is a good bit of skepticism about Cisco's ability to preserve Splunk's culture of innovation, its massive customer base and global reach will likely help Splunk achieve even greater growth. CIOs will be watching closely to see if Cisco can address their growing concerns about vendor consolidation, with resulting higher prices, and see if they deliver a successful outcome for both companies' customers."

Constellation Research analyst Doug Henschen said:

"Cisco tried to get into enterprise software and acquiring Composite Software a few years back and it didn't work out. Splunk/AppDynamics, in contrast, are closer fit to Cisco in being focused on network and log data analysis. not enterprise data integration like Composite (now part of TIBCO)."

Holger Mueller, Constellation Research analyst, added:

"Cisco has long time realzied its future canno tbe the network alone, this is the boldest move to understand what is happening in the network, and on the network. If executed right it will be a better and more vertically integrated Cisco than before, thus creating value for CxO." 

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Digital Safety, Privacy & Cybersecurity Future of Work Splunk cisco systems Chief Information Officer

Amazon's new Alexa devices get new LLM, generative AI spin

Amazon held an event to highlight new devices and a large language model (LLM) and generative AI tutorial broke out.

In a sign that products are becoming more about algorithms and generative AI than hardware design, the star of Amazon's Devices and Services event was an LLM that makes Alexa more conversational, can absorb real-time information, and reliably make the right API calls. The Amazon LLM model is also proactive and can use the company's Vision ID to know when you're about to talk and read body language. Amazon's Echo devices are a consumer use case with an enterprise LLM behind it. 

Simply put, this new LLM should negate the need to repeatedly say Alexa. Interactions should also become less transactional. Dave Limp, outgoing chief of Amazon's devices unit, said the new proprietary LLM is built on 5 foundational capabilities.

  • Conversation. Amazon was able to use data on what makes a conversation over the past 9 years. Conversations are built on words, body language, eye contact and gestures. That's why the model is built so Alexa can recognize cues on screened devices.
  • Real-world applications. Alexa isn't just a chat box in a browser. As a result, it has to interact with APIs and make correct choices.
  • Personalization. An LLM in the home must be personalized to you and your family.
  • Personality. A more conversational Alexa will be able to have more opinions to go with jokes.
  • Trust. Performance with privacy matters.

Limp's demo with Alexa and its new LLM illustrated some key upgrades. Alexa was able to stop and start a conversation and remember previous context. Amazon said the revamped Alexa will roll out early next year.

Rohit Prasad, senior vice president and head scientist of Amazon Artificial General Intelligence, said what makes Alexa's LLM unique is that it "it doesn't just tell you things but does things."

As a result, Amazon tuned the LLM for voice as well multiple points of context. Real-time connections to APIs will make Alexa better integrated into the smart home and natural.  Key points about Alexa's new LLM upgrades:

  • Alexa's automatic-speech recognition (ASR) system has been revamped with new machine learning models, algorithms and hardware and is moving to a large text-to-speech (LTTS) model that's trained on thousands of hours of audio data instead of the 100s of hours used previously.
  • The ASR model is built on a multibillion parameter model trained on short goal-oriented speech as well as long-from conversations. Amazon said the large ASR model will move from CPUs to hardware accelerated processing and use frames of input speech based on 30-millisecond snapshots of the speech signal frequency spectrum. 
  • Alexa's new speech-to-speech model is LLM based and produces output directly from input speech. The move will enable Alexa to laugh and have conversational tools.

While the models stole the show, Amazon launched a series of features including automatic lighting, call translation and emergency assist services to go along with hardware including the Echo Show 8, Echo Hub and new Fire TV Sticks and Fire TVs with generative AI updates. 

Next-Generation Customer Experience Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

ServiceNow launches Now Assist across Now Platform, pricing bundles

ServiceNow launched Now Assist generative AI tools across its platform and offered some specifics on add-on pricing.

With its Now Platform Vancouver Release, ServiceNow launched Now Assist across its core categories. Now Assist is built on a ServiceNow domain-specific large language model (LLM).

Specifically, ServiceNow launched Now Assist for IT Service Management (ITSM), Customer Service Management (CSM), HR Service Delivery (HRSD) and Creator.

With the move generative AI can be embedded across ServiceNow workflows. For instance, Now Assist across the ServiceNow portfolio can provide contextual summaries and incident resolution notes, curb the manual work required by agents and generate code.

The big question is how much these generative AI features will cost. Vendors have taken multiple approaches including across-the-board price increases, add-ons, new SKUs and credits for generative AI usage.

ServiceNow said feedback from 150 early customers of Now Assist indicates that there are gains in productivity in weeks. CEO Bill McDermott, speaking on ServiceNow's second quarter earnings call, said the company hopes to capture some of the value customers are receiving and was upbeat about premium SKUs.

According to ServiceNow, customers can buy add-ons to get access to Now Assist generative AI. Among the details:

  • ServiceNow IT Service Management, Customer Service Management, or HR Service Delivery Pro or Enterprise solution can buy Professional Plus or Enterprise Plus add-ons to get Now Assist.
  • Pro Plus and Enterprise Plus add-ons are available to customers with industry-specific licenses such as Telecommunications, Financial Services and Healthcare.
  • Now Assist for Creator is available with Creator Plus licenses.
  • Pro Plus, Enterprise Plus and Creator Plus add-ons are purchased by seat. Each seat allows a user to execute a certain amount of "assists." More complex generative AI interactions eat up more assists. This approach sounds similar to Adobe's credit program for generative AI.
  • ServiceNow didn't detail the initial entitlement of assists but noted more capacity can be purchased.

Enterprise pricing will vary based on volume discounts, the number of seats used for generative AI and complexity of interactions. The only certainty is that you'll be paying ServiceNow more for generative AI capabilities.

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity servicenow AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Dreamforce 2023: Implications for IT and AI Adopters

Under the glimmering lights of Dreamforce 2023 in San Francisco last week, the atmosphere was palpable with anticipation as the SaaS leader, now the #3 software company in the world according to estimates, made its most strategic AI announcement yet. Industry leaders, tech enthusiasts, IT practitioners, and Salesforce's core CRM audience all gathered together at Moscone Center to witness the unveiling of Salesforce's latest AI advancements, keenly aware of the seismic shift that AI integration will bring to their organizations and IT departments.

The stakes at Dreamforce were high this year, with the need to deliver on the promises of major gains that AI can provide in terms of business/IT efficiency as well as highly automated -- or at least significantly AI-augmented -- customer-centric operations. IT attendees were focusing on the implications of integrating Salesforce's powerful new AI capabilities into their existing product portfolios, enterprise architectures, and operating models. While the announcements themselves have already been covered by Constellation Insights, I instead seek to understand how these these developments will change the way businesses operate. More critically, how would IT departments and early generative AI adopters need to assess and eventually integrate with Salesforce's most serious foray yet into artificial intelligence?

The new Salesforce Einstein 1 Platform

Photo: Salesforce's main AI announcements at Dreamforce 2023 centered around the new Einstein 1 Platform

The AI Announcements at Dreamforce 2023

  • Einstein 1 Platform: A new unified platform for AI development and deployment, built on the Salesforce Customer 360 platform. It fully supports generative AI and can use many of the most common large language models.
  • Salesforce Data Cloud. This is the company's hyperscale data platform that enables businesses to connect, harmonize, and activate a wide range of their customer data across Salesforce applications. It is also the foundation of the Customer 360 platform. Most significant was the messaging about "bring your own lake" and "bring your own LLM or model.". The AI in the Einstein 1 Platform primarily operates on the information in Data Cloud.
  • Einstein Trust Layer. This is a new security architecture built natively into the Einstein 1 Platform. It is designed to help businesses adopt and use AI responsibly and securely, while preserving their customer data privacy and security standards. It assrures secure data retrieval of data for AI use, a grounding capability to ensure factual answers, data masking, audit trails, zero data retention if needed, and something known as "toxicity detection", which is sure to generate questions as to bias, yet is also an important move towards ensuring AI is business-ready.
  • Copilot and Copilot Studio: A set of AI work accelerators for Salesforce's various cloud products (sales, marketing, and service), plus a matching low-code/no-code AI development tool that enables anyone to create and deploy their own AI copilots, even without coding experience.
  • Free Tier for Data Cloud and Tableau: Salesforce announced that it will be offering free Data Cloud and Tableau licenses to all customers, making it easier for businesses of all sizes to access and manage their data. This will make AI adoption easier and less expensive in the beginning for some customers.
  • Data Cloud-Driven Flow: A new feature that allows businesses to automate workflows based on data from the Data Cloud.

Einstein Copilot Studio by Salesforce

Photo: Einstein Copilot Studio enables anyone to AI-enable tasks through prompt and skill builders

Implications for IT Departments and AI Adopters

The unbelievably rapid proliferation of AI technologies is currently making waves across the IT landscape. For IT departments and AI adopters, the current era is not just about embracing AI head-on. Instead, it's also about 1) detangling the intricate web of various fast-evolving AI platforms and 2) sorting out AI features added to existing products already being used, along with potent new foundational models and LLMs, all while selecting a refined shortlist of top-tier AI solutions that will somehow work well together.

This challenge is further amplified by findings from a recent Constellation Research survey I conducted, wherein nearly a quarter of CIOs and CDOs admitted to putting a halt on ad hoc generative AI apps and features. Their concerns mainly revolve around the obscurity of data flow and the processes through which these products assimilate and share corporate knowledge. Given the long-term implications, organizations have to act swiftly to prepare for AI, with top activities including understanding AI's competitive impact, the leading areas in the business to deploy AI, defining ethical AI policies, ensuring AI is safe to use, protecting data integrity, and training their workforce to work with AI tools. Salesforce, with the positioning and feature set of its Einstein 1 Platform, is indeed staking its claim in offering a formidable top-level enterprise AI capability. However, integrating such a robust tool further requires businesses to properly strategize on making it work coherently, cost-effectively, and beneficially with their existing systems and processes.

With these concerns, there are the top implications for the Einstein 1 Platform, with eye towards the needs of IT departments and generative AI adopters

1.Managing Multiple Enterprise LLMs: How will enterprise stack their various LLMs together? This was increasingly the question of the day at Dreamforce 2023, with domain-specific LLMs (medical, coding, etc.) as well as with external plus internal enterprise LLMs. With the integration of AI tools, IT departments will need to ensure simultaneous cross-model usability and coherence between multiple foundation models. The AI’s constant learning and evolving processes can lead to potential conflicts or redundancies if not managed correctly. In a high priority focus area that demands clarity, it's still not clear yet a) how well Einstein 1 Platform fully enables stacking, b) how strongly their new specially-tuned XGen-7B large language model will feature in the plaform, and c) how enterprises will make all of these modesl co-exist effectiely as organizations sort out their LLM mixes.

2. How Does the Safety Layer in the Einstein 1 Platform Fit into an Enteprise AI Landscape? The introduction of AI necessitates robust safety and security layers. As AI learns and evolves, the system must prevent it from making decisions that could harm the enterprise or its customers. It must protect data/IP while producing useful, accurate output. It means IT departments will need to have stringent safety protocols in place and conduct regular audits. But most importantly, IT orgs and Ai adopters must determine if Salesforce's ambitous and heavily-touted safety layer can serve as the root-level safety layer or just part of a trust regime.

3. Data Management and Privacy: With the Einstein 1 Platform collecting vast amounts of data for better insights, managing this data becomes paramount. It raises questions about data storage, retrieval, and most importantly, privacy. IT departments will need to ensure data residency, PII protection, GDPR compliance and other regional data protection regulations. Understanding how Data Cloud supports these needs and plays well with other data platforms will be key. For its part, Salesforce seems committed, rightly, to openness and interoperability, yet keeping strict control and oversight over corporate data with its trust layer.

4. Intellectual Property Protection and Ownership: As AI systems generate new content, such as marketing drafts or product ideas, the lines around intellectual property rights can blur. Enterprises will need to set clear terms about IP ownership. Salesforce as very clear they have data retention and control policies, and that they aren't gong to make a business from your data. But we're barely in the first inning of generative AI and issues like who own the output, who has legal liability if an AI platform provides plagiarized or otherwise unauthorized content and so on. Microsoft recently offered indemnities for legal liability on AI copyright claims and its not clear if Salesforce will follow suit. These are compelling subjects given the growing involvement in generative AI projects within the enterprise by legal and compliance teams.

5. Impact on Workers With Einstein Copilot. Copilots are fast-arriving and other similar AI-driven assistants are becoming commonplace. The workforce will soon undergo a significant shift as most rote work is eliminated. Workers will need training to work alongside AI, and IT departments will have to ensure that these copilots function as aids rather than replacements. But work itself is going to change as AI assists a growing percentage of work. Getting results from AI depends on broad deployment tof copilots, but this also has many downstream ramifications that organizations should get ahead of now. Einstein 1 Copilot Studio is going to be a major force for unleashing productivity across many business functions, but also will also likely creatte significant worker redundancy for certain classes of tasks.

6. Strategic Integration: Enterprises will need to have a clear strategy for integrating the Einstein 1 Platform and its associated features into their AI portfolios and ModelOps regimes. It means aligning business objectives with AI capabilities, ensuring that the technology serves the company's broader goals. But it also means ensuring the data foundation, trust layers, and enteprise-wide AI operations are all connected and operating holistically. It's not clear yet how well Einstein 1 Platform integrates at each level, but at least this analyst believes that the commitment is there from Salesforce to make it happen if there are shortfalls.

7. Operational Redundancies: The integration of AI could lead to operational redundancies, especially in areas like customer service or data management. Enterprises will need to identify these early and potentially retrain or redistribute workforce resources. In 2024, IT orgs and AI adoption teams must identify how the mix in staffing profiles will change -- sometimes significantly -- in 2024 across the business and IT.

8. Ethical Considerations and Guardrails: The profound capabilities of AI systems like Einstein bring forth ethical questions. How much autonomy should these systems have? Where is the line drawn for data collection and customer privacy? Enterprises will need to address these concerns head-on, establishing clear ethical guidelines and actionable policies for AI adoption and usage. It will become increasingly easy -- as anyone can build copilots or use generative AI in the work processes -- to tap into customer data for uses that may not seem questionable at first glance, but that proper oversight would show is not an allowable use. Guardrails -- almost certainly automated -- will need to prevent inappropriate usage of AI at scale and may well be one of the major issues that ultimates slows down broad deployment for a while until working to a high degree of capability. Salesforce's trust/safety layer appears to be robust enough to grow this capability, but does not appear to have it in full-fledged form today.

The Openness of Salesforce’s AI Strategy

Photo: Salesforce's Openness to 3rd Party Data Lakes and AI Models is Significant and Important

Utimatley, while Salesforce's announcements at Dreamforce 2023 showcased the vast potential of AI in transforming business operations, they also underscored the myriad considerations for IT departments. As AI continues its inexorable rise, IT departments and AI adoption teams stand at the forefront, guiding their enterprises through uncharted territories, ensuring that the promises of today translate into the competitive differentiators and historic opportunities of tomorrow. Salesforce's announcements at Dreamforce 2023 keep the company squarely in the AI race, but also raise a number of important issues that must be addressed. Because not adopting AI until all the issues are addressed is simply not an option for most organizations today, who badly need to gain experience and maturity in generative AI right now, to gain access to their futures.

My Related Research

How to Embark on the Transformation of Work with Artificial Intelligence

Analysis: Microsoft's AI and Copilot Announcements for the Digital Workplace

How Visual Collaboration Vendors are Adding Artificial Intelligence to their Platforms

How Generative AI Has Supercharged the Future of Work

Analysis of the White House's Guidance on Responsible AI Research, Development, and Deployment

How Leading Digital Workplace Vendors Are Enabling Hybrid Work

Every Worker is a Digital Artisan of Their Career Now

How to Think About and Prepare for Hybrid Work

Why Community Belongs at the Center of Today’s Remote Work Strategies

Reimagining the Post-Pandemic Employee Experience

It’s Time to Think About the Post-2023 Employee Experience

Research Report: Building a Next-Generation Employee Experience

Revisiting How to Cultivate Connected Organizations in an Age of Coronavirus

How Work Will Evolve in a Digital Post-Pandemic Society

Creating the Modern Digital Workplace and Employee Experience

The Challenging State of Employee Experience and Digital Workplace Today

The Most Vital Hybrid Work Management Skill: Network Leadership

Clorox hit with cyberattack, sees financial hit, product shortages

Clorox disclosed it will see a material hit to its first quarter results following a cybersecurity attack that hampered production and led to product shortages.

The company is just the latest in a series of companies likely to see a financial hit due to cyberattacks. Caesars and MGM both had incidents that took systems offline and brought web sites down.

Clorox in a regulatory filing said it had to switch to "manual ordering and processing procedures" after identifying unauthorized activity on its IT systems. Clorox added that it has been operated "at a lower rate of order processing and has recently begun to experience an elevated level of consumer product availability issues."

Indeed, Clorox's website that sells direct to consumers notes that the site is "undergoing system maintenance." Most products on the site are out of stock.

According to Clorox, the cybersecurity incident disrupted systems across the company's operations. In response to the attack, Clorox took its systems offline and expects to switch back to normal automated order processing the week of Sept. 25.

The company said:

"Clorox has already resumed production at the vast majority of its manufacturing sites and expects the ramp up to full production to occur over time. At this time, the Company cannot estimate how long it will take to resume fully normalized operations.

Clorox is still evaluating the extent of the financial and business impact. Due to the order processing delays and elevated level of product outages, the Company now believes the impact will be material on Q1 financial results. It is premature for the Company to determine longer-term impact, including fiscal year outlook, given the ongoing recovery."

Clorox will offer more details on the financial hit after it has more visibility.

Digital Safety, Privacy & Cybersecurity Security Zero Trust Chief Information Officer Chief Information Security Officer Chief Privacy Officer

Oracle's plan for Cerner: Cloud shift, generative code rewrite

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly.

Oracle has big plans for its Cerner acquisition, but it has integration work ahead including moving a customer base to the cloud and rewriting applications with a little help from generative AI.

As a refresher, Oracle closed its $28 billion acquisition of Cerner in June 2022. The plan for Oracle is clear: Use its cloud and data knowhow to transform healthcare.

But lost amid talk of Oracle Cloud Infrastructure, growth rates, partnerships with Microsoft and Oracle Cloud World was how much attention Cerner integration received from CTO Larry Ellison and CEO Safra Catz. Consider:

  • On the first quarter earnings call, Cerner was mentioned 30 times, second only to cloud at 71 mentions.
  • Catz gave revenue growth guidance including Cerner and excluding it. Catz cited headwinds on moving Cerner customers from a license model to a cloud one. And she noted that Cerner profitability has to get to "Oracle standards." Catz said: "We are in an accelerated transition of Cerner to the cloud. This transition is resulting in some near-term headwinds to the Cerner growth rate as customers move from license purchases, which are recognized upfront, to cloud subscriptions, which are recognized ratably. Again, excluding Cerner, I remain committed to accelerating our total revenue growth rate this fiscal year as well as maintaining our current high cloud growth rate for the year."
  • Ellison said that Cerner, aka Oracle Health, has been "awarded two large new contracts with a total value of over $1 billion" to be recognized in the current quarter.

Rest assured that Cerner and healthcare will get a breakout when Oracle hosts its analyst meeting at Oracle Cloud World. Oracle at its health conference tied to Cloud World being held in Las Vegas, made a series of announcements that highlight the company’s healthcare push. Oracle outlined generative AI capabilities across its health product suite, healthcare workplace optimization, financial planning and supply chain applications and customers wins such as Tenent Healthcare, Providence and Loblaw.

Ellison outlined the Cerner integration.

First, Oracle is moving Cerner's Millenium electronic health record (EHR) platform to the cloud and rewriting the software in pieces. At Oracle Cloud World, Oracle also outlined its next-generation Millenium EHR with generative AI tools and public APIs running on Oracle Cloud. 

Ellison said:

"There's a two-phase process with Cerner. The first thing is to get the lift and shift and get the existing system hardened, which we've done and moving the customers to the cloud, which we are in the process of moving everybody to the cloud. That will give them better performance, better security and new features will then start showing up with the system."

And then Oracle is replacing Cerner features with new ones. Ellison said this process will replace the old Cerner system with a new one. He added that Oracle isn't rewriting the code in Java though. It is using generative AI. Ellison said:

"We have an application generator called APEX. And we are not writing code for the new Cerner. We are generating that code in APEX, and it's going extremely well. Again, one of the great things about code generators is they don't make mistakes. Well, either they make the same mistake over and over again or once you fix the mistake, you fix it everywhere. So, the code gen -- we are using a code generator, and to write the new features in Cerner and it's coming along very, very nicely."

Finally, Oracle will transition the old Cerner business to a new model. Catz covered that point, but Ellison reiterated that the transition to a cloud model is a headwind as revenue is recognized over time. Catz also said Oracle "still has a way to go" on Cerner expenses, but changes will become more obvious in future quarters.

Related:

"We are always looking to save as much as we can, and to spend as little while still really transforming Cerner into a modern system in its entirety," said Catz.

Ellison added that Oracle loves to save money. "One of the things we did with our data centers is we automated them. We saved labor costs, and we have better security and better reliability because we eliminated human error," he said. "With the rewrite of Cerner, it's not armies of programmers that are going be rewriting this. We are generating the new Millennium software using APEX. And that's also going to save us a lot of human labor and generate higher quality code and higher quality user interfaces and better security all at once."

Research

Data to Decisions Next-Generation Customer Experience Future of Work Innovation & Product-led Growth Tech Optimization Oracle AR Chief Information Officer