Results

Atlassian Rovo AI additions go GA with consumption pricing on deck

Atlassian said its latest AI features and Rovo, a generative AI assistant that operates across the company's platform, are generally available across the company's products. Atlassian also introduced Rovo Agents.

The company said it will offer Rovo for annual subscriptions at $20 per user per month, monthly at $24 per user a month and consumption pricing in mid-2025. Licensing plans will be based on Rovo use per site where any users with access to the site is a billable user. Enterprises would only pay once per billable user.

Enterprise software vendors have been tweaking monetization models as some vendors focus on consumption or even conversations with an AI agent.

In May, Atlassian launched Rovo with the following core components:

  • Rovo Search, which will comb through content wherever it is stored (Google Drive, Microsoft SharePoint, GitHub, Slack etc.), and query across applications. Rovo Search will identify team players, projects and information needed to make decisions. Rovo Search will connect niche and custom apps via API and have enterprise-grade governance to data governance.
  • Insights, which are delivered via knowledge cards that offer context about projects, goals and teammates.
  • Rovo Chat, a conversational bot that is built on company data and learns as it goes.

In a briefing, Jamil Valliani, Head of Product AI at Atlassian, cited early customers who have boosted efficiency by about 25% using Rovo with their development teams. Rovo beta testers said they've saved 1 hour to 2 hours of time saved per week.

Atlassian is connecting Rovo Search via connectors and connecting that data to Rovo Chat and throughout the platform.

The company also outlined Rovo Agents, which operate out-of-the-ox from Atlassian's marketplace partners. Atlassian is providing more than 20 out-of-the-box agents and tools to build your own Rovo Agents with low and no code tools.

According to Atlassian, Rovo Agents can speed up the development process by automatically generating code plans, code recommendations and pull requests based on task descriptions, requirements and context.

Other updates for Atlassian Intelligence include:

  • Jira Service Management will use AI to group related alerts and surface critical incidents, suggest right resources and subject matter experts. The AIOps capabilities also capture incident timelines, generate post-incident reviews and summarize details.
  • Jira Service Management virtual service agent will automate support across multiple platforms and add new onboarding and automation enhancements.
  • Loom will get AI-powered automated workflows via integrations with Jira and Confluence.

Atlassian's AI additions will be critical to the company's future growth. In August, Atlassian projected first quarter revenue of $1.149 billion to $1.157 billion, below the consensus estimate of $1.16 billion. For fiscal 2025, Atlassian projected revenue growth of about 16%, below the 18% expected by Wall Street.

The company at the time cited uncertain macroeconomic conditions and an evolving go-to-market strategy.

Speaking at an investment conference, Atlassian Chief Operating Officer Anu Bharadwaj said early adoption of Atlassian Intelligence and Rovo has been strong.

"Thousands of customers have adopted Atlassian intelligence already so far, and I'm very pleased with the repeated usage that it gets because one of the interesting things about AI is where are the use cases where you can unlock tangible productivity benefits. I think it is still early innings, so I’m very much looking forward to seeing how that plays out."

Regarding pricing, Bharadwaj said Atlassian has raised prices for its cloud products over time as it has added AI, automation and new features. "The price increases are very much in tune with the amount of customer value that we are able to deliver," he said. "In terms of seat-based versus not, I do think that there is an interesting exploration there around consumption-based pricing, which we will really think through, especially in an AI world, where we talk about virtual agents, which will be different than a seat-based model."

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

How GE Healthcare is approaching generative AI, LLMs, and transformation

GE Healthcare has been working on machine learning, deep learning and artificial intelligence for years, but now the company sees an inflection point where generative AI can transform healthcare from products to workflow to efficiencies that improve the customer experience.

Parminder Bhatia, Chief AI Officer of GE Healthcare, said the emergence of multimodal large language models (LLMs) can uniquely improve healthcare, which is built on everything from different modalities, imaging data, clinical notes, voice interaction, electronic health records and other data.

Before GE Healthcare, Bhatia oversaw generative AI and large language models at Amazon Web Services. His group worked on Amazon Q and Amazon Bedrock. GE Healthcare and AWS recently announced a partnership to transform healthcare with a focus on purpose-built generative AI models using services such as Bedrock.

We caught up with Bhatia, an AI 150 inductee, at Constellation Research's AI Forum in New York to talk shop. Here's a look at the takeaways.

GE Healthcare's approach to AI. Bhatia has been in his current role for about 18 months overseeing the strategy and vision for AI at GE Healthcare. For GE Healthcare, the AI strategy revolves around the AI going into the MRI, CT and X-ray machines as well as digital platforms that focus on clinical and operational efficiencies across a hospital.

"There's a lot of focus on how we build these technologies that can really streamline workflow," said Bhatia. For instance, AI in an MRI machine that can reduce scan time by 50% with the same quality doubles the efficiency and productivity of the workforce.

Other examples of AI's role at GE Healthcare include AI in ultrasound equipment that can act as a copilot, remote scans and imaging and technologies that "improve the efficiencies and accelerate getting better diagnosis, solving problems in treatment and cancer areas as well," said Bhatia.

GE Healthcare has been a pioneer within machine learning and deep learning for more than a decade and has the highest number of FDA approved app authorizations three years in a row.

Why generative AI and healthcare go together. Bhatia said LLMs have been all the talk, but the excitement around them is that they are multimodal. That ability to be multimodal means they apply well to healthcare.

He said:

"These technologies are truly multimodal in nature and that means they're more tailored for healthcare, which consists of data coming from different modalities, imaging data, clinical notes, voice interaction, your EHRs and other data. As these technologies were being built out it made sense for me to get back into healthcare. It's the perfect opportunity to apply these applications."

Patient experience and AI. Bhatia said AI will ultimately have an impact on the patient experience as workflows and staffing levels are improved for diagnosis to screening to treatment and therapy. GE Healthcare Command Center is using AI to streamline hospital operations, manage staffing and send triggers for actions. While many of those technologies don't affect the patient directly, the patient experience is improved with capacity planning.

"These technologies streamline operations and that becomes relevant across a spectrum of things," said Bhatia. "Patient guidance will also be key as we take care from inside the hospital to outside with patient monitoring and virtual care at home."

These hospital workflows will give a longitudinal patient view across care that improves experiences, he said.

Indeed, GE Healthcare recently acquired MIM Software, a company that manages workflows from diagnosis to treatment and therapy. A few recent developments with MIM include:

Personalization of care. Bhatia said AI will also play a big role in personalized treatment for cancer that deliver targeted radiation to kill cells.

"In the next three to five years, you're going to have thousands of variations in which these different radiopharmaceutical drugs can be given to the individual patients," he said. "MIM Software is designed to address the complexities that happen across the system, where it provides solutions to navigate the expanding landscape of personalized treatment."

Bhatia added:

"A lot of these things are starting with operational efficiency, but also combining multimodal data. I think that's where AI is becoming a key enabler, not just at the diagnosis level, but health clinicians can streamline the longitudinal view of the patient's data, which is truly multimodal. That technology and data can really streamline the operations, which has impact on better therapy and more personalized therapy for patients as well."

GE Healthcare's approach to AI. Bhatia said the company is taking a hybrid approach to AI and investing in talent focused on cloud and AI. "We are bringing a lot of that muscle for cloud and AI across the spectrum," he said. "That becomes the key component as we're looking into a lot of problems and challenges as well."

The hybrid strategy will mean "a lot of things happen on prem and a lot of things will happen in the cloud to accelerate and transform," said Bhatia. With AWS, GE Healthcare will look to building its own foundational models as well as using multiple LLMs for everything from workflows to equipment to treatments and imaging. Bhatia said:

"The partnership we announced with AWS is about strategy and foundational model building for building our own proprietary genAI, streamlining workflows and developing use cases. The partnership is really 1+1 is greater than 2 because you get a lot of benefits from security and scale with AWS and GE Healthcare being in more than 160 countries."

This approach to hybrid AI will also mean multiple partnerships for clinical research. Ultimately, GE Healthcare wants to be able to predict if a patient is going to skip or arrive late to appointments adapt workflows, and build in flexibility, said Bhatia.'

Model choice. Bhatia said flexibility with foundational models is critical. "One model is not going to solve all problems and you'll have to look to the clinical side and the operational side of things," he said. "The first place AI can have an impact is to alleviate cognitive and data overload to highlight what's relevant."

Bhatia added that models will also need to be adapted and used for specific use cases. Open-source models also have potential to be adapted for a specific use cases.

AI as a horizontal and vertical tool. Bhatia said it's important for AI leaders to think about generative AI as a horizontal enabler and a technology that can be used to drill down in specific areas. He said:

"You can build these AI algorithms for breast cancer, but they can easily be adapted to prostate cancer or lung cancer. And I think that's where these technologies are becoming really game changer. How do you adapt them, not just looking into the vertical side of things, going from diagnosis to treatment therapy and the entire patient journey, but also how they can be adapted across the spectrum?"

Data to Decisions Next-Generation Customer Experience Future of Work Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity AR AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

HOT TAKE: SupportLogic New Features Help Leverage Support as Revenue Driver

SupportLogic has been in business since 2016, and has primarily been seen as a tool that helps support leaders drive a more enhanced support experience (or “SX” as the company brands it). This has mostly been achieved by using SupportLogic’s ML and sentiment analysis to extract “signals” from emails and other text-based data inside customer cases to prevent escalations, and provide better agent quality control. 

But the company has long understood that the signals it extracts are far more valuable than the core use cases of escalation avoidance and more intelligent case routing. CEO Krishna Raj Raja has always called the support center a “revenue center” rather than a cost center - positing that support organizations are the true front line when it comes to the actual voice of the customer. 

In B2B relationships (especially in high tech where SupportLogic has focused), this rings true. Think about it - CRM data from sales interactions holds some important deal data, but typically ends when the deal is closed. Marketing data only includes interest, a little insight about products purchased, used, and the product usage/customer experience. But case data includes a treasure trove of insights around actual products deployed, how they are used, user satisfaction, dissatisfaction - as well as signals around what a customer might be missing: features, configurations, additional products, etc. that can lead to a more successful (and profitable) relationship, if acted upon at the right time and in the right manner.

Enter “Expand” - a new feature set from SupportLogic than expands upon its signal extraction capabilities around Account health Scores to drive success and other revenue teams with previously hidden cross sell and upsell opportunities. As the company puts it, SupportLogic's Expand module brings real-time account health visibility to account management teams, helping them identify upsell and cross-sell opportunities, monitor customer satisfaction, and act on early warning signs that may signal potential issues or churn risks. This provides an interesting workflow where true customer signals can be flowed to customer success, account execs, etc. to better act on revenue opportunities, where these signals typcially get lost in unstructured text, or are not captured properly at all. 

The new features of Expand are designed to provide a comprehensive view of account health, combining insights from multiple data sources, allowing teams to take proactive measures for growth. It includes the following core features:

  • Account Health Score: A unified score that reflects the overall health of the customer relationship, combining sentiment data, support history, and product usage signals.
  • Account Commercial Signals: New commercial signals that significantly enhance customer retention, drive revenue growth, and foster long-term customer loyalty.
    • Signals include: Churn risk, renewal likelihood, competitive consideration, expansion opportunity, price sensitivity, license upgrades and downgrades.
  • Account Summarization: Generative AI-based automated summaries that capture the status of key accounts, making it easy for account managers to stay informed.
  • Account Alerts: Real-time alerts for changes in account health, including early warnings on potential churn or upsell opportunities.
  • Account CRM Widget: Seamless integration into popular CRM platforms, enabling account managers to view account health directly from their CRM dashboards.
  • Integration with Gainsight CS: Built-in compatibility with leading Customer Success platforms like Gainsight CS, offering streamlined workflows for customer success and account management teams.

The new Expand feature set is indicative of an overall shift in terms of tech better supporting a full journey approach to growth optimization. According to Supportlogic, organizations that understand, and act on, the types of signals that can be gleaned from support interactions can increase customer lifetime value and reduce churn in a more profitable and seamless manner. 

In the age of agents AI, and a lot of “easy button” promises by leading CX vendors - buyers should look more closely at how use cases such as account health and expansion revenue analysis are truly supported by both generic Gen AI and agent tools, as well as how they are truly addressed by the copilots and agents found in leading CRM platforms. Tools like SupportLogic are not “easy buttons,” but for businesses with longer, complex support cycles that include a lot of “back and forth” wherein lots of product and sentiment signals can be extracted - it is worth considering as many more broad based AI tools are not as finely tuned for such specific B2B use cases. 

Media Name: SupportLogic_Expand_in_Salesforce_Screenshot.jpg
Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity B2C CX ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief Revenue Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Nvidia launches NIM Agent Blueprint for cybersecurity

Nvidia launched a NIM Agent Blueprint for cybersecurity as it continues to expand use cases for its microservices and AI agent platform.

At its AI Summit in Washington DC, Nvidia outlined its NIM Agent Blueprint for container security. The cybersecurity NIM Agent Blueprint combines Nvidia's Morpheus cybersecurity AI framework, Nvidia cuVS and Rapids data analytics to accelerate vulnerabilities (CVEs) at scale.

The cybersecurity blueprint is included in Nvidia AI Enterprise, the GPU giant's flagship software platform for AI applications.

Nvidia has had a steady stream of NIM Agent Blueprint news as it aims to make agentic AI more commonplace in enterprises.

According to Nvidia, its NIM Agent Blueprint for container security enables enterprises to use generative AI to digest information and then explain vulnerabilities using natural language. Companies can then create agents for cybersecurity workflows.

Nvidia added that Deloitte is among the first to use Nvidia NIM Agent Blueprint for container security in its cybersecurity applications.

Here's a look at the architecture.

Among other notable items from Nvidia at its AI Summit.

More on agentic AI:

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience nvidia Security Zero Trust AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Product Officer

SAP gives ABAP code a genAI boost, adds data lake capabilities to Datasphere

SAP at its TechEd conference delivered its share of AI agent headlines as its Joule generative AI becomes one assistant across its platform, but as a more practical matter more developer options for ABAP and data lake capabilities will have a much larger impact.

The goal for SAP is to move its custom SAP ECC code to S/4 clean code as soon as possible and the company outlined a series of moves to make that happen faster with a big assist from generative AI.

At TechEd, SAP said it will enable ABAP developers to generate rate high-quality code with its Joule generative AI copilot to comply with SAP's ABAP cloud development model. According to SAP, "Joule will also be able to generate explanations for legacy code, making it easier to modernize legacy codebases and migrate to a clean core." ABAP is a programming language that runs in the SAP ABAP runtime environment, created and used by SAP for the development of application programs.

Constellation Research analyst Holger Mueller did a deep dive on the implications for ABAP developers, which will get extended customer fields, business logic and processes. Mueller noted that the ABAP additions to SAP Build will give SAP the ability to update legacy code at scale. ABAP has 2 million active developers.

By the end of 2024, SAP Build will include access from ABAP development tools and environments for SAP S/4HANA Cloud. The integration will enable developers to create and monitor ABAP Cloud projects in SAP Build.

The other big move by SAP revolved around new embedded data lake features for SAP Datasphere. By the end of the fourth quarter, Datasphere will have a data lake option to complement existing storage. Businesses will be able to analyze data across hybrid environments and preserve context and logic.

SAP said the data lake capabilities include:

  • An integrated object store for more efficient data transformation and processing.
  • Spark compute based on existing Datasphere data integration.
  • The ability to access data on integrated object stores without physically copying the data.
  • Ultimately, users will be able to import and integrate data at scale from SAP sources and non-SAP sources.

Mueller said:

"SAP shows broad investment across its technology platform, which is the key focus for all enterprise software vendors, readying their SaaS suites for the era of Infinite Computing. SAP with its new data lake capabilities adds (for the first time) object storage abilities. With this move, SAP for the first time gives customers the chance to holistically build next generation of applications powered by AI on its platform. Equally is important that SAP finally shows some love to its 2M+ active ABAP developers with support of ABAP in SAP Build, as well as making Joule available in ABAP. Critical for the ecosystem is that SAP now finally allows partners to build custom ABAP Code in SAP S/4 HANA Cloud Public Edition – a key move to help SAP customers to upgrade to S/4HANA. The impact of these three makes the solid progress on AI pale, where SAP shows the right execution – widening and deepening Joule capabilities. What stands out for Joule is the design point to have a single AI assistant across SAP."

Here's a look at everything announced at TechEd:

  • SAP Build will get the ability to give Joule custom skills as well as use SAP HANA Cloud to ground large language models using its vector engine.
  • Joule will get multiple AI agents that will combine business function expertise with the ability to carry out complex workflows. Joule will bring together specialized AI agents in areas like supply chain, procurement and finance.
  • SAP will add two out-of-the-box autonomous AI agent use cases including dispute management for incorrect or missing invoices, duplicate payments and the like and a financial accounting use case to streamline financial processes.
  • Joule will support 80% of SAP's most used business tasks by the end of the year. Joule will also be available in SAP Service Cloud and SAP Concur as well as SAP S/4HANA Cloud Public Edition. Simply put, Joule will be integrated into all of SAP's clouds.
  • SAP Knowledge Graph launched as a business context tool that is preloaded with ABAP tables, CDS views, APIs, and key data models so enterprises can ground AI models.
  • The company said it added Anthropic Claude 3.5 Sonnet via Amazon Bedrock to its generative AI hub along with the addition of IBM Granite foundation models, Meta Llama 3.1 and Mistral Large 2 and Codestral.
  • SAP Generative AI Hub gets improvements to customize pre-trained AI models, a new software development kit, and new regions from the big three hyperscale cloud providers.
Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience SAP AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Intuit's Enterprise Suite could upend midmarket ERP

One of Intuit's big bets for the years ahead is to disrupt the mid-market ERP business market with its Intuit Enterprise Suite as it aims to fill a gap between QuickBooks Online Advanced and costly ERP implementations that are required when companies grow.

The company launched the Intuit Enterprise Suite last month as it took its unified platform, powered by Intuit Assist, a common data store, services from its various offerings and generative AI, and took aim at an $89 billion total addressable market for larger, mid-market businesses.

Intuit launched Enterprise Suite in the US for multi-entity, service and project-based businesses. These early adopters are providing a feedback loop for Intuit to continue to iterate on its platform.

Sasan Goodarzi, CEO of Intuit, laid out the importance of Intuit Enterprise Suite during the company's recent Investor Day. "We have no intention to serve enterprise businesses, but every intention to serve large mid-market businesses" said Goodarzi, who said there's a massive gap in the ERP market between companies that are growing and complex but don't have the time for an ERP implementation. "You go out and talk to large customers and their words are 'big ERP is an organ transplant, it's too expensive and it's not about the yearly expense as much as it is the couple of years it taes to shift to an ERP solution.'"

Goodarzi explained that Enterprise Suite has been in the works for a long time, but AI has enabled it to serve customers far north of $3 million in revenue that have multiple locations, the need for segment reporting and various requirements. In other words, Intuit has a continuum that serves prosumers with QuickBooks, small businesses with QuickBooks Online Advanced and midmarket firms with Enterprise Suite.

Among early adopters of Intuit Enterprise Suite, annual revenue per customer (ARPC) is about $20,000. That sum is a win for Intuit and businesses since Enterprise Suite can consolidate an average of 10 business apps used by midmarket companies. QuickBooks Online Advanced ARPC for fiscal 2024 was $3,299.

For fiscal 2025, Intuit is projecting revenue growth of 12% to 13% to $18.16 billion to $18.35 billion. Its Global Business Solutions Group, which includes QuickBooks, Mailchimp and Enterprise Suite as well as workforce management tools, will deliver fiscal 2025 revenue north of $11 billion, up 16% to 17% from a year ago.

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

Laurent Sellier, SVP Product for Midmarket in Global Business Solutions Group, said: "It's common for customers to allocate tens of thousands of dollars, annually for business management software purchases of tens of thousands and even hundreds of thousands more for external support to get new systems set up. We are encouraged by the customer feedback from early adopters and they're telling us they're getting a lot of value by being on one platform, having one source of truth and easy onboarding due to the familiar navigation."

Intuit estimated that there are 1.9 million mid-market businesses in its priority markets. There are 283,000 QuickBooks Online Advanced customers today, up 28% from a year ago. Those customers now have an upgrade path to Intuit Enterprise Suite. In a demonstration, Intuit used a construction company with 25 employees with $25 million in revenue as an Enterprise Suite customer.

Enterprise Suite includes:

  • Financial and accounting tools to prepare financial statements and manage intercompany transactions. Intuit also uses AI to automate planning tasks such as cash flow management, budgeting and profit and loss forecasts with dashboards.
  • KPI tracking and insights by project and industry. These KPIs also sync with employee payroll and time data as well as historical financials.
  • Mailchimp marketing integration to find, retain and manage customers.
  • HR features for onboarding processes, healthcare, retirement and workers comp benefits. Payroll tax calculations, deductions and filings are automated with AI to catch errors.
  • Accounts payable and receivable automation and reconciliation.
  • Access to experts and services for customer success and customization. Intuit Enterprise Suite will have versions for construction, non-profit, service and project-based businesses.

Impact on competition

Intuit Enterprise Suite may disrupt midmarket ERP by providing a familiar upgrade path for businesses that would move toward Oracle NetSuite, Sage, Microsoft Dynamics and other plays on the bet they'd grow into the functionality. ADP is also expanding in human capital management and could effectively follow the same playbook as Intuit.

For Intuit, Enterprise Suite can also give it a way to leverage its unified platform, data quality and generative AI tools. That platform also gives Intuit the ability to move upstream.

“Innovation for ERP – no matter for large or medium or small enterprises – needs to come from a modern platform, an enterprise application platform (EAP) that supports the generic use case of extension, integration and building proprietary automation, enabled by low-code / no-code,” said Constellation Research analyst Holger Mueller. Equally, it requires to build on common data foundation that supports both Analytics and AI. “Getting these two offerings right is critical and Intuit is making good progress on both ends.”

Will Intuit ever threaten SAP, Workday, Oracle and Infor? No. But will it prolong the sales cycle to move up to those ERP packages? You bet.

Recent enterprise application research:

Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Digital Safety, Privacy & Cybersecurity AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR GenerativeAI LLMs Agentic AI Disruptive Technology Chief Information Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Why Every Organization Needs To Rethink Its Growth Strategy in the Age of AI | Big Idea Report

Constellation analyst Martin Schneider unpacks his latest report, discussing the need for organizations to adapt their growth strategies in the age of AI and emphasizing the importance of the office of the chief growth officer and the role of AI as an enabling technology. He also highlights the need for a modern growth strategy, taking into account the shift from the subscription economy to the retention economy and the post-pandemic economy.

Access the Full Report Here

View the full transcript here: (Please note this has not been edited and may contain errors.)

Martin, hi. This is Martin Schneider, Vice President and Principal Analyst at Constellation Research. I'm excited to talk about a new Big Idea report I've just published called why every organization needs to rethink its growth strategy in the age of AI. And that why is a really big why we've seen a lot of changes in the market and just in the economy over the past 510, 15 years, we really saw this wholesale shift to subscription, slash retention, slash everything as a service economy. But what we've not seen is our go to market strategies really supporting this fully.

We've We've operationalized around product and service delivery with these models. We've done revenue recognition changes things like that. But for a lot of organizations across almost every industry, most people are thinking still in the old way of you know, new business drives everything lead to opportunity. Conversions are the most important metrics. So people really just not thinking about the right metrics and the right approach. And that needs to change right now. What's really been interesting is leading organizations are creating either an office or a role called the chief growth officer. So the C suite is being augmented to include a more strategic leader when it comes to planning for growth, because you need to plan for growth across the entire customer life cycle. It's not just about, you know, elevating a chief revenue officer, right? We're really talking about a strategic member of the C suite here, so that Chief growth officer is becoming more and more prevalent and and more and more leading companies have that role installed.

What's really kind of interesting in supporting that is that ascendancy of what we would call rev ops to more growth ops, really, where you're seeing, you know, again, that strategic middle office, not just being kind of part of the bean counting in the deal desk, but really turning more into a strategic office where we're thinking about approaches to growth, thinking about optimizing workflows and key business processes and taking a leading role in how we utilize technology to optimize our growth strategies, you know, because they really have that interesting they're not as mired in the the weeds that, like sales leaders are.

They're not totally just thinking about campaigns and content the way a marketing might be. And they're not stuck with, you know, fire drills and keeping customers happy the way chief customer officers are. They can really attack the issue strategically and have that really great vantage point. So that's really interesting. And of course, in the age of AI, in the title, AI is a catalyst, but it's not driving these changes, these changes and these, you know, disruptions have been here, and they're affecting us. The great thing about AI is it allows even the smallest organization to take on these challenges head on, a little faster, a little better, a little cheaper, right? So it really is a catalyst and an accelerator for kind of rethinking your growth strategy, but it's not the reason we're making these changes, right? So it's important to have that perspective. Also in the report, I talk about four elements of a modern growth strategy. And you know, some foundational elements right that you really need to be thinking about as you rethink your approach to growth. And then finally, 10 questions that you can ask your self or the organization or your C suite as you rethink your growth strategy, or even just level set, and say, you know, where are we in our growth strategies?

So again, this is a really important report for growth leaders, for Chief growth officers, for CROs and for other C suite who are really thinking about growth and understanding the challenges, the pressures and all the changes we've been facing over the last, you know, decade plus, and how we can really approach these head on, leveraging AI to make it maybe a little easier, A little more cost effective and get more outsized results for these new strategies. So it's definitely a report worth checking out. If you're a client, you can access it in the library today, and if you're not, just contact us and find out how to be a client, because this is a this is kind of a can't miss report for growth leaders. Thanks a lot. You.

On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/qvNyB1BEXe4?si=dVA5cDBZEP-SYHy9" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Big Idea: Why Every Organization Needs to Rethink its Growth Strategy in the Age of AI



A new Constellation "Big Idea" report has just been published, written by me, and covers the topic of "Why Every Organization Needs to rethink its Growth Strategy in the Age of AI." And I know, it sounds pretty heavy and ominous - but I do believe most organizations are working from outdated and/or incomplete approaches to planning for growth. 

Today, nearly all industries are being disrupted by multiple factors. Big changes such as the shifts to subscription/retention economic models, the impact of AI, and the need to deal with rising costs have necessitated a new perspective on growth. Meanwhile, customer expectations are changing - and they are not asking but rather expecting or demanding that you meet them where and when they want. No longer can B2B organizations look solely to sales and marketing as the bastion for growth. Instead, a wider view of growth is needed—one in which all customer-facing departments contribute in a more reliable and scalable manner to overall lift.

And while we have amended our service/product delivery models to meet these changes, even created "customer success" departments and motions to support them - there is still a huge gap in how high level growth strategies account for these new realities. That has to change. 

In this report - I explain the big "Why" in terms of all of these disruptions and drivers, and why taking an approach to growth that incorporates a "full journey" approach pays off in terms of more profitable, scalable growth. To meet these changes, this report highlights the need for a new C-suite member: the Chief Growth Officer (CGO) - a role/committee with a more strategic position and perspective. While CROs and CMOs have important roles, often their remit and the metrics they use for success are rarely aligned with actual full journey orchestration that supports a modern growth strategy. Supporting the CGO, revOps teams are ascending into "GrowthOps" departments - elevating the middle office from more tactical "deal desk" support to more proactive, strategic stakeholders. RevOps has a unique position in terms of having a "crow's nest view" of go-to-market operations, and can be more strategic and process-optimization oriented while sales, marketing and support/success leaders are often too much in the tactical weeds to think long-term strategy. 

And while AI is a major catalyst and accelerator when it comes to modernizing approaches to growth - it is not THE driver. The disruptions and challenges faciung growth leaders predates this AI revolution - but the good news is that growth leaders can leverage AI to reimagine growth strategies faster, and with less heavy lifting than ever before. In the report we delve into some go-to-market use cases and where AI can be applied with the least effort, to drive the most results. 

The report also offers up four elements of a modern growth strategy - and how your organization cana dopt them. Finally, the report provides 10 key questions to ask when rethinking your growth strategies - delving into critical issues that can assit any organization wherever they may be on their growth journey. 

This report is available now for Constellation clients in our Research Library. if you are not a client and interested in accessing the report, you can contact us at [email protected] to learn how. 

Marketing Transformation Revenue & Growth Effectiveness Chief Revenue Officer

BioNtech, InstaDeep bet on genAI models to advance R&D, drug discovery, cancer treatment

BioNtech's InstaDeep, which was acquired in 2023 for about $682 million, has released a series of foundational generative AI models for proteins and DNA and released them on its DeepChain platform and outlined a supercluster called Kyber.

The news, outlined at BioNtech's AI innovation day, highlights how foundational models are branching out into industry specific use cases. In BioNtech's case, its InstaDeep unit is looking to embed AI throughout the life sciences, R&D and drug discovery value chain.

InstaDeep has even created an AI-driven lab agent built on its proprietary data and Meta's Llama family of models.

BioNtech in recent years has been best known for its COVID-19 vaccine partnership with Pfizer. However, BioNtech historically focused on mRNA cancer treatments. BioNtech is betting that AI can drive its drug pipeline for years to come with its acquisition of InstaDeep, which counted Google as an investor. BioNtech and InstaDeep formed a joint AI lab in 2020 and the partnership quickly accelerated.

Ugur Sahin, CEO BioNTech, explained the company's bet on InstaDeep, which also has its own supercomputing cluster called Kyber. Kyber is coming online in Paris and enables InstaDeep to train its own foundational models without the cost and queue involved with cloud computing.

Sahin said:

"Every cancer treatment for every patient is a new battle. Every cancer cell is different. How can we develop treatments that address tumor cells? Cancer is evolving. Cancer is adaptable. This has now become a high-level computational question."

Sahin added that cancer treatment in the future will start with clinical samples from the patient and an analysis of genetic changes in tumor cells that will generate about 4 terabytes of data for each patient. "We need AI, machine learning and algorithms to come to the right conclusions," he said. "AI gives us the opportunity to do that at a much deeper and faster scale."

Is BioNtech a biotech company or an AI company? Both. Life sciences and AI are likely to become symbiotic.

Ryan Richardson, Chief Strategy Officer at BioNTech, said the company is looking to build an "AI personalized immunotherapy platform." The value drivers for the InstaDeep purchase revolved around cost efficiencies from internalizing model training, building foundational models for vaccines and therapeutics and applying AI to drug discovery.

"The primary use case is to embed AI in drug discovery with the ability to combine our therapeutic platforms on one hand, which are very novel, and the AI capabilities that InstaDeep brings to bear," said Richardson. "There is truly profound disruptive potential in terms of developing or discovering new drugs."

Karim Beguir, CEO and Co-Founder of InstaDeep, said the goal is to work with BioNtech closely to become "a leader in digital biology." Beguir added that for InstaDeep and BioNtech to lead in digital biology his company also needs to be a leader in AI. "The same technology can apply to multiple use cases," said Beguir. "We are leaders in industrial optimization within biology and outside of biology these add up together. The objective is to continue to be a leading power in the world of AI."

Here's a look at what InstaDeep is working on as part of BioNtech.

A supercomputing cluster named Kyber. Beguir said the Kyber supercluster is built on 224 Nvidia H100 GPUs, 86,000 CPU cores, 1.7 petabytes of persistent storage and 400 Gbps RoCE network. The cluster, built on-premises with Dell, totals about 0.5 ExaFLOPs and is one of the top 20 H100 GPU clusters globally.

"We are now able to take all the work that we have built upon over the last several years and scale it up over the next five, six, seven, 10 years," said Beguir.

InstaDeep uses an in-house rack design that's easy to expand with modular nodes that offer consistent performance, cost, power and cooling. Standard designs will minimize costs over time. InstaDeep also tailored its AI software stack to its workloads with open standards.

Beguir said InstaDeep built the supercluster to avoid vendor lock-in and benefit from predictable costs while scaling models. Kyber enabled InstaDeep to train genAI models with more than 15 billion parameters with hardware efficiency on part with the latest Meta Llama 3.1 foundational model.

Bayesian Flow Networks (BFNs), a new class of generative model that uses Bayesian inference to update beliefs about data. BFNs generate discrete data in a continuous way and are better suited for proteomics and modeling protein folding, function prediction, antibody design and sequence generation.

InstaDeep wants to use BFNs to build foundational models based on heterogeneous scientific data to give scientists more flexibility. A model called AbBFN-X is designed to be a multimodal model for antimodels with 26 different attributes jointly modeled.

DeepChain, a platform designed to use AI to accelerate the R&D pipeline, gains new features. DeepChain is getting generative protein models, ProtBFN and AbBFN, and foundational models for DNA, Nucleotide Transformer and SegmentNT. These models, which can be customized and fine-tuned, are available on Hugging Face under the genomics tag.

Laila AI agents built on Meta Llama 3.1. Laila is integrated throughout the DeepChain platform and can recommend models and analyze data with internal and external tools. Laila can also visualize results, plot data and zoom in on certain DNA sequences and positions.

InstaDeep executives said that Laila, which comes in multiple sizes, is more than a chat bot and can use its expert knowledge of biology to reason, make decisions and provide feedback.

The company is also working to leverage its models across scientific and R&D workflows. InstaDeep has designed AI tools to automate labs, annotate tissue, segment pathology images and identify novel therapeutic targets.

Constellation Research analyst Holger Mueller said:

"While most of the CxO attention is on cloud platforms and AI vendors when it comes for the latest on genAI, there is substantial innovation coming from the biotech industry as well. BioNtech (where the founder would go on vacation with its workstation) acquired its own AI startup and it is showing significant progress on what matters at the moment - the 'uber ai' that chooses the right AI / statistical models for positive outcomes in protein folding, cancer research and more. It's good to see more AI model competition, especially coming from a practitioner."

More:

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AR AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Accenture to use Nvidia stack for agentic AI

Media Name: Screenshot 2024-10-03 063117.png

Accenture has formed a Nvidia Business Group that will deploy agentic AI using Nvidia's full stack. The move puts some systems integrator heft behind Nvidia's software ecosystem.

As noted during Nvidia's recent second-quarter earnings call, the real competitive moat around the chip giant's business is its software ecosystem. That ecosystem also enables Nvidia to sell its GPUs and AI accelerators. Nvidia made a series of announcements highlighting its ability to leverage its software ecosystem to boost performance. In addition, Nvidia is looking to make it easier for enterprises to bring generative AI projects from pilot to production.

In a statement, Accenture said its Nvidia Business Group will include 30,000 consultants trained on the chipmaker's stack. Accenture's AI Refinery platform will also leverage Nvidia's architecture and foundational models. Accenture said it has booked more than $3 billion in generative AI business

Specifically, Accenture will combine its AI Refinery and Nvidia's AI Foundry, AI Enterprise and Omniverse to focus on process optimization, simulations and sovereign AI. The latter has been identified by Nvidia and other infrastructure players like Oracle as a hot market.

The companies will also use Nvidia NIM Agent Blueprint for virtual facility robot fleet simulation and combine it with Eclipse Automation, an Accenture unit that focuses on manufacturing automation.

Nvidia's uncanny knack for staying ahead

Accenture said AI Refinery will be available on all public and private cloud platforms. Accenture said it has used its AI Refinery and AI agents to cut manual steps in campaigns by 25% to 35%, save 6% and increase speed to market by 25% to 55%.

More on agentic AI:

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity nvidia AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer