Results

How Seattle Seahawks are approaching genAI, model choices with AWS

How Seattle Seahawks are approaching genAI, model choices with AWS


The Seattle Seahawks are using Amazon Bedrock, generative AI and other AWS services to distribute video and content faster with a focus on quick returns as well as long-tail opportunities. Here's a look at the project and lessons learned so far.

The generative AI project, which started with this season, is part of a multi-year extension between the Seahawks and AWS, the team's official cloud, machine, AI and generative AI provider. Under the deal, the Seahawks are automating content distribution as well as transcribing, summarizing and distributing press conferences across multiple channels and languages.

AWS and the Seahawks will also integrate generative AI throughout business operations. The Seahawks and AWS first partnered in 2019 on NFL Next Gen Stats, insights on player health, performance and scouting. Lumen Field, home of the Seahawks, is also a showcase for Amazon's Just Walk Out technology.

I caught up with Kenton Olson, Seattle Seahawks Vice President of Digital & Emerging Media, to walk through the generative AI content project and what's next.

The project. Olson said the Seahawks will publish more than 1,000 videos throughout the year. The goal was to speed up the time it takes to get videos from the creation team and editors to production. After the 2023 season, the Seahawks looked to accelerate the process with Olson's content team of 11, which focuses on digital content and platforms.

"We use multiple AWS products for everything from encoding and transcribing video to hosting," said Olson. "We were excited to use Amazon Bedrock to provide some automation to the videos we're shooting to save time and get stuff out faster."

For now, the Seahawks are focused on media availability of videos and press conferences. The Seahawks will do about 300 press conferences throughout the year with plays and coaches.

In the future, Olson said generative AI will provide an assist for podcasting and entire video workflow. "As we move forward, we'll train the AI and make sure we tune it because every video is a little bit different," said Olson. "We started with press conferences and are learning."

    The process before and after. Olson said the previous process took about 45 minutes to an hour to create a video to publishing and streaming to various channels. The video processing and publishing process had 60 steps. "We're now in a situation where once the video is submitted it's published in about 10 minutes in a worst-case scenario," said Olson. "That includes things like translating and providing a summary that would have taken us hours before."

    Returns on investment. Olson said the initial return is time saved that frees his team up to think of new types of content to create. "We'd like our people thinking of new types of content not necessarily pushing buttons to publish something," said Olson.

    By the end of the season, Olson expects save hours "so that our content creators can focus on creating other things for our fans and exposing that content."

    Another early return is that the Seahawks can provide more in-depth information with generative AI summaries that can give fans more opportunities to discover content in unique ways. "We're also excited to see how our search engine referrals and various components are improved by providing more rich metadata," said Olson.

    Longer term, Olson said generative AI can boost the returns of the Seahawks video archive, which will be critical since the franchise will soon enter its 50th season. Olson said:

    "We have done a good amount of work over the past couple of years to take old Betamax tapes off the shelf and digitize those. We don't have a lot of real good data on all those, and so we're working with AWS right now to figure out how to process them and get a lot more data about who's in the video and what did they talked about. In the future, we could say here's a Jim Zorn video of him talking about something and do it within seconds. Today that would be a lot of manual scrubbing. As we move forward, there will be opportunities to talk about our history."

    More from the genAI field:

    Model choices. Olson said the plan from the beginning was to test multiple models and analyze them based on quality of output without human intervention. The Seahawks have already swapped a few models based on use cases. Olson noted that his team has swapped models out as the company moved from pilot to production. "It definitely took us some tinkering to understand what model makes sense and which doesn't. The tremendous thing about Bedrock is that we can use many different models," he said. "When we built this process, we knew these models are all changing. The model we're using now is really great, but for all we know there's some model in six or seven months that we'll want to move to."

    Humans in the loop. Olson said the primary goal of the genAI project was to focus his team on more content and new ideas. The process for the video team is to bookend video production with human oversight. At the end of the process, humans make the quality checks and decide to publish, but the models have gotten to the point where "we're hitting publish more than having to make edits," said Olson.

    Olson's team also had to give models unique spellings of names as well as new players as roster changes are daily and weekly. "We really work on ingesting our roster before every video to make sure the latest players are there," he said. Today, the models get an update every time there's a roster change.

    What's more intriguing to Olson is the human input at the front end of the process. He said that generative AI speeds up ideation and allows creators to try new things in seconds and iterate from there.

    The project timeline. Olson said the Seahawks started on the genAI project in the late spring with building components. By time, the season started the Seahawks were ready to go. "It took us about two months of adding pieces and iterating to make sure we could move forward," said Olson. "It was more about adjusting the model to fit our needs and making sure we use it in the correct way."

    Data to Decisions Future of Work Marketing Transformation Next-Generation Customer Experience New C-Suite Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

    OpenAI, AI Sustainability, CX Optimization | ConstellationTV Episode 90

    OpenAI, AI Sustainability, CX Optimization | ConstellationTV Episode 90

    We made it to ConstellationTV episode 90! 📺 Hear co-hosts Holger Mueller and Liz Miller discuss enterprise technology news, including AI Forum highlights, AI integration in workforce management, and the impact of OpenAI's recent funding and future innovation.

    Then R "Ray" Wang has an engaging conversation with Sol Salinas, EVP, and Sustainability Lead for Capgemini Americas on the role of sustainability, climate tech, and AI in addressing environmental challenges.

    The episode concludes with a CR CX Convo with leaders from The Scotts Miracle-Gro Company on the importance of empathy in customer experience.

    00:00 - Meet the hosts
    01:24 - Enterprise tech news updates (AIF 2024, workforce management AI, OpenAI)
    17:11 - Sustainability and AI with Capgemini's Sol Salinas
    29:40 - CR CX Convo: Tests CX to Optimize for Extraordinary Growth
    42:13 - Bloopers!

    ConstellationTV is a bi-weekly Web series hosted by Constellation analysts, tune in live at 9:00 a.m. PT/ 12:00 p.m. ET every other Wednesday!

    On ConstellationTV <iframe width="560" height="315" src="https://www.youtube.com/embed/1qA-GqW07bw?si=MPt_C27lsaw_Tigp" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

    How Seattle Seahawks are approaching genAI, model choices with AWS | Constellation Insights Interview

    How Seattle Seahawks are approaching genAI, model choices with AWS | Constellation Insights Interview

    The Seattle Seahawks are revolutionizing their video content distribution with the help of #AWS and #generativeAI. In a recent interview, Kenton Olson, VP of Digital and Emerging Media for the Seahawks, shared insights into their innovative project with Larry Dignan, Editor in Chief of Constellation Insights

    A few highlights include...

    📌 The team produces over 1,000 videos per year, creating a time-consuming distribution challenge. By integrating AWS services like Media Convert, Transcribe, and Bedrock, they've reduced publishing time from 30-45 minutes to just 10 minutes.
    📌 Generative AI allows the Seahawks to automatically generate video summaries, translations, and metadata - saving their content team valuable time to focus on creating new, engaging content.
    📌 The AI-powered system has improved the searchability and fan discoverability of Seahawks videos by providing richer metadata. This enhances the team's ability to reach and connect with their passionate fanbase.
    📌 The flexibility of AWS Bedrock enables the Seahawks to easily test and swap AI models, ensuring they can adapt to the latest advancements in generative AI technology.

    This example showcases how sports organizations can leverage the power of #cloud and #AI to streamline operations, enhance fan engagement, and unlock new content opportunities. The Seahawks are leading the way in transforming the digital fan experience. #AWS #AI #SportsTech #SeattleSeahawks #DigitalTransformation

    On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/z4b_JPK-K_k?si=LoRcqsXxNPKmBKXm" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

    Zoom launches AI Companion 2.0, eyes enterprise, industry expansion

    Zoom launches AI Companion 2.0, eyes enterprise, industry expansion

    Zoom outlined its roadmap and upcoming products that include AI Companion 2.0 across its platform, a focus on work management for frontline workers and a deeper dive into contact center, education and healthcare markets.

    In a briefing, Smita Hashim, Chief Product Officer at Zoom, said the company's AI Companion effort initially revolved around the theme of meet happy by adding tools for engagement, becoming more productive and collaboration. With AI Companion 2.0, Zoom is expanding its view more toward work happy.

    "We have expanded our vision to work happy. And what work happy really means for us from our perspective is helping our customers and users have the time to really be what is uniquely human to them," said Hashim, who noted that AI Companion is now activated on more than 4 million accounts and 57% of the Fortune 500. "AI Companion is going to be your personal assistant that can work across Zoom Workplace."

    This vision will be outlined at the company's Zoomtopia conference, which will feature CEO Eric Yuan's keynote as well as customer sessions. The upshot with Zoom's product launches and roadmap, which lands in the fourth quarter of 2024 and extends into 2025, is that the company is gunning for Microsoft Teams. The challenge for Zoom will be overcoming AI assistant fatigue--every enterprise application has one--and breaking through the Microsoft Bundle.

    Hashim noted that AI agent, companion, copilot fatigue directly.

    "Sprawl is a challenge for our users and frankly some of the implementations are confusing. Why do I need an agent just for SharePoint? I don't want hundreds and thousands of agents running around in my user interface. With AI Companion 2.0 we see it as a super-agent that will have skills to connect to Workday, Jira and various workflows to help you get more done."

    Constellation Research analyst Holger Mueller said:

    "Zoom has made massive progress in the last year to position itself beyond just synchronous communication, making it more and more an alternative to the omnipresent Teams. The question will be – can Zoom overcome the power of the Microsoft enterprise agreement with innovation? It certainly has some key capabilities here, especially on the AI side – where it makes AI insights and automation easier accessible than other products, as well as offering a single AI assistant with Zoom AI Companion. On the new offering side, the Frontline Worker offering has a lot of potential, Zoom got the capability mix right, now it has to get the price point right. Overall Zoom has the ability to change the future of work – again."

    Zoom's announcements at Zoomtopia include:

    Zoom AI Companion 2.0, available October 2025. Zoom AI Companion 2.0 works across the Zoom Workplace platform adds context, synthesizes data from meetings, chats and docs as well as Microsoft Outlook, Office, Gmail and other applications, and can take action.

    • Takeaway: AI Companion 2.0 is Zoom's horizontal AI and agent play. By connecting to other enterprise data repositories Zoom is betting that it can be the lead AI collaboration tool. The fact that Zoom doesn't charge extra for AI Companion is a big selling point for adoption.

    Custom AI Companion add-on for Zoom Workplace, which enables enterprises to customize AI Companion and connect it to business apps and data sets. This customization ability is expected in the first half of 2025 at $12 per user per month. AI Companion is typically included in Zoom Workspace without an add-on fee.

    • Takeaway: Hashim said the ability to customize was one of the big customer requests. The move to personalize AI Companion is a natural progression. "Our customers and their organizations are unique. They work across different applications, different data sources. Their employees are unique. They have different goals, they have different ambitions and customers have been asking us how AI companion could expand to address all of those needs," she said.

    Zoom AI Studio, which will customize AI Companion with connectors to knowledge bases, fine tuning and AI skills.

    • Takeaway: Custom AI Companion add-on and AI Studio will enable AI Companion to move across workflow applications and take actions. This ability will also enable Zoom to provide personal employee coaching and avatars.

    Zoom Tasks, which will use AI Companion to detect tasks across Workplace and sync updates.

    AI Companion for Workvivo for employee engagement as well as a listening suite to gauge employee sentiment. Will Meta's Workplace shutdown be a boon for Zoom's Workvivo?

    Contact center enhancements such as dynamic agent guides, suggested answer and supervisor tools.

    Industry enhancements with AI Companion for Educators including lesson plans, Zoom for Healthcare including Zoom Workplace for Clinicians, and Zoom Workplace for Frontline, which is aimed at workers in the field. Frontline workers were a big focus for Facebook Workspace, which is winding down and moving customers to Zoom Workvivo.

    • Takeaway: Zoom's education efforts are notable since they alleviate hybrid learning pain points and connectors to Canvas and other common education applications are a win. In addition, Zoom Workspace Clinicians is a nice way to leverage the platform overall. Zoom's focus on healthcare looks to build its 36% telemedicine market share. Zoom said more than 140,000 healthcare institutions use Zoom. Zoom Workplace for Frontline is also an area with a lot of white space for the company.

    Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience New C-Suite Marketing Transformation I am Team Leader at the Nominee Organization (no vendor self nominations) Distillation Aftershots Tech Optimization Digital Safety, Privacy & Cybersecurity zoom AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

    Atlassian Rovo AI additions go GA with consumption pricing on deck

    Atlassian Rovo AI additions go GA with consumption pricing on deck

    Atlassian said its latest AI features and Rovo, a generative AI assistant that operates across the company's platform, are generally available across the company's products. Atlassian also introduced Rovo Agents.

    The company said it will offer Rovo for annual subscriptions at $20 per user per month, monthly at $24 per user a month and consumption pricing in mid-2025. Licensing plans will be based on Rovo use per site where any users with access to the site is a billable user. Enterprises would only pay once per billable user.

    Enterprise software vendors have been tweaking monetization models as some vendors focus on consumption or even conversations with an AI agent.

    In May, Atlassian launched Rovo with the following core components:

    • Rovo Search, which will comb through content wherever it is stored (Google Drive, Microsoft SharePoint, GitHub, Slack etc.), and query across applications. Rovo Search will identify team players, projects and information needed to make decisions. Rovo Search will connect niche and custom apps via API and have enterprise-grade governance to data governance.
    • Insights, which are delivered via knowledge cards that offer context about projects, goals and teammates.
    • Rovo Chat, a conversational bot that is built on company data and learns as it goes.

    In a briefing, Jamil Valliani, Head of Product AI at Atlassian, cited early customers who have boosted efficiency by about 25% using Rovo with their development teams. Rovo beta testers said they've saved 1 hour to 2 hours of time saved per week.

    Atlassian is connecting Rovo Search via connectors and connecting that data to Rovo Chat and throughout the platform.

    The company also outlined Rovo Agents, which operate out-of-the-ox from Atlassian's marketplace partners. Atlassian is providing more than 20 out-of-the-box agents and tools to build your own Rovo Agents with low and no code tools.

    According to Atlassian, Rovo Agents can speed up the development process by automatically generating code plans, code recommendations and pull requests based on task descriptions, requirements and context.

    Other updates for Atlassian Intelligence include:

    • Jira Service Management will use AI to group related alerts and surface critical incidents, suggest right resources and subject matter experts. The AIOps capabilities also capture incident timelines, generate post-incident reviews and summarize details.
    • Jira Service Management virtual service agent will automate support across multiple platforms and add new onboarding and automation enhancements.
    • Loom will get AI-powered automated workflows via integrations with Jira and Confluence.

    Atlassian's AI additions will be critical to the company's future growth. In August, Atlassian projected first quarter revenue of $1.149 billion to $1.157 billion, below the consensus estimate of $1.16 billion. For fiscal 2025, Atlassian projected revenue growth of about 16%, below the 18% expected by Wall Street.

    The company at the time cited uncertain macroeconomic conditions and an evolving go-to-market strategy.

    Speaking at an investment conference, Atlassian Chief Operating Officer Anu Bharadwaj said early adoption of Atlassian Intelligence and Rovo has been strong.

    "Thousands of customers have adopted Atlassian intelligence already so far, and I'm very pleased with the repeated usage that it gets because one of the interesting things about AI is where are the use cases where you can unlock tangible productivity benefits. I think it is still early innings, so I’m very much looking forward to seeing how that plays out."

    Regarding pricing, Bharadwaj said Atlassian has raised prices for its cloud products over time as it has added AI, automation and new features. "The price increases are very much in tune with the amount of customer value that we are able to deliver," he said. "In terms of seat-based versus not, I do think that there is an interesting exploration there around consumption-based pricing, which we will really think through, especially in an AI world, where we talk about virtual agents, which will be different than a seat-based model."

    Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

    How GE Healthcare is approaching generative AI, LLMs, and transformation

    How GE Healthcare is approaching generative AI, LLMs, and transformation

    GE Healthcare has been working on machine learning, deep learning and artificial intelligence for years, but now the company sees an inflection point where generative AI can transform healthcare from products to workflow to efficiencies that improve the customer experience.

    Parminder Bhatia, Chief AI Officer of GE Healthcare, said the emergence of multimodal large language models (LLMs) can uniquely improve healthcare, which is built on everything from different modalities, imaging data, clinical notes, voice interaction, electronic health records and other data.

    Before GE Healthcare, Bhatia oversaw generative AI and large language models at Amazon Web Services. His group worked on Amazon Q and Amazon Bedrock. GE Healthcare and AWS recently announced a partnership to transform healthcare with a focus on purpose-built generative AI models using services such as Bedrock.

    We caught up with Bhatia, an AI 150 inductee, at Constellation Research's AI Forum in New York to talk shop. Here's a look at the takeaways.

    GE Healthcare's approach to AI. Bhatia has been in his current role for about 18 months overseeing the strategy and vision for AI at GE Healthcare. For GE Healthcare, the AI strategy revolves around the AI going into the MRI, CT and X-ray machines as well as digital platforms that focus on clinical and operational efficiencies across a hospital.

    "There's a lot of focus on how we build these technologies that can really streamline workflow," said Bhatia. For instance, AI in an MRI machine that can reduce scan time by 50% with the same quality doubles the efficiency and productivity of the workforce.

    Other examples of AI's role at GE Healthcare include AI in ultrasound equipment that can act as a copilot, remote scans and imaging and technologies that "improve the efficiencies and accelerate getting better diagnosis, solving problems in treatment and cancer areas as well," said Bhatia.

    GE Healthcare has been a pioneer within machine learning and deep learning for more than a decade and has the highest number of FDA approved app authorizations three years in a row.

    Why generative AI and healthcare go together. Bhatia said LLMs have been all the talk, but the excitement around them is that they are multimodal. That ability to be multimodal means they apply well to healthcare.

    He said:

    "These technologies are truly multimodal in nature and that means they're more tailored for healthcare, which consists of data coming from different modalities, imaging data, clinical notes, voice interaction, your EHRs and other data. As these technologies were being built out it made sense for me to get back into healthcare. It's the perfect opportunity to apply these applications."

    Patient experience and AI. Bhatia said AI will ultimately have an impact on the patient experience as workflows and staffing levels are improved for diagnosis to screening to treatment and therapy. GE Healthcare Command Center is using AI to streamline hospital operations, manage staffing and send triggers for actions. While many of those technologies don't affect the patient directly, the patient experience is improved with capacity planning.

    "These technologies streamline operations and that becomes relevant across a spectrum of things," said Bhatia. "Patient guidance will also be key as we take care from inside the hospital to outside with patient monitoring and virtual care at home."

    These hospital workflows will give a longitudinal patient view across care that improves experiences, he said.

    Indeed, GE Healthcare recently acquired MIM Software, a company that manages workflows from diagnosis to treatment and therapy. A few recent developments with MIM include:

    Personalization of care. Bhatia said AI will also play a big role in personalized treatment for cancer that deliver targeted radiation to kill cells.

    "In the next three to five years, you're going to have thousands of variations in which these different radiopharmaceutical drugs can be given to the individual patients," he said. "MIM Software is designed to address the complexities that happen across the system, where it provides solutions to navigate the expanding landscape of personalized treatment."

    Bhatia added:

    "A lot of these things are starting with operational efficiency, but also combining multimodal data. I think that's where AI is becoming a key enabler, not just at the diagnosis level, but health clinicians can streamline the longitudinal view of the patient's data, which is truly multimodal. That technology and data can really streamline the operations, which has impact on better therapy and more personalized therapy for patients as well."

    GE Healthcare's approach to AI. Bhatia said the company is taking a hybrid approach to AI and investing in talent focused on cloud and AI. "We are bringing a lot of that muscle for cloud and AI across the spectrum," he said. "That becomes the key component as we're looking into a lot of problems and challenges as well."

    The hybrid strategy will mean "a lot of things happen on prem and a lot of things will happen in the cloud to accelerate and transform," said Bhatia. With AWS, GE Healthcare will look to building its own foundational models as well as using multiple LLMs for everything from workflows to equipment to treatments and imaging. Bhatia said:

    "The partnership we announced with AWS is about strategy and foundational model building for building our own proprietary genAI, streamlining workflows and developing use cases. The partnership is really 1+1 is greater than 2 because you get a lot of benefits from security and scale with AWS and GE Healthcare being in more than 160 countries."

    This approach to hybrid AI will also mean multiple partnerships for clinical research. Ultimately, GE Healthcare wants to be able to predict if a patient is going to skip or arrive late to appointments adapt workflows, and build in flexibility, said Bhatia.'

    Model choice. Bhatia said flexibility with foundational models is critical. "One model is not going to solve all problems and you'll have to look to the clinical side and the operational side of things," he said. "The first place AI can have an impact is to alleviate cognitive and data overload to highlight what's relevant."

    Bhatia added that models will also need to be adapted and used for specific use cases. Open-source models also have potential to be adapted for a specific use cases.

    AI as a horizontal and vertical tool. Bhatia said it's important for AI leaders to think about generative AI as a horizontal enabler and a technology that can be used to drill down in specific areas. He said:

    "You can build these AI algorithms for breast cancer, but they can easily be adapted to prostate cancer or lung cancer. And I think that's where these technologies are becoming really game changer. How do you adapt them, not just looking into the vertical side of things, going from diagnosis to treatment therapy and the entire patient journey, but also how they can be adapted across the spectrum?"

    Data to Decisions Next-Generation Customer Experience Future of Work Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity AR AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

    HOT TAKE: SupportLogic New Features Help Leverage Support as Revenue Driver

    HOT TAKE: SupportLogic New Features Help Leverage Support as Revenue Driver

    SupportLogic has been in business since 2016, and has primarily been seen as a tool that helps support leaders drive a more enhanced support experience (or “SX” as the company brands it). This has mostly been achieved by using SupportLogic’s ML and sentiment analysis to extract “signals” from emails and other text-based data inside customer cases to prevent escalations, and provide better agent quality control. 

    But the company has long understood that the signals it extracts are far more valuable than the core use cases of escalation avoidance and more intelligent case routing. CEO Krishna Raj Raja has always called the support center a “revenue center” rather than a cost center - positing that support organizations are the true front line when it comes to the actual voice of the customer. 

    In B2B relationships (especially in high tech where SupportLogic has focused), this rings true. Think about it - CRM data from sales interactions holds some important deal data, but typically ends when the deal is closed. Marketing data only includes interest, a little insight about products purchased, used, and the product usage/customer experience. But case data includes a treasure trove of insights around actual products deployed, how they are used, user satisfaction, dissatisfaction - as well as signals around what a customer might be missing: features, configurations, additional products, etc. that can lead to a more successful (and profitable) relationship, if acted upon at the right time and in the right manner.

    Enter “Expand” - a new feature set from SupportLogic than expands upon its signal extraction capabilities around Account health Scores to drive success and other revenue teams with previously hidden cross sell and upsell opportunities. As the company puts it, SupportLogic's Expand module brings real-time account health visibility to account management teams, helping them identify upsell and cross-sell opportunities, monitor customer satisfaction, and act on early warning signs that may signal potential issues or churn risks. This provides an interesting workflow where true customer signals can be flowed to customer success, account execs, etc. to better act on revenue opportunities, where these signals typcially get lost in unstructured text, or are not captured properly at all. 

    The new features of Expand are designed to provide a comprehensive view of account health, combining insights from multiple data sources, allowing teams to take proactive measures for growth. It includes the following core features:

    • Account Health Score: A unified score that reflects the overall health of the customer relationship, combining sentiment data, support history, and product usage signals.
    • Account Commercial Signals: New commercial signals that significantly enhance customer retention, drive revenue growth, and foster long-term customer loyalty.
      • Signals include: Churn risk, renewal likelihood, competitive consideration, expansion opportunity, price sensitivity, license upgrades and downgrades.
    • Account Summarization: Generative AI-based automated summaries that capture the status of key accounts, making it easy for account managers to stay informed.
    • Account Alerts: Real-time alerts for changes in account health, including early warnings on potential churn or upsell opportunities.
    • Account CRM Widget: Seamless integration into popular CRM platforms, enabling account managers to view account health directly from their CRM dashboards.
    • Integration with Gainsight CS: Built-in compatibility with leading Customer Success platforms like Gainsight CS, offering streamlined workflows for customer success and account management teams.

    The new Expand feature set is indicative of an overall shift in terms of tech better supporting a full journey approach to growth optimization. According to Supportlogic, organizations that understand, and act on, the types of signals that can be gleaned from support interactions can increase customer lifetime value and reduce churn in a more profitable and seamless manner. 

    In the age of agents AI, and a lot of “easy button” promises by leading CX vendors - buyers should look more closely at how use cases such as account health and expansion revenue analysis are truly supported by both generic Gen AI and agent tools, as well as how they are truly addressed by the copilots and agents found in leading CRM platforms. Tools like SupportLogic are not “easy buttons,” but for businesses with longer, complex support cycles that include a lot of “back and forth” wherein lots of product and sentiment signals can be extracted - it is worth considering as many more broad based AI tools are not as finely tuned for such specific B2B use cases. 

    Media Name: SupportLogic_Expand_in_Salesforce_Screenshot.jpg
    Next-Generation Customer Experience Revenue & Growth Effectiveness Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity B2C CX ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Customer Officer Chief Revenue Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

    Nvidia launches NIM Agent Blueprint for cybersecurity

    Nvidia launches NIM Agent Blueprint for cybersecurity

    Nvidia launched a NIM Agent Blueprint for cybersecurity as it continues to expand use cases for its microservices and AI agent platform.

    At its AI Summit in Washington DC, Nvidia outlined its NIM Agent Blueprint for container security. The cybersecurity NIM Agent Blueprint combines Nvidia's Morpheus cybersecurity AI framework, Nvidia cuVS and Rapids data analytics to accelerate vulnerabilities (CVEs) at scale.

    The cybersecurity blueprint is included in Nvidia AI Enterprise, the GPU giant's flagship software platform for AI applications.

    Nvidia has had a steady stream of NIM Agent Blueprint news as it aims to make agentic AI more commonplace in enterprises.

    According to Nvidia, its NIM Agent Blueprint for container security enables enterprises to use generative AI to digest information and then explain vulnerabilities using natural language. Companies can then create agents for cybersecurity workflows.

    Nvidia added that Deloitte is among the first to use Nvidia NIM Agent Blueprint for container security in its cybersecurity applications.

    Here's a look at the architecture.

    Among other notable items from Nvidia at its AI Summit.

    More on agentic AI:

    Data to Decisions Digital Safety, Privacy & Cybersecurity I am Team Leader at the Nominee Organization (no vendor self nominations) Distillation Aftershots Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience nvidia Security Zero Trust AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Product Officer

    SAP gives ABAP code a genAI boost, adds data lake capabilities to Datasphere

    SAP gives ABAP code a genAI boost, adds data lake capabilities to Datasphere

    SAP at its TechEd conference delivered its share of AI agent headlines as its Joule generative AI becomes one assistant across its platform, but as a more practical matter more developer options for ABAP and data lake capabilities will have a much larger impact.

    The goal for SAP is to move its custom SAP ECC code to S/4 clean code as soon as possible and the company outlined a series of moves to make that happen faster with a big assist from generative AI.

    At TechEd, SAP said it will enable ABAP developers to generate rate high-quality code with its Joule generative AI copilot to comply with SAP's ABAP cloud development model. According to SAP, "Joule will also be able to generate explanations for legacy code, making it easier to modernize legacy codebases and migrate to a clean core." ABAP is a programming language that runs in the SAP ABAP runtime environment, created and used by SAP for the development of application programs.

    Constellation Research analyst Holger Mueller did a deep dive on the implications for ABAP developers, which will get extended customer fields, business logic and processes. Mueller noted that the ABAP additions to SAP Build will give SAP the ability to update legacy code at scale. ABAP has 2 million active developers.

    By the end of 2024, SAP Build will include access from ABAP development tools and environments for SAP S/4HANA Cloud. The integration will enable developers to create and monitor ABAP Cloud projects in SAP Build.

    The other big move by SAP revolved around new embedded data lake features for SAP Datasphere. By the end of the fourth quarter, Datasphere will have a data lake option to complement existing storage. Businesses will be able to analyze data across hybrid environments and preserve context and logic.

    SAP said the data lake capabilities include:

    • An integrated object store for more efficient data transformation and processing.
    • Spark compute based on existing Datasphere data integration.
    • The ability to access data on integrated object stores without physically copying the data.
    • Ultimately, users will be able to import and integrate data at scale from SAP sources and non-SAP sources.

    Mueller said:

    "SAP shows broad investment across its technology platform, which is the key focus for all enterprise software vendors, readying their SaaS suites for the era of Infinite Computing. SAP with its new data lake capabilities adds (for the first time) object storage abilities. With this move, SAP for the first time gives customers the chance to holistically build next generation of applications powered by AI on its platform. Equally is important that SAP finally shows some love to its 2M+ active ABAP developers with support of ABAP in SAP Build, as well as making Joule available in ABAP. Critical for the ecosystem is that SAP now finally allows partners to build custom ABAP Code in SAP S/4 HANA Cloud Public Edition – a key move to help SAP customers to upgrade to S/4HANA. The impact of these three makes the solid progress on AI pale, where SAP shows the right execution – widening and deepening Joule capabilities. What stands out for Joule is the design point to have a single AI assistant across SAP."

    Here's a look at everything announced at TechEd:

    • SAP Build will get the ability to give Joule custom skills as well as use SAP HANA Cloud to ground large language models using its vector engine.
    • Joule will get multiple AI agents that will combine business function expertise with the ability to carry out complex workflows. Joule will bring together specialized AI agents in areas like supply chain, procurement and finance.
    • SAP will add two out-of-the-box autonomous AI agent use cases including dispute management for incorrect or missing invoices, duplicate payments and the like and a financial accounting use case to streamline financial processes.
    • Joule will support 80% of SAP's most used business tasks by the end of the year. Joule will also be available in SAP Service Cloud and SAP Concur as well as SAP S/4HANA Cloud Public Edition. Simply put, Joule will be integrated into all of SAP's clouds.
    • SAP Knowledge Graph launched as a business context tool that is preloaded with ABAP tables, CDS views, APIs, and key data models so enterprises can ground AI models.
    • The company said it added Anthropic Claude 3.5 Sonnet via Amazon Bedrock to its generative AI hub along with the addition of IBM Granite foundation models, Meta Llama 3.1 and Mistral Large 2 and Codestral.
    • SAP Generative AI Hub gets improvements to customize pre-trained AI models, a new software development kit, and new regions from the big three hyperscale cloud providers.
    Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience SAP AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

    Intuit's Enterprise Suite could upend midmarket ERP

    Intuit's Enterprise Suite could upend midmarket ERP

    One of Intuit's big bets for the years ahead is to disrupt the mid-market ERP business market with its Intuit Enterprise Suite as it aims to fill a gap between QuickBooks Online Advanced and costly ERP implementations that are required when companies grow.

    The company launched the Intuit Enterprise Suite last month as it took its unified platform, powered by Intuit Assist, a common data store, services from its various offerings and generative AI, and took aim at an $89 billion total addressable market for larger, mid-market businesses.

    Intuit launched Enterprise Suite in the US for multi-entity, service and project-based businesses. These early adopters are providing a feedback loop for Intuit to continue to iterate on its platform.

    Sasan Goodarzi, CEO of Intuit, laid out the importance of Intuit Enterprise Suite during the company's recent Investor Day. "We have no intention to serve enterprise businesses, but every intention to serve large mid-market businesses" said Goodarzi, who said there's a massive gap in the ERP market between companies that are growing and complex but don't have the time for an ERP implementation. "You go out and talk to large customers and their words are 'big ERP is an organ transplant, it's too expensive and it's not about the yearly expense as much as it is the couple of years it taes to shift to an ERP solution.'"

    Goodarzi explained that Enterprise Suite has been in the works for a long time, but AI has enabled it to serve customers far north of $3 million in revenue that have multiple locations, the need for segment reporting and various requirements. In other words, Intuit has a continuum that serves prosumers with QuickBooks, small businesses with QuickBooks Online Advanced and midmarket firms with Enterprise Suite.

    Among early adopters of Intuit Enterprise Suite, annual revenue per customer (ARPC) is about $20,000. That sum is a win for Intuit and businesses since Enterprise Suite can consolidate an average of 10 business apps used by midmarket companies. QuickBooks Online Advanced ARPC for fiscal 2024 was $3,299.

    For fiscal 2025, Intuit is projecting revenue growth of 12% to 13% to $18.16 billion to $18.35 billion. Its Global Business Solutions Group, which includes QuickBooks, Mailchimp and Enterprise Suite as well as workforce management tools, will deliver fiscal 2025 revenue north of $11 billion, up 16% to 17% from a year ago.

    This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

    Laurent Sellier, SVP Product for Midmarket in Global Business Solutions Group, said: "It's common for customers to allocate tens of thousands of dollars, annually for business management software purchases of tens of thousands and even hundreds of thousands more for external support to get new systems set up. We are encouraged by the customer feedback from early adopters and they're telling us they're getting a lot of value by being on one platform, having one source of truth and easy onboarding due to the familiar navigation."

    Intuit estimated that there are 1.9 million mid-market businesses in its priority markets. There are 283,000 QuickBooks Online Advanced customers today, up 28% from a year ago. Those customers now have an upgrade path to Intuit Enterprise Suite. In a demonstration, Intuit used a construction company with 25 employees with $25 million in revenue as an Enterprise Suite customer.

    Enterprise Suite includes:

    • Financial and accounting tools to prepare financial statements and manage intercompany transactions. Intuit also uses AI to automate planning tasks such as cash flow management, budgeting and profit and loss forecasts with dashboards.
    • KPI tracking and insights by project and industry. These KPIs also sync with employee payroll and time data as well as historical financials.
    • Mailchimp marketing integration to find, retain and manage customers.
    • HR features for onboarding processes, healthcare, retirement and workers comp benefits. Payroll tax calculations, deductions and filings are automated with AI to catch errors.
    • Accounts payable and receivable automation and reconciliation.
    • Access to experts and services for customer success and customization. Intuit Enterprise Suite will have versions for construction, non-profit, service and project-based businesses.

    Impact on competition

    Intuit Enterprise Suite may disrupt midmarket ERP by providing a familiar upgrade path for businesses that would move toward Oracle NetSuite, Sage, Microsoft Dynamics and other plays on the bet they'd grow into the functionality. ADP is also expanding in human capital management and could effectively follow the same playbook as Intuit.

    For Intuit, Enterprise Suite can also give it a way to leverage its unified platform, data quality and generative AI tools. That platform also gives Intuit the ability to move upstream.

    “Innovation for ERP – no matter for large or medium or small enterprises – needs to come from a modern platform, an enterprise application platform (EAP) that supports the generic use case of extension, integration and building proprietary automation, enabled by low-code / no-code,” said Constellation Research analyst Holger Mueller. Equally, it requires to build on common data foundation that supports both Analytics and AI. “Getting these two offerings right is critical and Intuit is making good progress on both ends.”

    Will Intuit ever threaten SAP, Workday, Oracle and Infor? No. But will it prolong the sales cycle to move up to those ERP packages? You bet.

    Recent enterprise application research:

    Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Digital Safety, Privacy & Cybersecurity AI Analytics Automation CX EX Employee Experience HCM Machine Learning ML SaaS PaaS Cloud Digital Transformation Enterprise Software Enterprise IT Leadership HR GenerativeAI LLMs Agentic AI Disruptive Technology Chief Information Officer Chief Customer Officer Chief People Officer Chief Human Resources Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer