Results

Google Cloud Next: The role of genAI agents, enterprise use cases

Google Cloud pitched an agent-oriented vision for generative AI at Google Cloud Next and highlighted a bevy of emerging use cases going from pilot to production.

"We are now building generative AI agents," said Google Cloud CEO Thomas Kurian. "Agents are intelligent entities that take action to help you achieve specific goals."

These actions can range from helping a shopper find a dress, picking health benefits, nursing shift handoffs, bolstering security defenses or building applications. Google Cloud's agents during the keynote were built with its Gemini large language model, but presumably other LLMs were possible via the company's Model Garden.

Google Cloud continues to "offer widely used first party, third party and open-source models," said Kurian. "Vertex AI can be used to tune, augment, manage and monitor these models."

In many ways, Kurian's riff about agents is Google Cloud's answer to Microsoft's Copilot stack and AWS' Q. What Google Cloud did was tie agents to business outcomes and processes that could be automated. "These agents would connect with other agents as well as humans," said Kurian.

Kurian added that genAI agents powered by Gemini models will be the connective tissue between all of Google Cloud's services.

Constellation Research analyst Holger Mueller summed up Google Cloud's approach with agents:

"In the AI race Google provides the right mix of assistants/agents (not the inflationary number of co-pilots like Microsoft) while providing the Über AI with Gemini Cloud Assist (which has the same ambitions like Amazon's AWS' Q). And all of that on the best hardware infrastructure from chips to intra-data center networking and public networking. Google Cloud is powered by Gemini, the most advanced LLM out there, and offers grounding services with Google Search. All in all Google keeps it lead of 3-4 years when it comes to custom algorithms on custom silicon."

Here's a tour of use cases by the type of agents being deployed on Google Cloud.

Customer agents. For enterprises, customer agents are viewed as extra sales and service people. These agents are able to listen carefully, understand your needs and recommend products and services.

Mercedes Benz highlighted multiple customer agent experiences in car as well as for customizing models to buy. "The sales assistant helps customers to seamlessly interact with Mercedes when booking a test drive or navigating through offerings," said Mercedes Benz CEO Ola Källenius.

Enterprises cited by Google Cloud appeared to be gravitating toward genAI as a service engine. Discover Financial uses genAI to search and summarize procedures during calls and IHG Hotels & Resorts is building a travel planning tool for guests.

In addition, Target is optimizing offers and curbside pickup on its app and site. Best Buy is also building an agent to troubleshoot product issues and manage order deliveries. Paramount+ is also using genAI to personalize viewing recommendations.

Google Cloud customer agents can be tailored by conversation flow, languages and subject matter and then know when to hand off to a human agent.

Employee agents. The returns on employee agents are relatively straightforward: Remove repetitive tasks so employees can be more productive. Employee agents can also streamline chores such as health benefits enrollment.

Most of the employee agent examples were tethered to Gemini models running through Google Cloud Workspace, but via Vertex AI extensions models can connect to any external or internal API. Uber CEO Dara Khosrowshahi said employee agents were being built to aid support teams, summarize user communications and reduce marketing agency spending.

How Uber's tech stack, datasets drive AI, experience, growth

Other use cases included Dasa, a Brazil-based medical diagnostic company, using agents to surface relevant findings in test results; Etsy optimizing ad models; and Pepperdine University, which is using Gemini to provide captions and notes across multiple languages.

Gemini-powered agents in Workspace are also being used to analyze RFPs, contracts and other corporate documents. This analysis of large documents and paperwork automation was a key use case across companies such as HCA Healthcare and Bristol Myers Squibb.

Home Depot is leveraging Gemini for its Sidekick application that manages inventory. See: How Home Depot blends art and science of customer experience

Creative agents. Like employee agents, creative agents have been tied to Workspace in the Google Cloud ecosystem. However, I saw an AWS demo where a marketer or ad agency team can create mood boards, pick models and accelerate content concepts to minutes from days or weeks.

For Google Cloud, creative agents are all about using Gemini to create slides, images and text. Carrefour is using Vertex AI to create dynamic campaigns across social networks quickly.

Procter & Gamble is using Google Cloud's Imagen model to develop images and creative assets. Canva is using Vertex AI to power its Magic Design for Video editing tools.

WPP is using Gemini 1.5 Pro to power its media activation tools.

The returns of creative agents can be powerful in that enterprises can avoid media waste and its associated costs across a campaign. In addition, storyboards can be created and tweaked quickly.

Related: Middle managers and genAI | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge | CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

Data agents. A common use case is using generative AI to search, analyze and summarize document, video and audio repositories to surface insights. A good data agent is one that can answer questions and then tell us what questions we should be asking.

Suresh Kumar, CTO of Walmart, said it is using data agents to comb BigQuery and surface insights for personalization, supply chain signals and improve product listings.

Data agents are being deployed for drug discovery and medical treatments. Mayo Clinic is using data agents to search for more than 50 petabytes of clinical data.

In addition, delivery carriers and airlines are using data agents to optimize shipments and routes.

Data agents can be deployed for data preparation, discovery, analysis, governance and to create data pipelines. These agents can also provide notifications when KPIs are being met or in jeopardy.

Constellation Research analyst Doug Henschen said the data agent argument is strong.

"The vision for Data Agents is pretty compelling, with a key point made by Google Cloud being that multi-modal opportunities lie ahead. Multi-modal GenAI-powered data agents will unlock combinations of structured and unstructured data including video, audio, images and code and correlations with structured data. One scenario that Alphabet CEO Sundar Pichai shared was that of an insurance company adjuster that might combine video, images and text to automate a claims process. With BigQuery at the center, Google Cloud foresees data agents applying multiple engine to data, whether SQL, Spark, search or whatever to solve business problems."

BT150 CXO zeitgeist: Data lakehouses, large models vs. small, genAI hype vs. reality

Code agents. Goldman Sachs CEO David Solomon said that genAI ability to boost developer productivity was promising. "There's evidence that generative AI tools for assisted coding can boost developer efficiency and we're excited about that," said Solomon, who said genAI is being used to analyze content and market signals and boost client engagement.

Goldman Sachs rival JPMorgan Chase also sees a boom in developer productivity with genAI code assistance. JPMorgan Chase CEO Dimon: AI projects pay for themselves, private cloud buildout critical

Wayfair CTO Fiona Tan said the retailer is standardizing Google Code Assist and improvements via Gemini 1.5 Pro. Google Cloud is also leveraging Gemini Code Assist and has increased productivity by 30%.

Security agents. Anyone following the ongoing battle with Palo Alto Networks, CrowdStrike and Zscaler knows generative AI has a big role in security. Google Cloud said that Palo Alto Networks will build on top of Google Cloud AI.

Google Cloud said security agents are designed to incorporate data and intelligence to serve up insights and incident response faster. The win is that generative AI can create a multiplier effect for cybersecurity analysts by analyzing large samples of malicious code.

Charles Schwab and Pfizer were cited as a Google Cloud security customers. The goal of a security agent is to identify and address threats, summarize and explain findings and recommend next steps and remediation playbooks quickly. Ultimately, security agents will automate responses.

Constellation Research analyst Chirag Mehta analyzed Google Cloud's security strategy in a research note. He said:

"As a Google Cloud prospect or customer, take a comprehensive inventory of your current security tools landscape, encompassing Google Cloud and its partner ecosystem. Engage with Google Cloud and security tool vendors to discuss their roadmaps for Google Cloud, with a specific focus on how they plan to leverage AI to address your unique requirements. Additionally, consider exploring tools that offer multi-cloud support, regardless of your primary cloud provider, to future proof your security infrastructure."

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Intel launches Gaudi 3 accelerator with availability in Q2

Intel said its Gaudi 3 AI accelerator will be available in the second quarter with systems from Dell Technologies, HPE, Lenovo and Supermicro on tap. Intel, along with AMD, is hoping to give Nvidia some competition. 

The chipmaker's Gaudi 3 launch, announced at the Intel Vision conference, is the linchpin of Intel's plans to garner AI training and inference workloads and take share from Nvidia.

According to Intel, Gaudi 3 has 50% average better inference and 40% better average power efficiency than Nvidia H100 with lower costs. It's worth noting that the Nvidia has outlined its Blackwell GPUs and accelerators that leapfrog H100 performance.

Nevertheless, model training will be a balancing act between speed and compute costs. Enterprises will use a bevy of options for AI workloads including Nvidia, AMD and Intel as well as in-house offerings from AWS with Trainium and Inferentia and Google Cloud TPUs.

Key points about Gaudi 3:

  • Intel Gaudi 3 is manufactured on 5 nm process and uses its engines in parallel for deep learning compute and scale.
  • Gaudi 3 has a compute engine of 64 AI-custom and programmable tensor processor cores and eight matrix multiplication engines.
  • Memory boost for generative AI processing.
  • 24 GB Ethernet ports integrated into Gaudi 3 for networking speed.
  • PyTorch framework integration and optimized Hugging Face models.
  • Gaudi 3 PCIe add-in cards.

To go along with the Gaudi 3 launch, Intel said it will create an open platform for enterprise AI along with SAP, RedHat, VMware and other companies. It is also working with the Ultra Ethernet Consortium and will launch a series of network interface cards and AI connectivity chiplets.

 

Tech Optimization Data to Decisions Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity intel AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Consider HPE Greenlake in your Hybrid Cloud IT Investment | CR ShortList Spotlight

Paging all CXOs 📣 If you use the Constellation ShortList portfolio to narrow your search for leading enterprise technologies, don't miss this interview!👇

In 2024, we selected Hewlett Packard Enterprise Greenlake as one of the leading hashtag#transformation target platforms. R "Ray" Wang sits down with Fidelma Russo, CTO of HPE, to talk through why CXOs should strongly consider using HPE Greenlake to reach their digital transformation goals.

On <iframe width="560" height="315" src="https://www.youtube.com/embed/Z06C3yDRius?si=jX4dAiv1__oSOLmJ" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

MongoDB Atlas expands Google Cloud Vertex AI integration, eyes vertical use cases

MongoDB expanded integrations with Google Cloud's Vertex AI, BigQuery, Google Distributed Cloud and Google Cloud Manufacturing Data Engine.

The expanded collaboration between MongoDB and Google Cloud boils down to a common theme: Enterprises need more seamless ways to build generative AI applications with their proprietary data.

MongoDB's news lands as Google Cloud kicked off its Google Cloud Next conference in Las Vegas with a barrage of new product features and announcements with Vertex AI and BigQuery as the headliners.

Key items in the MongoDB and Google Cloud expanded partnership include:

  • Google Cloud Vertex AI will have an extension for MongoDB Atlas and Spark integration with BigQuery.
  • MongoDB Atlas will be integrated into the Google Cloud Manufacturing Engine, which is focused on the manufacturing vertical.
  • MongoDB joins Google Cloud's Industry Value Network, which is designed to expand industry-focused AI. MongoDB and Google Cloud are also working on industry integrations for retail.
  • MongoDB Atlas Search Nodes are generally available on Google Cloud.
  • And MongoDB Enterprise Advanced on Google Distributed Cloud is aimed at regulated industries that need to comply with data privacy requirements.

For MongoDB, a Google Cloud partner of the year, the partnership with the No. 3 cloud provider gives it more reach into key industries. Enterprises are increasingly looking to multiple language models to fine tune for industry-specific applications while keeping first-party data secure.

MongoDB and Google Cloud have been partners since 2018 and have thousands of joint customers.

More MongoDB:

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience mongodb Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Google Cloud Next 2024: Google Cloud aims to be data, AI platform of choice

Google Cloud outlined a series of services and enhancements across its platform in a bid to make it easier for enterprises to bring their data to generative AI models, build applications and deploy them at scale. Google Cloud's data analytics services will unify under the BigQuery umbrella and Vertex AI becomes the venue to tune, orchestrate and deploy models.

Ultimately, Google Cloud is bidding to be the AI optimized stack of choice that will enable companies to deploy a series of agents that can automate workflows and carry out tasks. And by the way, Google Cloud is offering model choices, but is embedding Gemini everywhere

Google Cloud CEO Thomas Kurian (right) said:

"We're building AI to be an open, vertically optimized stack. This stack consists of advances with our AI supercomputer, which is now used by over 90% of AI unicorns. There are advances to improve the efficiency and scale of training and serving and a wide portfolio of different kinds of system optimizations, allowing us to provide developers and organizations with the market leading cost performance for training and inferencing models."

Kurian added that Google Cloud Next 2024 will include more than 1,000 new products and features across its platform. "We continue with our strategy to help organizations drive digital transformation using our cloud platform and AI," said Kurian.

Here's the Google Cloud vision in two slides.

Leading to...

The announcements filling out this vision are plentiful, but here are the big launches.

  • At the infrastructure layer, Google Cloud outlined the latest GPU/TPI support, PyTorch enhancements, Axion, Google's first custom designed Arm processor, and confidential computing enhancements. Distributed Cloud will be aimed at sovereignty workloads and AI anywhere use cases. 
  • Google Axion delivers up to 50% better performance and 60% better energy efficiency compared to x86 based instances. 
  • Google Cloud said that Nvidia's latest Grace Blackwell GPUs will be served up as instances in early 2025. The company also outlined A3 Mega, a generally available instance tht has twice the bandwidth per GPU compared to A3 instances. 
  • On the model choice front, Google Cloud general availability for Gemini 1.0 and Gemini 1.5 Pro in public preview as well as Grounding on Google Search that will provide fresh information that's grounded. Google Cloud also announced Imagen 2.0 Editing in GA and private preview of text to live image, LangChain on Vertex AI and Vertex AI Prompt Management and Assistance. 
  • Gemini is being added to BigQuery, Databases, Vector indexing and Looker to name a few. Gemini is also being added to Google Cloud's security offerings. 
  • Google Workspace will get Google Vids, a way to collaborate and tell stories at work via Gemini, Vertex AI and Workspace integration and an add-on SKU for Gemini Meet/Chat that will be $10 per month per user.
  • For databases, Gemini will power an AI database assistant across Google's offerings. AlloyDB will also add an extension for faster vector search and AlloyDB for LLMs will be able to retrieve information with natural language. Vector support and integration with LangChain will be deployed across Google databases. 
  • On data analytics, BigQuery will become a unified platform for data to AI to support multimodal data and workloads with Gemini and a series of AI integrations. Specifically, BigQuery upgrades include a metastore for a unified data foundation, unified governance and cataloging and the addition of Apache Spark and Apache Kafka integration. Google Cloud is putting all of its data analytics services together on BigQuery.

  • With security, Google is using Gemini for SecOps and Threat Intelligence as well as adding an enterprise browser for Chrome.
  • Developers will get Gemini Code Assist, Cloud Assist for cloud operations and insights, and enhanced integrations with partners such as Datadog, DataStax, Elasticsearch, HashiCorp, SingleStore and Redis.
  • Customer references cited by Google Cloud for AI adoption include Bayer, Best Buy, Discover Financial and TD Bank to name a few. Also see what Equifax and Wayfair have done with Google Cloud. 

The race with AWS and Microsoft Azure

Google Cloud is No. 3 among the hyperscale cloud vendors, but Kurian said the company has a real play for deploying AI models and its data platform. "Customers want three things. They want a platform that allows them to build deploy AI models at scale. They want that platform to be differentiated. They want an organization that owns its own models and is able to vertically optimize the models. And they want integrated AI across the portfolio," said Kurian.

Kurian also touted model choices, which is something AWS recently spoke about with Bedrock. Kurian said grounding of models will also be critical. He said:

"We're introducing grounding with Google search. Not just grounding on your own enterprise data, and then evaluating. We provide that platform that offers a set of services that works with all the models. People are able to choose the platform and then choose the latest model or the best model for their needs. Many organizations now recognize that they need to take an enterprise AI platform, not pick a model. Models are changing week to week, month to month. They need a common foundational platform to do that."

Ultimately, Kurian is betting that Google Cloud and gain ground with a series of AI agents that can carry out tasks, understand processes and context and orchestrate workflows. He said customers are using a combination of Google Cloud building blocks "not just to do individual tasks but to orchestrate process flow."

Ultimately, cloud providers are looking to evolve to become the model orchestration layer for AI.

Google Cloud agents everywhere

The linchpin of Google Cloud's agent vision revolves around Vertex AI, which will include an Agent Builder and Model Builder to go along with a wide selection of models.

Agent Builder will feature no code, low code and full code varieties to orchestrate, ground and augment models, take action and process documents.

In a blog post, Kurian outlined the importance of agents to the generative AI landscape. He said agents can understand multi-modal information and learn overtime to handle transactions and business processes over time. Best Buy, Etsy, The Home Depot and ING Bank are uutilizing agents. 

Kurian and Google Cloud executives said these agents can be deployed in multiple contexts and venues including contact center, security, healthcare, retail and media to name a few. The consolidation of data analytics under BigQuery in a unified platform will hand off to Vertex AI.

Google Cloud will also layer in a series of MLOps services to bring generative AI from pilots to production including prompt management to create a feedback loop to continually improve and revise prompts.

Constellation Research's take

Constellation Research CEO Ray Wang said the following announcements stuck out on Day 1 of Google Cloud Next:

  • Gemini is now across software development, application life cycle, security, data analytics, BI, and databases.
  • Model selection. Customers want to bring their own models and being able to choose from Gemini, Lensa, Gemma, and Athropic is what customers want.  
  • Google Cloud is providing choice in chips from TPUs, to Nvidia GPU's to CPUs in the data center. 
  • Security models allow for air gap capabilities meaning it works great for government. Threat intelligence was beefed up.
  • Google Workspace is getting some new features from Vids to AI meetings that will take notes in 69 languages.

Constellation Research analyst Doug Henschen covered many of the announcements, here's his take on the Google Cloud Next news:

Gemini integration into BigQuery, Looker and GCP's databases. Henschen said:

"It’s significant both in terms of the depth and breadth of GenAI capabilities promised both within each product and across the entire portfolio of services. Focusing on BigQuery, the breadth of Gemini assistance is a differentiator, spanning from ingestion, data preparation, cleansing, and low-code data pipeline building to query recommendation, query cost and performance optimization, semantic search, and Python and SQL code generation."  

What Gemini brings to the data platform. Henschen said:

"GenAI capabilities are showing up in a lot of databases, but Google is going deeper and doing it more comprehensively across its portfolio. The capabilities are still in preview, mind you, but Google has also updated the underlying model since last year’s Duet announcements. Google says Gemini will deliver higher accuracy and better performance at a lower cost. As with most GenAI features, the promise is making sophisticated tasks easier for untrained users while improving the productivity of more experienced users."

Google Cloud's AI strategy to date. Henschen said:

"Google's data and AI strategy is about delivering a comprehensive platform with well-integrated capabilities so you can do it all without having to move data around or cobble together disparate services. BigQuery, in particular, has become the focal point, with tight integration with Vertex AI, for  AI, ML and GenAI, and Google Looker, for analytics. Another differentiator is  multi-cloud support through BigQuery Omni access to other clouds as well as enterprise applications, such as Salesforce, with a new Zero-ETL capability."

Google Cloud's analytics strategy relative to rivals. Henschen said:

"It’s not just about tacking on GenAI features. The big push across all the hyperscale clouds it to offer a single platform for data that seamlessly supports all your AI, GenAI, analytics and wider application development and operational needs. That’s also Microsoft’s push with Fabric and AWS’s push with its extensive portfolio of services. I give Google credit for bringing together a well-integrated platform centered around BigQuery and Vertex. AWS, for its part, has been doing more integration across its vast portfolio of services in recent years. Microsoft Fabric is very new and unproven at this point, but it’s also messaging about doing it all with one platform, putting an emphasis on familiarity and ease of use."   

What Google should do next. Henschen said:

"I’d say Google just needs to stick to its strategy, which has been very consistent. Google BigQuery and Vertex AI are strong, well-integrated services that are only getting stronger.  Google Cloud is playing catch-up a bit on the transactional side with AlloyDB, which is a much more recent introduction that goes up against Amazon Aurora. I’m sure Google will keep on improving that product while also retaining the open partnerships it has had with leading independents, such as MongoDB. Google leads with its strengths, but openness to third-party vendors and model providers  has been another consistent and important part of the Google Cloud strategy."   

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Google Cloud SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

JPMorgan Chase CEO Dimon: AI projects pay for themselves, private cloud buildout critical

JPMorgan Chase CEO Jamie Dimon issued his annual shareholder letter and provided an incremental update on the company's artificial intelligence efforts as well as private cloud buildout.

In the letter, Dimon covered the expected interest rate outlook and geopolitical uncertainty, but also spent a good bit of space on AI, generative AI and transitioning to the cloud, which enables JPMorgan Chase to roll out services faster.

JPMorgan Chase: Digital transformation, AI and data strategy sets up generative AI (download PDF) | JPMorgan Chase: Why we're the biggest tech spender in banking

Here are some of the technical highlights that update JPMorgan Chase's AI strategy as outlined in Constellation Research's case study (download PDF).

  • AI's big picture. Dimon said: "We are completely convinced the consequences will be extraordinary and possibly as transformational as some of the major technological inventions of the past several hundred years: Think the printing press, the steam engine, electricity, computing and the Internet, among others."
  • JPMorgan Chase has 2,000 AI and machine learning experts and data scientists.
  • The company has more than 400 use cases in production. "We're also exploring the potential that generative AI (GenAI) can unlock across a range of domains, most notably in software engineering, customer service and operations, as well as in general employee productivity," said Dimon, who added that generative AI will help the company "reimagine entire business workflows."  Also: BT150 CXO zeitgeist: Data lakehouses, large models vs. small, genAI hype vs. reality
  • JPMorgan Chase will continue to invest in AI and "many of these projects pay for themselves." Dimon added that AI has the potential to augment most jobs, reduce roles and create new ones.
  • To enable new AI capabilities, JPMorgan Chase has to migrate its data estate to the public cloud. "These new data platforms offer high-performance compute power, which will unlock our ability to use our data in ways that are hard to contemplate today," said Dimon.
  • AI is being incorporated into JPMorgan Chase's risk and control frameworks to counter threats.
  • Multicloud is critical to avoid lock-in. Dimon said JPMorgan Chase's cloud plans will include multiple clouds--private and public.
  • JPMorgan Chase is building 4 new private cloud data centers for $2 billion.
  • Most workloads and data will be in public and private clouds. "To date, about 50% of our applications run a large part of their processing in the public or private cloud. Approximately 70% of our data is now running in the public or private cloud. By the end of 2024, we aim to have 70% of applications and 75% of data moved to the public or private cloud," said Dimon. "The new data centers are around 30% more efficient than our existing legacy data centers. Going to the public cloud can provide 30% additional efficiency if done correctly (efficiency improves when your data and applications have been modified, or “refactored,” to enable new cloud services)."

Related: Middle managers and genAI | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge | CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Splunk’s Acquisition by Cisco Accelerates Convergence of Network, Security, and Observability, Fueled by AI

Two questions have haunted me for two decades: first, can we really address security without addressing networks? Second, are observability and security like oil and water? With customers adopting SASE (Secure Access Service Edge) solutions, we have begun to see a convergence of networks and security. Now, with Splunk's acquisition, Cisco has answered my second question.

Cisco officially closed Splunk’s acqusition a few days ago. Last week, the leadership from Cisco and Splunk communicated their joint vision in an executive roundtable that I attended. The roundtable featured Liz Centoni, Cisco EVP and Chief Customer Experience Officer, Tom Casey, SVP of Splunk Products & Technology, and Jeetu Patel, Cisco EVP and GM of Security and Collaboration. They together offered a glimpse into how they see the future of Cisco and Splunk working together to better serve their customers. Here's a breakdown of the key points, my brief analysis, and recommendations for customers.

Leaders' Vision: A Unique Combination of Security, Observability, and AI

The core message from Cisco and Splunk was clear: data, observability, and AI are the cornerstones of modern security.  Jeetu Patel emphasized,

"If you want to be a world-class security company, you have to be a world-class AI company, and if you want to be a world-class AI company, you have to be a world-class data company."

What excites both companies is the unique combination their offerings bring. Cisco's strength in networking and security complements Splunk's expertise in data platform and observability. As Tom Casey highlighted, "There's a lot of complements between the two areas."

This unique combination, according to Jeetu, "doesn't exist in the market today."  A fully integrated security and observability platform with AI at its core has the potential to revolutionize how organizations approach security.

Jeetu outlined three key objectives: enhancing efficacy through generative AI, improving user experience, and optimizing economics. In his closing remarks, he reiterated the companies' commitment to customer-centric innovation, “We will always start from the customer first and work backwards. We love our customers keeping us honest and making sure that we can actually drive to the outcomes.”

My Insights

  • Cisco's acquisition of Splunk presents a unique opportunity to address customers' end-to-end cybersecurity needs, leveraging the power of AI and data analytics. Although integrations require time and effort, when executed effectively, they have the potential to solve complex challenges and enhance operational efficiency.
  • Cisco and Splunk are culturally different companies; Cisco being a mature networking player known for its robust partner ecosystem, while Splunk holds boasts a broader developer reach with its security and observability offerings. Harnessing each other's strengths, they can foster a thriving cybersecurity ecosystem that paves a path for companies to build compelling solutions on their platform. A complementary acquisition typically benefits customers more than an overlapping one.
  • Traditionally, observability and security were seen as distinct areas. Yet, their merging offers Cisco and Splunk an unprecedented opportunity to tackle enduring cybersecurity issues, aggravated by data silos. In a time when organizations struggle with a scarcity of cybersecurity expertise, Cisco's ambition to democratize cybersecurity via AI reflects prevailing industry patterns we observe, placing emphasis on enhanced tool adoption and fortified security posture.

Recommendations for CxOs

  • As you navigate the convergence of networking and security, and formulate your AI strategy, reasses your current cybersecurity landscape. Look for opportunities to streamline your tools to drive increased adoption. Remember, increased adoption is superior to the allure of extravagant features.
  • Advocate for improved integration between Cisco and Splunk by articulating specific business outcomes you aim to achieve. Communicate your expectations to both companies' leadership regarding the enhancements you anticipate in the coming weeks and months. Consider attending their conferences this summer, Cisco Live and Splunk .conf24, to gain insights into their respective product roadmaps and provide your valuable feedback.
  • As you craft your security strategy and execution plan, check out our "11 Top Cybersecurity Trends of 2024 and Beyond." (If you're a vendor and don't have access to the report please contact me for a courtesy copy.) Drawing insights from numerous conversations with security, technology, and business leaders as well as extensive market research, this cybersecurity trends report offers a holistic view into the broader cybersecurity landscape. It also offers tangible recommendations for CxOs who are frantically navigating the cybersecurity maze to design and operationalize their cybersecurity strategy, with the objective to improve their defenses against increasingly sophisticated attacks.
Digital Safety, Privacy & Cybersecurity Chief Information Officer Chief Information Security Officer Chief Technology Officer

BT150 CXO zeitgeist: Data lakehouses, large models vs. small, genAI hype vs. reality

Enterprises need to focus on data lakehouse strategies in 2024 to properly take advantage of generative AI; model architecture will be critical to managing large and small models; fine tuning is more difficult than you'd think; and CXOs were weary of database vendors glomming on to genAI hype.

Those were some of the takeaways from Constellation Research's April 5 BT150 CXO meetup.

These gatherings, held under Chatham House rules, are a venue to share information and emerging trends.

Here's a look at the topics from our February meetup.

Fine tuning isn't as easy as you'd think. While fine tuning and customizing a foundational model should be easier than training a large language model from scratch, the process is more involved. The tooling isn't mature enough yet for fine tuning at scale and enterprises are evaluating where to host data.

Get that data lakehouse. Enterprises are coming around to the reality that they need to have a data strategy before even thinking about AI. Considerations include:

  • Ability to move data to models in real time.
  • Need to combine enterprise data with third party data.
  • Costs.
  • Need for real-time data ingestion.
  • Build your own enterprise data lakehouse.
  • Benefits of data lakehouse strategy is that the business will also see business intelligence and analytics benefits and the promise of big data.
  • Take 2024 to nail the data lakehouse strategy to prepare your company for AI.

Foundational model strategy. CXOs and Constellation Research analysts expect industry and role specific models to emerge. In addition, enterprises will need to have model strategies that incorporate approaches that use language that applies to industries and companies specifically. Enterprises will need to think through model architectures to manage models for finance, HR, manufacturing, and other roles.

Kill switches. There was a good debate on our call about the need for an AI kill switch. On one side, models will be hacked and when that happens, you'll need to be able to pull the plug and recover. The argument against the kill switch concept is that other parts of the enterprise don't automatically shut down.

CXOs were exhausted by transactional database vendors that are glomming on to the generative AI hype. These vendors are mostly concerned about you moving your data away from them. Wait until you see a feasible generative AI solution from database vendors before falling for the hype.

Small models vs. large ones. LLMs could be seen by enterprises as boiling the ocean and many CXOs and vendors are talking about smaller models that are specific to a task or process. The reality is that models will require a hybrid approach. Some models will be large, some small and some will be run locally too.

Model suites will always win? While CXOs will take a best-of-breed approach to models due to conditions and hardware limitations, but ultimately the suite approach is likely to win. Specialist models will be better for some tasks, but economies of scale over time favor generalists and suites. The feedback loop of more data and context is likely to favor large models. Beware of small model chatter from vendors without a comprehensive AI strategy or access to a large language model.

Generative experiences with avatars. One CXO was piloting a series of avatars to personalize generative experiences by language and use case. This avatar meets genAI approach appears to be positive for the host and participant. The CXO noted that starting with a framework, governance and privacy controls is a key enabler for generative AI use cases.

AI will create interesting dynamics in the labor pool. Analytics and data science roles are likely to be impacted despite what recent surveys have indicated. A college student with the free time to experiment with prompts can replicate the experience of someone doing predictive analytics and data science for decades. Simply put, the entire skill model for enterprises is going to change.

High performance computing will change due to generative AI. HPC is going to have to evolve since it is in the middle of the generative AI revolution. Nvidia's Blackwell launch featured a series of GPU clusters that will likely compete with supercomputers. Generative AI workloads will fundamentally change compute.

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity BT150 AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Digital Officer Chief Data Officer Chief Technology Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Will demographics mitigate the genAI hit to workers?

The debate over generative AI and its impact on the workforce is just heating up since the technology hasn't scaled at most enterprises. One of the biggest questions to ponder is whether genAI's impact will be muted by demographics.

Last week, we covered the intersection of tech layoffs, generative AI and middle management, which is taking the biggest hit. The Daily Show's Jon Stewart also had an interesting riff on the "promise of AI."

It's easy to conclude that generative AI is going to take jobs from humans. But there's another argument that genAI will be needed just to maintain and improve productivity levels because there will be fewer workers. There’s a demographic donut hole in the workforce that may be partially ameliorated by genAI.

During Paychex's third quarter earnings call, CEO John Gibson highlighted how small and mid-sized businesses were struggling to find employees. Gibson also noted that the pace of retirements from Baby Boomers is only going to pick up. Meanwhile, generation X doesn't have the numbers to fill the institutional knowledge gap.

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

According to the Bureau of Labor Statistics, the civilian labor participation rate hasn't recovered from its pre-COVID-19 pandemic rate. And the US Census Bureau said that 16.8% of the population is 65 and older and projected to grow to 22% by 2040.

Gibson said Paychex is using AI to drive insights on retention as well as integrations with Indeed. Paychex's partnership with Visier will offer compensation insights.

He added:

"The simple fact is we have a generational change happening in the labor force. Participation rates remain below pre-pandemic levels and it's going to be very difficult given the rate of retirements that we're seeing in Baby Boomers to really see that change. And what you see in the prime age workers were actually at record highs. The problem is not enough prime age people to fill all the opportunities."

"We need to do more to allow businesses to invest in productivity and drive productivity enhancements and that's not going to replace workers. That's going to enable them to get the work done with less workers than are going to exist in the marketplace. I think this is a systemic problem."

Related: Middle managers and genAI | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge | CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey

Gibson said there's a productivity gap that'll occur as younger workers replace older ones. The only nuance here is that older workers may not all leave the workforce on schedule.

Pew Research found that 19% of Americans ages 65 and older were employed in 2023, nearly double the share from 35 years ago. The typical worker age 65 or older earns $22 an hour.

Workers ages 75 and older are the fastest-growing age group in the workforce. Today, 9% of workers that age is employed. Blackrock CEO Larry Fink should be excited about that development given he sounded the alarm bells about retirement funding and noted that one fix to Social Security would be working longer. Living to age 80 isn't terribly uncommon today.

How these workforce dynamics play out is anyone’s guess. Today, it’s hard to reconcile layoffs in technology with rosier IT employment prospects outlined by CompTIA.

Bottom line: The generative AI hit to the workforce is inevitable, but there's a more nuanced position to take amid the doom and gloom. One thing is certain: Generative AI is going to be a public policy issue.

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief People Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Wipro names Pallia CEO to replace Delaporte

Wipro names Pallia CEO to replace DelaporteWipro has named Srini Pallia CEO effective immediately replacing Thierry Delaporte, who stepped down to pursue other interests.

Pallia (right) most recently was CEO for Wipro's Americas 1 unit and has been with the company for more than three decades. Pallia was responsible for Wipro Americas' vision, strategy and industries and was president of Wipro's consumer business and head of business applications services. Delaporte will remain with Wipro through the end of May for the transition to Pallia.

Wipro's Americas 1 unit includes healthcare and medical devices, consumer goods and life sciences, retail, transportation and services, communications, media and information services, technology products and platforms in the US and Latin America. Wipro's Americas 2 division is focused on financial services, manufacturing, technology and energy and utilities in the US and Canada.

The US accounted for 56% of Wipro's fiscal 2023 revenue.

In a statement, Pallia said he was "excited to build on the strong foundation established by Thierry and lead Wipro on its next growth trajectory."

Delaporte led Wipro through a transformational phase in his four years at the helm. The company has launched its ai360 strategy, which revolves around embedding AI throughout its services and offerings.

For the nine months ended Dec. 31, Wipro reported revenue of $8.12 billion with profit of $993 million. Revenue was modestly higher relative to a year ago. For fiscal 2023, Wipro reported revenue of $11 billion, up 14% from 2022, with profits of $1.38 billion.

Tech Optimization wipro Chief Information Officer