Results

Amazon Bedrock gets custom model import, evaluation tools, new Titan models

Amazon Bedrock gets custom model import, evaluation tools, new Titan models

Enterprises will be able to import their own large language models (LLMs) into Amazon Bedrock and evaluate models based on use cases, said Amazon Web Services. AWS also launched two Amazon Titan models.

The ability to import custom models and evaluate them plays into the broader themes of generative AI choice and orchestration. AWS' Bedrock bet is that enterprises will use multiple models and need a neutral platform to orchestrate them.

In addition, enterprises are showing broad interest in using open-source models and then customizing with their own data.

According to AWS, Amazon Bedrock Custom Model Import will give companies the ability to import and access their custom models via API in Bedrock. These custom models could be added to the various model choices in Bedrock.

AWS' Matt Wood on model choice, orchestration, Q and evaluating LLMs

Enterprises will be able to add models to Amazon Bedrock that were customized in Amazon Sagemaker or a third-party tool provider with automated validation. AWS said imported custom models and Amazon Bedrock models will use the same API. Custom Model Import is in private preview and supports Flan-T5, Llama and Mistral open model architectures with more planned.

AWS also said Model Evaluation in Amazon Bedrock is now available. Model Evaluation will analyze and compare models for various use cases, workflows and other evaluation criteria. Bedrock is also getting Guardrails for Bedrock so enterprises can have control over model responses.

As for the Titan models, AWS said Amazon Titan Image Generator, which has invisible watermarking, and the latest Amazon Titan Text Embeddings are generally available exclusive to Bedrock. Amazon Titan Text Embeddings V2 is optimized for Retrieval Augmented Generation (RAG) use cases.

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Google Unveils Ambitious AI-Driven Security Strategy at Google Cloud Next'24

Google Unveils Ambitious AI-Driven Security Strategy at Google Cloud Next'24

In eight years, from a few hundred attendees at a nondescript location in San Francisco to 30,000+ attendees in one of the largest convention centers in Las Vegas, with a private concert in a football stadium, Google Cloud Next conference has come a long way. And so has Google Cloud. Google Cloud grew its revenue 5x in the last five years.
 
As conference attendees patiently waited in a long line at an overpriced Starbucks to get their caffeine fix on their way to a keynote, and then rushing between keynotes, sessions, and the expo, with a pit stop at a crummy food court before retiring to their hotel rooms, you could not go even a minute without hearing the word “AI.” The energy was evident and the enthusiasm palpable. In one of the most striking moments, during a developer keynote, one of the Google Cloud speakers referred to 2023 as “legacy.” That’s how much Google Cloud has changed. 

Standing tall on a keynote stage in front of thousands of attendees, Thomas Kurian, the CEO of Google Cloud, announced their latest AI model, Gemini 1.5 Pro. A much taller screen behind him flashed a slide in large Google Sans font. An audible gasp from the audience followed. This AI model has a context window of 1 million tokens, the largest in the industry, which translates to analyzing data of various modality such as 1 hour of video, 30k lines of code, 11 hours of audio, and 700k words. Gemini now powers various Google Cloud’s offerings including their security portfolio.

Source: Google Cloud

Google Cloud's Differentiation Strategy: AI and Security Take Center Stage

When questioned about competition in a Q&A session with analysts, in his measured and deliberate answer, Kurian highlighted AI and security as Google Cloud's key differentiators. He also boasted Google Cloud’s open approach of giving options to customers at each layer whether they prefer first party, third party, or open-source AI models. 

 
AI and ML are a natural fit for security, as they can help process billions of signals to identify attack patterns, profile adversaries, and respond to threats in real-time. Generative AI copilots have become increasingly prevalent in enterprise software, providing a conversational interface for end users to analyze extensive data and gain insights through simple English queries. Google Cloud has also embraced this trend, announcing Gemini to integrate AI capabilities across its product portfolio, including security offerings.

The Evolution of Google Cloud's Security Strategy

Google Cloud's security narrative traces back to Google's early days as a native cloud company, where scaling infrastructure to support billions of users securely was paramount. Google developed its security stack from hardware to applications, encompassing nearly every system layer. Google then brought this native security to Google Cloud to differentiate and compete against other public cloud providers, Amazon and Microsoft. It also paved the way for externalizing internal services as products.
 
In 2017, Google introduced security services like Identity Aware Proxy (IAP) and Data Loss Prevention (DLP), but they gained little traction. The 2018 announcement of Security Command Center (SCC) in beta marked a turning point, serving as a hub for protecting Google Cloud assets and integrating ISV partner solutions. While Google Cloud wasn't yet seen as a security solutions company, this marked the beginning of its SecOps journey.
 
In 2019, Google integrated Chronicle (with Backstory and VirusTotal) from Alphabet's X moonshot factory into Google Cloud, followed by the 2022 acquisition of Mandiant. These additions expanded Google Cloud's security portfolio but introduced disparate products with different purposes, target customers, and underlying cloud platforms.
Since then, Google Cloud has focused on unifying the experiences, expanding Chronicle into a SecOps platform with SIEM and SOAR functionality, and competing as a security vendor beyond just a cloud provider. The announcement of Security Command Enterprise at Next'24 marked a significant milestone, converging cloud security with enterprise security and supporting AWS and Azure on the same "security fabric" as Google's SecOps functionality. Additionally, Google consolidated all security teams under a single organization with a general manager overseeing security as a business beyond Google Cloud, making Next'24 a coming-out party for Google Cloud's security offerings.

Unifying Threat Intelligence and Security Operations

To better understand cybersecurity, let's draw a parallel with investment banking. Investment banks have equity research teams that analyze stocks and provide recommendations, while sales and trading teams execute transactions based on this research. They collaborate, sharing real-time market insights to inform trading strategies.
 
Similarly, in cybersecurity, threat intelligence analyzes adversaries, incidents, and malware by scanning vast data volumes. This intelligence is crucial for operations teams managing digital asset security. However, these teams often lack shared tools and unified underlying information.
 
Google Cloud aims to change this with a new risk management approach, unifying CNAPP and TDIR (threat intelligence, detection, and response) on a single "security fabric" platform. By combining multiple categories, Google is attempting to define a new category on its own, a feat no other security vendor has successfully achieved. While categories can be limiting and primarily benefit vendors, Google does have an opportunity to differentiate, but it’s not going to be easy.

Source: Google Cloud

Securing Chrome as an Endpoint

As customers access applications through web browsers, Google recognizes the importance of securing Chrome as an endpoint to provide robust protection. With Chrome's widespread adoption in the enterprise sector, customers seek enhanced security features without requiring a separate browser. By treating Chrome as a secured endpoint, Google can significantly enhance endpoint security, even if operating system vulnerabilities remain. This approach also opens up a new cybersecurity category for Google to explore.
 
At Next'24, Google announced the general availability of Chrome Enterprise Premium, featuring Data Loss Prevention (DLP) and context-aware proxy capabilities. Notably, Google Cloud introduced Identity Aware Proxy (IAP) and DLP in 2017. While many customers have initiated their Secure Access Service Edge (SASE) journey, most won't complete it in the near future. Instead, they're seeking incremental solutions like proxies and DLP to supplement their existing tools. Google Cloud can capitalize on this demand with its offerings and those from its ecosystem partners, providing a comprehensive security suite.

Balancing AI Innovation with Core Cloud Security Investments

Google has acknowledged that certain areas, such as NGFW (Next-Generation Firewall), require significant investment without clear differentiation, and has opted to partner with Palo Alto Networks instead of building its own software firewall. This approach allows Google to serve its customers while avoiding unnecessary development efforts.
 
Humans are the weakest link in security. Google's previous lack of focus on Identity and Access Management (IAM) has been surprising, given its critical role in cybersecurity. However, Google is now addressing this underinvested area, introducing Privileged Access Manager in preview at Next'24. Leveraging AI, Google has the opportunity to revolutionize identity management and automate manual workflows, reducing the risk of severe breaches often caused by human error.
 
In addition to IAM, Google is investing in other critical security areas, such as Cloud Armor for DDoS protection and Autokey for customer-managed encryption keys. It's crucial for Google to continue differentiating its core cloud security offerings, ensuring the core platform receives the necessary investment, rather than disproportionately focusing on AI.

Challenges ahead 

Google Cloud has laid a strong foundation with its security portfolio, and its intentions are commendable. Nevertheless, there are challenges that need to be addressed to fully realize its vision.

Crafting a Unified and Compelling Security Story

Google Cloud's security portfolio messaging is overly product-focused, rather than solution-centric. Customers are looking for solutions; they don’t want to solve a puzzle of mapping problems to products. On top of that, by combining CNAPP and TDIR, Google is creating a unique category, but this complexity can make it difficult for customers to understand how Google Cloud security offerings align with their specific needs, especially if they don't primarily use Google Cloud. This is particularly challenging for organizations that don't consider Google a security vendor, despite Google's multi-cloud security offerings. 
 
Google will have a decision to make: Do they see their security offerings as a differentiation for customers to consider Google Cloud or do they see their security offerings to differentiate on their own merits with or without the underlying cloud platform they run on. Ideally, Google can achieve both—being the most secure cloud and a highly differentiated multi-cloud security vendor—but Google needs to articulate a unified and a compelling story that supports their vision.

Hybrid Environments and The IoT and OT Security Gap

Despite growing adoption of public cloud, 80% of IT workloads remain on-premise. While public cloud providers anticipate customers will modernize and migrate their on-premise workloads, the reality is that the majority of legacy workloads will remain on-premise in the near future. Moreover, the vast array of connected Operational Technology (OT) and Internet of Things (IoT) devices, running various legacy and non-standard operating systems, poses significant security challenges. These systems are often exploited by attackers, who leverage them as gateways to propagate malware to other systems.
 
As a native cloud company, Google is not well-positioned to secure these on-premise assets, nor should it attempt to build on-premise solutions. Instead, Google needs to foster strong partnerships to address this critical gap. Most organizations will operate in a hybrid environment for the foreseeable future, and industries with a high concentration of vulnerable OT and IoT devices—such as healthcare, retail, oil and gas, manufacturing, mining, and utilities—require comprehensive security solutions. Google Cloud must develop a compelling and comprehensive strategy to support these organizations and ensure the security of their on-premise workloads and devices.

Balancing Competition and Partnership, a Delicate Dance

Google is navigating a delicate balance with its security partners, many of whom have made significant investments in Google Cloud to serve their shared customers. While Google will inevitably compete with some of these partners, not just on Google Cloud but across public cloud platforms, it has committed to protecting their interests and has contractual obligations in place. These partners generate substantial revenue for Google Cloud, making it crucial for Google to tread carefully and avoid disrupting this revenue stream.
 
To strike the right balance, Google must compete and differentiate its offerings with clarity and integrity, driven by a genuine vision to innovate and improve products, rather than simply seeking to boost growth. By doing so, Google can maintain trust and collaboration with its partners while advancing its own goals.

Recommendations for customers

Assess and Align Your Security Tools Landscape

As a Google Cloud prospect or customer, take a comprehensive inventory of your current security tools landscape, encompassing Google Cloud and its partner ecosystem. Engage with Google Cloud and security tool vendors to discuss their roadmaps for Google Cloud, with a specific focus on how they plan to leverage AI to address your unique requirements. Additionally, consider exploring tools that offer multi-cloud support, regardless of your primary cloud provider, to future proof your security infrastructure.

Go Beyond Categories to Solve Specific Problems

Instead of shopping by category, focus on solving specific problems. Shift your mindset from being a vendor manager to a problem-solver, targeting the outcomes you want to achieve. Remember, your unique starting point and system maturity may differ from others, so avoid a one-size-fits-all approach. Tailor your solutions to your distinct needs and goals.

Advocate for Your Needs and Shape the Security Conversation

Clearly articulate and communicate your expectations to vendors, as security is a complex and vast domain. Recognize that vendors prioritize their product roadmaps, just as you prioritize your needs. Engage in open discussions with fellow security and technology leaders to share strategies and learn from their experiences. Be an active and vocal participant in the community, shaping the conversation and influencing the solutions that meet your needs.

Stay Ahead of Cybersecurity Threats with Expert Trends Report

As you craft your security strategy and execution plan, check out our "11 Top Cybersecurity Trends of 2024 and Beyond." (If you're a vendor and don't have access to the report please contact me for a courtesy copy.) Drawing insights from numerous conversations with security, technology, and business leaders as well as extensive market research, this cybersecurity trends report offers a holistic view into the broader cybersecurity landscape. It also offers tangible recommendations for CxOs who are frantically navigating the cybersecurity maze to design and operationalize their cybersecurity strategy, with the objective to improve their defenses against increasingly sophisticated attacks.

Chief Information Officer Chief Information Security Officer Chief Privacy Officer Chief Technology Officer

SAP's Q1: Net loss on restructuring, cloud revenue growth of 24%

SAP's Q1: Net loss on restructuring, cloud revenue growth of 24%

SAP reported a first quarter loss due to a restructuring charge but said its cloud revenue was up 24% with cloud ERP revenue growth of 32%. SAP said its cloud backlog was €14.2 billion, up 27% from a year ago.

The company's first quarter results come amid a restructuring effort that cut 8,000 jobs. SAP is also moving customers to S4/HANA through its SAP RISE program and looking to layer in business intelligence throughout its applications via generative AI.

SAP reported first quarter revenue of €8.04 billion, up 8% from a year ago with a net loss of €824 million. That sum included a €2.2 billion restructuring charge. Adjusted earnings were €944 million, down from €1.01 billion a year ago.

CEO Christian Klein said SAP was "off to a great start in 2024 and we’re confident we’ll achieve our goals for the year." SAP reiterated its outlook for the year. Klein said that Business AI, cross-selling and winning midmarket customers would drive growth.

Dominik Asam, SAP CFO, said the company's restructuring effort is designed to allow the company "to focus our investments on the Business AI opportunity while decoupling expenses from revenue growth."

SAP’s quarterly report lands as Constellation Research’s BT150 CXOs have gripes about SAP’s RISE program. Some of the feedback:

  • One CIO asked the group for opinions on SAP's RISE program and being forced from on-premises to the cloud. The goal was to have a strategy for SAP in place by the end of the year.
  • CXOs weren't thrilled about SAP RISE and items like licensing credits for legacy environments. A CIO wondered what would prevent a customer from moving away from SAP--especially since the enterprise operates in a space that doesn't garner investment from the enterprise software giant.
  • SAP's RISE program is viewed as an exercise in financial engineering more than something that benefits customers.

Our BT150 CXOs aren't alone. SAP's German speaking user group takes aim at cloud contracts, BTP and more | SAP user group DSAG rips S/4HANA innovation plans, maintenance increases | SAP retools for generative AI, cuts 8,000 jobs, sets 2024, 2025 ambition

The company said SAP RISE has potential to convert customers to the cloud and enable SAP to increase wallet share.

The company is also betting on its Joule copilot experience to drive demand in the future, but first it needs to convert customers to S4/HANA.

Key items in the quarter include:

  • Software license revenue fell 26% in the quarter.
  • Service revenue of €1.08 billion was flat from a year ago.
  • SAP expects to exit 2024 with a headcount total on par with 2023. The company eliminated 8,000 positions but plans to fill new ones focused on future business needs via new hiring and internal reskilling.
  • SAP’s restructuring depends on the uptake of voluntary leave programs, but the company doesn’t have visibility into its German workforce.
  • SAP reiterated its previous 2024 outlook of €17.0 billion to €17.3 billion cloud revenue at constant currencies, up 24% to 27%. Full year cloud and software revenue will be €29 billion to €29.5 billion at constant currencies, up 8% to 10% at constant currencies.
  • The company is expecting a Net Promoter Score of 9 to 13 for 2024. Scores above 50 are excellent with scores above 80 considered world class. 

Klein on RISE progress

On a conference call with analysts, Klein said there's a lot of runway with RISE with SAP and said companies will need to upgrade to transform. SAP is also offering incentives to migrate to S/4HANA across its portfolio. He said:

"Our installed base is large with over EUR11 billion remaining support revenue to be converted to the cloud. Typically, by a factor of around two to two. On top, the EUR700 billion Cloud ERP market offers significant cross selling opportunities, and I have no doubt that SAP's integrated best of suite capabilities will win in the core business of our customers. As part of RISE and via the clean core journey, SAP and our ecosystem will help our customers to remove the ERP custom code and instead develop integrated ERP extensions on BTP. This gives us an immense additional revenue potential considering that customers in the on premise world spend up to EUR7 on custom code for every euro they invest in ERP software."

He also said GROW with SAP as potential for smaller enteprises.

"As SAP's greenfield cloud ERP offering for net new customers or new business units of large enterprises, GROW delivers go lives in weeks for every business model in every industry in every country. With our ERP solution, SME customers can grow and scale their business without migrating to a new ERP. Ultimately, RISE and GROW offer customers similar advantages, innovation, modularity, scalability, and integration."

SAP Business AI is another growth driver for the company and Klein said the plan is to infuse Joule and Business AI across te portfolio. "Joule will be our new user experience via natural language, our one front end. We have based our Joule roadmap on an analysis of the most frequent business and analytical transactions of our end users. This way, we make sure that the most heavily used transactions will be fully AI enabled by the end of this year," said Klein. 

He added that SAP is embedding genAI in its cloud portfolio and has released more than 30 new AI scenarios across the cloud portfolio and has more than 100 in the pipeline. 

Klein was asked about SAP customers migrating to S/4HANA. The CEO said that SAP is taking a modular approach with megadeals for faster time to value. Custom code has hampered migrations. 

"When customers decide to move to RISE, they're not just doing a move of their current environments and replicating the same capability. In fact, far from it. They're trying to transform, operationally process all of their data, all of their capability to serve now and into the future," said Klein. "They're setting their business up. And so when they look at these, which is why, to answer your question, the larger deals are because they look at a multi-year roadmap of capability transitioning from an older state, including non-SAP."

Constellation Research's take

Constellation Research analyst Holger Mueller said:

"SAP has its work cut out this financial year. It needs to get going on its new go-to-market setup for S/4 HANA upgrades, aka the Thomas Saueressig organization. It is shuffling 10%+ of its employee base (8,000 as part of the reshuffle and then attrition), make S/4 HANA more interesting, show value of AI, and grow. These are all necessary steps for Christian Klein and team – but a lot of things to master at the same time. All eyes will be on the 2027 ECC upgrade deadline, that at this point is no longer plausible for large scale SAP customers. Will these core SAP customers (many in Germany) move because of the AI promise anyway – or will they wait and 2029 or even 2030, which will be the new deadline. Time will tell."

 

Data to Decisions Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work SAP Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

How Verizon buckets its AI, genAI use cases

How Verizon buckets its AI, genAI use cases

Verizon CEO Hans Vestberg is splitting the telecom giant's AI use cases in three buckets: Optimizing processes, product experiences and revenue growth.

Vestberg's thinking on AI, including generative AI, highlights how CEOs are maturing in their generative AI approaches. First, there was a wave of demand for shiny new objects. Then there was the wave of efficiency for AI use cases. And revenue growth is percolating as a goal but is far less developed so far.

Related: Middle managers and genAI | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge | CEOs aim genAI at efficiency, automation, says Fortune/Deloitte survey | 77% of CxOs see competitive advantage from AI, says survey | Google Cloud Next: The role of genAI agents, enterprise use cases

Here's what Vestberg said on Verizon's first quarter earnings call:

"Our AI strategy focuses on three priorities. First, optimizing internal processes and operations through machine learning, such as creating efficiencies in fuel consumption. AI is already centered to our cost transformation program and will become even more important over time. Secondly, enhancing product experiences with AI capabilities like the personalized plan recommendation on myPlan, which is producing good early results. And thirdly, establishing an AI-based revenue stream by commercializing our network's unique low latency, high bandwidth, and robust mobile edge compute capabilities. Generative AI workloads represent a great long-term opportunity for us. As we expand our network and increase our performance advantage, we're also making Verizon a more efficient organization."

If you unpack those comments, Verizon is seeing tangible results across the spectrum. "We already had several generative AI projects going live," he said.

  • Vestberg said Verizon has outsourced much of its customer service infrastructure without interruption, but also sees opportunities for AI to improve service.
  • On the experience side, Vestberg said Verizon's myPlan effort, which offers consumers more customization options, is personalized with the help of AI.
  • For efficiency results, AI has been in the mix for driving network performance but is used now for capacity deployment and power consumption. "We are using AI and generative AI already now commercially. So this is not the playing ground for us. We just see more opportunities," said Vestberg.

Naturally, Verizon also sees edge networks as well as private networks being a big driver of AI workloads. Vestberg said:

"On the flip side, of course we also see revenues. Our network was built for AI. That was my thought when I built Verizon Intelligent Edge Network five years ago or six years ago, that we're going to have compute and storage at the Edge. AI is sort of built for that with the low latency we have on the 5G network. And as we are deploying our 5G right now, with the mobile edge compute and AI, this is a great long-term opportunity for us using AI."

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity B2C CX AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Informatica says it'll hit Q1 targets, says not actively in acquisition talks

Informatica says it'll hit Q1 targets, says not actively in acquisition talks

Informatica said its first quarter results will be at the upper end of its guidance given in February and is "not currently engaged in any discussions about being acquired."

Following its fourth quarter results, Informatica projected first quarter revenue of $375 million to $395 million with subscription annual recurring revenue of $1.135 billion to $1.155 billion. Cloud subscription ARR was projected to be between $645 million to $655 million.

Informatica also reaffirmed its 2024 financial outlook. It also said Jitesh Ghai, Chief Product Officer, is leaving to pursue another executive position at another company.

The outlook and statement from Informatica lands amid reports that the company was in talks to be acquired by Salesforce. The Wall Street Journal initially reported Salesforce was in advanced talks to acquire Informatica, but later said the two parties couldn't agree on a price.

Reports of a Salesforce-Informatica combination also gave investors time to vote with their money. Salesforce shares took a hit on the news. 

Informatica's position in the market has improved as the need for data management platforms is critical for enterprises to adopt artificial intelligence applications. In addition, Informatica is viewed as a neutral party amid multiple enterprise applications.

An acquisition by Salesforce would have altered that neutrality. Informatica would have had some overlap with Salesforce's MuleSoft unit but would have been largely additive in data integration, metadata management, data governance and master data management. Informatica would have also turbocharged Salesforce's fast-growing Data Cloud.

Data to Decisions Innovation & Product-led Growth informatica Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Financial services firms see genAI use cases leading to efficiency boom

Financial services firms see genAI use cases leading to efficiency boom

Generative AI use cases are proliferating in financial services to illustrate a trend that's a bit counterintuitive--heavily regulated industries appear to be better set up for artificial intelligence because they have their data and controls in order.

Goldman Sachs CEO David Solomon said he's bullish on generative AI as an investment and financing theme as well as for internal operations. Solomon turned up at Google Cloud Next keynote and then followed up on the bank's first quarter earnings call. Goldman Sachs CIO Marco Argenti outlined the firm’s AI thoughts nearly a year ago. Solomon noted:

"For our own operations, we have a leading team of engineers dedicated to exploring and applying machine learning and artificial intelligence applications. We are focused on enhancing productivity, particularly for our developers, and increasing operating efficiency while maintaining a high bar for quality, security, and controls."

JPMorgan Chase CEO Jamie Dimon also spent a lot of space in his shareholder letter on genAI. The company has more than 400 use cases in production. "We're also exploring the potential that genAI can unlock across a range of domains, most notably in software engineering, customer service and operations, as well as in general employee productivity," said Dimon.

What's interesting about genAI use cases is that the regulated industries can move ahead other industries. Why? Regulated industries already have the data best practices and controls required for generative AI deployments. That case was made by AWS' Matt Wood, VP of AI, in a recent meetup.

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly and is brought to you by Hitachi Vantara.

Data from the Stanford University's 2024 AI Index Report highlights how financial services is a standout when it comes to embedding AI into the business. Citing McKinsey data, the AI Index Report highlights how financial services stick out. Here's a few charts to ponder.

At Google Cloud Next, Bernd Leukert, Chief Technology, Data and Innovation Officer at Deutsche Bank, and KeyBank CIO Dean Kontul put a few anecdotes around the use case data. They outlined the use cases being deployed. Kontul noted that 2023 was more about banks navigating interest rates than generative AI. But 2024 features genAI as more of a priority.

Leukert and Kontul said generative AI is a technology where enterprises can't afford to wait because the technology will change business. "I think you have to pick your spot on the hype curve and look at what it can do for operational efficiency," said Kontul. "You have to control for costs, but generative AI use case is going to be real, tangible expense takeouts and eventually remove customer friction so there will be use cases around revenue as well."

Other realities driving use cases for generative AI include:

  • High-end sponsorship internally. CEOs want genAI and that top-down support will move proofs of concept to production.
  • There's an intersection of process improvement and automation and generative AI.
  • GenAI projects will require an ongoing conversation with employees about their roles in the future, how to upskill and reskill and remain with enterprises in different capacities. Leukert said his bank "introduced as a principle that genAI is augmenting human capabilities, not replacing the human."

"There is so much acceleration of movement with generative AI," said Leukert.

Here's a look at some of the use cases highlighted by big banks in recent days.

Document processing to execution workflows. Leukert said generative AI's promise is that it can take thousands of documents coming from customers, process them and then take actions.

Any use case that can be deployed once and then scaled horizontally. Leukert said Deutsche Bank looked at that document processing use case and then noted that it was applicable across multiple services. "We reached out to the business and said let's collect use cases and cluster them into categories and then say if it works in that category for that type of a theme, then we know it is possible to be applicable across multiple application areas across multiple themes in that category," he said.

Business re-engineering. Kontul said genAI will transform business, so it makes sense to think beyond individual use cases. Efficiency is the initial target as genAI can make marketing, software, documentation and code development more efficient. "GenAI will touch every employee in some way at the bank and will do it in a more intuitive way than other technologies in the past," said Kontul.

Risk management. Leukert said generative AI is enabling the bank to be a stronger adviser in times of crisis and better manage risks. For instance, Deutsche Bank has been pouring data into its risk management to better manage liquidity demand during crises (Covid-19, Russia's war with Ukraine) and model next moves. "We think that banks need to manage risks at a much more detailed level than they have in the past to be prepared for the unknown," said Leukert.

Research and data collection. Leukert said content management is a primary use case especially for Deutsche Bank Research, which provides research reports to customers. For analysts, 80% of the work is data collection and 20% is digesting and building the report. Generative AI can now automate much of the data collection and build content to be turned into a report.

Compliance. Kontul said KeyBank has a pilot using genAI to keep apprised of all regulatory changes across states that affect the bank's products.

Engineering productivity. Kontul and Leukert both said their banks were pursuing genAI for coding and engineering productivity. Developer productivity also has the attention of the big players such as Goldman Sachs and JPMorgan Chase.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Equifax bets on Google Cloud Vertex AI to speed up model, scores, data ingestion

Equifax bets on Google Cloud Vertex AI to speed up model, scores, data ingestion

Equifax said it is deploying Google Cloud's Vertex AI platform across its systems as it accelerates data ingestion as well as new product launches.

Speaking on Equifax's first quarter earnings call, CEO Mark Begor said driving AI innovation is key to the company's growth goals via products such as Ignite and Interconnect. Begor said:

"During 2024, we're deploying both Equifax proprietary explainable AI along with Google Vertex AI across Ignite, Interconnect and our global transaction systems.

For Equifax, Vertex AI enables faster and more predictive model development on our Ignite platform. And for our clients, Ignite, which combines data analytics and technology into one cloud-based ecosystem, customers can connect their data with our unique data through our identity resolution process to gain a single holistic view of consumers."

Constellation Research previously covered Equifax's cloud transformation in a customer story. Equifax also was a reference account at Google Cloud Next.

Equifax reported first quarter revenue of $1.39 billion, up 7% from a year ago, with net income of $124.9 million.

Those platforms combined with Equifax's broad reach into datasets means faster ingestion and analytics and new products and services.

Begor said Equifax has access to 100% of the US population through its data sets in a single data fabric. He added that Equifax's cloud infrastructure is processing data 5x faster than its legacy applications could.

Equifax is also planning on completing its cloud transformation, closing data centers and saving $300 million a year in 2024. From there, Equifax will be focused on growing its EFX.AI offerings with higher performing models, scores and data products.

"Completing the cloud transformation also frees up our team to fully focus on growth and expanding innovation, new products and new markets. Our progress towards completing the cloud is gaining momentum with over 70% of our total revenue in the new Equifax Cloud at the end of the quarter," said Begor. "And we're focused on executing the remaining steps to reach 90% with Equifax revenue in the cloud by year-end."

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Meta launches Llama 3, wide availability with more versions on deck

Meta launches Llama 3, wide availability with more versions on deck

Meta launched its Llama 3 open source large language model and said it will be available on AWS, Databricks, Google Cloud, Microsoft Azure, Snowflake, IBM WatsonX, Nvidia NIM and have support from enterprise hardware platforms.

Yes, we're in the age of weekly LLMs that leapfrog each other, but Llama 3, which will initial come in 8B and 70B parameters with more versions on deck, is of interest to enterprises.

Why? Companies are likely to look to capable open source LLMs and then look to fine tune them with enterprise data. Llama 3 represents a backdoor enterprise play for Meta.

In a blog post, Meta outlined the Llama 3 effort, which is a big leap over Llama 2. "Improvements in our post-training procedures substantially reduced false refusal rates, improved alignment, and increased diversity in model responses. We also saw greatly improved capabilities like reasoning, code generation, and instruction following making Llama 3 more steerable," said Meta.

Meta added that it evaluated Llama 3 based on prompts covering 12 use cases including brainstorming, advice, coding, creative writing, extraction and summarization to name a few. In this use case testing scenario, Llama 3 70B topped Claude Sonnet, Mistral Medium and Llama 2.

According to Meta, Llama 3 is being deployed across its applications including Facebook, Instagram and WhatsApp. Llama 3 is also available for a spin on the web.

Going forward, Meta said a 400B parameter model is training and it'll "release multiple models with new capabilities including multimodality, the ability to converse in multiple languages, a much longer context window, and stronger overall capabilities."

Data to Decisions Chief Information Officer

Infosys acquires In-tech, posts mixed Q4 with genAI progress

Infosys acquires In-tech, posts mixed Q4 with genAI progress

Infosys CEO Salil Parekh said the services provider is landing large deals and "seeing excellent traction with our clients for generative AI work," but its fourth quarter was mixed. Infosys also said it would acquire In-tech, an engineering and R&D services provider catering to the German automotive industry.

The company reported fourth-quarter earnings of $958 million on revenue of $4.56 billion, flat from a year ago. Wall Street was expecting Infosys to report fourth quarter earnings of 17 cents a share. For the year ended March 31, Infosys reported earnings of $3.17 billion on revenue of $18.56 billion.

For fiscal 2025, Infosys projected revenue growth of 1% to 3% with operating margins of 20% to 22%.

Constellation Research CEO Ray Wang said Infosys is facing a bevy of moving parts that the company will have to navigate. "The combination of AI arbitrage, margin compression, and exponential efficiency is having an effect on the overall market.  Constellation expects that the overall service market will be flat to single digit growth for the next year," said Wang.

Constellation ShortList™ Digital Transformation Services (DTX): Global

Indeed, Infosys outlined a bevy of items on its earnings call. During the fourth quarter, Infosys said it had "a rescoping and renegotiation of one of the large contracts in the financial services segment." That renegotiation led to a 1% revenue hit, but 85% of the contract continued as is.

Parekh on a conference call said Infosys said fiscal 2024 brought in $17.7 billion in large deals. These large deals revolve around cost efficiency and consolidation.

Here's a look at what Infosys is facing.

The good

Generative AI remains a highlight for Infosys. Parekh said:

"We're working on projects across software engineering, process optimization, customer support, advisory services and sales and marketing areas. We're working with all market-leading open access and closed large language models.

As an example, in software development, we've generated over 3 million lines of code using one of generative AI large language models. In several situations, we've trained the large language models with client specific data within our projects. We've embedded generative AI in our services and developed playbooks for each of our offerings."

Public and private cloud migrations remain a priority. "We continue to work closely with the major public cloud providers and on private cloud programs for clients. Cloud with data is the foundation for AI and generative AI and Cobalt encompasses all of our cloud capabilities," said Parekh.

Data and automation. Parekh said the acquisition of In-tech played into the data strategy for Infosys. "We see data structuring, access, assimilation critical to make large language models and foundation models to work effectively, and we see good traction in our offering to get enterprises, data ready for AI," he said.

The challenges

Jayesh Sanghrajka, CFO of Infosys, said the company saw 180 basis points of margin compression quarter over quarter due to the renegotiation of the large contract, salary increases, brand building and visa expenses.

Infosys did offset the margin hit somewhat by lower post sales customer support and efficiency efforts called Project Maximus.

On the economy, Sanghrajka said:

"We continue to see macroeconomic effects of high inflation as well as high interest rates. This is leading to cautious spend by clients who are focusing on investing in services like data, digital, AI and cloud."

Industry demand

Sanghrajka said industry demand was mixed with strength in financial services and manufacturing as well as retail.

Financial services: "Financial services firms are actively looking to move workloads to cloud, pipeline and deal wins are strong and we are working with our clients on cost optimization and growth initiatives."

Manufacturing: "There is increased traction in areas like engineering, IoT, supply chain, smart manufacturing and digital transformation. In addition, our differentiated approach to AI is helping us gain mind and market share. Topaz resonates well with the clients. We have a healthy pipeline of large and mega deals."

Retail: "In retail, clients are leveraging GenAI to frame use cases for delivering business value. Large engagements are continuing S/4HANA and along with infra, apps, process and enterprise modernization. Cost takeout remains primary focus."

Communications: Clients remain cautious, and budgets are tight. Sanghrajka said cost takeout, AI and database initiatives may show promise.

Overall, Sanghrajka said Infosys should benefit with large deals recently won as well as AI. "We are witnessing more deals around vendor consolidation and infra managed services. Deal pipeline of large and mega deals is strong due to our sustained efforts and proactive pitches of our cost takeouts and digital transformation, etc., across the subsectors," he said.

 

 

 

 

Data to Decisions Next-Generation Customer Experience Tech Optimization infosys Chief Information Officer

SAS launches industry-focused models, Model cards that serve as AI nutrition labels

SAS launches industry-focused models, Model cards that serve as AI nutrition labels

SAS said it will launch a series of AI models that are lightweight and focused on industry use cases. The company also added generative AI features to its Viya platform and unveiled "nutrition labels" for models.

The news, outlined at the company's SAS Innovate conference in Las Vegas, is part of the company's broader investment in AI.

"Once seen as the laggard in the industry, the decades of experience in mastering data by SAS is now very valuable in the packaging of data models for easy consumption for customers for AI.  They are making it easier for customers to put AI to work," said Constellation Research CEO Ray Wang. 

Here's the breakdown of what was announced:

Industry-focused models. SAS said it will roll out a series of models for individual license starting with an AI assistant for warehouse space optimization designed to enable nontechnical users to optimize and plan faster.

The general idea of the industry AI models is to give enterprises something they can deploy quickly with low overhead costs. SAS will target financial, healthcare, manufacturing and public sector AI models.

77% of CxOs see competitive advantage from AI, says survey | Why digital, business transformation projects need new approaches to returns | Why you'll need a chief AI officer | Enterprise generative AI use cases, applications about to surge

SAS' bet is that it can move beyond large language models and drive value with industry-proven AI models for fraud detection, supply chain, document conversation and healthcare payments to name a few.

SAS Viya gets generative AI tools. SAS Viya will get trustworthy genAI tools and introduce a synthetic data generator called SAS Data Maker.

Viya has genAI orchestration tools to integrate external models, Viya Copilot for developer, data science and business productivity, Data Maker, which is in private preview, and genAI features in SAS Customer Intelligence 360.

Model cards and AI governance. SAS also said that it will offer model cards, which are a nutrition label for AI designed to flag bias and model drift, as well as AI governance services.

SAS said Model cards will be an upcoming feature in Viya in mid-2024.

SAS Viya Workbench goes GA. SAS announced the general availability of SAS Viya Workbench aimed at model developers. Viya Workbench is a self-service compute environment for data prep, analysis and developing models. Viya Workbench will be available by the end of 2024 with SAS and Python initially and R to follow. Yiya has two development environment options Jupyter Notebook/JupyterLab and Visual Studio Code. 

AWS and SAS expand partnership. SAS said it has expanded its hosted managed services to AWS including SAS Viya. SAS' full product suite is available on AWS. 

Constellation Research’s take

Andy Thurai, analyst at Constellation Research, said:

“Viya Copilot is a useful offering that can help users with multiple tasks and is likely to be particularly useful in knowledge gap analysis and data wrangling tasks. Viya Copilot could useful in reducing the manual tasks that take up data scientist time.

SAS Data Maker, in private preview, can help users create synthetic data but only in tabular format. The problem is that AI needs a lot of unstructured data, which is harder to generate. SAS Data Maker may not be much of use to many organizations that need synthetic data for AI today. I hope SAS can eventually get there.

Packaged AI models and Industry-specific lightweight models are interesting. It is notable that more vendors seem to be moving towards smaller models. Google just announced last week about the specialized models’ concept that can run on edge and other lightweight locations.

There is an argument that specialized, lightweight models may underperform without augmentation. These models need to be heavily trained on industry-specific data for them to be useful. Model cards and AI governance advisory services can help enterprises improve AI governance. However, there are smaller startups such as Guardrails AI that offer much more broader offerings in this space.

SAS’s AI initiatives are noteworthy and should interest the company’s customers. But the AI market is moving fast and many other vendors have already announced more advanced and game-changing features.”

Constellation Research analyst Doug Henschen gave his take on Viya Copilot and SAS Data Maker as well as the big picture. 

"Both Viya Copilot and SAS Data Maker are in private preview at this point, so I’d say they are potentially significant. The part that stands out for me is Data Maker, as SAS is one of the few vendors that is talking about and addressing the need for synthetic data generation. Constellation believes data scarcity will limit the accuracy and effectiveness of AI-based systems. SAS is one of the few companies talking about this capability in the context of their generative AI capabilities."

"SAS’s Copilot is similar to what many vendors have announced, and what a few now have available in public preview or even general availability. The capability that is somewhat differentiated is SAS Data Maker. All the big cloud vendors and big companies like IBM support synthetic data generation, but they don’t tend to talk about it in the same context as natural language-based GenAI. It’s a technique in which you feed the AI samples of data that you might have at a relatively small scale and the system will then generate similar, non-privacy-sensitive data sets for use in training on a massive scale." 

SAS has taken a conservative approach, but it has moved more quickly to infuse GenAI into its Customer 360 app, because that’s where competitors including Adobe and Salesforce have been pushing GenAI aggressively. Large customers doing analytics and AI at scale do not switch horses quickly based on this or that hot new feature. In fact, plenty of companies and CXOs are being very cautious about GenAI. What you will see, and what we have already seen over the years, is innovation teams doing experiments with cutting-edge vendors or cutting-edge capabilities provided by cloud vendors, for example. So SAS has to keep up. I see the company’s Generative AI Orchestration announcement as a signal to customers that SAS will enable customers to tap into a proven stable of open models when they are ready to pursue GenAI at scale.   

The pace of innovation is constantly accelerating, particularly in GenAI, so I’d like to see these private-preview announcements move into public preview and general availability as quickly as possible. I’d also like to see more detail on the portfolio of models SAS plans to make available for orchestration and where those models fit with various use cases and industry applications."

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Big Data business Marketing SaaS PaaS IaaS CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer