Results

Sam's Club CEO Nicholas on AI, frictionless commerce, focus on members

Chris Nicholas, President and CEO Sam's Club, said artificial intelligence is enabling the company to "take 100 million tasks out of our clubs" even though it has more associates.

The game plan for Sam's Club, a unit of Walmart, is to leverage technology to enable employees to solve customer problems and drive engagement, said Nicholas, speaking at the Constellation Research AI Forum in New York.

Nicholas said:

"We will take 100 million tasks out of our clubs, and we will have more associates. Why? What are they doing? They are solving members problems. They are driving engagement. They are building our E commerce business. They are connecting on services that we're offering. What it does is it opens up the aperture of people, and efficiency needs to be some kind of like negative connection."

Nicholas said technology needs to help people build careers, build careers and solve problems and then find the next round of opportunities. He added:

"Going forward, the superpower of a business to win from a business perspective is going to be in the connections with people. It just is. I know you can have great nuance with the tone of a generative AI system, and that will solve the problems I don't want my associates to solve. Then they can spend time solving the more deeply connected, more empathic solutions."

Other takeaways from Nicholas:

AI and technology implementations need to start with design thinking, the problems that need to be solved and the customer journey. Nicholas said personalization, Sam's Club member value and experiences are solved "only by technology and the application of data assimilated through artificial intelligence," said Nicholas.

More than a third of Sam's Club shopping visits are using Scan and Go technology for frictionless checkouts. "The adoption curve is rapid and once people have done it, they never go back,” said Nicholas.

Computer vision boosting inventory. Nicholas said:

"We were literally asking people to walk around the club, look for pallets and write on a clipboard if they could see a palette or not. This still happens in most places. Nobody was doing inventory. Terrible, terrible idea, and they're doing it every day. What we did was we had these floor scrubbers, and we said, 'how about put cameras on the floor scrubbers?' We take 23 million images a day of where every palette is, everywhere, multiple times a day, so that we know what's in stock, what's not in stock, where the palettes are, what needs to come down, when and how. By doing that, you just enabled the associate with the app to say, what's the next best task? What we realized through that is that the quality of computer vision imagery getting was so high that we've put it to other applications."

Focus on the work and people, not the technology. Nicholas said enterprises can be carried away with technology, data, AI and new applications, but "the real value is in the actual work that people are doing." "If you take friction away from their lives and empower them you don't need market adoption," said Nicholas. "You just need to make it easy and adoption will happen."

Data to Decisions Future of Work Matrix Commerce Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Supply Chain Officer Chief Technology Officer Chief Executive Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Boomi CEO Lucas: AI agents will outnumber your human employees soon

Boomi CEO Steve Lucas said that the number of AI agents will outnumber the number of people in your business in less than three years.

"The digital imperative is how do I work with agents? The number of agents will outnumber the number of humans in less than three years," said Lucas, speaking at Constellation Research's AI Forum in New York. "It will be overwhelming. It will be fast and we're not prepared."

Lucas said that he's challenging his CIO to rid Boomi of expense reports. It's 2024 and it's time to get together with AI and make sure humans never have to approve an expense report.

"The question everyone should be looking at is how can I augment every single business process that I have today with AI," said Lucas. "Try a process, and find a way AI can augment it and reduce human time consumption. No. 2 is how can you automate my business and eliminate the need for humans in specific areas."

Constellation Research analyst Holger Mueller said:

"The three year horizon is probably too conservative. If the trend continues that agents can be built by low code and no code tools and citizen developers. We will reach that point easily by end of 2025. The hunger for more automation powered by a nothing and also with AI is unstoppable. Give business users the tools and they will go fish."

Lucas said he wasn't talking about reducing jobs as much as redundancies. The big issue will be orchestrating agents, he added. In a nutshell, Lucas sees layers of agents all checking for hallucinations, accounting compliance and other issues.

Boomi is working on a registry to manage agents. "We are building what we call an Active Directory for agents, where you can register them, track them, understand the decisions, revoke authority, and do it in real time. If you don't have that doesn't matter what agents you invent. It doesn't matter what you do with AI. You have to have inside transparency, explainability, control," said Lucas.

"Under Steve Lucas' watch, Boomi has accelerated its pace of innovation and is now leading the way among integration platform as a service (iPaaS) vendors toward AI orchestration and the development of agentic applications," said Constellation Research analyst Doug Henschen.

Other takeaways from Lucas' talk:

Data strategy and quality remains job one. Lucas said "the rule hasn't changed. It's garbage in, garbage out." Lucas said he's "amazed at how few companies are really prepared across the complex landscape of apps, databases, APIs and models that they don't know where the data comes from to feed these systems."

Boomi is feeding information into Llama 3 models internally. "We give it everything from all of our product, pricing and packaging data, which changes frequently," explained Lucas. "We feed it Slack channel information from all of our employee, customer success and support conversations. It knows everything. Why? A challenge I'd put in front of you is to build an agent that is better than any human in a role."

AI improvement. Lucas said his bet is that can get 10% incrementally better every year. "If AI gets just 10% better each year we may not reach artificial general intelligence, but we will get close," said Lucas. "Whether it is truly conscious or not won't matter, it will do most of our jobs better than we will as defined today. This is the AI Big Bang, the real version."

Will we be talking about AI a year from now? Lucas said the context of AI conversations will change, but the topic remains. "We are going to need agents, registries and protocols. We will need whole protocols for agents to define how they communicate so we humans can even understand what we're talking about. What does that stack look like from data all the way up to development. Agents will communicate at speeds we cannot even understand."

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

An Origin Story

What exactly makes any data valuable?

In my previous blog, The Future of Data Protection, I started to look at what it is that makes data valuable. I think this is the best way to frame the future of data protection. In each application, we must know where the value in a piece of data lies if we are to protect it.

There are so many different things that might matter about a piece of data and thus make it valuable:

  • Authorship, including the authority or reputation of the author(s).
  • Evidence, references, peer review, repeatability and so on.   
  • In the case of identifiable (personal) data, the individual’s consent to have the data processed.
  • Details of the data collection process, ethics approval, or instrumentation as applicable.
  • The algorithms (including software version numbers) used in analytics or automated decisions.
  • Data processing system audits.
  • Sometimes the locality or jurisdiction where data has been held is important.
  • As data is added to, who were the contributors, and what were their affiliations?
  • The release of data to the public or specific users may need specific approvals.
  • What rights or conditions attach to released data as to further use or distribution?

A lot of this boils down to origin. Where did a given piece of data come from? 

This simple question is inherently difficult to answer for most data, because raw data of course is just ones and zeros, able to be copied ad infinitum for near zero cost.

But several interesting approaches are emerging for telling the story behind a piece of data; that is, conveying the origins of data. These are some of the first examples of the solutions category I call Data Protection Infostructure.

Proof of personhood

How can we tell human authors and artists from robots?  Or new bank account applicants from bots? The rise of Generative AI and synthetic identities has driven the need to know if we are dealing with a person or an automaton.

Identity crime is frequently perpetrated using stolen personal data. To fight this, we need to know not just the original source of identification data but also the source of each presentation.  In other words, what path did a piece of important data take to get to where it needs to be used?

A sub-category of Data Protection Infostructure is emerging around proof of personhood.

Delivering this sort of assurance in a commercially sustainable way is proving harder than it looks. Only recently, an especially promising start-up IDPartner Systems, led by digital identity veteran Rod Boothby was unexpectedly wound up.

Content Provenance

A conceptually elegant capability with plenty of technical precedents is to digitally sign important content at the source, to convey its provenance. That’s how code signing works.

The Coalition for Content Provenance and Authenticity (C2PA) is developing a set of PKI-based standards with which content creators can be endorsed and certified with individual signing keys. C2PA will be implemented within existing authority and reputation structures such as broadcast media licensing, journalist credentialing, academic publishing and peer review.

Similar proposals are in varying stages of development for watermarking generative AI outputs and digitally signing photographic images immediately after capture, within camera sensors.

Confidential Computing

The path taken by any important data today can be complicated.

For example, powerful AI-based image processing is now built into many smartphone cameras; the automatic manipulation of regular photographs can be controversial, even if it’s not intended to mislead.

And the importance of Big Data and AI in all sorts of customer management and decision support systems has led to strengthened consumer protections (most notably in the European Union’s AI Act) to provide algorithmic accountability and explainability.

So, data now flows through complex and increasingly automated supply chains. Signing important data “at the source” isn’t enough when it goes through so many perfectly legitimate processing stages before reaching a consumer or a decision maker. Data may be transformed by AI systems that have been shaped by vastly greater volumes of training data. Moreover, those AI models may be evolving in real time, so the state of an algorithm or software program might be just as important to a computation as the input data was.  

And we haven’t even touched on all the cryptographic key management needed for reliable signing and scalable verification.

For these reasons and more, there is an urgent need to safeguard data value chains in their entirety — from the rawest of raw data, before it leaves the silicon, through all processing and transformations. We are approaching a point in the growth of computing where every change to every piece of data needs to be accounted for.

Such a degree of control might seem fanciful but the Confidential Computing movement has the vision and, moreover, the key technology players that are needed to fundamentally harden every single link in the data supply ecosystem.  

See also

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience adobe nvidia Google Microsoft ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Information Officer Chief Digital Officer Chief Data Officer Chief Information Security Officer Chief Technology Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Enterprises leading with AI plan next genAI, agentic AI phases

Enterprises that are leaders in artificial intelligence are evolving quickly and laying out plans for their next phases. The next phase of genAI deployments revolve around integrating AI into the business, optimizing processes, agility and scale. You should watch these companies closely for the road ahead.

When I think of leading AI companies, I'm usually looking for the buy side of the enterprise equation. These companies are deploying AI as well as integrating it into their business. For instance, Rocket spent years deploying its data platform and working with machine learning and AI models. When generative AI hit the enterprise, Rocket had all the DNA to move quickly and leverage the technology to deliver better customer experiences.

Intuit was in a similar position and saw its bets on data science and then AI work. It has an Investor Day coming up that'll feature a lot of strategy and AI. JPMorgan Chase is another player that has moved to the next phase of its genAI strategy. The upshot is that JPMorgan Chase has moved to integrating AI operations into its various business units.

JPMorgan Chase: Digital transformation, AI and data strategy sets up generative AI (download PDF) | JPMorgan Chase: Why we're the biggest tech spender in banking

With that backdrop, it's prudent to think of customer stories and case studies as living documents. Consider Rocket. Constellation Insights documented Rocket's use of AI and strategy (PDF) in May. Since then, the company has named a new Chief Technology Officer and outlined its next steps in its AI progression. Perhaps the biggest takeaway from Rocket is that there are no overnight AI successes. Rocket has spent $500 million over the last five years on Rocket Logic, its proprietary loan origination system that uses AI to streamline income verification, document processing and underwriting.

Simply put, AI isn't merely lift and shift. The work is never done.

Rocket CEO Varun Krishna said the company's "super stack" of technology is critical. "What makes our super stack special from a technology perspective? We have created a groundbreaking new architecture. It's data powered, humanity driven, and self-learning. This engine fuels every aspect of our ecosystem and we've spent years perfecting it. It's now driving efficiency, velocity and experience across the company," he said.

Heather Lovier, Chief Operating Officer at Rocket, said the technology stack has been applied to every process at Rocket. In underwriting, the company has " taken complex processes and the categories of income property asset, and credit and broken them down into hundreds of thousands of discrete tasks in order to apply automation and AI." That approach also expands into the experience layer. Lovier said it's a game of inches and continual improvement with AI. "We've been working on AI long before it was sexy," she said.

During Rocket's investor day, CTO Shawn Malhotra, who started in May after being Head of Engineering and Product Development at Thomson Reuters, laid out the plan. First, Malhotra outlined how Rocket's previous technology decisions left it in good shape for genAI. "AI is not new. It's been around for a while and it's been powerful for a while. Rocket delivered its first production AI models back in 2012," said Malhotra. "We now have more than 200 AI models in production adding real value for our business and our clients. It's important to remember that AI is not a what, it's a how that enables powerful outcomes. Those outcomes are what matter to our clients and our business."

For Rocket, those outcomes revolve around using AI to enhance the "entire homeownership journey" with more seamless processes that are efficient and personalized. This strategy means AI touches every customer touchpoint. Rocket works with Amazon Web Services and Anthropic to meld third-party models and proprietary systems. "We're always going to focus on our secret sauce and our proprietary AI models, but then we're going to deeply partner with the world's best to great large language models that are across domains," said Malhotra. "We're not going to just buy this from them. We're going to co-create."

GenAI is also about productivity. Rocket said that it has reduced closing times for refinancing and home equity loans by 30% to 45% faster than industry benchmarks. The company also uses AI to process more than 300,000 transcripts from client calls weekly to extract data points.

Malhotra said the next phase of AI for Rocket is about accelerating experiences in the homeownership journey. GenAI will be deployed more in marketing automation, customer interactions and servicing. Rocket also plans to double its use of AI in software development and customer-facing operations.

"We're helping our developers produce more code automatically in the last 30 days. We estimate that 27% of our code over a quarter was written by the AI. That's a good start, but soon, we're going to be doubling the number of developers who are using these tools," said Malhotra.

AI-powered chat across all digital platforms was another key theme. Malhotra said models will be improved to blend human empathy with models.

He outlined the following AI services that are being enhanced or rolled out.

Rocket Data Platform: Malhotra said that data platform needs continual improvement with new features such as ID resolution, more data ingestion and democratization of access with natural language.

Rocket Exchange: A platform with 122 proprietary models powered by 6TB of data that allows the company to accurately price mortgage-based securities in less than a minute.

Rocket Logic Assistant: A personal assistant for mortgage bankers, Rocket Logic Assistant transcribes calls, auto-completes mortgage applications, and extracts key data points. This helps bankers focus on relationship-building and customer service rather than manual tasks. Rocket plans to expand its capabilities and automate more of the mortgage process.

Pathfinder and Rocket Logic Synopsis: These tools use generative AI to assist customer service agents by reducing the time needed to resolve customer queries. This has led to a 68% reduction in the time to resolve client requests.

Rocket Navigator will be built out so non-technical team members can leverage AI and contribute to product development.

Rocket said it is increasingly focused on developing AI tools to improve the midfunnel experiences with a focus on personalization and long-tail engagement. These tools will deliver bespoke guidance, educational content, and tailored recommendations to potential homebuyers over time.

The company also hinted that it wants to develop AI-powered real estate tools to help homebuyers search for properties, get insights and automate more of the home-buying process beyond mortgages. This effort is likely to be built on data from Rocket Homes and other properties in its ecosystem.

Insights Archive

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Executive Officer Chief Information Officer Chief Data Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

The Future of Data Protection

I recently released my latest Constellation ShortList™ for “Data Protection Infostructure” . In this blog post and the next, I drill into what these sorts of solutions are seeking to do.

“Data protection” in many parts of the world is simply synonymous with data privacy.  For instance, the General Data Protection Regulation (GDPR) is a data privacy regime; it is very specifically about limiting the flow of personal information, a special class of data.  Further, Europeans tend to refer to privacy regulators as Data Protection Authorities, and the conventional privacy compliance tool is the Data Protection Impact Assessment.  

So “Data Protection” in Europe has a narrow, even technical, meaning.

Now, regular readers will know I am a huge fan of regulating the collection, use and disclosure of personal information. A great many problems of the digital era, from Surveillance Capitalism to Deep Fakes can be tackled by more strenuous and creative application of regular privacy rules featured in most legal systems.

Nevertheless, there is more to data protection than privacy. Privacy by its nature is restrictive. I’d like to spark a broader discussion about what it is about data that needs protecting. We could begin by asking, What is it that makes data valuable?  

First let’s review how security professionals think about data.

Conventional wisdom in data security is that threats to information assets can be viewed in three different dimensions: Confidentiality, Integrity and Availability (or “C-I-A”). Different asset classes can be stronger to different degrees in any of these dimensions. For instance, patient information needs to be especially confidential but medical records also need to have high availability if they are to be useful at a point of care, and high integrity (error resistance) to keep patients safe.

On the other hand, historical employee records — often retained for legal reasons for seven years or more — might not need to be highly available, so archiving on magnetic tape or even paper is worthwhile to keep personal data away from hackers.

But the “C-I-A” perspective is missing so many of the richer dimensions that make data valuable.

Consider three current hot topics:

  1. Identity Theft is generally perpetrated by data thieves who acquire personal data and use it to impersonate their victims. The problem is that automated identification systems can’t tell if personal data is being presented by the individual concerned or by an imposter (see also my analysis of data breaches).
  2. Deep Fakes are images or audio that look or sound like real people but have actually been synthesised artificially (typically by Generative AI) instead of recording the real thing.
  3. And speaking of AI, there is increasing interest in the history of how models are trained. What sort of training data was used? Was it broad and deep enough to be free of bias? Were people in that data aware that it would be used to rain AIs?

Availability, Integrity and Confidentiality are not useful ways to think about safeguarding data in any of these cases. Think about how most LLMs today are rained on "public domain" data. No matter where you stand on the question of creators' intellectual property rights, we would all agree it's too late to make the artworks in question confidential. 

Instead of "C", "I" or "A", stakeholders across these and similar examples may want assurances that:

  • personal data submitted by a purported individual opening an account or applying for a job was really presented by that person
  • creative works used to train an AI model have been licensed for use
  • medical data used to train a diagnostic tool has been audited for bias and came from patients who gave informed consent
  • the science behind a diagnostic tool has been properly evaluated, and
  • software used to generate a particular result was version controlled and can be wound back to an earlier release if bugs are found.

From one digital use case to another, there will be different aspects or qualities of the data concerned that make the data fit for purpose — or in other words, valuable.

In my next blog, I will focus on one such dimension that’s missing from the traditional C-I-A picture: the origins of data.

 

New C-Suite Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Next-Generation Customer Experience Future of Work Tech Optimization ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR Chief Digital Officer Chief Data Officer Chief Information Security Officer Chief Privacy Officer Chief Technology Officer Chief Executive Officer Chief Information Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

Constellation Energy, Microsoft ink nuclear power pact for AI data center

Constellation Energy said that it is restarting Three Mile Island (TMI) Unit 1 and will sell about 835 megawatts of power to Microsoft for AI workloads.

In a release, Constellation Energy said that the deal with Microsoft is its largest power purchase agreement. TMI Unit 1 is adjacent to TMI Unit 2, which shutdown in 1979 and is being decommissioned. TMI Unit 1 is an independent facility and hasn't been impacted by Unit 2. Constellation Energy bought TMI Unit 1 in 1999.

Nuclear power has seen a resurgence in interest as the electricity grid strains under data center workloads due to generative AI. In addition, technology giants are trying to find a way to power AI workloads and hit carbon neutral goals. Simply put, hyperscale cloud nuclear deals may become more commonplace. In January, Amazon Web Services acquired a data center attached to Talen Energy's nuclear plant. Talen Energy will sell power to AWS.

Generative AI driving interest in nuclear power for data centers

In recent weeks, the drumbeat behind nuclear power as a solution for AI data center needs has picked up. Oracle CTO Larry Ellison talked up nuclear-powered data centers on the company's earnings conference call. Meanwhile, OpenAI CEO Sam Altman is chairman of Oklo, which is touting mini-nuclear reactors that can scale with data centers.

Constellation Energy has advocated for co-locating data centers at nuclear power plants as a way to build out infrastructure for AI quickly.

For Constellation Energy, the Microsoft deal is big. Five years ago, the power company shut down Three Mile Island Unit 1 due to poor economics. In a statement, Joe Dominguez, CEO of Constellation Energy, said "powering industries critical to our nation’s global economic and technological competitiveness, including data centers, requires an abundance of energy that is carbon-free and reliable every hour of every day, and nuclear plants are the only energy sources that can consistently deliver on that promise."

Earlier this week, Microsoft announced a partnership with BlackRock, Global Infrastructure Partners and MGX to invest in data centers and energy infrastructure to power AI.

Speaking on Constellation Energy's second quarter earnings call last month, Dominguez said:

"We're continuing to do well in our discussions and negotiations with data center companies. The simple fact is that data centers are coming and they're essential to America's national security and economic competitiveness. And it's absolutely critical that the U.S. not fall behind it. Time is of the essence. We simply cannot wait years for the data centers that are going to bring transformations."

Dominguez added that sustainability is also playing a role in nuclear and renewable energy demand:

"We're seeing more evidence of our customers, not just data center customers, but customers as a whole, evolving in their sustainability journeys from buying annual clean energy products to starting to match their hourly consumption with clean energy."

Bottom line: Nuclear power is likely to play a big role in the AI factory buildout.

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity Big Data AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

How Iron Mountain built its InSight DXP on MongoDB

Iron Mountain recently announced that its InSight Digital Experience Platform (DXP) will use MongoDB Atlas and MongoDB Atlas Vector Search for document processing, workflow automation and information governance.

The company's InSight DXP is a product that highlights Iron Mountain's overall transformation. The company has been best known for its shredding and disposable of physical documents, but has expanded into digital services, document lifecycle management and even leasing data centers.

Iron Mountain's revenue for the first six months of 2024 was $3 billion, up 13% from a year ago. Storage rental revenue was $1.8 billion for that time period with services revenue of $1.2 billion.

We caught up with Adam Williams, Vice President of Global Platforms at Iron Mountain, to talk about use of MongoDB and scaling. Here's a look at some of the key takeaways.

The move from physical documents to digital transformation services. Williams said Iron Mountain has housed and digitized many physical assets including microfilm, microfiche and physical assets. "Customers were then asking us 'can you digitize those for us and management them as well?'" said Williams. "That's where we got into content management and repositories."

The decision to build instead of buy. Williams said that Iron Mountain initially used a bevy of vendors in enterprise content management and for content services platforms. "We work with large banks, large insurance companies and government agencies that have petabytes of data. For us to be able to store that data we need to have a very elastic and scalable database," said Williams. Williams said Iron Mountain decided to build on MongoDB's NoSQL and Atlas platform and consolidated a search vendor it was using.

"We were looking for the ability to do more at scale but without the overhead," said Williams, who noted that vector search was also critical. "We entered genAI, so we were able to take Mongo Atlas, search, the vectorization and then those SQL capabilities, and instead of using a patchwork of vendors, we're able to work with a single vendor. But more importantly, we only move the data once. We don't have to move the data three different times to three different places and then pay for it in three different places."

A multicloud architecture. Given Iron Mountain's footprint across the Fortune 500, many customers have data residency requirements. Iron Mountain supports AWS, Microsoft Azure and Google Cloud, but has stringent multicloud requirements that led to MongoDB instead of separate databases on each cloud, said Williams.

A platform to accommodate multiple customers. InSight DXP's front end and user experience is built by Iron Mountain. Williams explained:

"We have a modern user experience that we built. What's really needed is a user experience is customizable. At Iron Mountain, we deal with a lot of different industries. Our technology strategy with the platform that we built was actually designed out of frustration with all of the different industries that we work for. I found myself as a leader having to work with energy in the morning, healthcare in the afternoon, and then the next day I'm talking to financial services. You end up in this never-ending cycle where you can't be enough to please everybody. By going into platform approach, we're able to create customizable experiences for the different industries. With our updated user interface, we bring workflow, which we've built, and a connector strategy. We also bring in our data processing capability with the intelligent document processing that we've built."

Williams added that Insight DXP can pull in the data whether it's digitized by Iron Mountain or already digitized, transform, extract the metadata and move to workflows with information governance and management. GenAI can be used to gain insights from unstructured data.

The Insight DXP platform includes intelligence document processing with traditional AI and machine learning and internal models. Content management with the ability to absorb documents, edit and manage data and apply governance such as retention schedules and disposition of assets. Customers need audit ready compliance for all documents and data.

Business models. Iron Mountain charges a subscription for InSight DXP based on number of workflows, document times, number of users, overall size of assets and other metrics. Iron Mountain pays MongoDB based on consumption.

GenAI. Williams said Iron Mountain is building out its genAI product and the challenge is offering those services affordably. "We want to make sure we understand all of the finances behind genAI and we're getting better views from MongoDB on the costs by the different services and the different SKUs," said Williams. Security compliance with genAI is also critical for regulated industries and MongoDB has the controls to understand what data is being shared, he added. Iron Mountain is generally using Microsoft OpenAI since most customers are comfortable with it and there's a data privacy guarantee. 

Data to Decisions Future of Work Digital Safety, Privacy & Cybersecurity Tech Optimization Sales Marketing mongodb Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Chief Information Officer Chief Marketing Officer

Amazon launches Project Amelia for third party sellers, powered by AWS

Amazon is rolling out a series of generative AI tools for third-party sellers built on Amazon Web Services' models and Amazon Bedrock.

The announcement of the tools highlights how Amazon can showcase AWS and its advertising capabilities with its commerce platform. Among the tools of note is Project Amelia, which serves as a personal selling expert.

Project Amelia, which is in beta and built on Amazon Bedrock, understands a seller's business and provides recommendations, insights and information. Amazon said Project Amelia will ultimately be able to recognize opportunities, diagnose issues, offer tips on how to grow revenue and optimize inventory and act as an agent to take action autonomously.

Project Amelia can:

  • Answer knowledge-based questions with personalized information and best practices.
  • Serve up sales data, traffic information and business metrics.
  • Resolve issues and take actions.

Other genAI tools being rolled out to Amazon sellers, typically small and midsized businesses, include:

Generative AI content and product listings. Amazon said it is upgrading capabilities with its genAI product listings so sellers can create multiple listings and workflows at the same time. For instance, a seller could upload a spreadsheet with listing details and Amazon genAI would create titles, bullet points and descriptions.

A+ Content, which gives brands on Amazon the ability to develop custom content, images, carousels and iterate. A+ Content is available today in the US with more countries rolling out by the end of the year.

Personalized product recommendations on the website and in Amazon's shopping app. Amazon uses genAI to tailor recommendations based on time, topic and season. Amazon said:

"By analyzing product attributes and customer shopping information, like preferences, search, browsing, and purchase history, we leverage a Large Language Model (LLM) to edit a product title to highlight features that we believe are most important to the customer and their current shopping activity. To ensure these titles accurately reflect what matters most to each individual, another LLM, known as an evaluator LLM, challenges and improves the results, assuring customers see the best possible product information."

GenAI video ads. Amazon is leveraging genAI and its ad unit to create video ads. Amazon launched Video Generator, which will give sellers Amazon's ad tools to create video based on a single product image for free.

Data to Decisions Marketing Transformation Matrix Commerce Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Marketing Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

T-Mobile, OpenAI to launch custom AI-driven IntentCX system

T-Mobile said it is custom building an "intent-driven AI-decisioning platform called IntentCX with OpenAI. The partnership with OpenAI was part of a broader AI push by T-Mobile.

The partnership between OpenAI and T-Mobile is notable on a few fronts. First, the OpenAI deal with T-Mobile highlights the frenemy arrangement with Microsoft. In addition, the T-Mobile-OpenAI deal comes a day after Salesforce CEO Marc Benioff ranted against do-it-yourself approaches to generative and agentic AI.

Under the OpenAI and T-Mobile deal, the companies will combine the wireless carrier's data on intent, customers and sentiment in real time and couple it with OpenAI models. The two companies said they will continual to collaborate to develop AI services and tools as part of a multi-year improvement.

According to the companies, IntentCX will do the following:

  • Apply understanding and knowledge to every interaction.
  • Resolve issues and take proactive actions.
  • Maximize T-Mobile customer journeys and ultimately provide a blueprint that can be commercialized to other industries.
  • Personalize service with a combination of humans and digital agents.
  • Navigate multi-threaded conversations across languages with context.
  • Take action autonomously where needed.
  • Tap into OpenAI's latest models to improve engagement.

IntentCX will be trained in T-Mobile's customer care and team of experts business processes as well as billions of data points from customer interactions.

In a statement, T-Mobile CEO Mike Sievert said:

"IntentCX is much more than chatbots. Our customers leave millions of clues about how they want to be treated through their real experiences and interactions, and now we’ll use that deep data to supercharge our Care team as they work to perfect customer journeys."

For OpenAI CEO Sam Altman, the T-Mobile deal could pave the way for industry-specific platforms leveraging its models. T-Mobile said it is actively testing IntentCX with implementation on tap for 2025.

Along with the OpenAI deal, T-Mobile announced the following:

  • A technology partnership with Nvidia, Ericsson and Nokia to leverage AI in mobile networking infrastructure. The AI radio access network (RAN) will combine T-Mobile's 5G expertise with Nvidia AI Aerial platform and networking knowhow from Ericsson and Nokia. This effort will increase speeds and leverage AI to bolster gaming, video, social media and augmented reality.
  • A plan to enable AI customer experiences and grow market share. T-Mobile aims to reach 12 million 5G broadband customers by 2028 using excess capacity, a more than 50% increase from its previous target of 7 to 8 million customers by 2025. Service revenue growth is expected to have a compound annual growth rate of 5% between 2023 and 2027. AI and efficiencies are expected to boost adjusted Ebitda and cash flow. T-Mobile projected $18 billion to $19 billion in adjusted free cash flow in 2027.
  • T-Priority, a network slice for first responders.

Related:

Data to Decisions Future of Work Innovation & Product-led Growth Next-Generation Customer Experience Tech Optimization Digital Safety, Privacy & Cybersecurity openai ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

HubSpot launches Breeze AI agents, Breeze Intelligence for data enrichment

HubSpot launched Breeze, the company's AI that features copilots and agents, and Breeze Intelligence, which enriches data and identifies the best prospects.

The tandem of Breeze and Breeze Intelligence, outlined in HubSpot's Fall 2024 Spotlight, are designed to aid marketing, sales and service teams drive revenue.

Breeze includes Copilot, an AI work companion, four Breeze Agents focused on content, social media, prospecting and customer interactions, and more than 80 additional features. HubSpot also announced agent.ai, a marketplace for agents that has Agent Builder. HubSpot said agent.ai has more than 47,000 users and more than 1,700 builders.

With Breeze Intelligence, HubSpot is looking to solve multiple pain points. Breeze Intelligence includes data enrichment via more than 200 million buyers and company profiles and contacts in HubSpot Smart CRM. Buyer intent is designed to find the best prospects and form shortening tools to increase conversion by adding information Breeze Intelligence already knows.

"Our customers acquired a bunch of point solutions, and they are struggling to integrate all of those data points into actionable insights and to control the cost, and they're looking for much better ways to be able to get all of that information into a single platform," said HubSpot CEO Yamini Rangan on the company's second quarter earnings call. 

HubSpot also updated Marketing Hub and Content Hub with Content Remix for video, tools like Lead Scoring and Google Enhanced Conversions and tools to measure impact with Marketing Analytics Suite.

Data to Decisions Marketing Transformation Next-Generation Customer Experience Sales Marketing New C-Suite Hubspot Marketing B2B B2C CX Customer Experience EX Employee Experience AI ML Generative AI Analytics Automation Cloud Digital Transformation Disruptive Technology Growth eCommerce Enterprise Software Next Gen Apps Social Customer Service Content Management Collaboration Chief Information Officer Chief Marketing Officer Chief Revenue Officer Chief Customer Officer Chief People Officer