Results

Google I/O 2024: Multimodal Gemini, Project Astra, AI agents and 'teammates'

Google I/O 2024: Multimodal Gemini, Project Astra, AI agents and 'teammates'

Google I/O 2024 featured a bevy of generative AI advances that will be layered throughout Google's product portfolio--photos, music, Workspace, search, video and other areas--but the real takeaway was that the company laid out its vision for agents, models training models, and creating systems that work for you.

Sundar Pichai, Alphabet CEO, outlined the vision a day after OpenAI laid out GPT-4o.

"With multi-modality soon you'll be able to mix and match inputs and outputs. This is what we mean when we say it's an IO for a new generation. That's one of the opportunities we see with AI agents. I think of them as intelligent systems that show reasoning, planning and memory and are able to think multiple steps ahead. They work across software and systems, all to get something done on your behalf and most importantly, under your supervision.

We are still in the early days, and you will see glimpses of our approach throughout the day."

Search is obviously the most important venue for Google generative AI given that it funds the model buildout. Google rolled out multistep reasoning in search and features such as AI Overview. Google search will ultimately reason, do research for you and tap into real-time data. We assume that this work on your behalf will feature advertising, but the data loop may be more important. 

But search was just the obvious headliner--for developers and Wall Street. Here are the takeaways to note from Google I/O's barrage of news, which served as a sequel to Google Cloud Next.

Project Astra. Astra is a "universal AI agent" designed to be helpful in everyday life. Astra is one of the reasons Gemini is multimodal and has AI training AI models. Google Deepmind Chief Demis Hassabis said Astra makes is a creation that's based on models training models and data feedback loops.

"At any one time, we have many different models in training, and we use our very large and powerful ones to help teach and train our production ready models together with user feedback," said Hassabis.

The token arms race. Pichai outlined how 1.5 million developers are using Gemini models to create next-gen AI applications. Gemini 1.5 Pro will now have a 2 million token context window and aggressive pricing. Gemini 1.5 Pro will be available today in Workspace Labs and be capable of synthesizing meetings, documents and your Gmail quickly.

Trillium, a new TPU that will be available on Google Cloud for machine learning and AI workloads. "Trillium delivers a 4.7x improvement in compute performance per chip. Over the previous generation. So most efficient, and performant TPU today will make Trillium available to our cloud customers in late 2024," said Pichai, who noted that Google Cloud will offer Nvidia Blackwell GPUs too. Pichai's message was that Google has the scale to keep building out infrastructure for the AI arms race whether it's in liquid cooling advances, custom processors and fiber cables around the world. 

Gemini 1.5 Flash is a model that's expected to be as powerful as Gemini 1.5 Pro, but be faster. Gemini 1.5 Flash is designed for low latency tasks as well as the small, but fast model trend. Gemini 1.5 Flash, available for public preview, is built for speed and real-time answers such as customer service responses even with an up to 1 million token window.

Search will have planning capabilities. Google pitched a vision where search can answer complex questions and plan. When looking for ideas, Google will give you an AI-generated answer powered by Gemini. These answers and suggestions will likely be monetized. AI-organized search results will appear when you're looking for inspiration. Don't be surprised if these planning capabilities turn up in Google Cloud services. Would business and process planning be much of a leap?

Meet your AI teammate. Workspace's Gemini-powered side panel will be available this month. These features across Workspace apps--Gmail, Meet, Docs--were announced at Google Cloud Next. Google demonstrated AI Teammate, an agent that has a Workspace account and participates in tasks and projects. The demo featured "Chip" configured by the business. The timeline? Stay tuned. Google is also rolling out Gems, which is a personalized Gemini assistant.

Gemini Nano will be multi-modal on Android devices. This AI on device announcement front runs Apple WWDC, but the move highlights how smartphones and edge devices will run models with low latency and create new experiences. Android 15 will feature "Gemini at the core."

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Google Cloud Google SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

AWS names Garman CEO effective June 3

AWS names Garman CEO effective June 3

Matt Garman will become the new CEO of Amazon Web Services as Adam Selipsky is stepping down. Garman takes over June 3.

The Verge first published the memo from Jassy to AWS. Sources confirmed the report is accurate with more to come from the company shortly.

Amazon published the memos from Jassy, Selipsky and Garman.

Garman currently is Senior Vice President of AWS Sales, Marketing and Global Services. Previously, he was Vice President of AWS Compute Services.

Selipsky was one of the first vice presidents at AWS in 2005, spent 11 years there and then left to become CEO of Tableau. When Andy Jassy took over as CEO of Amazon, Selipsky returned to run AWS.

In his staff email, Jassy said when Selipsky returned it was expected that he'd lead for a few years and develop the bench of executive talent.

AWS has seen customers optimize their cloud spending and manage through generative AI. With the launch of Amazon Q and the uptake of Amazon Bedrock, AWS is on a good trajectory and $100 billion revenue run rate.

AWS annual revenue run rate hits $100 billion as growth accelerates | Constellation ShortList™ Global IaaS for Next-Gen Applications | Constellation ShortList™ Artificial Intelligence and Machine Learning Cloud Platforms

Constellation Research CEO Ray Wang said:

"Adam played a key role in Andy Jassy’s transition to CEO of overall Amazon.  The current morale has not been the best, and customers’ expectations remain high as we enter an Age of AI. This sets Matt Garman up to be in a good place to take AWS to the next level."

More:

Data to Decisions amazon Chief Information Officer

Alibaba's cloud unit growth anemic in Q4 with AI green shoots ahead

Alibaba's cloud unit growth anemic in Q4 with AI green shoots ahead

Alibaba's Cloud Intelligence group posted revenue growth of 3% for the quarter ending March 31, well below global cloud providers.

In the company's fourth quarter earnings report, Alibaba--China's largest e-commerce company--said its Cloud Intelligence Group reported revenue of $3.54 billion, up 3% from a year ago. Adjusted EIBTA for Alibaba's Cloud Intelligence Group was $198 million.

According to Alibaba, revenue growth for the cloud unit was driven by the company's other businesses. Excluding Alibaba-consolidated subsidiaries revenue fell slightly. Alibaba had planned to spin off its cloud unit, but shelved plans citing uncertainty over trade restrictions.

Compared to other cloud vendors, Alibaba's cloud unit is sputtering. AWS grew revenue 13% in the first quarter and hit a $100 billion run rate. Microsoft Azure revenue growth in its most recent quarter was 31% and Google Cloud's first quarter revenue growth was 28%.

Constellation ShortList™ Global IaaS for Next-Gen Applications | Constellation ShortList™ Artificial Intelligence and Machine Learning Cloud Platforms

Alibaba said it has been focusing on high-quality cloud revenue and transitioning away from "low-margin project-based revenues." "We expect the strong revenue growth in public cloud and AI-related products will offset the impact of the roll-off of project-based revenues," the company said.

According to Alibaba, cloud revenue related to AI is surging with strength from foundational model companies, financial services and automotive.

For fiscal 2024, Alibaba Cloud Intelligence Group reported revenue of $14.73 billion, up 3% from a year ago, with EIBTA of $848 million.

Alibaba reported fiscal 2024 net income of $11.04 billion on revenue of $130.35 billion, up 7% from a year ago. Alibaba also announced a bevy of executive appointments in various units. Fourth quarter revenue growth for Alibaba was 7%. 

Data to Decisions Tech Optimization SaaS PaaS IaaS Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer

OpenAI's GPT-4o: A look at short-term, mid-term and long-term implications

OpenAI's GPT-4o: A look at short-term, mid-term and long-term implications

OpenAI's launch of GPT-4o appears to have upped the large language model (LLM) ante with a real-time conversational chat interface that recognizes audio and video and detect emotions. Here's a look at the implications to the enterprise, the short-term impact and the long run.

First, the details. OpenAI's GPT4o enables you to do a lot more than traditional models. In fact, GPT-4o is more in line with what you'd see in a science fiction movie. It's cool, odd and scary at the same time. For what it's worth, the "o" stand for omni and GPT-4o can respond to audio on an average of 320 milliseconds to match a human. It's also more efficient and cheaper.

Here's a look at the benchmarks.

And in a blog post, OpenAI CEO Sam Altman said: "the new voice (and video) mode is the best computer interface I’ve ever used. It feels like AI from the movies; and it’s still a bit surprising to me that it’s real. Getting to human-level response times and expressiveness turns out to be a big change."

Meanwhile, OpenAI aims to make its new model affordable with a free tier and then access via ChatGPT Plus and Teams with a rollout to enterprises on deck.

Here are a few thoughts about what this all means.

  • Short-term: This launch is almost comical in how it falls ahead of Apple's WWDC. Apple has been widely reported to be in negotiations with OpenAI about embedding models into iOS. And in case you haven't noticed (and who hasn't?) Siri has needed a new brain for years. OpenAI's GPT-4o basically seals the Apple deal and a massive revenue stream.
  • Mid-term: OpenAI has a close relationship with Microsoft and that's been mutually beneficial. The problem for OpenAI is that Microsoft is more likely to own that customer relationship than OpenAI. It's obvious GPT-4o can help build more direct enterprise relationships. Moderna and OpenAI may just be a start as enterprises will want more direct access to GPT-4o.
  • Mid-term: Enterprise use cases with GPT-4o are going to surge. Assuming OpenAI's latest and greatest model can replicate a human closely customer experiences are likely to go even more human.
  • Short-term: GPT-4o is interesting because OpenAI is so good at big bang AI. You can expect the competition to heat up even more from LLM rivals, who may not be so far behind.
  • Short-term: The enterprise pendulum has swung to more model choices, but key platforms like Amazon Bedrock feature a bevy of choices that usually don't include OpenAI. OpenAI with GPT-4o may be able to carve out relationships with hyperscalers not named Microsoft Azure. Enterprises are going to want to swap out models as needed, but OpenAI's latest LLM may be too good to ignore. Despite what enterprises say about being vendor neutral, they usually gravitate to one provider and lock-in (and complain about it later).
  • Long-term: GPT-4o appears to be a big leap and enough to trigger an even larger white-collar recession in the future. There are also a bevy of other cultural issues to ponder. As the LLM race accelerates, these issues are only going to become larger.
  • Long-term: OpenAI has already made ChatGPT a verb to some degree. Broad access at a reasonable price may seal the deal. The company said:

"GPT-4o's text and image capabilities are starting to roll out today in ChatGPT. We are making GPT-4o available in the free tier, and to Plus users with up to 5x higher message limits. We'll roll out a new version of Voice Mode with GPT-4o in alpha within ChatGPT Plus in the coming weeks.

Developers can also now access GPT-4o in the API as a text and vision model. GPT-4o is 2x faster, half the price, and has 5x higher rate limits compared to GPT-4 Turbo. We plan to launch support for GPT-4o's new audio and video capabilities to a small group of trusted partners in the API in the coming weeks."

See more:

 

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity openai ML Machine Learning LLMs Agentic AI Generative AI Robotics AI Analytics Automation Quantum Computing Cloud Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain Leadership VR GenerativeAI Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Rocket Companies’ genAI strategy: Playing both the short and the long game

Rocket Companies’ genAI strategy: Playing both the short and the long game

Rocket Companies, a fintech company with mortgage, real estate, and personal finance businesses, is starting to see the payoff from its generative AI efforts as well as a bet on AWS’s Amazon Bedrock.

The company reported first-quarter revenue of $1.4 billion and net income of $291 million. The first quarter topped Wall Street’s expectations as well as the company’s internal guidance. On an earnings conference call, Rocket CEO Varun Krishna (right), formerly an executive at Intuit, PayPal, Groupon, and Microsoft, said the company is taking share, focusing on what it can control as interest rates and the mortgage market ebb and flow and investing in artificial intelligence (AI) to transform the business.

“The reason that we’re obsessed with AI is because it brings a number of transformative benefits to our business,” says Krishna. “We’re playing both the short and the long game, gaining momentum and achieving success while strategically planning and executing for the long term. We are committed to delivering industry-leading experiences powered by AI benefiting our clients, mortgage brokers, real estate agents, financial institution partners, and our team members alike.”

See PDF version of this customer story.

Rocket Companies’ journey to becoming an AI-driven mortgage and lending disruptor has been a long one. The company started as Rock Financial in 1985; created Mortgage In A Box, a mail-in mortgage application, in 1996; expanded into loans and became Quicken Loans in 1999; became the largest provider in FHA loans in 2014; became the largest residential mortgage lender in 2017; and went public in 2020. Throughout its history, Rocket has had to manage through real estate and lending boom-and-bust cycles.

AI Survey: Do you want to know where your enterprise stands within Constellation's AI maturity model of adoption and integration? Take 10 minutes to complete our 2024 AI survey and get a free 15 min. consultation and first access to the report!

In April 2024, Rocket launched Rocket Logic, an AI platform built on insights from more than 10 million petabytes of proprietary data and 50 million annual call transcripts. Rocket Logic scans and identifies files for documentation, uses computer vision models to extract data from documents, and saved underwriters 5,000 hours of manual work in February.

Rocket quickly followed up with Rocket Logic – Synopsis, an AI tool that analyzes and transcribes customer calls, analyzes sentiment, and identifies patterns. Synopsis is built on AWS and Amazon Bedrock, which features models from Anthropic, Cohere, Meta, Mistral, and others; has made 70% of client interactions self-service; and will learn from homeowner communications preferences over time.

Krishna also cites a new AI effort from Rocket Homes called Explore Spaces Visual Space, which enables users to upload photos of features they deem important and use image recognition to find homes. Another generative AI pilot enables clients to update their verified approval letters by using their voice. That use of AI will save bankers and underwriters time, since they manually adjust letter modifications almost 300,000 times a year.

“AI eliminates the drudgery of burdensome, time-consuming manual tasks so that our team members can spend more time on making human connections and producing higher-value work. Ultimately, with AI, we are driving operational efficiency, speed, accuracy, and personalization at massive scale,” says Krishna.

The plan for Rocket is to continue to roll out AI services on Rocket Logic. Rocket’s strategy is to leverage generative AI in a model-agnostic way to gain market share during the mortgage-and-lending downturn. Recent Rocket Logic additions include Rocket Logic Assistant, which follows conversations in real time, and Rocket Logic Docs, a document processing platform that can extract data from loan applications, W-2s, and bank statements. In February 2024, Rocket said nearly 90% of documents were automatically processed.

“We believe artificial intelligence is evolving rapidly and approaching a critical inflection point, where knowledge engineering, machine learning, automation, and personalization will be at the center of how clients buy, sell, and finance homes,” explained Rocket in its annual report.

The necessary data foundation

Like many other companies looking to scale generative AI, Rocket’s focus on data science and data governance sets the stage. The lesson in 2024 is clear: The companies, such as Intuit, JPMorgan Chase, and Equifax, that have their data strategies down can leverage generative AI for competitive advantage.

Dian Xu, director of engineering in Data Intelligence at Rocket Central, speaking at AWS re:Invent 2023, outlined how the company had evolved from a legacy big data infrastructure to a more scalable AWS infrastructure and ultimately Amazon SageMaker and Bedrock.

Xu explained that Rocket had an open-source data lake in 2017 that worked well enough but that the company’s volume subsequently doubled and then tripled. “We realized the legacy structure couldn’t scale and that data ingestion took too long,” said Xu. “We knew we had to modernize.”

It didn’t help that legacy providers all had contracts that had to renew and support costs. Xu said Rocket had $1 million in fixed costs on top of cloud costs. Rocket used AWS’s migration acceleration program, retained cloud credits, and saved $3 million annually on supporting the data infrastructure.

Rocket’s journey included multiple services for data management and analytics before the company landed with SageMaker, which is used to manage models, deploy them, and provide an interface for multiple skill levels.

At the time of re:Invent, Xu said Rocket was prepared for generative AI, due to its data infrastructure, and was looking at AWS’s DataZone for Governance, Code Whisperer, and Amazon Bedrock. Six months later, Rocket was outlining its Rocket Logic AI platform and Synopsis.

For Rocket CEO Krishna, the data foundation is the linchpin in model training. “The key to AI is continuous training of models with recursive feedback loops and data. We are organizing this invaluable data to construct unified client profiles in a centralized repository,” he says. “From this repository, we trained models to gain deeper insights and analytics to personalize all future interactions with our clients. The ultimate objective is to deliver an industry-best client experience that translates into better conversion rates and higher client lifetime value and to just get continuously better and better at it.”

The returns on generative AI

Rocket executives say Rocket Logic is already generating strong returns. The company says that Rocket Logic automation reduced the number of times an employee interacts with a loan by 25% in the first quarter, compared to a year ago.

Turn times for Rocket clients to close on a home purchase declined by 25% from August 2022 to February 2024. As a result, Rocket is closing loans nearly 2.5 times as fast as the industry average.

In addition, generative AI saves hours of manual work.

Research: Enterprises Must Now Cultivate a Capable and Diverse AI Model Garden

Rocket Logic Docs saved more than 5,000 hours of manual work for Rocket’s underwriters in February 2024. Extracting data points from documents saved an additional 4,000 hours of manual work.

Rocket CFO Brian Brown says, “AI is bringing tangible business value through enhanced operational efficiency, velocity, and accuracy at scale. The most apparent and significant value add that I’ve seen is augmenting team member capacity through operational efficiency.”

Brown says Synopsis is taking over manual tasks such as populating mortgage applications and classifying documents. “With AI handling this work, our team members have more time to provide tailored advice and engage in higher-value conversations with our clients.” He adds Synopsis cut a fourth of the manual tasks in the first quarter compared to a year ago.

Other generative AI returns from Rocket’s first quarter earnings conference call include:

  • 170,000 hours saved per year
  • First-call resolution improved 10% with Synopsis after a few weeks
  • Zero audit findings with generative AI income verification

Rocket executives say AI is about growth and efficiency and that both are on the same continuum. Brown says generative AI brings the ability to add more capacity into the system. “We did $79 billion in originations last year, and we believe we can put significantly more capacity through the system,” says Brown.

Krishna’s take is that the savings from AI can drive investment and growth as well as velocity. He says:

“The thing I’m excited about is that our AI strategy is specifically designed to create and unlock operating leverage. It will allow us to grow our capacity without increasing head count. And it will allow us to actually build our company and grow durably. So, we don’t look at this AI investment as a head count reducer. I mean that’s not how you build a growth company durably.

“But the combination of being able to invest in technology and have an ongoing principle around efficiency is how we think we’re going to create a durable flywheel.”

Data to Decisions Future of Work New C-Suite Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Sustainability 50 interview: Ann Arbor's Missy Stults on data and the importance of storytelling

Sustainability 50 interview: Ann Arbor's Missy Stults on data and the importance of storytelling

Missy Stults, Sustainability and Innovations Director for Ann Arbor, MI, has seen sustainability grow up in her community and become more mature in measuring the impact on the climate. Stults noted that data is critical to sustainability, but storytelling is just as important.

Stults, one of Constellation Research's Sustainability 50 members for 2024, caught up with me to talk sustainability and rallying a community. Here's a look at some of the takeaways:

Herding carbon cats. Stults noted the challenges with tracking carbon emissions across a community, supply chain or any other ecosystem. She said Ann Arbor's local government is responsible for just shy of 2% of the community's greenhouse gas emissions. Those emissions cover buildings, water treatment plans and other infrastructure. "If I'm going to move towards carbon neutrality for the whole community in adjusted equitable way, I got to work with the whole community to do that. So, we do it through a lot of different techniques," she said. There are regulations and sticks, but carrots like resources, services, rebates and utility programs are more fun.

And sometimes you just have to show people what happens if we don't address climate change. Bad beer turns out to be an interesting illustration. Stults said:

"We've worked with local brewers and we brewed a pretty crappy beer, where all the ingredients were stressed to mimic what climate change would do to the conditions here in Michigan. Michigan has really great beer, but climate change is going to threaten that. So, we produced a craft beer and it was called fail of the earth. People got to try it and were like 'beer cannot taste like this.' We simply must do something about climate change."

The data. Stults said Ann Arbor tracks greenhouse gas inventories and there are protocols for local government operations just like businesses have. Standardization is key. But the data wrangling to date is less than perfect. Stults said:

"We also have to figure out things like purchasing. If you think about purchasing it's incredibly complex because we're trying to move to scope 3 and lifecycle analyses. I need to understand not just electric use and natural gas consumption, but what 120,000 people are buying, where those materials are coming from and how they're produced. There's a big movement in the local government field to create methodologies for how we can do that in a meaningful enough way. It'll never be specific, but we can at least have some generalized data that helps us make more informed policy decisions."

Storytelling matters as much as data. "We spend a lot of time to get up quantitative metrics in our work at the local level, storytelling is just as important," she said. "What does this work mean to the actual people who live here? What does it mean for the local business? Why are they making these investments in these practices? Why are restaurants doubling down on plant forward diets? Or why are they working on sourcing from sustainable local businesses? We're a storytelling species. We have to tell stories of what this looks like."

The evolution of sustainability. When Stults started in her role, the team was small and the survival of the department depended on businesses and people wanting to work on sustainability. Today, Ann Arbor has passed a tax to fund climate work with more than 70% of the vote. Sustainability is part of the community identity. And given natural disasters, supply chain impacts and other climate issues sustainability has become more of an issue for everyone. "People are just more they're more aware. And I think there's a willingness to do something about it," said Stults.

Supply chains and sustainability. One reason sustainability has become more prominent is its relationship with the supply chain. And economic incentives between sustainability and supply chain are aligned. Stults said:

"We're paying attention to where we get goods, supplies and materials and where the labor is coming from. We're also asking questions about what our suppliers are doing about that. How are you reducing your own emissions? How are you making sure your supply chains are resilient, and that includes your employees? How are we thinking about this system holistically? I think that has evolved a lot in the last several years. We were pretty unsophisticated in this space, even five years ago."

Next-Generation Customer Experience Innovation & Product-led Growth Future of Work sustainability Chief Information Officer Chief Sustainability Officer

The Role of AI in Sustainability | Sustainability 50 Interviews

The Role of AI in Sustainability | Sustainability 50 Interviews

Constellation Insights Editor-in-Chief Larry Dignan interviews Sandeep Chandna, a 2024 Sustainability 50 winner and Chief Sustainability Officer of Tech Mahindra about how he's using #ai #technology to transform sustainability initiatives.

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/ylprRdgJKoo?si=3uBytycEVR_VmEBF" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Platform Based Communications Approach for Unified Experience

Platform Based Communications Approach for Unified Experience

Dion Hinchcliffe, VP and Principal Analyst at Constellation Research, explains how digital experience benefits from a systemic approach.

On <iframe width="560" height="315" src="https://www.youtube.com/embed/91AqQSPGqhE?si=y-1z8ShBouE2h1qY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

What is an iPaaS? Integration Platform as a Service Explained

What is an iPaaS? Integration Platform as a Service Explained

Constellation Research explains the components and trends in Integration Platform as a Service and what to expect in a next-generation iPaaS offering.

On <iframe width="560" height="315" src="https://www.youtube.com/embed/s74O4Et69pE?si=CjUcx_UU1cD3ZWa9" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Boomi aims to ease SAP Datasphere migrations

Boomi aims to ease SAP Datasphere migrations

Boomi said it has enhanced Boomi for SAP to ease the migration of business data into SAP Datasphere.

The move, outlined at Boomi World, aims to solve a pain point for SAP customers that face the end of support for SAP Business Warehouse at the end of 2027. SAP customers have groused about SAP Datasphere as well as the need to support third party data outside of SAP systems.

For SAP, Datasphere is a critical part of its process and automation plans since it can ride alongside SAP Signavio and LeanIX. SAP is also partnered with UiPath for its automation platform. Celonis is also frequently plugged into SAP systems. There are multiple players that want to be your automation platform.

Boomi's plan is to use Boomi for SAP to accelerate the transition to SAP Datasphere on AWS through its iPaaS and Amazon Redshift. Boomi noted that today "the move to SAP Datasphere requires significant investment and substantial effort from highly-skilled individuals."

Steve Lucas, Boomi CEO, said the company's Enterprise Platform combined with AWS is a more cost effective way to move to SAP Datasphere because customers can avoid SAP Business Warehouse upgrades and staging areas, have more efficient data tiering to SAP Datasphere and Redshift and integrate and prepare data better for AI and analytics.

Data to Decisions Innovation & Product-led Growth boomi Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer