Results

Rocket Companies’ genAI strategy: Playing both the short and the long game

Rocket Companies, a fintech company with mortgage, real estate, and personal finance businesses, is starting to see the payoff from its generative AI efforts as well as a bet on AWS’s Amazon Bedrock.

The company reported first-quarter revenue of $1.4 billion and net income of $291 million. The first quarter topped Wall Street’s expectations as well as the company’s internal guidance. On an earnings conference call, Rocket CEO Varun Krishna (right), formerly an executive at Intuit, PayPal, Groupon, and Microsoft, said the company is taking share, focusing on what it can control as interest rates and the mortgage market ebb and flow and investing in artificial intelligence (AI) to transform the business.

“The reason that we’re obsessed with AI is because it brings a number of transformative benefits to our business,” says Krishna. “We’re playing both the short and the long game, gaining momentum and achieving success while strategically planning and executing for the long term. We are committed to delivering industry-leading experiences powered by AI benefiting our clients, mortgage brokers, real estate agents, financial institution partners, and our team members alike.”

See PDF version of this customer story.

Rocket Companies’ journey to becoming an AI-driven mortgage and lending disruptor has been a long one. The company started as Rock Financial in 1985; created Mortgage In A Box, a mail-in mortgage application, in 1996; expanded into loans and became Quicken Loans in 1999; became the largest provider in FHA loans in 2014; became the largest residential mortgage lender in 2017; and went public in 2020. Throughout its history, Rocket has had to manage through real estate and lending boom-and-bust cycles.

AI Survey: Do you want to know where your enterprise stands within Constellation's AI maturity model of adoption and integration? Take 10 minutes to complete our 2024 AI survey and get a free 15 min. consultation and first access to the report!

In April 2024, Rocket launched Rocket Logic, an AI platform built on insights from more than 10 million petabytes of proprietary data and 50 million annual call transcripts. Rocket Logic scans and identifies files for documentation, uses computer vision models to extract data from documents, and saved underwriters 5,000 hours of manual work in February.

Rocket quickly followed up with Rocket Logic – Synopsis, an AI tool that analyzes and transcribes customer calls, analyzes sentiment, and identifies patterns. Synopsis is built on AWS and Amazon Bedrock, which features models from Anthropic, Cohere, Meta, Mistral, and others; has made 70% of client interactions self-service; and will learn from homeowner communications preferences over time.

Krishna also cites a new AI effort from Rocket Homes called Explore Spaces Visual Space, which enables users to upload photos of features they deem important and use image recognition to find homes. Another generative AI pilot enables clients to update their verified approval letters by using their voice. That use of AI will save bankers and underwriters time, since they manually adjust letter modifications almost 300,000 times a year.

“AI eliminates the drudgery of burdensome, time-consuming manual tasks so that our team members can spend more time on making human connections and producing higher-value work. Ultimately, with AI, we are driving operational efficiency, speed, accuracy, and personalization at massive scale,” says Krishna.

The plan for Rocket is to continue to roll out AI services on Rocket Logic. Rocket’s strategy is to leverage generative AI in a model-agnostic way to gain market share during the mortgage-and-lending downturn. Recent Rocket Logic additions include Rocket Logic Assistant, which follows conversations in real time, and Rocket Logic Docs, a document processing platform that can extract data from loan applications, W-2s, and bank statements. In February 2024, Rocket said nearly 90% of documents were automatically processed.

“We believe artificial intelligence is evolving rapidly and approaching a critical inflection point, where knowledge engineering, machine learning, automation, and personalization will be at the center of how clients buy, sell, and finance homes,” explained Rocket in its annual report.

The necessary data foundation

Like many other companies looking to scale generative AI, Rocket’s focus on data science and data governance sets the stage. The lesson in 2024 is clear: The companies, such as Intuit, JPMorgan Chase, and Equifax, that have their data strategies down can leverage generative AI for competitive advantage.

Dian Xu, director of engineering in Data Intelligence at Rocket Central, speaking at AWS re:Invent 2023, outlined how the company had evolved from a legacy big data infrastructure to a more scalable AWS infrastructure and ultimately Amazon SageMaker and Bedrock.

Xu explained that Rocket had an open-source data lake in 2017 that worked well enough but that the company’s volume subsequently doubled and then tripled. “We realized the legacy structure couldn’t scale and that data ingestion took too long,” said Xu. “We knew we had to modernize.”

It didn’t help that legacy providers all had contracts that had to renew and support costs. Xu said Rocket had $1 million in fixed costs on top of cloud costs. Rocket used AWS’s migration acceleration program, retained cloud credits, and saved $3 million annually on supporting the data infrastructure.

Rocket’s journey included multiple services for data management and analytics before the company landed with SageMaker, which is used to manage models, deploy them, and provide an interface for multiple skill levels.

At the time of re:Invent, Xu said Rocket was prepared for generative AI, due to its data infrastructure, and was looking at AWS’s DataZone for Governance, Code Whisperer, and Amazon Bedrock. Six months later, Rocket was outlining its Rocket Logic AI platform and Synopsis.

For Rocket CEO Krishna, the data foundation is the linchpin in model training. “The key to AI is continuous training of models with recursive feedback loops and data. We are organizing this invaluable data to construct unified client profiles in a centralized repository,” he says. “From this repository, we trained models to gain deeper insights and analytics to personalize all future interactions with our clients. The ultimate objective is to deliver an industry-best client experience that translates into better conversion rates and higher client lifetime value and to just get continuously better and better at it.”

The returns on generative AI

Rocket executives say Rocket Logic is already generating strong returns. The company says that Rocket Logic automation reduced the number of times an employee interacts with a loan by 25% in the first quarter, compared to a year ago.

Turn times for Rocket clients to close on a home purchase declined by 25% from August 2022 to February 2024. As a result, Rocket is closing loans nearly 2.5 times as fast as the industry average.

In addition, generative AI saves hours of manual work.

Research: Enterprises Must Now Cultivate a Capable and Diverse AI Model Garden

Rocket Logic Docs saved more than 5,000 hours of manual work for Rocket’s underwriters in February 2024. Extracting data points from documents saved an additional 4,000 hours of manual work.

Rocket CFO Brian Brown says, “AI is bringing tangible business value through enhanced operational efficiency, velocity, and accuracy at scale. The most apparent and significant value add that I’ve seen is augmenting team member capacity through operational efficiency.”

Brown says Synopsis is taking over manual tasks such as populating mortgage applications and classifying documents. “With AI handling this work, our team members have more time to provide tailored advice and engage in higher-value conversations with our clients.” He adds Synopsis cut a fourth of the manual tasks in the first quarter compared to a year ago.

Other generative AI returns from Rocket’s first quarter earnings conference call include:

  • 170,000 hours saved per year
  • First-call resolution improved 10% with Synopsis after a few weeks
  • Zero audit findings with generative AI income verification

Rocket executives say AI is about growth and efficiency and that both are on the same continuum. Brown says generative AI brings the ability to add more capacity into the system. “We did $79 billion in originations last year, and we believe we can put significantly more capacity through the system,” says Brown.

Krishna’s take is that the savings from AI can drive investment and growth as well as velocity. He says:

“The thing I’m excited about is that our AI strategy is specifically designed to create and unlock operating leverage. It will allow us to grow our capacity without increasing head count. And it will allow us to actually build our company and grow durably. So, we don’t look at this AI investment as a head count reducer. I mean that’s not how you build a growth company durably.

“But the combination of being able to invest in technology and have an ongoing principle around efficiency is how we think we’re going to create a durable flywheel.”

Data to Decisions Future of Work New C-Suite Next-Generation Customer Experience Innovation & Product-led Growth Tech Optimization Digital Safety, Privacy & Cybersecurity amazon AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Sustainability 50 interview: Ann Arbor's Missy Stults on data and the importance of storytelling

Missy Stults, Sustainability and Innovations Director for Ann Arbor, MI, has seen sustainability grow up in her community and become more mature in measuring the impact on the climate. Stults noted that data is critical to sustainability, but storytelling is just as important.

Stults, one of Constellation Research's Sustainability 50 members for 2024, caught up with me to talk sustainability and rallying a community. Here's a look at some of the takeaways:

Herding carbon cats. Stults noted the challenges with tracking carbon emissions across a community, supply chain or any other ecosystem. She said Ann Arbor's local government is responsible for just shy of 2% of the community's greenhouse gas emissions. Those emissions cover buildings, water treatment plans and other infrastructure. "If I'm going to move towards carbon neutrality for the whole community in adjusted equitable way, I got to work with the whole community to do that. So, we do it through a lot of different techniques," she said. There are regulations and sticks, but carrots like resources, services, rebates and utility programs are more fun.

And sometimes you just have to show people what happens if we don't address climate change. Bad beer turns out to be an interesting illustration. Stults said:

"We've worked with local brewers and we brewed a pretty crappy beer, where all the ingredients were stressed to mimic what climate change would do to the conditions here in Michigan. Michigan has really great beer, but climate change is going to threaten that. So, we produced a craft beer and it was called fail of the earth. People got to try it and were like 'beer cannot taste like this.' We simply must do something about climate change."

The data. Stults said Ann Arbor tracks greenhouse gas inventories and there are protocols for local government operations just like businesses have. Standardization is key. But the data wrangling to date is less than perfect. Stults said:

"We also have to figure out things like purchasing. If you think about purchasing it's incredibly complex because we're trying to move to scope 3 and lifecycle analyses. I need to understand not just electric use and natural gas consumption, but what 120,000 people are buying, where those materials are coming from and how they're produced. There's a big movement in the local government field to create methodologies for how we can do that in a meaningful enough way. It'll never be specific, but we can at least have some generalized data that helps us make more informed policy decisions."

Storytelling matters as much as data. "We spend a lot of time to get up quantitative metrics in our work at the local level, storytelling is just as important," she said. "What does this work mean to the actual people who live here? What does it mean for the local business? Why are they making these investments in these practices? Why are restaurants doubling down on plant forward diets? Or why are they working on sourcing from sustainable local businesses? We're a storytelling species. We have to tell stories of what this looks like."

The evolution of sustainability. When Stults started in her role, the team was small and the survival of the department depended on businesses and people wanting to work on sustainability. Today, Ann Arbor has passed a tax to fund climate work with more than 70% of the vote. Sustainability is part of the community identity. And given natural disasters, supply chain impacts and other climate issues sustainability has become more of an issue for everyone. "People are just more they're more aware. And I think there's a willingness to do something about it," said Stults.

Supply chains and sustainability. One reason sustainability has become more prominent is its relationship with the supply chain. And economic incentives between sustainability and supply chain are aligned. Stults said:

"We're paying attention to where we get goods, supplies and materials and where the labor is coming from. We're also asking questions about what our suppliers are doing about that. How are you reducing your own emissions? How are you making sure your supply chains are resilient, and that includes your employees? How are we thinking about this system holistically? I think that has evolved a lot in the last several years. We were pretty unsophisticated in this space, even five years ago."

Next-Generation Customer Experience Innovation & Product-led Growth Future of Work sustainability Chief Information Officer Chief Sustainability Officer

The Role of AI in Sustainability | Sustainability 50 Interviews

Constellation Insights Editor-in-Chief Larry Dignan interviews Sandeep Chandna, a 2024 Sustainability 50 winner and Chief Sustainability Officer of Tech Mahindra about how he's using #ai #technology to transform sustainability initiatives.

On Insights <iframe width="560" height="315" src="https://www.youtube.com/embed/ylprRdgJKoo?si=3uBytycEVR_VmEBF" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Platform Based Communications Approach for Unified Experience

Dion Hinchcliffe, VP and Principal Analyst at Constellation Research, explains how digital experience benefits from a systemic approach.

On <iframe width="560" height="315" src="https://www.youtube.com/embed/91AqQSPGqhE?si=y-1z8ShBouE2h1qY" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

What is an iPaaS? Integration Platform as a Service Explained

Constellation Research explains the components and trends in Integration Platform as a Service and what to expect in a next-generation iPaaS offering.

On <iframe width="560" height="315" src="https://www.youtube.com/embed/s74O4Et69pE?si=CjUcx_UU1cD3ZWa9" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>

Boomi aims to ease SAP Datasphere migrations

Boomi said it has enhanced Boomi for SAP to ease the migration of business data into SAP Datasphere.

The move, outlined at Boomi World, aims to solve a pain point for SAP customers that face the end of support for SAP Business Warehouse at the end of 2027. SAP customers have groused about SAP Datasphere as well as the need to support third party data outside of SAP systems.

For SAP, Datasphere is a critical part of its process and automation plans since it can ride alongside SAP Signavio and LeanIX. SAP is also partnered with UiPath for its automation platform. Celonis is also frequently plugged into SAP systems. There are multiple players that want to be your automation platform.

Boomi's plan is to use Boomi for SAP to accelerate the transition to SAP Datasphere on AWS through its iPaaS and Amazon Redshift. Boomi noted that today "the move to SAP Datasphere requires significant investment and substantial effort from highly-skilled individuals."

Steve Lucas, Boomi CEO, said the company's Enterprise Platform combined with AWS is a more cost effective way to move to SAP Datasphere because customers can avoid SAP Business Warehouse upgrades and staging areas, have more efficient data tiering to SAP Datasphere and Redshift and integrate and prepare data better for AI and analytics.

Data to Decisions Innovation & Product-led Growth boomi Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Arm's data center takeover: A lumpy revolution

Arm Holdings' chip designs may take over the data center over time as GPUs, cloud custom processors and Nvidia's march to AI factories gains momentum. But the road to licensing and royalty nirvana is going to be lumpy.

And lumpy it is for Arm's fourth quarter results and first quarter outlook. Arm's third quarter ascent caught Wall Street by surprise, but the fourth quarter earnings and first quarter outlook had to deal with much higher expectations.

The chip designer said first quarter non-GAAP earnings will be between 32 cents a share to 36 cents a share on revenue between $875 million to $925 million. Fiscal 2025 revenue will be between $3.8 billion and $4.2 billion with adjusted earnings of $1.45 a share and $1.65 a share. The first quarter outlook was above estimates and the fiscal year guidance was in line. 

Higher expectations go with the territory when shares year to date were up 41% going into an earnings report. Arm peaked at $164 after its third quarter report and will fall under $100 today. In other words, Arm has gone from a poster child of trickle-down generative AI economics to not being able to deliver the growth expected.

As with most things in life, the truth lies in the middle. For Arm, that truth looks promising, but chip designs take time to work through data center buildouts. Also keep in mind that Arm stands to benefit from AI processing at the edge--namely AI PCs and smartphones.

Here's a look at the moving parts of Arm.

Future growth is all about royalty revenue from chips based on Armv9 designs. In the fourth quarter, Arm said Armv9 technology contributed 20% of royalty revenue due to smartphones, servers, IoT and networking devices. That's up from 15% in the third quarter.

Arm CEO Rene Haas said on the company's earnings conference call.

"What we're seeing is the acceleration of v8 to v9, which drives not only better royalties, but we're also seeing more CPUs inside the chip, which compounds that royalty growth really across all end markets. V9 adoption will only continue to increase."

V9 adoption will be faster than previous Arm designs because of the uptake in infrastructure and subsystems.

That uptake of Armv9 takes time and revenue recognition may be lumpy. CFO Jason Child said licensing revenue will be "lumpy" due to timing of revenue recognition. Arm said the first half of fiscal 2025 will be about 40% of licensing revenue for the year. Royalty revenue will continue to grow in the mid-20% range. Child said Arm has a pipeline of new licenses and royalty bearing chips to maintain revenue growth of 20% for fiscal 2026 and 2027.

Nvidia is Arm's BFF. Nvidia's surge is going to bring Arm along for the ride. Haas said:

"With Nvidia's most recent announcement, Grace Blackwell, you are going to see an acceleration of Arm in the data center in these AI applications. One of the benefits that you get in terms of designing a chip such as Grace Blackwell is by integrating the Arm CPU with the Nvidia GPU, you're able to get an interconnect between the CPU and the GPU that allows for a much higher access to memory, which is one of the limiting factors is for training and inference applications."

Don't forget the CPU play. Seventy percent of the world's population is using Arm-based CPUs. Those CPUs will become an AI play as AI workloads are moved to the edge. "Our licensing activity is probably the best proxy for that. The way to think about licensing revenue as it applies to Al is as software is moving faster than hardware, the hardware designs need to be upgraded quickly to make sure they can capture the needs of these new Al workloads," said Haas.

Hyperscalers will pay Arm because they need energy efficiency. Google's Axion processor is custom and based on Arm and will be used for inference and training. AWS' Graviton and Trainium are Arm as is Microsoft's custom processor. And then there's the big fish in Nvidia's Grace Blackwell superchip.

Networking is a market. Arm recently announced Ethos-U85, which adds transformer network support to Arm Ethos products, which aim to bring genAI to embedded devices.

Compute subsystems will bring Arm more growth. Arm said it will see growth from Arm Compute Subsystems (CSS), which integrate various Arm technologies for more off-the-shelf components.

Haas said:

"And our first customer in the Neoverse space doing a design, Microsoft, their Cobalt chip is now ramping. We are oversubscribed on this compute subsystem strategy. We have far more demand for the product than anticipated, and we are anticipating growing that significantly over time."

Tech Optimization Data to Decisions Big Data Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer

Boomi melds API management, AI integration via platform updates, two acquisitions

Boomi outlined a vision that puts integration platform as a service (iPaaS) at the heart of connecting AI agents as well as APIs and announced the acquisitions of APIIDA's federated API management business and API management assets from Cloud Software Group.

Speaking at Boomi World, CEO Steve Lucas said the company is looking to end "operational overhead and API sprawl" with its platform and enable scale for AI use cases.

Lucas said: "Connectivity remains a critical challenge for almost every organization. The chief culprit is digital fragmentation, a byproduct of digital shifts that, paradoxically, lead to digital silos and disjointed technical architectures that leave the average enterprise now juggling over 364 applications and numerous API gateways. AI thrives on reliable, secure, and current data, yet too often, this data is fragmented, difficult to govern, and not securely managed."

Boomi's Enterprise Platform aims to address those issues with new features for API management in AI, out-of-box AI agents as well as automated AI orchestration workflows, and tools to manage data quality, data lineage and metadata management.

Key additions include Boomi's platform include:

  • The Boomi AI agent framework, which integrates various agents.
  • Boomi Answers, an agent for prescriptive help.
  • Boomi DataDetective, an agent for classifying data fields and protecting sensitive data and tracking data movement.
  • Boomi DesignGen, an agent for building integrations.
  • Boomi Scribe, which automatically documents existing and built-by-AI integrations.
  • The ability to add third party agents via APIs as well as Boomi GPT. To that end, Boomi announced a partnership with Vianai Systems, which provides conversational AI tools for finance.
  • Boomi DataHub, a data access layer for integration pipelines and master data management.

Constellation Research analyst Doug Henschen is at Boomi World and relayed the following take:

"Boomi shared a very forward-looking vision at BoomiWorld 24 for GenAI agent-assisted integration and automation. The four agents initially released -- Boomi Answers for prescriptive assistance, Boomi DataDetective for automatically classifying data and detecting PII, Boomi DesignGen for autonomously generating integrations, and Boomi Scribe for documenting existing integrations – are right in Boomi’s integration and automation wheelhouse, but company CEO, Steve Lucas, also promised an ambitious variety of agents yet to come to the Boomi Agent Garden. From financial analysis to dashboard building to marketing automation, Lucas wasn’t shy about promising broad-ranging, GenAI-based capabilities yet to come, whether provided by Boomi or by third-party partners."

 

Data to Decisions Tech Optimization Innovation & Product-led Growth Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity boomi ipaas ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

SAP, IBM Consulting pair up on process transformation, genAI

IBM and SAP said they will collaborate on generative AI models aimed at industries.

The "Value Generation" partnership will focus on generative AI and industry cloud applications. SAP is trying to migrate its customer base to the cloud and S4/HANA via its RISE with SAP program. IBM's approach to generative AI includes an open ecosystem and models that are focused on industries and specific use cases.

SAP and IBM have been partners for decades and have a bevy of joint customers across industries. This generative AI partnership revolves around IBM Consulting, which is already helping migrate SAP customers.

The companies said IBM Consulting and SAP will focus on the following areas revolving around RISE with SAP.

  • AI and process improvements. IBM and SAP said the two companies are looking to leverage AI in SAP business problems for industry cloud applications. IBM will extend AI into SAP-driven finance, supply chain and human capital management systems. IBM will also use SAP Signavio for process mining and SAP Business AI to retool processes.
  • Industry generative AI use cases. The two companies said they will focus on industrial manufacturing, consumer packaged goods, retail, defense, automotive and utilities. IBM is building models for these industries already and would tie in SAP processes.
  • Reference architectures. IBM and SAP said they will provide reference architectures to define data, process, systems, orchestration and automation. IBM said it will leverage SAP BTP, SAP Signavio and LeanIX. IBM Consulting is a big partner of Celonis and UiPath too.
  • IBM models available on SAP. IBM said watsonx will be available on SAP's Generative AI Hub so its Granite model family will be available across SAP apps.

 

Data to Decisions Digital Safety, Privacy & Cybersecurity Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience IBM SAP AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Technology Officer Chief Information Security Officer Chief Data Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

ServiceNow to integrate Now Assist with Microsoft Copilot

ServiceNow and Microsoft are integrating their respective generative AI bots, ServiceNow Now Assist and Microsoft Copilot, respectively.

The news, delivered in ServiceNow's Knowledge 2024, is part of ServiceNow's effort to partner with a wide range of enterprise software players including Microsoft and SAP. ServiceNow and Microsoft said their updated strategic alliance will bring the companies' generative AI assistants into one experience.

According to ServiceNow and Microsoft, the generative AI integration will be available in the Fall.

ServiceNow Knowledge 2024: Model choices, genAI everywhere, automation, process optimization

Here's a look at the details:

  • ServiceNow's Now Assist and its workflows will be integrated into Microsoft Copilot to execute productivity tasks from various apps.
  • CJ Desai, president and chief operating officer at ServiceNow, noted that large enterprise players will have to collaborate to benefit customers.
  • The integration is designed to reduce context switching between apps.
  • ServiceNow's assistants will be available within Microsoft 365.
  • Copilot will be able to hand off employee requests to Now Assist from within Microsoft Teams.

Going forward, the companies said employees will be able to use Copilot in Microsoft 365 applications from ServiceNow to create documents based on ServiceNow prompts.

Separately, ServiceNow enhanced its Contract Management Pro, Security Operations and Field Service Management applications with AI-enabled collaboration and process tools. The company also announced an integration of IBM's watsonx and Now Assist. 

See more:

Data to Decisions Future of Work Innovation & Product-led Growth Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity servicenow AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer