Results

Q2 tech earnings themes to watch

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly. 

Earnings season is on deck, and we are going to see a parade of technology vendors with second quarter reports. Last quarter was sluggish but better than feared since most folks were waiting for a recession. When Armageddon didn't come the stock market rejoiced. It did not hurt that generative AI euphoria drove tech stock prices too.

Now that the stock market has had a nice run it’s possible a lot of optimism is priced in. Simply put, we have a different setup for tech companies this quarter. It's unlikely that tech giants will post massive sales gains--excluding Nvidia of course--since enterprises are still wary. Yes, the cost cutting may be over, but tech buyers aren't going to be giddy right away.

To cut through all the noise, I'm looking for the following signals from earnings season.

Cloud demand. The big three cloud vendors--Microsoft, Amazon Web Services and Google Cloud--are all seeing slowing growth. In the first quarter, there was a lot of talk about cost optimization over cost cutting (funny how those two categories rhyme). Oracle saw strong cloud sales though. In addition, Constellation Research's Dion Hinchcliffe found that enterprises are looking to private cloud because it can save money.

In the second quarter, every demand signal from cloud providers will be overanalyzed by Wall Street. Meanwhile, cloud providers are buying gear to support generative AI workloads that are still largely experimental for enterprises. AI will drive cloud computing demand, but the timing is debatable.

Software price increases. Salesforce launched a series of AI services, became more efficient and raised list prices about 9% on average across clouds. Salesforce's increases are effective Aug. 1, but the CRM giant won’t be the only company raising prices. Watch software company earnings to see how much growth has been fueled by price increases.

Generative AI buzz. In the first quarter, generative AI was the hot topic. The second quarter will likely feature a lot more generative AI talk. I'll be looking at the earnings calls from enterprise buyers to see what they say about generative AI, concerns, efficiency and production implementations. The first quarter earnings transcripts turned up a decent amount of generative AI comments from multiple industries.

Data revenue streams. While enterprises are training large language models and looking for competitive advantage, new revenue streams are going to appear in surprising places. My hunch is that Shutterstock's move to license its IP so OpenAI can train models is just the beginning. There are multiple companies in various industries that can license data to train models. Simply put, I think we're going to see every company try to monetize its data.

IT spending. Hardware is hot given how AI is reshaping data center demand. I cannot recall talking this much about semiconductors in years. AI is going to drive a hardware upgrade cycle at some point. I'll be looking to see if hardware demand expands beyond Nvidia and semiconductors and into networking gear and servers.

Tech Optimization Innovation & Product-led Growth Future of Work Data to Decisions Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI ML Machine Learning LLMs Agentic AI Generative AI Robotics Analytics Automation Cloud SaaS PaaS IaaS Quantum Computing Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP CCaaS UCaaS Collaboration Enterprise Service developer Metaverse VR Healthcare Supply Chain Leadership GenerativeAI Chief Information Officer Chief Technology Officer Chief Digital Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Executive Officer Chief Operating Officer Chief AI Officer Chief Product Officer

Contact centers to evolve, become data treasure trove in Experience Enterprises

Customer experience teams--marketing, commerce, customer service, support and contact center--are pressured to become a unified growth engine driving engagement and revenue opportunities and that's going to force contact center evolution.

Liz Miller, Principal Analyst at Constellation Research, focused on customer experience CX and where the contact center fits in a recent report. The report, "Connecting Experiences From Employees to Customers: 5 Trends Shifting Priorities and Strategies in the Modern Contact Center," examines the state of CX, how customers are in charge of their experience journey and how companies need to evolve to become Experience Enterprises.

Miller wrote about the shift to becoming an Experience Enterprise:

"It is a shift that has been a long time coming as individual teams have evolved from operational deployment teams into centers of functional excellence within an experience-driven enterprise. Sales’ role becomes that of revenue-optimization engine, whereas Marketing helms growth identification and amplification. Service becomes a key communication channel, proactively resolving issues and scaling one-to-one engagements that derive value for customers. Across all functions runs a common call for AI-powered, data-driven solutions to guide customers as well as employees through processes. Guided selling, self-service support, and focused hyperpersonalization to drive brand loyalty and advocacy begin to feel like shared services rather than individual, independent, disconnected functions or teams.

This shift has most acutely impacted the contact center, pushing Service as a whole out of the shadow of being an “operational cost center” and into the light of being a hub for profitable relationship-driving via purpose-built connected experiences. This move, where all three skill sets of CX delivery—selling, servicing, and marketing—must be executed by agents and supervisors, turns the contact center into a strategic growth hub for the experience-driven enterprise."

The upshot: Contact center agents will need to serve as brand ambassadors, storytellers, and sales problem solvers. Enterprises will have to enable those agents.

Many companies aren't there yet due to cloud computing maturity and the expertise needed to leverage AI and functional silos. Miller provides the questions to ask as well as takeaways about where the contact center fits in. Among the takeaways:

  • Contact centers are adopting new CX metrics. These metrics include quality of issue resolution, impact on customer lifetime value and revenue outcomes.
  • Contact centers are a data goldmine that can be leveraged for AI and new workflows. Customer records with agent notes, history, attitude, preferences, sentiment and other information can provide real insights into product development, marketing and sales.
  • Customers make no distinction about functional CX silos within an enterprise.
  • Enterprises will need to clearly articulate AI strategies in contact center and credit answers when they are generated by AI.
  • The technology behind the contact center isn't always easy to modernize and transform and enterprises may choose to keep on-premises systems.

Next-Generation Customer Experience Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Data Officer Chief Executive Officer Chief Information Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Wipro launches Wipro ai360, will invest $1 billion over 3 years to advance AI

Wipro said it will invest $1 billion over the next three years to advance artificial intelligence via Wipro ai360, a unit focused on end-to-end AI innovation.

According to the services and consulting giant, Wipro ai360 will build on the company's existing investments and integrate AI into every platform and tool used internally and offered to clients.

Wipro added that responsible AI will be at the core of Wipro ai360's efforts. Thierry Delaporte, CEO of Wipro Limited, said in a statement that Wipro ai360 will target multiple industries, business models and challenges. Wipro ai360 will have 20 innovation centers and digital pods, more than 300 patents and experience via 2,000 AI engagements.

If you zoom out, Wipro ai360 highlights how technology vendors are embedding generative AI and related tools everywhere. However, enterprise buyers are wary of AI and its implications for compliance, first party data and security even as boardrooms push for rapid adoption.

In other words, there's a lot to figure out when it comes to integrating AI everywhere. Here's a sampling of how enterprise buyers are thinking through AI.

Wipro said it plans to train its 250,000 employees on AI fundamentals and responsible use over the next 12 months with more customized training available.

Wipro ai360 includes:

  • 30,000 Wipro experts in analytics and AI across multiple businesses.
  • New AI capabilities across cloud, analytics, design, consulting, security, and engineering as well as new processes and practices.
  • Wipro's innovation hub Lab45 will be part of Wipro ai360 to speed up research and co-innovation for AI.
  • R&D efforts to improve AI, data and platform capabilities.

In addition, Wipro said it will accelerate startup investments via its Wipro Ventures arm. The company will also launch a GenAI Seed Accelerator.

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity wipro ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Shutterstock's generative AI way forward: 6-year training data deal with OpenAI

Is Shutterstock mapping the way forward for IP owners in the generative AI age or making an epic business model mistake?

Check back in a few years.

Shutterstock, which has more than 615 million images enriched with metadata and 2.2 million global customers, will provide training data for OpenAI models. Shutterstock and OpenAI launched a partnership in January, but the expanded deal will give OpenAI a license to access Shutterstock training data for image, video and music libraries and metadata.

In return, Shutterstock gains priority access to OpenAI technology and continues to use DALL-E's generative AI tools. In addition, Shutterstock and OpenAI will collaborate on bringing generative AI to the GIPHY platform.

Last week, Shutterstock said that it will indemnify enterprise customers following a similar move by Adobe.

The takeaway is that Shutterstock sees generative AI as a way to expand its total addressable market. Shutterstock will use generative AI tools to drive subscriptions and licenses and improve its products. What remains to be seen is whether granting OpenAI access to Shutterstock's first party data will be a business risk.

A historical comparison would be Starz, which licensed its catalog to Netflix in 2008. Netflix then used the Starz catalog to build its streaming business. Starz pulled its content from Netflix in 2011.

Other companies with extensive media catalogs have waffled between taking money upfront and licensing content to digital platforms and keeping it exclusive.

Whether Shutterstock regrets granting OpenAI access to training data remains to be seen. Shutterstock's e-commerce channel is self-serve and is 60% of the company's business. Those subscribers could ultimately choose OpenAI in the future. However, Shutterstock has its own generative AI offerings too.

Shutterstock CEO Paul Hennessy said on the company's first quarter earnings call that generative AI requires "experimentation at scale so we understand this new market."

Hennessy said generative AI drives engagement for its platform and enables other companies to create products by licensing its data. He said:

"We are aggressively investing in bringing generative AI to our customers. After launching our AI image generation platform in partnership with OpenAI in January, we had last reported that users had created 3 million assets in the two weeks immediately following the launch. From the three months since inception, almost 1 million users have created more than 20 million assets on our platform. To put that in context, Shutterstock averaged 10 million new images every quarter since 2020, and so the pace thus far in generative images created far exceeds the growth we’ve seen historically in our content engine.

Although it’s too early to provide any definitive statements on generative AI’s revenue potential, I’m excited to report some early indicators that speak to high engagement and the exciting potential of this new technology across the entire user journey."

To Hennessy there's no conflict between building generative AI products with Shutterstock's first party data and licensing it to others to grow its total contract value.

"In addition to the major investments we’re making in developing and delivering generative AI for our customers, we continue to be highly encouraged by our pipeline of data partnerships to help large tech companies train their models to develop their own generative AI products and solutions. The need to use meta data for generative AI model training is expanding, and we are seeing new companies invest with urgency to build commercial products within their core area of focus. We are also seeing our pipeline expand for existing customers across multiple asset types: customers who started with images looking at video, and customers looking at music and 3D content for model training. The use cases for training data are also expanding and we are seeing opportunities that are increasingly industry-specific and for specific commercial products."

Related:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

Salesforce raises list prices across clouds by about 9%

Salesforce is raising prices across its clouds starting in August.

In a blog post, Salesforce said it will be increasing list prices about an average of 9% across Sales Cloud, Service Cloud, Marketing Cloud, its industry clouds and Tableau.

The news was good for Salesforce shares as the company has cut costs, kept customers and is now raising prices.

Salesforce noted that the company's last list price increase was 7 years ago. The primary argument for the price increase is that Salesforce has been rolling out new innovations around generative AI. Recent headlines include the launch of AI Cloud, Einstein GPT, Sales and Service GPT as well as other updates.

According to Salesforce, Professional Edition list prices will increase $5 to $80, Enterprise Edition will jump $15 to $165 and Unlimited Edition will go for $330, up $30. Similar increases will be implemented across Industries, Marketing Cloud Engagement and Account Engagement, CRM Analytics and Tableau.

Constellation Research CEO Ray Wang said:

"SaaS pricing was supposed to get cheaper with time and Salesforce was the standard bearer. 

Unfortunately customers now pay for licenses before implementation, they over buy licenses and can’t reduce them, and now they are subject to vendor lock-in and price increases despite record profits by the cloud providers. 

Customers should ban together and reject these price increases before it's too late.

Salesforce had amazing profits and customers entrusted Salesforce to achieve economies of scale and past cost savings to the customer not hold them in vendor lock-in."

While many enterprises have multi-year contracts and discounts across multiple clouds, Salesforce's increases will add up when it's time to renew.

Salesforce has said that 20% of its customers have more than 4 clouds.

Marketing Transformation Next-Generation Customer Experience Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Sales Marketing Digital Safety, Privacy & Cybersecurity Tech Optimization salesforce AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

A look at the surprises of 2023 so far...

This post first appeared in the Constellation Insight newsletter, which features bespoke content weekly. 

We're careening into the second half of the year and it's always worth a bit of reflection. What stood out for me were the surprises.

Here's a quick look.

☁️ The private cloud is kind of cool now. You've seen the cloud providers all report slowing growth (except for Oracle coming off a smaller base), but the real surprise was that CIOs were thinking a lot more about the private cloud. Dion Hinchcliffe's research report on the private cloud's newfound popularity highlighted how platforms like HPE Greenlake were garnering demand. In a nutshell, public cloud providers haven't been passing on savings and encouraging enterprises to move workloads such as AI on-premise.

🤖 The intensity of the generative AI theme. Exiting 2022, it was clear that OpenAI captured lightning in a bottle. What wasn't clear: Generative AI euphoria would fuel the stock market, revive tech stocks and create a groundswell of press releases and product rollouts. When it comes to generative AI, the technology sector is taking a "build it and they will come" approach. Nvidia is the biggest winner in the generative AI buildout, but the wealth is starting to spread around. Enterprise tech buyers remain cautiously optimistic about generative AI, but acknowledge the potential risks too.

🏁 Big vendors move fast. In technology lore, the storyline is usually one that revolves around a startup upending a sleeping giant. Generative AI is an area that's highlighting how fast the giants are moving. Microsoft caught Google off guard and the latter rallied after a rough start. Salesforce outlined a generative AI roadmap in a hurry as did a bevy of other vendors. Quantum computing is being driven by giants too. Bottom line: It's way harder to sneak up on a giant today.

📉 The recession never happened. When the calendar turned to January, there was a wide consensus that the economy was going to struggle. Interest rates were surging, layoffs hit key sectors like technology and CFOs were hitting the brakes. Instead, inflation cooled a smidge, earnings weren't as horrible as expected and technology vendors saw stable demand. We're not out of the economic woods yet, but the first half didn't produce the slowdown expected.

👩🏽‍💻 The future of work isn't the present of work yet. I can't believe we're still debating the all-in-office approach vs. the all-remote approach. Like everything in life, the extreme cases are the few and the middle is the many. Downtown isn't booming, commercial real estate is a mess and clearly, we all didn't run back to the office. Nevertheless, the entirely remote work life is becoming rarer. It's obvious that a hybrid approach has emerged. Nevertheless, we'll keep arguing. Also see: How a writing-based culture can rewrite work

Data to Decisions Future of Work Innovation & Product-led Growth New C-Suite Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI GenerativeAI ML Machine Learning LLMs Agentic AI Analytics Automation Disruptive Technology Chief Information Officer Chief Experience Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Data Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

JPMorgan Chase: Digital transformation, AI and data strategy sets up generative AI

View full PDF

 

JPMorgan Chase will deliver more than $1.5 billion in business value from artificial intelligence and machine learning efforts in 2023 as it leverages its 500 petabytes of data across 300 use cases in production.

"We've always been a data driven company," said Larry Feinsmith, Managing Director and Head of Technology Strategy, Innovation, & Partnerships at JPMorgan Chase. Feinsmith, speaking with Databricks CEO Ali Ghodsi during a keynote at the company’s Data + AI Summit, said JPMorgan Chase has been continually investing in data, AI, business intelligence tools and dashboards.

Indeed, JPMorgan Chase said it will spend $15.3 billion on technology investments in 2023. JPMorgan Chase's technology budget has grown at a 7% compound annual growth rate over the last four years.

Feinsmith said the bank's AI/ML strategy is one of the big reasons JPMorgan Chase migrated to the public cloud. "If you look at our size and scale, the only way to deploy at scale is to do it through platforms," said Feinsmith. "Everyone has an opinion on data platforms, but you can efficiently move the data once and manage. Once you start moving data around it's highly inefficient and breaks the lineage."

JPMorgan Chase, a customer of Databricks, Snowflake and MongoDB, has multiple platforms, according to Feinsmith. It has an internal platform, JADE (JPMorgan Chase Advanced Data Ecosystem) for moving and managing data and one called Infinite AI for data scientists. "Equally as important as the data is the capabilities that surround that data," said Feinsmith, adding that data discovery, data lineage, governance, compliance and model lifecycle are critical.

 

According to Feinsmith, JPMorgan Chase's AI efforts start with a business focus with data scientists and AI/ML experts embedded into each business.

Feinsmith said JPMorgan Chase is leveraging streaming data and said he was a fan of Databricks' Lakehouse architecture and new AI features because it's easier to move and process data in one environment instead of two architectures, a data warehouse for business intelligence and a data lake for AI. JPMorgan deploys a central but federated data strategy and interoperability between data platforms is important. "Data has to be interoperable," Feinsmith told Ghodsi. "Not all of our data will wind up in Databricks. Interoperability is very important."

That comment rhymes with what other enterprise technology buyers have said. Despite a lot of talk about consolidating vendors--mostly from vendors looking to gain share--enterprise buyers want to keep options open. How JPMorgan Chase has approached its tech stack is instructive.

The digital transformation behind the AI

At JPMorgan Chase's Investor Day in May, Lori Beer, Global CIO at the bank, gave an overview of the bank's technology strategy. In 2022, JP Morgan launched a plan to deliver leading technology at scale with its team of 57,000 employees.

"Products and platforms need a strong foundation to be successful, and ours are underpinned by our mission to modernize our technology and practices," explained Beer. "We are already delivering product features 20% faster than last year, and we continue to modernize our applications, leverage software as a service and retire legacy applications."

JPMorgan Chase is moving to a multi-vendor public cloud approach while optimizing its owned data centers. The company is also embedding data and insights throughout the organization, said Beer. Those efforts will pave the way for large language models (LLMs) and other advances in the future.

"We have driven $300 million in efficiency through modern engineering practices and labor productivity, and we have developed a framework that enables us to identify further opportunities in the future. Our infrastructure modernization efforts have yielded an additional $200 million in productivity, driven by improved utilization and vendor rationalization," said Beer.

Here's a look at the key pillars of JP Morgan Chase's digital transformation.

Applications. Beer said the bank has decommissioned more than 2,500 legacy applications since 2017 and is focusing on modernizing software to deliver products faster. The bank has more than 560 SaaS applications, up 14% from 2022. By using industry-leading SaaS applications, Beer said it will be easier to scale new products to more than 290,000 employees.

Infrastructure modernization. Beer said:

"To date, we have moved about 60% of our in-scope applications to new data centers, which are 30% more efficient, and this translates to 16,000 fewer hardware assets. We are also migrating applications to utilize the benefit of public and private cloud. 38% of our infrastructure is now in the cloud, which is up 8 percentage points year-over-year. In total, 56% of our infrastructure spend is modern. Over the next three years, we have line of sight to have nearly 80% on modern infrastructure. Of the remainder, half are mainframes, which are highly efficient and already run in our new data centers."

JPMorgan Chase has been able to maintain infrastructure expenses flat even though compute and storage volumes have increased 50% since 2019, said Beer. One example is Chase.com is now being served through AWS and has an average of 15 releases a week.

Engineering. Beer said JPMorgan is equipping its 43,000 engineers with modern tools to boost productivity. JPMorgan Chase has adopted a framework to speed up the move from backlog to production via agile development practices.

Data and AI. Beer said:

"We have made tremendous progress building what we believe is a competitive advantage for JPMorgan Chase. We have over 900 data scientists, 600 machine learning engineers and about 1,000 people involved in data management. We also have a 200-person top notch AI research team looking at the hardest problems in the new frontiers of finance."

Specifically, Beer said AI is helping JPMorgan Chase deliver more personalized products and experiences to customers with $220 million in benefits in the last year. At JPMorganChase's Commercial Bank, AI provided growth signals and product suggestions for bankers. That move provided $100 million in benefits, said Beer.

The data mesh

To capitalize on AI, JPMorgan Chase created a data mesh architecture that is designed to ensure data is shareable across the enterprise in a secure and compliant way. The bank outlined its data mesh architecture at a 2021 Data Mesh Learning meetup.

JPMorgan said its data approach is to define data products that are curated by people who understand the data and management requirements. Data products are defined as groups of data from systems that support the business. These data groups are stored in its product specific data lake. Each data lake is separated by its own cloud-based storage layer. JPMorgan Chase catalogs the data in each lake using technologies like AWS S3 and AWS Glue.

Data is then consumed by applications that are separated from each other and the data lakes. JPMorgan Chase said it makes the data lake visible to data users to query it.

At a high level, JPMorgan Chase said its approach will empower data product owners to manage and use data for decisions, share data without copying it and provide visibility into data sharing and lineage.

In a slide, this architecture looks like this.

According to JPMorgan Chase, its architecture keeps data storage bills down and ensures accuracy. Since data doesn't physically leave the data lake, JPMorgan Chase said it's easier to enforce decisions product owners make about their data and ensure proper access controls.

How JPMorgan Chase will address generative AI

Given JPMorgan Chase's data strategy and architecture, the bank can more easily leverage new technologies like generative AI. Feinsmith at the Databricks conference said JPMorgan Chase was optimistic about generative AI but said it's very early in the game.

"There's a lot of optimism and a lot of excitement about generative AI. Businesses all know about it and generative AI will make us more productive," said Feinsmith. "But we won't roll out generative AI until we can do it in a responsible way. We won't roll it out until it's done in an entirely responsible manner. It's going to take time."

In the meantime, JPMorgan Chase's Feinsmith said the bank is working through the generative AI risks. The promise for JPMorgan Chase is obvious: Take 500 petabytes of data, train it, make it valuable and then add value to open-source models.

Beer outlined the JPMorgan Chase approach during the bank's Investor Day in May.

"We couldn't discuss AI without mentioning GPT and large language models. We recognize the power and opportunity of these tools and are committed to exploring all the ways they can deliver value for the firm. We are actively configuring our environment and capabilities to enable them. In fact, we have a number of use cases leveraging GPT4 and other open-source models currently under testing and evaluation.”

With Databricks, MongoDB and Snowflake all adding generative AI and large language model (LLMs) capabilities to the data stack, enterprises will have the tools when ready.

JPMorgan Chase has named Teresa Heitsenrether its chief data and analytics officer, a central role overseeing the adoption of AI across the bank. Heitsenrether oversees data use, governance and controls with the aim of harnessing AI technologies to effectively and responsibly develop new products, improve productivity and enhance risk management.

Heitsenrether is a 35-year veteran at JP Morgan Chase and previously was Global Head of Securities Services from 2015 to 2023.

Beer said explained JPMorgan Chase’s approach to responsible AI:

“We take the responsible use of AI very seriously, and we have an interdisciplinary team, including ethicists, data scientists, engineers, AI researchers and risk and control professionals helping us assess the risk and build appropriate controls to prevent unintended misuse, comply with regulation, and promote trust with our customers and communities. We know the industry is making remarkably fast progress, but we have a strong view that successful AI is responsible AI."

Data to Decisions New C-Suite Innovation & Product-led Growth Tech Optimization Future of Work Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity AI ML Machine Learning LLMs Agentic AI Generative AI Analytics Automation B2B B2C CX EX Employee Experience HR HCM business Marketing Metaverse developer SaaS PaaS IaaS Supply Chain Quantum Computing Growth Cloud Digital Transformation Disruptive Technology eCommerce Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP Leadership finance Social Healthcare VR CCaaS UCaaS Customer Service Content Management Collaboration M&A Enterprise Service GenerativeAI Chief Information Officer Chief Data Officer Chief Executive Officer Chief Digital Officer Chief Technology Officer Chief Information Security Officer Chief AI Officer Chief Analytics Officer Chief Product Officer

How a writing-based culture can rewrite work

Adam Nathan, CEO of Almanac, said modern work is broken and needs to be fixed by reframing remote work, creating writing-based cultures and processes and providing enough space for the magic of human collaboration.

Nathan's approach, which was outlined on DisrupTV Episode 328, is worth a listen starting at the 40 minute mark. Here are some of the takeaways.

Work itself is broken. "It was broken before Covid but since then it's very clear that where we work has changed but how we work hasn't. Teams are experiencing a ton of burnout, a ton of chaos at work. People are just not being able to get stuff done," said Nathan.

Creating a modern work method. Nathan said Almanac set out to conduct 5,000 interviews with organizations that have mastered collaboration. The research informed Almanac's modern work method. "We largely found that regardless of what a team does, purpose of the company or location is that the teams that are working the fastest and delivering the most value work with a lot more structure, more transparency and can't wait for meetings," said Nathan.

That tired remote work debate. "I think there has been a very loud push especially in the New York Times and from what I call old white guys on Twitter to return to the office but if you actually look at the data on this remote work a percent of the workforce has actually continued to grow even after the end of the end of Covid-era restrictions in September 2022. If you look at white collar professional jobs before the pandemic about 22% of the workforce was working in a remote or hybrid fashion. Today that number is 66%. I don't love this debate between office versus remote. It's a tiring one if you think about remote work as internet work," said Nathan. He added:

"Internet work is a disruptive and inexorable trend. Just like our consumer lives have moved from shopping in person to e-commerce and hanging out in person to social media, the same thing is happening to work. Working on the internet is not going away anytime soon. I think the questions we're asking are almost all the wrong ones. Theory and data don't support this idea that life is going to return to how it was."

7 future of work themes to know now | 12 lessons for when work, life collide | Future of work research and insights

Advantages of internet work. "For workers, there's obviously flexibility and freedom. You can work when you want and where you want and that gives people a lot more time to get into focus and flow to balance their lives better between work, family and friends and hobbies. I think that's why CEOs and owners are pushing back so much," said Nathan. "There's another chapter in this tension between capital and labor and who controls the leverage. Labor has gotten broad new freedoms and I think CEOs, people who own real estate and maybe some elected officials that there's discomfort with how this new normal is going to work out. There's not the same sense of control anymore over people's time and location. For the last 50 to 70 years managers have been managing my presence by butts and seats and did you attend meetings. I don't think that was at all correlated with effectiveness, growth or value delivery."

Well managed teams do well remote or in person (the office hides dysfunction). "The other silly thing is that remote is not a place--it's the absence of one. What we've seen in the data is that remote work and network really expose how teams are functioning. Well-managed teams tend to do better in remote settings because they already have good systems and structures and processes in place," said Nathan. "There's a high trust level. Teams that were dysfunctional don't have the theater of the office to cover it up, so all the dysfunction is exposed. These teams are often facing the choice of do we want to improve how we're working or revert and ignore it by going back into the office. There are bosses that clearly don't know how to operate in a distributed environment and would prefer the control an office creates."

Culture of writing. Amazon is well known for requiring employees to draft a memo before any meeting. Bridgewater is another example of an organization with high performance and a culture and decision-making process based on writing. "There's this misconception that the only way to get stuff done in stressful environments is to get everyone together, create a lot of chaos and move really fast," explained Nathan. "In the Marines slow means smooth and smooth means fast. A lot of organizations we've interviewed and observed are calm working environments. Everything feels really smooth, everyone's really calm and yet they're moving extremely fast in part due to a culture of writing."

How to get there? Nathan said high performing organizations start with a doc before a meeting. "Sometimes the doc obviates the need for a meeting. Even when there is a meeting everyone has read beforehand and commented. It makes the synchronous time they're spending more effective," he said.

Another move is to understand what recurring meetings aren't useful anymore. "What happens in organizations is that back-to-back meetings are just an accumulation of things that were once useful," said Nathan. Use documents to cancel meetings and store them so they can find answers easily.

Generative AI's impact on writing cultures. "I think the main thing LLMs are doing right now is producing fuzzy first drafts. It's the average of everything out there to give you an answer. Now we have a better chat interface that's going to get us to look over a larger amount of information much faster and produce a better outcome," said Nathan. "I think the productivity curve of what we can do with writing is to move up and into the right."

"What happens in the future is there are going to be some people who are going to really be able to exploit this technology to their advantage and some people who fall behind. Teamwork and collaboration are still a deeply human exercise. The human brain is constantly rewiring based on interactions it has with other people." 

The magic of human collaboration. "What makes collaboration so magical is we don't know what will happen when we get together to work on a problem together. We might see AI almost like a collaborator in some ways but LLMs are just looking at past information, decisions, and knowledge," said Nathan. "I think the magic of human collaboration will always be there and what we do together might be more elevated because we have better technology to automate the overhead work."

Future of Work Tech Optimization Innovation & Product-led Growth Data to Decisions New C-Suite Chief Executive Officer Chief People Officer Chief Information Officer Chief Experience Officer

Rivian: AI, data power customer experiences

Rivian is betting that data, AI and machine learning will continually improve its customer experience and detect and prevent future vehicle issues.

Speaking at the Databricks Data + AI Summit, Wasssym Bensaid, SVP of Software Development at Rivian, walked through how the electric vehicle manufacturer created an architecture that enables it to ingest telemetry data from vehicles, boost battery life and roll out new features with over-the-air (OTA) updates.

Bensaid said that the data ecosystem for EV makers goes well beyond autonomy and other technologies that grab headlines. "With software defined vehicles and an amazing hardware platform you can have all-in-one vehicles," he said. "Everything at Rivian is data driven from our supply chain to manufacturing to the customer relationship."

Rivian said it produced 13,992 vehicles in the second quarter and said it is on track to produce 50,000 for the year.

Speaking at an investor conference June 15, Rivian CFO Claire McDonough said the EV maker is looking to own "the full end-to-end ownership experience for commercial customers and for consumers as well." On the commercial side, Rivian counts Amazon as its flagship customer with a purpose-built delivery vehicle connected to fleet management software called FleetOS.

Data from the Rivian vehicle is shared with Amazon's back-end software system to improve efficiency and make life easier for drivers with perks like cooled seats.

For consumers, McDonough outlined:

"What we started with was how to create a seamless transaction experience. If you go online and you buy a Rivian, you can purchase, right, insurance, financing, trade-in your vehicle in about 6 minutes. Is that convenient and fast? Some of the early investments we’ve made have really ensured that we had this integrated experience for our customers with services like financing and insurance. Over time, we’re constantly updating our vehicles with over-the-air updates. And we’ve added incremental drive modes and feature sets to the vehicles. Over time, we will have the opportunity to create features that can be bundled or paid features for consumers. But right now, we’re really excited about offering continuous value accretion for Rivian owners that have seen the range of their vehicles, increase over the lifespan of their ownership and true enhancements to some of the new drive modes that we’ve offered as well."

The architecture

Bensaid said Rivian initially struggled with data silos and multiple systems, data types and tools. Rivian also had a team of experts focused on Rivian's data strategy and ultimately became the bottleneck. As a result, Bensaid said Rivian moved to democratize access to data while ensuring security, privacy and governance.

Rivian used Databricks and its Lakehouse platform to build a new architecture on top of data lakes that could scale. Bensaid said Rivian also uses Databricks Unity Catalog to create one version of the truth.

The EV maker has automated more than 95% of its Databricks provisioning workflows. Rivian's stack includes a data and analytics layer that runs through Rivian technology, cloud, product development and operations, products and services.

"We're using data and AI to achieve and unlock business outcomes," said Bensaid.

Ultimately, Rivian is planning to improve the customer experience and become more predictive about maintenance and improving performance. "Imagine a world where a vehicle will self-monitor its health and schedule its own appointments to deliver an amazing experience."

 

Data to Decisions Next-Generation Customer Experience Innovation & Product-led Growth Future of Work Tech Optimization Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration Chief Information Officer Chief Data Officer Chief Executive Officer Chief Technology Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer

How JetBlue is leveraging AI, LLMs to be 'most data-driven airline in the world'

JetBlue is actively using artificial intelligence and machine learning across its business and actively using generative AI for its internal operations and ultimately revenue-producing products.

Speaking at the Databricks Data + AI Summit, Sai Ravuru, Senior Manager of Data Science and Analytics at the airline, walked through how the company was using Databricks Lakehouse on multiple fronts. Databricks launched a series of new additions to its platform and said it will acquire MosaicML

"Over the last two years, we've made investments in data science and data refinement so raw data is continuously hydrated and reliable," said Ravuru. He said that AI and machine learning teams at JetBlue work alongside data scientists. "AI/ML scouts for the next use case before handing off to the data science team," explained Ravuru, who noted the goal for JetBlue is to be the most data-driven airline.

Ravuru said that data touches every part of JetBlue's business including operations, commercial and support functions. JetBlue is creating a unified digital twin of its business with cross-team collaboration and process-driven data science fueled by data from multiple systems.

Databricks Lakehouse is what absorbs and creates modeling across JetBlue's data footprint.

 

The airline has leveraged Databricks' platform to create an ecosystem of models called BlueSky to enable decision making. "The BlueSky product was built from scratch internally," said Ravuru. "It is a continually refreshed network with embedded LLM and real-time components for frontline staff."

BlueSky serves as JetBlue's AI-driven operating system.

Ravuru also said that JetBlue has created a unified LLM called BlueBot that uses open-source models complemented by corporate data integrated with BlueSky. BlueBot can be used by all teams at JetBlue since access to data is governed by role. For instance, the finance team may see data from SAP and regulatory filings, a new employee may just be served FAQs, and operations would see maintenance information, explained Ravuru.

"BlueBot brings crew members much closer to data and insights without change management," he said.

JetBlue is using Databricks for generative AI use cases that are experimental as well as production.

What's next? JetBlue is looking at LLMs to create new revenue channels so customers "can book from BlueBot or plan trips better." In addition, JetBlue is looking at efficiency gains by using LLMs to provide the "technical operations team with WebMD style diagnoses for each and every aircraft."

Related:

Data to Decisions Innovation & Product-led Growth Future of Work Tech Optimization Next-Generation Customer Experience Digital Safety, Privacy & Cybersecurity ML Machine Learning LLMs Agentic AI Generative AI AI Analytics Automation business Marketing SaaS PaaS IaaS Digital Transformation Disruptive Technology Enterprise IT Enterprise Acceleration Enterprise Software Next Gen Apps IoT Blockchain CRM ERP finance Healthcare Customer Service Content Management Collaboration GenerativeAI Chief Information Officer Chief Data Officer Chief Technology Officer Chief Executive Officer Chief AI Officer Chief Analytics Officer Chief Information Security Officer Chief Product Officer