Box CEO Aaron Levie outlined his take on generative AI, software table stakes, productivity and the importance of being neutral as enterprises race to integrate the technology.
The comments from Levie came on Box's first quarter earnings conference call. The company reported first quarter earnings of 2 cents a share on revenue of
$251.9 million, up 5.6% from a year ago. For the second quarter, Box projected revenue of $260 million to $262 million with non-GAAP earnings of 34 cents a share to 35 cents a share. The results and outlook were better than expected.
Here's a look at what Levie had to say about large language learning models and generative AI in the enterprise.
LLMs can bring visibility into unstructured corporate data. Levie said:
"For years we've been able to ask questions about our structured data, like the information that's in a database, ERP system, or CRM system. You can ask those systems for financial forecasts, sales pipeline results, inventory levels, supply chain details, and more. But we’ve had limited ability to ask questions of our unstructured data, like content, which is 80% of corporate data. And now we can. By safely bringing leading AI models to enterprise data, enterprises can truly unlock the value that lies in their content.
To do this, we need a way to connect these models safely, securely, and compliantly to our enterprise content.
Imagine being able to instantly ask things like how many days of parental leave can I take? on an HR document or please summarize this report and provide five key takeaways on a quarterly earnings document or how would you pitch this product to a customer in the automotive industry when looking at a product overview document."
Neutrality on AI will matter to enterprise customers. Levie noted:
"As a platform-neutral vendor, we will also be AI-neutral, which means as new AI breakthroughs emerge from more vendors over time, we’ll be in a position to bring the full power of their technology to Box and our customers. In addition to our collaboration with OpenAI, we recently announced that we are building on our strategic partnership with Google Cloud to integrate Google’s advanced AI models into Box AI to create new ways for joint customers to work smarter and more productively with generative AI."
OpenAI collaboration will likely lead to more Microsoft integration for Box.
"One, we're partnering with OpenAI which by virtue leads you to partnering more broadly over time with Microsoft as well, given the OpenAI models generally running on Azure. So, there's, I think, a lot of exciting potential in that collaboration and an area that we're going to cooperate with them, we think, pretty meaningfully. And so, customers will be able to basically leverage the exact same AI that they would be seeing in any Microsoft products, but within Box as well. So that kind of adds a bit of a bit of benefit to our relationship there with CoPilot."
New uses cases, table stakes and new models. From a product standpoint, Levie said some generative AI capabilities will simply be table stakes. Think about generating AI with content, asking questions about content and workflows. Incremental monetization could come through platform APIs or multiproduct suites. "We've already shown and done are very kind of public about now is we are going to be building this technology full force, and we think it's transformational in how we can work with our unstructured data and our content," said Levie.
"I think everybody is trying to figure out their strategy of how they bring generative AI to their enterprise use cases, which is going to -- which requires a substantial amount of work in kind of the abstraction layer between AI models, customer data and cloud infrastructure, and that's exactly what we're building out."
Productivity gains. Levie said:
"With AI, I think you have such a rapid alignment of customers on testing new use cases, trying out new products and capabilities. Obviously, things like ChatGPT have been front and center. Some companies are fully banning that. Some customers are -- some companies are enabling that. And what I think companies are trying to figure out is where is the productivity gain going to most come from? Is it going to come from going into an AI interface and just like a ChatGPT and asking a question and getting an answer back? Or is it going to come from AI reasoning over existing data and existing workflows in an enterprise and then becoming a productivity boost for those kinds of use cases."
He added that his personal opinion was that generative AI is going to boost productivity by taking on a variety of subtasks a knowledge worker has to handle.
"I strongly believe that this is going to have a net positive impact to just knowledge worker productivity as opposed to a net replacement to kind of large swaps of knowledge work. If you look at the actual tasks that any one of us do in any of our jobs kind of across our roughly 2,500 employees or kind of anybody that we interact with, the vast majority of work that we're actually doing is sort of a collection of many subtasks; hundreds, thousands of subtasks that require us to have a large degree of context that we kind of maintain.
And I think AI is going after those individual subtasks and in some cases, collections of subtasks, but really in a way that will just make us more productive overall. Maybe some roles will be 5% more productive, some roles may be 50% more productive. But I think the net result of that is that we just accelerate into the future faster as opposed to we kind of like do less work.
Instead of having maybe a sales rep or an engineer waste time trying to search or find information, they can be doing the more fun productive parts of their job of working with a customer or getting code released and building a feature. And so, I think that's the kind of impact on the total knowledge worker population.
So, I'm firmly in the optimist camp on this one in terms of what it does to jobs."