Oracle said its generative AI managed service on Oracle Cloud Infrastructure (OCI) is generally available and the enterprise software giant said it plans to infuse it throughout its database and application offerings.

The company said its OCI Generative AI service will integrate large language models (LLMs) from Cohere and Meta Llama 2 for multiple business cases. Those two options fall short of the number of models offered by other hyperscalers, which have taken a mall approach to models, but the way Oracle is deploying the generative AI service may appeal to enterprises.

In a blog post, Greg Pavlik, SVP OCI, said:

"We took a holistic approach to generative AI as we thought through the complete picture of what enterprises truly need to successfully implement generative AI. We’re also increasingly adapting models to real-world enterprise scenarios."


OCI generative AI service can be used in Oracle Cloud or on-premises via an OCI Dedicated Region. That twist may appeal to regulated industries that Oracle already caters to, said Constellation Research analyst Andy Thurai. He said:

"While OCI's managed LLM as a service, via API access, is a compelling option, it is currently limited to just Cohere and Meta's Llama 2. Currently, the use cases are also very limited to text generation, summarization, and semantic similarity tasks.

Oracle's option to use the generative AI service in the Oracle cloud and on-premises via OCI dedicated region is a somewhat unique proposition that might be interesting to some large enterprise customers -- especially the ones in regulated industries.

In terms of overall generative AI offerings, Oracle is far behind all three cloud providers. However, the option to integrate generative AI in Oracle's ERP, HCM, SCM, and CX applications running on OCI could make this offering more attractive, if priced right."

Doug Henschen, Constellation Research analyst, said:

"It's notable that Oracle is hosting two moderately sized foundation models that  promise lower-cost operation than the large public models. Use of Cohere models will be indemnified by Oracle while Llama 2 is an open source option that will enable customers to build custom models. Being hosted on OCI, both options keep data and model training entirely inside Oracle's cloud, avoiding cross-cloud calls to external public models."

Here are the key points of the announcement at Oracle's CloudWorld Tour:

  • OCI Generative AI service supports more than 100 languages with improved GPU cluster management.
  • Oracle will embed AI across its applications and converged database with pre-built services instead of a tool kit.
  • Enterprises can consume the service via API calls.
  • Customers can refine the OCI Generative AI service models with retrieval augmented generation (RAG) techniques. To that end, OCI Generative AI Agents is in beta and can combine with a RAG agent for customization.
  • The OCI Generative AI Agents beta supports OCI OpenSearch, but Oracle said it will support search and aggregation tools to Oracle Database 23c and MySQL HeatWave with vector search tools.
  • Oracle said it will deliver prebuilt agent actions across its suite of SaaS applications.

Constellation ShortList™ Artificial Intelligence and Machine Learning Cloud Platforms | Trust In The Age of AI

In addition, Oracle launched OCI Data Science AI Quick Actions, a no-code feature of the OCI Data Science service that will enable integration with multiple LLMs and open-source models.

The company said it also improved its existing AI offerings for vision, speech, document understanding and translation.

Business Research Themes