Oracle to fine-tune AI for the enterprise
[ad_1]
Oracle’s plans to strengthen its artificial intelligence (AI) offerings could just be one of many such announcements to come in the enterprise application space, but the database application suite giant has, in fact, been working on adding artificial intelligence across its stack for some time.
In September 2023, it announced its Oracle Cloud Infrastructure Generative AI, in beta. This has now moved to general availability.
In January 2024, Oracle also announced a number of new beta products, including Oracle Cloud Infrastructure (OCI) Generative AI Agents – built on top of its OCI OpenSearch service, and OCI Data Science AI Quick Actions. The beta for these will open in February.
But perhaps of most interest to Oracle customers, as well as CIOs and chief data officers looking to apply AI tools to their corporate data, are Oracle’s plans to add pre-built AI agent actions across its software-as-a-service (SaaS) products. These include Oracle Fusion, Cloud Applications Suite, and Oracle NetSuite, as well as the Oracle Cerner platform for healthcare records.
The plans are part of a significant investment by Oracle in AI. According to Holger Mueller, principal analyst and vice president at Constellation Research: “[Oracle] have been investing 50% of free cash flow in cloud. Most of it went to AI.” Much of that has, in turn, been invested in Nvidia technology, he said.
The move is partially defensive, and partially putting a stake in the ground. A large percentage of the world’s enterprise data is in Oracle databases, and platforms that run on Oracle technology.
The technology company will not want to see customers move away from its platforms, or add other vendors’ AI, to their stacks, so that they can take advantage of technologies such as large language models (LLMs).
And CIOs will not want to lift and shift large volumes of data from their existing Oracle infrastructure – whether on premises or in OCI – to another supplier, if they can avoid it.
“We don’t require customers to move their data outside the data store to access AI services. Instead, we bring AI technology to where our customers’ data resides,” Vinod Mamtani, Oracle’s vice-president and general manager for generative AI services, OCI told the media during the January launch event for its OCI generative AI products.
In essence, Oracle customers will be able to use their own, private data to “fine-tune” public LLMs, delivering results that are specific to that organisation.
Oracle will give access to Oracle Database 23c with AI Vector Search and, Mamtani said, MySQL HeatWave with Vector Store. This is fundamental to Oracle’s strategy that customers will not have to move data across to a separate supplier database to access AI capabilities.
Additionally, Oracle will launch OCI Data Science AI Quick Actions, again in beta from February 2024. These tools will give Oracle users “no-code” access to a number of opensource LLMs, including those from Cohere, Meta and Mistral AI.
Oracle will also add generative AI capabilities into its database offerings, so Oracle shops can build their own AI-powered applications. According to Mamtani, the technology could also be used to help organisations manage their technology stacks.
But for many business users, the real value could lie not just in better analytics, but in Oracle’s plans to add AI into its cloud-based enterprise applications.
“What we have done over the past two quarters is to infuse generative AI technology into every application that’s out there, whether it’s HCM [human capital management], SCM [supply chain management], ERP, CX or NetSuite,” noted Mamtani. “We have built hundreds of use cases where we have generative AI solutions integrated into the apps.”
Here, Oracle appears to be moving ahead of its competitors, in bringing in AI tools both for data scientists and general users.
“It is significant as it is the first IaaS-PaaS-SaaS announcement,” said Constellation’s Mueller. “All other players offer only one or two pieces of the whole part. It is also critical as most of the world’s mission-critical transactional data is in an Oracle database – and enterprises need to get to the data for their AI endeavours.”
Doing away with the need to move data removes many of the practical considerations around transferring what can be very large data sets to different platforms and even to different cloud providers.
And enterprises should gain better results from training AI against their own data, rather than the public data in the current LLMs. By definition, those models are trained with the public data that is available, rather than the specific information firms hold on their own systems.
Guarding against ‘AI hallucinations’
Oracle’s first step in this direction will be to launch a RAG – or retrieval augmented generation – agent in beta. The OCI Generative AI Agents service with RAG agent sets out to combine LLMs and enterprise search.
“The key idea behind this agent is we have enterprises that have huge knowledge bases with unstructured data or documents in them,” says Mamtani. “Now [LLM] models have been trained on public datasets, they haven’t been trained on enterprise datasets that are within the enterprise premises. Enterprises want to be able to leverage [their] data. They want their employees to be able to talk to their data.”
The aim is to allow employees to use natural language to interrogate enterprise data through Open Search, and the agent will return results, along with citations and references to the documents and data sources the agent has used. Employees can ask follow up questions, which the agent can then use to refine its results.
This process should help the agents guard against “AI hallucinations” where AI models return erroneous results. Oracle will also add content moderation to its AI agents, so that firms are not wasting resources running prompts that will not produce relevant results. Oracle plans support for more than 100 languages.
The AI tools will run in the Oracle cloud, or on-premise via OCI Dedicated Region. This will be an important consideration for organisations that have to comply with data sovereignty rules, or simply want to be absolutely sure they know where their information is being held.
Oracle is also taking further steps to ensure security and data privacy, which is hardly a surprise given the sensitivity of critical enterprise data. Models are hosted on OCI, and data is not shared with LLM providers such as Cohere or Meta, and there is no intermixing of customer data sets. These steps are essential, if customers using the “fine tuning” model are to trust the process.
In all, Oracle’s venture into generative AI appears to offer plenty of potential for businesses that already operate Oracle infrastructure and Oracle cloud technologies.
Constellation’s Mueller point out that the Nvidia technology also makes Oracle a logical home for firms that use Nvidia-based AI. In fact, both Nvidia and Microsoft use Oracle to run AI workloads.
Oracle is likely to face competition, both from cloud providers such as Google Cloud Platform and more hardware-focused AI offerings such as those from HPE.
“But if you have Nvidia AI workloads, or plan to have [them], Oracle is a very attractive place. And that is what CIOs are looking for now,” Mueller said, adding that Oracle is also effective for cost, privacy and compliance.
“Oracle is good on all three,” he said. “It is faster: it turns out building a cloud for databases is a good cloud for AI. Given your database runs there, privacy is given. For compliance, Oracle has an ace here with its aggressive location strategy.”
[ad_2]
Source link