Open source AI models are closing the gap in the debate between open and closed models.
Since the introduction of Meta Llama generative AI models in February 2023, more enterprises have started to run their AI applications on open source models.
Cloud providers like Google have also noticed this shift and have accommodated enterprises by introducing models from open source vendors such as Mistral AI and Meta. At the same time, proprietary closed source generative AI models from OpenAI, Anthropic and others continue to attract widespread enterprise interest.
But the growing popularity of open source and open models has also made way for AI vendors like Together AI that support enterprises using open source models. Together AI runs its own private cloud and provides model fine-tuning and deployment managed services. It also contributes to open source research models and databases.
"We do believe that the future includes open source AI," said Jamie De Guerre, senior vice president of product at Together AI, on the latest episode of TechTarget's Targeting AI podcast.
"We think that in the future there will be organizations that do that on top of a closed source model," De Guerre added. "However, there's also going to be a significant number of organizations in the future that deploy their applications on top of an open source model."
Enterprises use and fine-tune open source models for concrete reasons, according to De Guerre.
For one, open models offer more privacy controls in their infrastructure, he said. Enterprises also have more flexibility. When organizations customize open source models, the resulting model is something they own.
"If you think of organizations making a significant investment in generative AI, we think that most of them will want to own their destiny," he said. "They'll want to own that future."
Enterprises can also choose where to deploy their fine-tuned models.
However, there are levels involved in what is fully open source and what is just an open model, De Guerre said.
Open models refers to models from vendors that do not include the training data or the training code used to build the model, but only the weights used.
"It still provides a lot of value because organizations can download it in their organization, deeply fine-tune it and own any resulting kind of fine-tuned version," De Guerre said. "But the models that go even further to release the training source code, as well as the training data used, really help the open community grow and help the open research around generative AI continue to innovate."
Esther Ajao is a TechTarget Editorial news writer and podcast host covering artificial intelligence software and systems. Shaun Sutner is senior news director for TechTarget Editorial's information management team, driving coverage of artificial intelligence, unified communications, analytics and data management technologies. Together, they host the Targeting AI podcast series.
Create your
podcast in
minutes
It is Free