The Expanding World of AI and Postgres

September 25, 2024

It wasn’t long ago that AI was considered a niche topic of interest reserved for researchers and academics. But as AI/ML engineers with extensive research backgrounds entered the industry, AI transitioned from a specialized area of study to a widely adopted and influential technology. Today, everyone’s talking about AI–from software engineers and cybersecurity analysts to linguists, front-end engineers, and the world at large.

In a recent discussion, EDB Technical Fellow Marc Linster and Machine Learning Engineer Bilge Ince discussed how AI evolved and how Postgres and EDB are contributing to this journey. Watch the full conversation on YouTube here.

 

How did AI go from a niche interest to the central focus of our industry?

Bilge explained how the demand for AI is driving its integration into various products, especially software technology. 

“We’re in the midst of a multi-year phase of democratization and commoditization of AI,” says Bilge, “and I believe this happened with the rise of foundational models.” 

Foundational machine learning and large language models (LLMs) have significantly accelerated the strength–and therefore the adoption–of AI by enabling the representation and flexible distribution of vast knowledge and experience.

Is EDB developing AI for Postgres? Or Postgres for AI?

There’s a critical relationship between a strong model and its data. In order for AI to work, the data must be stored and delivered in different formats that meet real-world needs. 

Many enterprises today trust Postgres to store and manage their critical business data due to its enterprise-grade quality of service, resilience, security, compliance, support, scalability and performance. As AI becomes more common in production environments, where real data and high standards of quality are essential, Postgres stands out as the ideal platform. Its robust, battle-tested infrastructure ensures AI applications are not only powerful but also reliable and secure, meeting the stringent demands of modern businesses. As Marc Linster states, “AI is tied to Postgres because there’s no AI without data.” 

By integrating AI with Postgres, enterprises are equipped to derive actionable insights, make strategic data-driven decisions, and maintain a competitive edge in the market.

So it works both ways: AI is for Postgres. And Postgres is for AI. 

“AI is tied to Postgres because there's no AI without data.” 

– Marc Linster, EDB Technical Fellow 

What are the key challenges EDB is tackling in the AI space?

EDB is focused on creating a top tier platform to implement AI solutions by utilizing the operational maturity and strengths of the Postgres platform. While Postgres provides a strong foundation, AI, especially generative AI, increasingly relies on non-traditional data that a relational database may struggle to handle efficiently. That’s why one of our primary technical priorities is to enhance Postgres to accommodate new methods of storing and managing large amounts of multimodal data in a disaggregated lakehouse architecture

The world is talking about LLMs/foundational models, but in the Postgres context, we're talking about vectors. How are these connected?

Everyone is focused on LLMs because we communicate through language, and LLMs are the center of language models. LLMs were designed to satisfy different needs using language such as translation, text generation, and summarization.

The core component of LLM models is their ability to represent words, phrases, and entire texts as numerical vectors through a process known as embedding. The input text for LLMs is converted to a multi-dimensional vector using embedding models to capture the semantic meaning and contextual relationship. 

Arithmetic operations are used to understand how close the vectors are in a vector space. As vectors get closer, they’re considered to have similar meanings. For example, if we’re looking for fast cars, we’ll probably find Porsche, Ferrari and Lamborghini close together. Fiat 500, Lada, and Topolino are probably far away from these. Vectors allow us to represent these complex semantic things. 

With the rise of AI, there’s been a growing need to handle vector data within databases. As Postgres can handle geographic data, texts, and more, it can also handle vectors very easily. Pgvector supports vector operations in Postgres, making it possible to store, index, and query vector data efficiently. Together, Postgres and pgvector leverage the strengths of both AI models and relational databases to power advanced, efficient, and scalable applications.

“As AI is a multimodal representation system that can handle geographic data, texts, and more, it can also handle vectors very easily. This is what makes [Postgres] an ideal AI platform.”

– Bilge Ince, Machine Learning Engineer, EDB 

What are the differences between service-based and proprietary LLMs?

Let's start with the service-based models like OpenAI's GPT-3 and Claude. These are generally considered closed source. They’re easily accessed through APIs, and you don't need to worry about hosting or maintaining the model yourself. They're convenient, but can be costly for high-volume use and may have limitations on customization.

On the other hand, there are models you can download and run locally or on your own servers like Mistral. These give you more control and potentially lower costs for high usage, but require more technical expertise and computational resources.

There's also a middle ground of models that are "open weights," where the model architecture and weights are available, but the training code isn't. LLama 3 by Meta is a great example. Models like Llama enable you to fine-tune and have deployment flexibility while protecting intellectual property at the same time.

Choosing between these options depends on factors like control, cost, scalability, and privacy. Service-based models are easier to use and scale, while downloadable models offer more control and privacy. Open source models provide transparency and customization, whereas closed-source models often come with optimized performance and professional support.

How do ‘context size,’ ‘fine-tuning,’ and ‘Retrieval Augmented Generation’ relate to LLMs?

Context size refers to the maximum amount of text (tokens) that an LLM can process at once. This limit is crucial because it affects how much information the model can consider when generating a response. 

Fine-tuning the process of further training an existing LLM on your specific data. For example, a general LLM can be fine-tuned with medical texts to improve its performance on healthcare-related queries. This makes the model more accurate and specialized for specific use cases.

Retrieval Augmented Generation (RAG) is a technique that combines the broad knowledge of an LLM with specific information retrieved from a separate database. It's a way to give the LLM access to up-to-date, proprietary data that it wasn't trained on. EDB Envoy and our Migration Portal’s AI Co-Pilot are good examples of RAG usage.

These techniques help bridge the gap between the LLM's general knowledge and your specific requirements, making the model much more useful and accurate for your particular use case.

How is EDB using AI today?

At EDB, we’re integrating AI into a number of our solutions. We utilize AI in EDB Envoy, our new custom built chatbot that helps customers get more out of their data. You can try it on our homepage today. We also use AI to translate Oracle code into Postgres code and streamline migration through our Migration Portal

We’re contributing to open source by adding pgvector support to PrivateGPT, enabling users to ask questions about their documents using the power of LLMs without internet connectivity. The goal is to provide a similar experience as ChatGPT and the OpenAI API, while mitigating privacy concerns.

“AI is made for Postgres. And Postgres is made for AI.”

– Bilge Ince, Machine Learning Engineer, EDB 

We’re also working on combining the proprietary aidb extension with the open source pgvector extension to create a consolidated solution for vector storage and similarity search, simplifying AI application development. We aim to provide a full platform where developers can easily implement enterprise-grade AI solutions without delving into complex AI implementations, while still having the same feature capabilities. Learn more.

With advanced, easily accessible AI capabilities integrated with robust data management solutions powered by Postgres, the sky's the limit.

Watch the full World of AI discussion 

Preview the EDB Postgres AI database

Explore our Envoy chatbot 


 

Share this

More Blogs