Unleashing AI with PostgreSQL: What is Generative AI?

August 15, 2024

We created our Unleashing AI video series and blog series to help users understand and capitalize on the benefits of AI and Postgres. In our first and second blog articles, we outlined the different types of AI and explored the business and operational benefits of data lakehouses. Now we’ll move on to cover generative AI in more detail, since it’s changing the way information is accessed, created, and used.

Today there’s a lot of buzz around generative AI, which is a form of deep learning. While deep learning isn’t new, generative AI is unique because it combines multiple deep learning techniques to generate new data and content. The term that describes this best is “transformer architecture,” which refers to the ability to transform input data via various steps into output data.

Transformer architecture, powered by large language models (LLMs), is a driving force behind the generative AI industry. While there are multiple vendors offering commercial variants of LLMs, there is also an open ecosystem that has evolved around the Hugging Face platform, where models and datasets are available for free. 

Understanding encoder and decoder models

Transformer architecture includes two phases:

  • Encoder Phase: Input data (text, images, audio) is encoded into embeddings (an array of floating point numbers), capturing the semantic meaning of the input data.
  • Decoder Phase: The embeddings are fed into a decoder that generates new output data based on the encoded semantics.

A wide range of generative AI applications are powered by the encoder-decoder process:

  • Machine translation: Encode text in one language into embeddings, decode into another language.
  • Summarization: Encode lengthy text into embeddings, decode into a concise summary.
  • Classification: Encoder-only models like classification models don’t generate other unstructured data, but encode input into embeddings, and output a classification score (e.g., detecting profanity).
  • Language generation: Decoder-only models such as GPT (Generative Pre-trained Transformer) generate new data and human-like text continuations.

Embeddings are the secret sauce

To understand the generative AI landscape, it’s crucial to recognize the role of embeddings in the transformer architecture. Embeddings – arrays of floating point numbers – act as the bridge connecting the encoder and decoder components. As numerical representations of input data that capture its essential features, they are the secret sauce that ties the entire generative AI process together.

Overall, the transformer architecture which combines encoders, decoders, and embeddings underpins generative AI, allowing input data to be transformed into novel generated outputs leveraging large AI models’ capabilities.

EDB Chief Architect for Analytics and AI, Torsten Steinbach discusses generative AI. Watch the video here

EDB Postgres AI for AI workloads

EDB Postgres AI delivers unparalleled flexibility for enterprise AI, enabling seamless data access for utilization of large language models (LLMs). To see how Postgres AI can unlock transformative AI capabilities for your business, just reach out.
 

Watch the video to learn more about generative AI 

Read the white paper: Intelligent Data: Unleashing AI with PostgreSQL 

Share this

More Blogs

Data Horizons with Postgres

How open source Postgres is meeting the data needs of the future Twenty eight years ago, EDB VP Bruce Momjian chose to work on Postgres. That was a defining moment...
September 17, 2024