Introduction

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Heads up... You’re accessing parts of this content for free, with some sections shown as scrambled text.

Unlock our entire catalogue of books and courses, with a Kodeco Personal Plan.

Unlock now

Vectors are fundamental to LLMs and, by extension, RAGs. Their unique attributes enable semantic search, and understanding how vectors work is essential to maximizing the potential of RAGs — helping you fine-tune your RAGs for optimal results.

Different RAGs have different requirements — some prioritize precision, while others need the backing LLM to be more creative in responding to prompts. Azure AI Search supports vector search and provides extensive configuration options.

Before utilizing vector search, you need to create embeddings. Embeddings offer a way to index data using vectorization in vector spaces. They enable the indexing of various data types beyond text — the most common data type in traditional searches.

By the end of this lesson, you will learn to:

  • Configure vector search capabilities in Azure AI Search.
  • Generate and store embeddings for efficient semantic search.
  • Implement a basic vector search query using Azure AI Search.
See forum comments
Download course materials from Github
Previous: Lesson 1: RAG with Azure AI Search Next: Instruction 01