Vectors are fundamental to LLMs and, by extension, RAGs. Their unique attributes enable semantic search, and understanding how vectors work is essential to maximizing the potential of RAGs — helping you fine-tune your RAGs for optimal results.
Different RAGs have different requirements — some prioritize precision, while others need the backing LLM to be more creative in responding to prompts. Azure AI Search supports vector search and provides extensive configuration options.
Before utilizing vector search, you need to create embeddings. Embeddings offer a way to index data using vectorization in vector spaces. They enable the indexing of various data types beyond text — the most common data type in traditional searches.
By the end of this lesson, you will learn to:
Configure vector search capabilities in Azure AI Search.
Generate and store embeddings for efficient semantic search.
Implement a basic vector search query using Azure AI Search.
See forum comments
This content was released on Nov 15 2024. The official support period is 6-months
from this date.
Introduction to Vector Search with Azure AI Search.
Download course materials from Github
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress,
bookmark, personalise your learner profile and more!
Previous: Lesson 1: RAG with Azure AI Search
Next: Instruction 01
All videos. All books.
One low price.
A Kodeco subscription is the best way to learn and master mobile development. Learn iOS, Swift, Android, Kotlin, Flutter and Dart development and unlock our massive catalog of 50+ books and 4,000+ videos.