In retrieval augmented generation (RAG), an LLM retrieves contextual documents from an external dataset as part of its execution.
This is useful if we want to ask question about specific documents (e.g., our PDFs, a set of videos, etc).
What AI tools did you use?
Langchain
Share a 3–5 minute demo video (Google Drive link only) explaining your core idea, tools used, challenges faced, and how you improved your solution — make sure the link is viewable.