Introduction of the Speaker
I possess a background in machine learning and natural language processing (NLP), having previously worked at Microsoft and Walmart. I currently contribute to Google's efforts in enhancing the quality and features of Google Gemini in the multilingual space. I hold a master's degree from IIT Kanpur and a bachelor's degree from NIT Hamirpur. Additionally, I dedicate my time to teach machine learning, deep learning and LLMs from scratch
Title
Beyond the Black Box: Using RAG in LLMs
Abstract
While Large Language Models (LLMs) possess a deep 'parameterized' understanding of language, they often struggle with a fixed knowledge cutoff and a tendency to 'hallucinate' when they lack specific data. RAG transforms this process by essentially giving the LLM an 'open-book exam'—enabling it to ground its responses in up-to-date, external documents rather than relying solely on its memory. In this talk, we will explore the technical architecture behind RAG—including vector databases and embeddings—and discuss how it provides a cost-effective, real-time solution for building more accurate and reliable AI applications.