Dec 30, 2025
| Term | Definition |
|---|---|
| LLM Hallucination | When an LLM generates plausible but incorrect or fabricated information. |
| Training Cutoff Date | The latest date of data used to train an LLM; model lacks later information. |
| RAG (Retrieval-Augmented Generation) | Approach combining retrieval (vector stores/search) with LLM generation to ground answers. |
| Vector Store | A database storing embeddings for retrieval of context relevant to queries. |
| Fine-Tuning | Re-training or adapting an LLM on domain-specific data to improve accuracy. |