The Vault LLM Assistant plugin lets you query your vault's content or generate new content using large language models. It offers two modes 1. query for asking questions and 2. create for drafting full notes with the option to include context from selected folders. Responses come with citations linking back to your source files and can be copied as plain text or markdown. You can also save them directly as notes with AI generated titles. The plugin supports both OpenAI and Google Gemini, with flexible model settings and folder level control for scanning. It's especially useful for summarising scattered notes, exploring complex ideas, or drafting new content faster.