README file from
GithubSmart Note Agent
An agentic AI assistant plugin for Obsidian that can read and — with your approval — modify vault notes, powered by your choice of LLM provider.
Features
- Agentic chat — multi-turn conversations with an autonomous tool-calling loop
- Vault tools — full-text search, read notes, list folders, follow backlinks and outgoing links, get the active note and current selection
- Edit mode — create, edit (full replace or unified patch), delete, and move notes; every change is shown as a diff for your review before it is committed
- Multiple providers — OpenAI, Anthropic, DeepSeek, Qwen (Alibaba), Kimi (Moonshot), Zhipu (GLM), MiniMax, OpenRouter, Ollama (local), and any custom OpenAI- or Anthropic-compatible endpoint
- Three modes
- Ask — read-only; for Q&A and research without touching your vault
- Edit — full write access; all changes require your approval via a diff UI
- Scheduled — automated background runs (daily summary, weekly review) with restricted write access
- Scheduled tasks — daily summaries and weekly reviews written automatically to configurable folders
- Auto-compaction — conversation history is compacted transparently when approaching the model's context limit
- Per-provider profiles — separate API key, base URL, and model saved per provider
- User profile — optional personal description injected into every system prompt
- i18n — English and Simplified Chinese UI, auto-detected from Obsidian's language setting
Installation
Community plugins (recommended)
- Open Settings → Community plugins and disable Safe mode if prompted
- Click Browse and search for
Smart Note Agent - Click Install, then Enable
Manual
- Download
main.js,manifest.json, andstyles.cssfrom the latest release - Create the folder
<your-vault>/.obsidian/plugins/smart-note-agent/ - Copy the three files into that folder
- Reload Obsidian and enable the plugin in Settings → Community plugins
Setup
- Open Settings → Smart Note Agent
- Select your LLM provider and enter your API key
- Optionally set a custom base URL (for self-hosted or proxy endpoints) and model name
- Choose a mode: Ask for read-only access, Edit for write access
Usage
- Click the bot icon in the left ribbon, or run
Open Note Agentfrom the command palette - Type your message and press Enter (or Shift+Enter for a new line)
- In Edit mode, the agent proposes note changes as unified diffs — approve or reject each one before it is written to disk
- Use
New Chatto start a fresh conversation; previous conversations are saved and accessible via the history panel
Supported Providers
| Provider | Notes |
|---|---|
| OpenAI | GPT-4o, GPT-4o-mini, o1, o3, etc. |
| Anthropic | Claude 3.5 / 4 series |
| DeepSeek | deepseek-v3, deepseek-r1 |
| Qwen | Alibaba Cloud Dashscope (qwen-plus, qwen-max, etc.) |
| Kimi | Moonshot AI (moonshot-v1 series) |
| Zhipu | GLM-4 series |
| MiniMax | MiniMax-Text series |
| OpenRouter | Any model via openrouter.ai |
| Ollama | Local models — Llama, Mistral, Qwen, etc. |
| Custom | Any OpenAI-compatible or Anthropic-compatible endpoint |
Disclosures
Account required
Using a remote LLM provider (OpenAI, Anthropic, DeepSeek, Qwen, Kimi, Zhipu, Z.ai, MiniMax, OpenRouter) requires an account with that provider and a valid API key. Local providers (Ollama, LM Studio) and custom self-hosted endpoints require no account.
Payment may be required
Remote LLM providers charge for API usage. You are billed directly by the provider you choose — this plugin has no subscription or in-app purchase of its own. Local providers (Ollama, LM Studio) are free.
Network use
When you send a message, the plugin transmits your message text and relevant vault context to the LLM provider you have configured. No data is sent to any service by default — the plugin is inert until you supply an API key and send a message. The following remote services may be contacted, depending on your provider selection:
| Provider | Endpoint |
|---|---|
| OpenAI | https://api.openai.com |
| Anthropic | https://api.anthropic.com |
| DeepSeek | https://api.deepseek.com |
| Qwen (Alibaba Cloud) | https://dashscope.aliyuncs.com |
| Kimi (Moonshot AI) | https://api.moonshot.cn |
| Zhipu / Z.ai | https://open.bigmodel.cn, https://open.z.ai |
| MiniMax | https://api.minimax.chat |
| OpenRouter | https://openrouter.ai |
Ollama and LM Studio communicate only with localhost — no data leaves your machine.
Development
npm install
npm run dev # esbuild watch mode — rebuilds on save
npm run build # tsc type-check + production bundle
npm test # unit tests (Vitest)
For local testing, symlink or copy the dist/ folder into your vault:
<vault>/.obsidian/plugins/smart-note-agent/
The plugin artifacts (main.js, manifest.json, styles.css) are output to dist/ after each build.