Smart Note Agent

by binhong87
5
4
3
2
1
Score: 35/100

Description

This plugin has not been manually reviewed by Obsidian staff. Agentic AI assistant for reading and editing vault notes.

Reviews

No reviews yet.

Stats

stars
downloads
0
forks
1
days
NaN
days
NaN
days
0
total PRs
0
open PRs
0
closed PRs
0
merged PRs
0
total issues
0
open issues
0
closed issues
0
commits

Latest Version

Invalid date

Changelog

README file from

Github

Smart Note Agent

An agentic AI assistant plugin for Obsidian that can read and — with your approval — modify vault notes, powered by your choice of LLM provider.

中文文档

Features

  • Agentic chat — multi-turn conversations with an autonomous tool-calling loop
  • Vault tools — full-text search, read notes, list folders, follow backlinks and outgoing links, get the active note and current selection
  • Edit mode — create, edit (full replace or unified patch), delete, and move notes; every change is shown as a diff for your review before it is committed
  • Multiple providers — OpenAI, Anthropic, DeepSeek, Qwen (Alibaba), Kimi (Moonshot), Zhipu (GLM), MiniMax, OpenRouter, Ollama (local), and any custom OpenAI- or Anthropic-compatible endpoint
  • Three modes
    • Ask — read-only; for Q&A and research without touching your vault
    • Edit — full write access; all changes require your approval via a diff UI
    • Scheduled — automated background runs (daily summary, weekly review) with restricted write access
  • Scheduled tasks — daily summaries and weekly reviews written automatically to configurable folders
  • Auto-compaction — conversation history is compacted transparently when approaching the model's context limit
  • Per-provider profiles — separate API key, base URL, and model saved per provider
  • User profile — optional personal description injected into every system prompt
  • i18n — English and Simplified Chinese UI, auto-detected from Obsidian's language setting

Installation

  1. Open Settings → Community plugins and disable Safe mode if prompted
  2. Click Browse and search for Smart Note Agent
  3. Click Install, then Enable

Manual

  1. Download main.js, manifest.json, and styles.css from the latest release
  2. Create the folder <your-vault>/.obsidian/plugins/smart-note-agent/
  3. Copy the three files into that folder
  4. Reload Obsidian and enable the plugin in Settings → Community plugins

Setup

  1. Open Settings → Smart Note Agent
  2. Select your LLM provider and enter your API key
  3. Optionally set a custom base URL (for self-hosted or proxy endpoints) and model name
  4. Choose a mode: Ask for read-only access, Edit for write access

Usage

  • Click the bot icon in the left ribbon, or run Open Note Agent from the command palette
  • Type your message and press Enter (or Shift+Enter for a new line)
  • In Edit mode, the agent proposes note changes as unified diffs — approve or reject each one before it is written to disk
  • Use New Chat to start a fresh conversation; previous conversations are saved and accessible via the history panel

Supported Providers

Provider Notes
OpenAI GPT-4o, GPT-4o-mini, o1, o3, etc.
Anthropic Claude 3.5 / 4 series
DeepSeek deepseek-v3, deepseek-r1
Qwen Alibaba Cloud Dashscope (qwen-plus, qwen-max, etc.)
Kimi Moonshot AI (moonshot-v1 series)
Zhipu GLM-4 series
MiniMax MiniMax-Text series
OpenRouter Any model via openrouter.ai
Ollama Local models — Llama, Mistral, Qwen, etc.
Custom Any OpenAI-compatible or Anthropic-compatible endpoint

Disclosures

Account required

Using a remote LLM provider (OpenAI, Anthropic, DeepSeek, Qwen, Kimi, Zhipu, Z.ai, MiniMax, OpenRouter) requires an account with that provider and a valid API key. Local providers (Ollama, LM Studio) and custom self-hosted endpoints require no account.

Payment may be required

Remote LLM providers charge for API usage. You are billed directly by the provider you choose — this plugin has no subscription or in-app purchase of its own. Local providers (Ollama, LM Studio) are free.

Network use

When you send a message, the plugin transmits your message text and relevant vault context to the LLM provider you have configured. No data is sent to any service by default — the plugin is inert until you supply an API key and send a message. The following remote services may be contacted, depending on your provider selection:

Provider Endpoint
OpenAI https://api.openai.com
Anthropic https://api.anthropic.com
DeepSeek https://api.deepseek.com
Qwen (Alibaba Cloud) https://dashscope.aliyuncs.com
Kimi (Moonshot AI) https://api.moonshot.cn
Zhipu / Z.ai https://open.bigmodel.cn, https://open.z.ai
MiniMax https://api.minimax.chat
OpenRouter https://openrouter.ai

Ollama and LM Studio communicate only with localhost — no data leaves your machine.

Development

npm install
npm run dev       # esbuild watch mode — rebuilds on save
npm run build     # tsc type-check + production bundle
npm test          # unit tests (Vitest)

For local testing, symlink or copy the dist/ folder into your vault:

<vault>/.obsidian/plugins/smart-note-agent/

The plugin artifacts (main.js, manifest.json, styles.css) are output to dist/ after each build.

License

MIT