Description
Learning & Knowledge Management Plugin
Category: Learning & Knowledge Management
The Mini-RAG plugin enables local retrieval augmented generation by connecting your notes to a locally running LLM through Ollama. You can start a chat in the context of a specific note or folder, allowing the model to reference only relevant content when generating responses. It supports any Ollama-installed model and provides controls for model selection, temperature adjustment, and even context-free chatting when you want unconstrained responses. Interactions can be initiated directly from right-click menus in the editor or sidebar, and conversations can be saved for later reference.
Stats
1
stars
1 stargazers
72
downloads
72 downloads
0
forks
0 forks
8
days
8 days since creation
34
days
34 days since last commit
65
days
65 days since last release
0
total PRs
0 total pull requests
0
open PRs
0 open pull requests
0
closed PRs
0 closed pull requests
0
merged PRs
0 merged pull requests
0
total issues
0 total issues
0
open issues
0 open issues
0
closed issues
0 closed issues
0
commits
0 total commits in last one year
RequirementsExperimental
Ollama installed and running locally
Latest Version
2 months ago
Changelog
Description
First release of the Mini-Rag plugin for Obsidian.
Features
- Context-Sensitive Chats with Local LLMs
- Context-Free Chats with Local LLMs
README file from
Similar Plugins
info
• Similar plugins are suggested based on the common tags between the plugins.