The CoCo AskAI plugin integrates AI-powered assistance into Obsidian, supporting both OpenAI and Ollama local models. It offers interactive note assistance with features such as flexible multi-card pop-up interactions, conversational history management, and customizable templates for defining AI behavior or task execution. Users can streamline their workflow through command-line adjustments for model parameters and shortcuts for quick AI queries. The plugin enhances writing and organization by allowing contextual AI interactions, providing dynamic menu systems, and enabling efficient task processing based on user-defined prompts and settings.
The Ollama Chat plugin enables users to interact with a locally hosted language model (LLM) to ask questions directly related to their Obsidian notes. It indexes files during startup and updates the index upon file modifications, ensuring up-to-date responses. Users can open a modal via shortcuts or commands to query the LLM. The plugin supports running a local model and plans to introduce features like real-time text streaming and predefined commands for common queries such as summarizing notes or topics.