The LLM Shortcut plugin streamlines prompt management by converting a folder of saved prompts into instantly accessible commands. Each file in the prompt directory becomes a command that can be triggered directly, with the current open note automatically passed as context to the chosen LLM provider. It supports any OpenAI -compatible API, giving flexibility in selecting your AI backend while keeping all activity local without external logging. The tree-like folder structure is preserved as a navigable list, making it easy to organize and access prompts for different use cases.