"There are now several Obsidian plugins available that allow you to use local LLMs instead of a commercial LLM provider", so PKM explorer tries Copilot for Obsidian plugin on his Windows laptop, because this plugin "not only offers the possibility to have a question and answer dialogue ... but also lets you [index your vault]... you can query the content... has three modes of operation:
It can also perform NLP on selected text.
It works with multiple LLM providers via API, as well as local LLMs via Ollama or LM Studio... "you can use a local embedding model to index your vault". The full article provides more detail on installation and configuration and then provides one example per mode - TL:DR; seems to be accurate, but slow on the mid-range laptop used:
More Stuff I Like
More Stuff tagged productivity , ai , obsidian
See also: Digital Transformation , Innovation Strategy , Personal Productivity , Science&Technology
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.