

@emory @obsidianmd i will be honest that is a question you might be more equipped to answer than i, but here are the links if you wanna check it out. i see it has a folder named LLMproviders
, but i am not sure if that is what you mean or not.
obsidian://show-plugin?id=copilot
https://github.com/logancyang/obsidian-copilot
https://github.com/logancyang/obsidian-copilot/tree/master/src/LLMProviders
@gabek @obsidianmd @emory I do not. It seems to function with all the features when you use local inferencing via Ollama.