Support Ollama
complete
MindMac
Get up and running with large language models locally by supporting Ollama.
Suggested by
lu_chin2k
MindMac
I just found a workaround solution to use Ollama with MindMac by using LiteLLM. Please check this video https://www.youtube.com/watch?v=bZfV70YMuH0 for more details. I will integrate Ollama deeply in the future version. Stay tuned for further updates.
MindMac
complete
From the version 1.8.11, we can use Ollama with MindMac directly without using LiteLLM.
Sam
MindMac: that’s awesome! Thank you!
MindMac
Sam: Very quick reaction 🙌
i
ibnbd
MindMac: been waiting for this for very long time, thank you
i
ibnbd
The app doesn't open after adding Ollama, it crashes. I will send you the logs by email.
MindMac
ibnbd: Please upgrade to version 1.8.12 to fix the crash issue.
J
Josh Taylor
First off, want to prefix by saying MindMac is great, thanks for all the hard work! Used the free version to ensure it actually runs and the settings work well, and purchased in <1 minute :-).
It's confusing as the homepage says it supports Ollama, but there is no configuration option/docs (from what I can see).
Could you add notes to the Configuration tab noting about this, as running another program in the background like litellm etc is annoying and uses unnecessary resources...
Ollama has an API, could this be supported?
J
Josh Taylor
It's confusing as the homepage says it supports Ollama, but there is no configuration option/docs (from what I can see).
Could you add notes to the Configuration tab noting about this, as running another program in the background like litellm etc is annoying and uses unnecessary resources...
Ollama has an API, could this be supported?
MindMac
Josh Taylor: Thank you for your kind words. Actually, I'm waiting for this PR https://github.com/jmorganca/ollama/pull/1331. If it gets merged, we will able to use MindMac directly with Ollama. Please stay tuned for further updates.
MindMac
Josh Taylor: I decided to support current Ollama API. Please upgrade to version 1.8.10 to use Ollama with MindMac directly without using LiteLLM.
MindMac
Now it's able to use Stream mode with LiteLLM + Ollama in the version 1.8.2 Sam lu_chin2k
l
lu_chin2k
MindMac: Cool. Thanks.
MindMac
I just found a workaround solution to use Ollama with MindMac by using LiteLLM. Please check this video https://www.youtube.com/watch?v=bZfV70YMuH0 for more details. I will integrate Ollama deeply in the future version. Stay tuned for further updates.
l
lu_chin2k
MindMac: Thanks a lot for making MindMac work with Ollama.
Sam
MindMac: Nice work with Ollama :)
Sam
As mentioned - this would be an absolutely amazing feature, it would be great to integrate directly with Ollama which would save us having to run LM Studio + Ollama at the same time for multiple apps.
Thanks for your work on this!
MindMac
While we work on implementing this feature, we encourage you to use LMStudio with MindMac to utilize LLMs. For more details, please refer to the video https://youtu.be/3KcVp5QQ1Ak
MindMac
planned