Get up and running with large language models locally by supporting Ollama.
Suggested by
lu_chin2k