Ollama
19 Sep 2024
Installing the new Go-based Fabric LLM cli
Fabric’s installation instructions from its website make it appear that installing this new version rewritten in the Go language should be easier than it was with the original Python implementation.
Although pipx
made things relatively easy before,
I ran into trouble when my installed version of Go did not meet the requirements of
the newer Fabric package.
|
|
My solution was to try out PKGX which was something that I had been looking for an excuse to try.
30 Jan 2024
Run Large Language Models Locally With *ollama.ai's* Docker Image
ollama is an open-source framework for self-hosting large language models (LLM) similar to ChatGPT or Google’s Bard. The official Docker image makes it painless to host any number of the supported models.
Shell script to get an Ollama model running #
The example “run_ollama_container.sh” script(s) below take the LLM model name as an argument (assigned to $LLMMODEL
), but the pull
and run
docker exec
commands can be called additional times to start other models as well within the same running container.