TIL - that docker for mac doesn’t allow you to access the GPUs that you have on your mac running apple Si.
this is annoying, it means that i need to run ollama with some wrappers, which isn’t the end of the world, but a bit more hassle than i was looking for.
misc. notes in no particular order
default ollama port
11434
making it such that i can connect to this from off localhost (i.e., from the ipad using enchanted) requires a bit more chicanery.
- stop ollama
launchctl setenv OLLAMA_HOST "0.0.0.0"
- (re)start ollama
if using the non-cask version from homebrew it’s possible to handle this via a modification to the plist file that homebrew creates for the service entry.
remember this for future reference.
documentation pointers
- https://github.com/ollama/ollama/issues/3581 - a rather snarky bit of discussion on this topic.
meta
- location: minneapolis, mn
- weather: ⛅️ (Partly cloudy) +44°F(+41°F)