TIL: ollama notes
TIL - that docker for mac doesn’t allow you to access the GPUs that you have on your mac running apple Si.
this is annoying, it means that i need to run ollama with some wrappers, which isn’t the end of the world, but a bit more hassle than i was looking for.
misc. notes in no particular order
-
default ollama port
11434
-
making it such that i can connect to this from off localhost (i.e., from the ipad using enchanted) requires a bit more chicanery.
- stop ollama
launchctl setenv OLLAMA_HOST "0.0.0.0"
- (re)start ollama
if using the non-cask version from homebrew it’s possible to handle this via a modification to the plist file that homebrew creates for the service entry.
remember this for future reference.
documentation pointers
- https://github.com/ollama/ollama/issues/3581 - a rather snarky bit of discussion on this topic.
update the plist and move on with life
brew services stop ollama
- to make sure nothing bumps the config- the following should be placed in
/opt/homebrew/opt/ollama/homebrew.mxcl.ollama.plist
brew services start ollama
- to start ollama again.
there’s probably some “allowing remote connections” prompts to work thru on the GUI.
20250617 update
it looks like there’s some good news on the ollama cask front such that they’re enabling the remote access as a part of the settings on the mac. this should help to obviate some of the hassle with updating the plist to set the environment appropriately.
ref: Release v0.9.1 · ollama/ollama
that’s a welcome option.
meta
- location: minneapolis, mn
- weather: ⛅️ (Partly cloudy) +44°F(+41°F)