If it's just the novelty aspect of it or some idealogical reason, that's fine, but it should be explained in the blog post before someone thinks this is a sane and logical way to run Ollama on a gaming PC.
- reproducible (with minor adjustments even on non-WSL systems)
- if you are used to nix, there is not much which beats it in terms of stability, maintainability, upgradability, and fun (?)
- additional services are typically easier to set up like tailscale-acl, used by the author, which uses pulumi under the hood
- despite some downsides (disc speed was an issue when I used it), WSL is surprisingly capable
It would make more sense for AMD I suppose where Ollama's Windows support is lacking compared to Linux.
That said, neat tricks useful for other stuff as well.
i use nixos.wsl at work to have the same emacs configuration as on my laptop, and that's fine except the windows filesystem performance makes me want to throw the whole system in a dumpster. but on my home gaming machine i have some games that only run on windows so i just installed ollama's windows installer which works with my GPU and installs an autostart entry.
these days the windows box sits in a dark corner on my network with tailscale (again just the windows install), running sunshine too to start steam games on my laptop.
Running locally has a lot of advantages - privacy, getting to learn how to run LLMs, not having to deal with quotas, logins, outages.
Way better utilization of expensive hardware as well ofc.
Bet being that I can get most games to work on it - that was the sticking point. (Thanks to Valve I think it’ll work out)
Then I rebooted my system and found that Steam had broken Gnome and I couldn't log in and had to go into safe mode and debug from the command line. 1 hour in, 1 thing installed.
I'll try again in 10 years.
Installing Steam breaking GNOME sounds wild.
If you are using a system-controlling Nix (nix-darwin, NixOS…), it’s as easy as `hardware.services.ollama.enable=true` with maybe adding `.acceleration=“cuda”` to force GPU usage or `host=“0.0.0.0”` to allow connections to Ollama that are not local to your system. In a home-manager situation it is even easier: just include `pkgs.ollama` in your `home.packages`, with an `.override{}` for the same options above. That should be it, really.
I will say that if you have a more complex NixOS setup that patches the kernel or can’t lean on cachix for some reason that using the ollama package takes a long time to compile. My setup at home runs on a 3950X Threadripper and when Ollama compiles it uses all the cores at 99% for about 16 minutes.
It's fast as hell. Though you will need at least two GPUs to divide between ollama and if need something else(display/game/proxmox) to use it.
> I refused to manage a separate Ubuntu box that would need reconfiguring from scratch.
Immediately followed by:
> After hacking away at it for a number of weeks
Hmmm
>LLM
stinky