That’s like the lowest possible bar to clear.
Whenever you try to build something via pip, the build will invariably fail. The times that NumPy built from source from PyPI are long over. In fact, at least 50% of attempted package builds fail.
The alternative of binary wheels is flaky.
That's not a development dependency manager. System package management is a different kind of issue, even if there's a bit of overlap.
> because the libraries do not change all the time
That's not true in practice. Spend enough time with larger projects or do some software packaging and you'll learn that the pain is everywhere.
I then yum installed a lib and headers, it worked well.
C++ on an msft platform is the worst. I can’t speak for Mac. C++ on a linux is quite pleasant. Feels like most of the comments like yours are biased for un-stated reasons.
I think the real solution here is to just only use python dependency management for python things and to use something like nix for everything else.
NixOS is a stark contrast to Python here. It makes things that can't be done deterministically difficult to do at all. Maybe this sounds extreme from the outside, but I'd rather be warned off from that dependency as soon as I attempt to use it, rather than years later when I get a user or contributor than can't make it work for some environmental reason I didn't forsee and now everything rests on finding some hacky way to make it work.
If Nix can be used to solve Python's packaging problems, participating packages will have to practice the same kind of avoidance (or put in the work to fix such hazards up front). I'm not sure if the wider python community is willing to do that, but as someone who writes a lot of python myself and wants it to not be painful down the line, I am.
Now when I cd into a project, direnv + nix notices the dependencies that that project needs and makes them available, whatever their language. When I cd into a different project, I get an entirely different set of dependencies. There's pretty much nothing installed with system scope. Just a shell and an editor.
Both of these are language agnostic, but the level of encapsulation is quite different and one is much better than that other. (There are still plenty of problems, but they can be fixed with a commit instead of a change of habit.)
The idea that every language needs a different package manager and that each of those needs to package everything that might my useful when called from that language whether or not it is written in that language... It just doesn't scale.
Between pip, poetry and pyproject.toml, things are now quite good IMHO.
I don't really know Rust, or Cargo, but I never have trouble building any Rust program: "cargo build [--release]" is all I need to know. Easy. Even many C programs are actually quite easy: "./configure", "make", and optionally "make install". "./configure" has a nice "--help". There is a lot to be said about the ugly generated autotools soup, but the UX for people just wanting to build/run it without in-depth knowledge of the system is actually quite decent. cmake is a regression here.
With Python, "pip install" gives me an entire screen full of errors about venv and "externally managed" and whatnot. I don't care. I just want to run it. I don't want a bunch of venvs, I just want to install or run the damn program. I've taken to just use "pip install --break-system-packages", which installs to ~/.local. It works shrug.
Last time I wanted to just run a project with a few small modifications I had a hard time. I ended up just editing ~/.local/lib/python/[...] Again, it worked so whatever.
All of this is really where Python and some other languages/build systems fail. Many people running this are not $language_x programmers or experts, and I don't want to read up on every system I come across. That's not a reasonable demand.
Any system that doesn't allow non-users of that language to use it in simple easy steps needs work. Python's system is one such system.
That's your problem right there.
Virtual environments are the Python ecosystem's solution to the problem of wanting to install different things on the same machine that have different conflicting requirements.
If you refuse to use virtual environments and you install more than one separate Python project you're going to run into conflicting requirements and it's going to suck.
Have you tried pipx? If you're just installing Python tools (and not hacking on them yourself) it's fantastic - it manages separate virtual environments for each of your installations without you having to think about them (or even know what a virtual environment is).
The simplest answer, IMO, is to download the Python source code, build it, and then run make altinstall. It’ll install in parallel with system Python, and you can then alias the new executable path so you no longer have to think about it. Assuming you already have gcc’s tool chain installed, it takes roughly 10-15 minutes to build. Not a big deal.
If you're installing for a small script then doing python -m venv little_project in you home dir is straightforward, just active it after [1]
I'm using rye[2] now and its very similar to Rust's Cargo, it wraps a bunch of the standard toolchain and manages standalone python versions in the background, so doesn't fall into the trap of linux system python issues.
[1]https://docs.python.org/3/library/venv.html [2]https://rye.astral.sh/
They really not that different from any other packaging system like JS or Rust. The only difference is instead of relying on your current directory to find the the libraries / binaries (and thus requiring you to wrap binaries call with some wrapper to search in a specific path), they rely on you sourcing an `activate` script. That's really just it.
Create a Virtualenv:
$ virtualenv myenv
Activate it, now it is added to your $PATH: $ . myenv/bin/activate
There really is nothing more in the normal case.If you don't want to have to remember it, create a global Virtualenv somewhere, source it's activate in your .bashrc, and forget it ever existed.
Some days later, in some woods or cave, people will hear your screams of rage and despair.
Dev/test with relaxed pip installs, freeze deployment dependencies with pip freeze/pip-tools/poetry/whateveryoulike, and what's the problem?