If your answer is "delete the venv and recreate it", what do you do when your code now has a bunch of errors it didn't have before?
If your answer is "ignore it", what do you do when you try to run the project on a new system and find half the imports are missing?
None of these problems are insurmountable of course. But they're niggling irritations. And of course they become a lot harder when you try to work with someone else's project, or come back to a project from a couple of years ago and find it doesn't work.
As someone with a similar approach (not using requirements.txt, but using all the basic tools and not using any kind of workflow tool or sophisticated package manager), I don't understand the question. I just have a workflow where this isn't feasible.
Why would the wrong venv be activated?
I activate a venv according to the project I'm currently working on. If the venv for my current code isn't active, it's because nothing is active. And I use my one global Pip through a wrapper, which (politely and tersely) bonks me if I don't have a virtual environment active. (Other users could rely on the distro bonking them, assuming Python>=3.11. But my global Pip is actually the Pipx-vendored one, so I protect myself from installing into its environment.)
You might as well be asking Poetry or uv users: "what do you do when you 'accidentally' manually copy another project's pyproject.toml over the current one and then try to update?" I'm pretty sure they won't be able to protect you from that.
>If your answer is "delete the venv and recreate it", what do you do when your code now has a bunch of errors it didn't have before?
If it did somehow happen, that would be the approach - but the code simply wouldn't have those errors. Because that venv has its own up-to-date listing of requirements; so when I recreated the venv, it would naturally just contain what it needs to. If the listing were somehow out of date, I would have to fix that anyway, and this would be a prompt to do so. Do tools like Poetry and uv scan my source code and somehow figure out what dependencies (and versions) I need? If not, I'm not any further behind here.
>And of course they become a lot harder when you try to work with someone else's project, or come back to a project from a couple of years ago and find it doesn't work.
I spent this morning exploring ways to install Pip 0.2 in a Python 2.7 virtual environment, "cleanly" (i.e. without directly editing/moving/copying stuff) starting from scratch with system Python 3.12. (It can't be done directly, for a variety of reasons; the simplest approach is to let a specific version of `virtualenv` make the environment with an "up-to-date" 20.3.4 Pip bootstrap, and then have that Pip downgrade itself.)
I can deal with someone else's (or past me's) requirements.txt being a little wonky.
Because when you activate a venv in a given terminal window it stays active until you deliberately deactivate it, and one terminal and one venv looks much like another.
> I activate a venv according to the project I'm currently working on.
So just manual discipline? It works (most of the time), but in my experience there's a "discipline budget"; every little niggle you have to worry about manually saps your ability to think about the actual business problem.
> "what do you do when you 'accidentally' manually copy another project's pyproject.toml over the current one and then try to update?" I'm pretty sure they won't be able to protect you from that.
Copying pyproject.toml is a lot less routine than changing directories in a terminal window. But if I did that I'd just git checkout/revert to the original version.
> the code simply wouldn't have those errors. Because that venv has its own up-to-date listing of requirements; so when I recreated the venv, it would naturally just contain what it needs to.
So how do you ensure that? pip dependency resolution is nondeterministic, dependency versions aren't locked by default and even if you lock the versions of your immediate dependencies, the versions of your transitive dependencies are still unlocked.
> If the listing were somehow out of date, I would have to fix that anyway, and this would be a prompt to do so.
Flagging up outdated dependencies can be helpful, but getting forced to update while you're in the middle of working on a feature (or maybe even working on a different project) is rather less so. Especially since you don't know what you're updating - the old versions were in the venv you just clobbered and then deleted, so you don't know which dependency is causing the error and you've got no way to bisect versions to find out when a change happened.
> Do tools like Poetry and uv scan my source code and somehow figure out what dependencies (and versions) I need? If not, I'm not any further behind here.
uv has deterministic dependency resolution with a lock file that, crucially, it uses by default without you needing to do anything. So if you wiped out your cache or something (or even switched to a new computer) you get the same dependency versions you had before. There's no venv to clobber in the first place because you're not activating environments and installing dependencies - when you "uv run myproject" the dependencies you listed in pyproject.toml, there's no intermediate non-version-controlled thing to get out of sync and cause confusion. (I mean, maybe there is a virtualenv somewhere, but if so it's transparent to me as a user)
> I spent this morning exploring ways to install Pip 0.2 in a Python 2.7 virtual environment, "cleanly" (i.e. without directly editing/moving/copying stuff) starting from scratch with system Python 3.12. (It can't be done directly, for a variety of reasons; the simplest approach is to let a specific version of `virtualenv` make the environment with an "up-to-date" 20.3.4 Pip bootstrap, and then have that Pip downgrade itself.)
Putting pip inside Python was dumb and is another pitfall uv avoids/fixes.