>Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore.
I might just blow your mind here:
$ time python -m venv with-pip
real 0m3.248s
user 0m3.016s
sys 0m0.219s
$ time python -m venv --without-pip without-pip
real 0m0.054s
user 0m0.046s
sys 0m0.009s
The thing that actually takes time is installing Pip into the venv. I already have local demonstrations that this installation can be an order of magnitude faster in native Python. But it's also completely unnecessary to do that:
$ source without-pip/bin/activate
(without-pip) $ ~/.local/bin/pip --python `which python` install package-installation-test
Collecting package-installation-test
Using cached package_installation_test-1.0.0-py3-none-any.whl.metadata (3.1 kB)
Using cached package_installation_test-1.0.0-py3-none-any.whl (3.1 kB)
Installing collected packages: package-installation-test
Successfully installed package-installation-test-1.0.0
I have wrappers for this, of course (and I'm explicitly showing the path to a separate Pip that's already on my path for demonstration purposes).
> a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>
Yes, Uv implements PEP 723 "Inline Script Metadata" (https://peps.python.org/pep-0723/) - originally the idea of Paul Moore from the Pip dev team, whose competing PEP 722 lost out (see https://discuss.python.org/t/_/29905). He's been talking about a feature like this for quite a while, although I can't easily find the older discussion. He seems to consider it out of scope for Pip, but it's also available in Pipx as of version 1.4.2 (https://pipx.pypa.io/stable/CHANGELOG/).
> The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.
Part of why Pip is slow at this is because it insists on checking PyPI for newer versions even if it has something cached, and because its internal cache is designed to simulate an Internet connection and go through all the usual metadata parsing etc. instead of just storing the wheels directly. But it's also just slow at actually installing packages when it already has the wheel.
In principle, nothing prevents a Python program from doing caching sensibly and from shuffling symlinks around.