Edit: I don't mean to disparage projects like this and pipfile. Both are great efforts to bring the packaging interface in line with what's available in other languages, and might be the only way up and out of the current state of affairs.
From binary wheels (including on different linux architectures), to things like local caching of packages (taking LOTS of load off the main servers). To the organisation github of pypa [0], to `python -m venv` working.
Also lots of work around standardising things in peps, and writing documentation for people.
I would like to applaud all the hard work people have done over the years on python packaging. It really is quite nice these days, and I look forward to all the improvements coming up (like pipenv!).
I'd suggest people checkout fades [1] (for running scripts and automatically downloading dependencies in a venv), as well as conda [2] the alternative package manager.
[1] https://fades.readthedocs.io/en/release-5/readme.html#what-d...
Now it's hard to compete with JS on some stuff : it's the only language in the most popular dev plateform (the web) and it has one implicit standardized async model by default.
It's hard to compete with rust on some stuff : it's compiled and is fast, can provide stand alone binaries easily and has a checker that can avoid many bugs.
But this. The package manager. We can compete. And yet we are late.
It's partially my fault since it's a project I had in mind for years and never took the time to work on. It's partially everybody's fault I guess :)
Recently, I built a small CLI tool in Python, and learned all of the bits needed to build, test and package my application "the right way". I knew Python syntax before, but it was a lot of effort to set this up. The difference in the experience between Python and Rust or .NET Core is actually shocking, and most it isn't down to anything that Python couldn't do, just the current state of the tooling.
I'm curious what I've been missing about pip that makes it problematic - I've never used the other tools you mentioned (setuptools/distutils/ez_install) - so I can't comment on them, but, on the flip side, I've never had to use them, so maybe my requirements are somewhat more simple than yours.
Another thing is providing a stand alone build. Something you can just ship without asking the client to run commands in the terminal to make it work. I use nuikta (http://nuitka.net/) for this. It's a fantastic project, but man it's a lot of work for something that works out of the box in Go or Rust.
One last thing is to generate packages for OS (msi/deb/rpm/dmg/snap). Your sysadmin will like you. Pex (https://pypi.python.org/pypi/pex) is the closest, but not very standard.
Other pet peeves of mine:
- you can't easily move virtualenvs;
- creating a setup.py is very hard for a beginner and has numerous traps;
- setup.py are executable files. Meh.
- what's with this setup.cfg containing 2 lines ? And the MANIFEST.IN being a separate files. Why do I have to put conf also in tox.ini ? And one for each of my linters ? I want ONE setup.cfg file with all config for all tools for my project inside and be done with it. TOML can handle rich sections, just stop creating new files.
- accessing file with pkg_resources() is way harder than it should be. I made a wrapper for this (http://sametmax.com/embarquer-un-fichier-non-python-propreme...).
- one place to have __version__, please. I want it readable in my code AND in my package metadata, without having to use regex or have side effects on imports.
- remove the "can't build wheel message" when it's useless. It scares newcomers.
- README is the long_description. Don't make me read it manually.
- how do I provide vendors in a clean way ?
- install_requires, extras_requires, setup_requires, tests_requires... Make it one require with hooks and tags and be done with it.
- creating a setup.py test config is way harder than it should be and breaks in CI on strange edge cases.
- can we get a PEP on the standard project structure and built it in our tools to be done with it? We all have src/package + setup.py on root anyway.
- pip installs packages in the site-packages dir of the python executable it's installed for. It makes sense, and I think Python deals pretty well with the fact you can have various versions installed on the same machine. But God people are confused by this. Now you can recommend to do "python -m pip", but it's very verbose and it assumes people know what version of Python is behind the "python" executable. On windows it can be any, and they must chose with... yet another command ("py")! pipenv just bypass that by assuming you want a virtualenv, and be able to access it. It's a very good call.
- pip install --user will create commands you can't use unless you edit your PATH. This makes newcomers go mad.
Why exe? How do you package libraries using this new tool you are envisioning?
Also notable, IMO, is the lack of a tool like rbenv or rustup for python. I can't tell you how many times I have had to try to figure out which python version a given pip executable worked with.
I'd ask for your reasoning but it seems sametmax has done a good job of that for you:
A more controversial statement with the same content probably wouldn't have been voted up, but thread parent is objectively true even though it doesn't contain its own proof.
Ha, that's what I came here to say!
Or better - a new packaging paradigm.
Maybe it's extremely uncool to say ... but I think Java still has the best packaging paradigm of all languages. Jars rule. Of course 'gradle' is kind of a confusing mess so they don't have dependencies worked out very well ...
Nevertheless I do feel that Python's packaging and dependency/versioning woes create a much bigger systematic problem than many realize.
Kudos to the author though ...
> It harnesses Pipfile, pip, and virtualenv into one single toolchain. It features very pretty terminal colors.
For a weekend project, this has some very nice things.
Which removes the need for me to run my own project that basically does these things... In more or less, a worse way.
Everything I've come to expect from Reitz, and hopefully it'll gain some decent ground like other projects of the same author.
Often people have a requirements.live.txt, or other packages depending on the environment. Is that handled somehow? Can we use different files or sections? [ED: yes, different sections]
Still wondering to myself if this is worth the fragmentation for most people using requirements.txt ? Perhaps the different sections could have a "-r requirements.txt" in there, like how requirements.dev.txt can have "-r requirements.txt". [ED: the pipfile idea seems to have quite some people behind it, and pip will support it eventually. Seems it will be worth it to standardise these things. requirements.txt is a less jargony name compared to Pipfile though, and has a windows/gui friendly extension.]
Other tools can set up an environment, download stuff, and run the script. Will pipenv --shell somescript.py do what I want? (run the script with the requirements it needs). ((I guess I could just try it.)) [ED: doesn't seem so]
Why Pipfile with Caps? Seems sort of odd for a modern python Thing. It looks like a .ini file? [ED: standard still in development it seems. TOML syntax.]
With a setup.py set up, all you need to do is `pip install -e .` to download all the required packages. Or `pip install somepackage`. Lots of people make the setup.py file read the requirements.txt. Do you have some command for handling this integration? Or is this needed to be done manually? [ED: seems no considering about this/out of scope.]
Is there a pep? [ED: too early it seems.]
It's a very similar set of tools. I use pip-compile which allows me to put all of my dependencies into a `requirements.in` file, and then "compile" them to a `requirements.txt` file as a lockfile (so that it is compatible with pip as currently exists).
This looks great, though, I'm excited to check it out!
Ruby and Node apps are particularly guilty of this pulling in sometimes hundreds of packages some of which need compilation. Compare that to a Go binary which is download and use. These things can get very complicated very fast even for developers or system folks let alone end users who may not be intimately familiar with that specific ecosystem.
It would be _super_ interesting if the Python and Ruby communities got together to harmonize every last detail of their packaging toolchain. Who is in?
- the JS community. npm dependancy graph, webpack resolver and yarn performances;
- the rust community like with cargo.
The nearest equivalent is to place a file called '.ruby-version' in the top level directory, containing the version number of the Ruby you want to use. Version numbers come from https://github.com/rbenv/ruby-build/tree/master/share/ruby-b.... rbenv, chruby and rvm all support .ruby-version.
One difference from virtualenv is that the Ruby version managers share single installations of each version of Ruby. My understanding from occasional use of Virtualenv is that it copies the python installation into a new per-project subdirectory, which seems a bit wasteful to me.
> You can set the path config variable of Bundler to not place the project Gems in a central location which I think is cleaner and try to remember to always do now.
Yes, this is what I do. It gives me a shared 'clean' Ruby installation of the right version, plus a project-specific copy of all the gems the project depends on. To me this provides the best trade off between project isolation and not duplicating the whole world. You can set bundler up so this is done automatically by creating '~/.bundle/config' containing
---
BUNDLE_PATH: "vendor/bundle"
BUNDLE_BIN: ".bundle/bin"
(The BUNDLE_PATH one is the important one; see 'bundle config --help' for other options.)With nix[OS] you just run `nix-shell -p python[2,3] python[2,3]Pacakges.numpy ...` to get an environment with the required packages.
Of course this requires that the python library is packaged in nix, but in my experience the coverage is quite good, and it's not very hard to write packages once you get the hang of it.
It also possible (but currently a bit clumsy in some ways) to set up named and persistent environments.
I'm glad to see Python getting the same attention as other modern package managers. This is all great work!
As this is not cross platform and it would be nice to switch between Linux/Windows while coding to maintain platform compatibility, can the virtualenv envs be created with a os platform & subsystem prefix ? for example, having multiple envs at once:
- env/posix/bin/activate
- env/nt/Scripts/activate.batYou can easily get a nice isolated python environment with some packages in nix without using pip, pyenv, etc. `nix-shell -p python pythonPackages.numpy ...`
So far I think it works quite well for most languages as long the needed packages are in nixpkgs.
Some of the tooling could be better, but the underlying model seems sound.
I'm not really convinced language-specific package managers are needed. Nix isn't perfect yet, but it has come a long way.
I had a dilemma about it. But after all, you can not move your venv directory unless you use `--relocatable` option. So, anyone have a strong argument about creating venvs inside your project directory?
We just use a directory where we keep our dependencies. It's a matter of:
mkdir libs
pip install -t libs <package>
# then to run
PYTHONPATH=libs python app.py
From what I can tell, this accomplishes everything a venv does (except bringing the Python interpreter itself along) without requiring any extra tools or conventions to learn. - you still need to change PYTHON_PATH to recognize libs from `libs`
- packages have bin scripts sometimes that most likely, will be needed by the project
A more proper command:
- pip install -t ./libs --install-option="--install-scripts=./bin"But still, does not solve the PYTHON PATH issue, it won't be solvable because all python cli tools and all scripts in ./bin must be aware of PYTHON_PATH including ./libs
This is what venv does and pip alone cannot easily solve, replicating an entire python environment that is aware of project local pip packages
- ./bin. No commands for you.
- a way to specify your libs to a program not providing a way to set env var.
- isolation from the system stdlib. This will cause subtile bugs.
- a clean pip freeze. No deps listing.
- and hence a lock file.
I find tools like virtualenvwrapper and this one from Kenneth tend to solve issues I don't really have. A little bit of repetitive typing here and there is ok to burn knowledge into my mind; and less leaky abstractions I have to deal with, the better.
- beginers don't have to understand the whole virtualenv shenanigans. I use pew myself to replace virtualenvwrapper but I will switch to pipenv just to ease the pain of team member joining in.
- it enforces good dependency management practice with the toml file and lock file. This is an issue in almost all project I worked on, including ones from Python experts. We all use only requirements.txt file out of convenience, and never lock.
- it's one tool to do all the packaging stuff. No need for pip and virtualenv and a wrapper. You got one command.
The whole things make it way easier to get started for a beginner. Now more activate. No more wondering about virtualenv. Automatic lock files are great since no project I know of use them since they are not well understood.
It's like node_packages (easy and obvious), but cleaner (no implicit magic).
Like.
PyGradle [2]: "The PyGradle build system is a set of Gradle plugins that can be used to build Python artifacts"
[1] https://github.com/linkedin/pygradle/blob/01d079e2b53bf9933a...
I use Python 2.7, 3.4, and 3.5 on various projects. Is there a way to choose between 3.4 and 3.5 using Pipenv? I'm using something like this with virtualenv:
$ virtualenv -p `which python3.5` .venv $ virtualenv -p python3.5 .venvSecondly, this tool allow you to freeze your requirement list at specific versions. So in your req file, you have the name of the packages you depend on. Bon on the req lock, you get all the pulled dependencies recursively with the version you are using right now. The first one let you dev more easily. The second one deploy more easily.
All that, for less complexity. Win win.
At first glance, this doesn't seem to offer anything beyond what I already see from setup(). What am I missing?
It's unfortunate that CPython gave us distutils and took a very long time to converge on a built-in successor (setuptools?) that gives the right composability.
Glad that someone thought about similar thing and made a tool to solve it!
This is a bit strange because the python binary is always supposed to be Python 2. The Python 3 binary is supposed to be named python3. Some distributons don't follow this, but they're the weird non-conformant ones; it's not a behaviour that should really be relied on.
This is not correct. It's a symlink to python2 on systems that rely on calls to python to be python2. On modern systems, python is usually a symlink to python3. This is the case on Arch Linux and I believe other recent distro releases.
Given that they're not mutually compatible except in rare cases, it's a very silly thing to do. You can upgrade GCC with the same name because you know it will handle most of the same input. If you do that with Python you're breaking tons of existing scripts for very close to zero benefit. Why would you do that?