The Python packaging ecosystem has long desired a overhaul and implementation of designed features, but it often stalls on adoption.
I think its time to propose a guiding principle for incremental change.
be carefully impatient
The cautious approach for delivering a new feature in the infrastructure looks like this:
- Design the change.
- Implement the change(s) needed, in a new major version (e.g Metadata-2.0).
- Wait for the new version to be the default everywhere.
- Tell users they can use it.
This is frankly terrible. Firstly, we cannot really identify ‘default everywhere’. We can identify ‘default in known distributions’, but behind the firewall setups may lag arbitrarily far behind. Secondly, it makes the cycle time for getting user feedback extraordinarily long: decade plus time windows. Thirdly, as a consequence, we run a large risk of running ahead of our users and delivering less good fixes and improvements than we might do if they were using our latest things and giving us feedback.
So here is how I think we should deliver things instead:
- Design the change with specific care that it fails closed and is opt-in.
- Implement the change(s) needed, in a new minor version of the tools.
- Tell users they can use it.
So, why do I think we can skip waiting for it to be a default?
pip, wheel and setuptools are just as able to be updated as any other Python component. If someone is installing (say) numpy via pip (or easy-install), then by definition they are willing to use things from PyPI, and pip and setuptools are in that category.
And if they are not installing via pip, then the Python packaging ecosystem does not affect them.
If we have opt-in as a design principle, then the adoption process will be bottom up: projects that are willing to say to their users ‘you need new versions of pip and setuptools’ can do so, and use the feature immediately. Projects that want to support users installing their packages with pip but aren’t willing to ask that they also upgrade their pip can hold off.
If we have fails-closed as a design principle, then when a project has opted in, and the user installing the package hasn’t upgraded their pip, things will at least fail rather than silently doing the wrong thing.
I had experience of this in Mock recently: the 1.1.0 and up releases depended on setuptools 17.1. The minimum setuptools we could have depended on (while publishing wheels) was still newer than that in Ubuntu Precise (not to mention RHEL!), so we were forcing an upgrade regardless.
This worked ok but we had two significant issues. Firstly, folk with incorrect Python paths can end up shadowing system installed packages, and for some reason ‘six’ triggered this for multiple users. Secondly, we had a number of different attempts to clearly signal the dependency, as the new features we were using did not fail closed: they were silently ignored by sufficiently old setuptools.
We ended up with a
setup_requires="setuptools>17.1" clause in setup.py, which we’re hopeful will fail, or Just Work, consistently.