I have never understood why people feel that pbr is better than just regular setup.py. All it seems to do is move the exact same metata into setup.cfg instead. Maybe it's a bit easier syntactically but to pull in a whole package just for that seems a bit much. Especially since you don't find yourself editing setup.py that often. You still need to learn about all the fields, what they do and what you should set them to.
That said... packaging stuff in Python could really do with being a lot simpler. Pbr just doesn't seem to make it simpler, it just moves the problem to a different file.
Prefacing with "all it seems to do" makes you sound like you haven't actually used PBR, which could also explain why you don't understand why people prefer PBR to setup.py.
If you think packaging is a matter of configuration, then PBR makes a lot of sense. It gives you a config file that you simply fill out so you don't have to worry yourself about the code.
I've used both PBR and setup.py quite a bit and I personally prefer PBR since there's less things I need to debug when things go wrong.
That's the problem, there's more things you need to debug, since PBR does not get rid of any of the problems that you encounter with setuptools or pip, but adds its own.
Trying to paper over the (real) problems with the Python packaging toolchain by generating setup.py from a config file strikes me as a vanity project.
I have, in 2 different projects and rejected PRs to introduce it in another three. Because "all it seems to do" is move the problem to a different location.
You fail to give any examples as to why it would be an improvement too, aside from a very vague "less things I need to debug when things go wrong". Which I'd argue against since now I also need to potentially worry about things going wrong in PBR, which is one more thing I need to debug. This seems to be the general theme with pbr, there's no clear reasons for why it's better and when asked about it you get these kind of hand-wavy answers.
If you actually have some concrete examples as to what it improves and why I'm all ears.
Isn't that paradoxical? That code is for parsing requirements and those requirements can't be parsed until initial pre-requirements (pip, in this case) are installed.
from my understanding, when you build the package, your computer extracts the relevant required packages, and sends up that data to pypi, which then in turn sends that data back to users when they install
I think you may be misunderstanding the problem being discussed. The code that uses pip will crash if pip is not already installed. There is no way to parse and satisfy the dependency on pip, because the author chose to import pip as a library.
pip being a dependency is defined at upload time, it is parsed by the creator of the package, not the consumer, afaik
if you used any other package manager, it would need to resolve dependencies from the additional info in the package index, resulting in pip being downloaded and installed before the desired package is installed
You don't always want to install the actual package necessarily (e.g. dev) - using requirements separately lets you install the deps but not the package...
run piptools pip-compile to generate a locked set of dependencies
- requirements.txt
- requirements_dev.txt
It gives me some of the benefit of Rust's Cargo.toml / Cargo.lock in the python world (and actually respects all package's dependency version declarations unlike other tools like pyup).
I'd suggest pipenv for this kind of thing. It will generate a pipfile.lock, which is the (still in development, but pretty stable) official way to create a locked set of deps.
I'd much more recommend reading through https://python-packaging.readthedocs.io/en/latest/index.html to actually understand how packaging works.
That said... packaging stuff in Python could really do with being a lot simpler. Pbr just doesn't seem to make it simpler, it just moves the problem to a different file.