Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have never understood why people feel that pbr is better than just regular setup.py. All it seems to do is move the exact same metata into setup.cfg instead. Maybe it's a bit easier syntactically but to pull in a whole package just for that seems a bit much. Especially since you don't find yourself editing setup.py that often. You still need to learn about all the fields, what they do and what you should set them to.

I'd much more recommend reading through https://python-packaging.readthedocs.io/en/latest/index.html to actually understand how packaging works.

That said... packaging stuff in Python could really do with being a lot simpler. Pbr just doesn't seem to make it simpler, it just moves the problem to a different file.



Prefacing with "all it seems to do" makes you sound like you haven't actually used PBR, which could also explain why you don't understand why people prefer PBR to setup.py.

If you think packaging is a matter of configuration, then PBR makes a lot of sense. It gives you a config file that you simply fill out so you don't have to worry yourself about the code.

I've used both PBR and setup.py quite a bit and I personally prefer PBR since there's less things I need to debug when things go wrong.


That's the problem, there's more things you need to debug, since PBR does not get rid of any of the problems that you encounter with setuptools or pip, but adds its own.

Trying to paper over the (real) problems with the Python packaging toolchain by generating setup.py from a config file strikes me as a vanity project.


I have, in 2 different projects and rejected PRs to introduce it in another three. Because "all it seems to do" is move the problem to a different location.

You fail to give any examples as to why it would be an improvement too, aside from a very vague "less things I need to debug when things go wrong". Which I'd argue against since now I also need to potentially worry about things going wrong in PBR, which is one more thing I need to debug. This seems to be the general theme with pbr, there's no clear reasons for why it's better and when asked about it you get these kind of hand-wavy answers.

If you actually have some concrete examples as to what it improves and why I'm all ears.


At first glance, pbr appears to solve the problem of having to duplicate my dependencies in both setup.py and requirements.txt.


I suppose it does doe that, so does this though:

    with codecs.open('requirements.txt') as f:
        requirements = f.read().splitlines()
    ...
    install_requires=requirements,
I don't think it's worth the dependency. But perhaps for others it is.


That works for simple use cases but it is not robust.

In case anyone sees this, at a minimum you need to use

  from pip.req import parse_requirements
  parse_requirements('path/to/requirements.txt')
to be safe here.


What if the user doesn't have pip installed? This is before we've installed the requirements, so we shouldn't be assuming anything outside the stdlib.


if I'm not mistaken, requirements are parsed out to be installed before installing the desired package, so that should be fine


Isn't that paradoxical? That code is for parsing requirements and those requirements can't be parsed until initial pre-requirements (pip, in this case) are installed.


sorta

from my understanding, when you build the package, your computer extracts the relevant required packages, and sends up that data to pypi, which then in turn sends that data back to users when they install

At least thats how I think it works


I think you may be misunderstanding the problem being discussed. The code that uses pip will crash if pip is not already installed. There is no way to parse and satisfy the dependency on pip, because the author chose to import pip as a library.


pip being a dependency is defined at upload time, it is parsed by the creator of the package, not the consumer, afaik

if you used any other package manager, it would need to resolve dependencies from the additional info in the package index, resulting in pip being downloaded and installed before the desired package is installed


This is actually less robust, since instead of relying on the setuptools API, you now also rely on the pip API.

If you control requirements.txt, there is nothing "not robust" about parsing it in setup.py.


You should read this: https://caremad.io/posts/2013/07/setup-vs-requirement/, it specifically references this "feature" of pbr.


Why are you using a requirements.txt if you have a setup.py that lists them? `pip install -e` or `setup.py install` should just work in that case.


You don't always want to install the actual package necessarily (e.g. dev) - using requirements separately lets you install the deps but not the package...


What if you have your own package index? You can use `--index-url` in requirements.txt to specify that.


You can use `--index-url` when calling pip: https://pip.pypa.io/en/stable/reference/pip_wheel/#index-url

You can also use `dependency_links` in your setup.py to specify this, which allows deps on github etc.


The approach I've been using

Define my dependencies

- setup.py

- requirements_dev.in

run piptools pip-compile to generate a locked set of dependencies

- requirements.txt

- requirements_dev.txt

It gives me some of the benefit of Rust's Cargo.toml / Cargo.lock in the python world (and actually respects all package's dependency version declarations unlike other tools like pyup).


I'd suggest pipenv for this kind of thing. It will generate a pipfile.lock, which is the (still in development, but pretty stable) official way to create a locked set of deps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: