Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I personally don't feel that pip-compile is that useful. All you can achieve with it can be achieved with a "pip install" + freeze in a clean venv. OK, error detection may have some use, but you still end with almost the same effort (packages are downloaded, parsing happens and you get an error).

IMHO we'd first need to standardize how dependencies are managed (requirements.txt, setup.py ...)and then build a solution off that. if you have requirements.txt-based setup, is easy to e.g. recursively download/parse to get the full list of dependencies.

The requirements.in file reminds me of automake.



Agreed that this is doable, although I would find rebuilding a virtualenv every time I wanted to change a dependency pretty annoying. Perhaps pip-compile is just a stopgap until a better solution comes along, but I think I'd find it helpful.


I'd be more for a "pip diff" in that case, to show me what are the differences between the current venv and the changed layout.

I don't really like the idea of consolidating dependencies locally because you'll end up with a big compiled requirements.txt which then you have to maintain yourself. IMHO it'll be a drag when you'll update a 3rd party dependency which has its own changed dependencies...


So are you suggesting that you don't track "meta"-dependencies (dependencies of dependencies)? I've been bitten by issues from this before, which I think is kind of the point of pip-compile.

I don't really see how it's much different from Gemfile.lock for Ruby or npm-shrinkwrap.json for Node.


The gemfile.lock (and I assume the node equivalent) are reactive things in the sense that they're generated for the final configuration (much like pip freeze).

I'm not really against of tracking dependencies recursively, but not for the final purpose of building a grand requirements file (which people will end up using and tweaking). I'm more interested if there are conflicts between different 3rd party libs (e.g. if one needs package==1.0.0 and the other needs package=1.2.0) and if they're not necessarily backwards compatible (e.g. v1.4 vs v1.6 of django).

Now, if you want to change the version of a 3rd party which, in turn, has altered dependencies, you'd need to clean up the venv anyway to be on the safe side. Currently I end up just rebuilding the venv and performing a pip install. On distribution, you could build an equivalent of a gemfile.lock via pip freeze (again). Hence, no real benefit of a pip-compile. Still, a pip diff (e.g. between 2 requirements.txt, tracking recursively dependencies) would be of more value imho, particularly if the devs would agree on a pattern/best practice.

There's also the matter of adding yet another file (requirements.in) to the setup framework (setup.py, metafiles...) and somewhat changing the meaning of requirements.txt.


Okay, that's a fair point. It just seems like Ruby and Node have sucessfully adopted the use of two separate files for dependency tracking, so it seems like it would work with Python as well.

I guess what you're suggesting is that requirements.txt should be the equivalent of a Gemfile and there should be a separate "requirements.lock" (or whatever) which tracks the output of pip freeze.

On top of that, I understand that you're suggesting that the generation of this lock file should be part of the process of setting up a virtualenv as opposed to a separate tool such as pip-compile.

If my interpretation is correct, then I totally agree. Although I still think pip-compile may be useful until such a tool exists :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: