Overall it feels like UV is the best thing to happen to python packaging in two decades, by circumventing the endless non productive discussions on peps and instead just building something that works and is fast. In Rust naturally.
UV is great but also builds on existing PEPs. While they have the ability experiment (which is great), they also benefit from those "endless non productive discussions on peps" as you called it.
I think UV proves that dedicated funding can make a huge impact on a project and benefit for the community. They are a doing a damn, good job.
They mostly took inspiration from other languages for UV. Cargo (Rust) was a huge inspiration, but they got stuff from Ruby as well, I believe. There was an episode on "the changelog" about it. Don't remember them saying anything about PEPs, although that might just be me not having listened to the entire thing. However, the Charlie Marsh was extremely insistent on the advantages of being a polyglot and hiring people with diverse programming experiences. So I think it's quite safe to assume that played a bigger role than just PEPs.
> Don't remember them saying anything about PEPs, although that might just be me not having listened to the entire thing. However, the Charlie Marsh was extremely insistent on the advantages of being a polyglot and hiring people with diverse programming experiences. So I think it's quite safe to assume that played a bigger role than just PEPs.
Not that they took inspiration from PEPs, but they sought to implement those standards (for interoperability) and have been active in the discussion of new packaging-related PEPs.
I can't speak for the UV team. My 2C on how I would treat the PEPs: If there is an accepted one, and implementing it doesn't go too strongly against your competing design goals, do it for compatibility. This does not imply that the PEP is driving your design, or required to make your software. It is a way to improve compatibility.
I wrote an earlier one (rust, inspired by Cargo, managed deps, scripts, and py versions) called PyFlow that I abandoned, because nobody used it. "Why should I use this when we have pip, pipenv, and poetry?"
It probably doesn't help that there seem to be a few other projects out there with the same name or a similar name, which do completely different things.
It's true that where Python offers critical performance it's typically by providing a nice interface to existing compiled code. But people who work through those interfaces are still fundamentally "doing it in Python"; the most important "it" is that which makes a useful system on top of the number-crunching.
But putting that aside, a big part of uv's performance is due to things that are not the implementation language. Most of the actually necessary parts of the installation process are I/O bound, and works through C system calls even in pip. The crunchy bits are package resolution in rare cases (where lock files cache the result entirely), and pre-compiling Python to .pyc bytecode files (which is embarrassingly parallel if you don't need byte-for-byte reproducibility, and normally optional unless you're installing something with admin rights to be used by unprivileged users).
Uv simply has better internal design. You know that original performance chart "installing the Trio dependencies with a warm cache"?
It turns out that uv at the time defaulted to not pre-compiling while pip defaulted to doing it; an apples-to-apples comparison is not as drastic, but uv also does the compilation in parallel which pip hasn't been doing. I understand this functionality is coming to pip soon, but it just hasn't been a priority even though it's really not that hard (there's even higher-level support for it via the standard library `compileall` module!).
More strikingly, though, uv's cache actually caches the unpacked files from a wheel. It doesn't have to unzip anything; it just hard-links the files. Pip's cache, on the other hand, is really an HTTPS cache; it basically simulates an Internet connection locally, "downloading" a wheel by copying (the cached artifact has a few bytes of metadata prepended) and unpacking it anew. And the files are organized and named according to a hash of the original URL, so you can't even trivially reach in there and directly grab a wheel. I guess this setup is a little better for code reuse given that it was originally designed without caching and with the assumption of always downloading from PyPI. But it's worse for, like, everything else.
Can someone explain why UV is so praised when Poetry achieved a lot of that good several years earlier? Maybe I missed the train but I’ve been using poetry since its first version and all the benefits people praise UV for have long been in my builds.
Ruff was the gateway drug for me, much better than the black/isort/etc combo.
Led me to try uv, which fixed a couple of egregious bugs in pip. Add speed and it’s a no brainer.
I don’t think poetry has these advantages, and heard about bugs early on. Is that completely fair? Probably not. But it’s obvious astral tools have funding and a competent team.
after using both for the past 5 years, I'd say the main difference between peotry and uv is user experience and documentation. Poetry fought me every step of the way w.r.t. pip migration, venv, poetry updates. uv just worked in these areas, which meant I was updating packages much more frequently, which increased my experience and confidence in uv for a virtual cycle.
Poetry was in some senses before its time; people were frustrated with pip, but didn't really understand the problems they were encountering, meanwhile the ecosystem had started trying to move away from doing everything with Setuptools for everyone. There's a ton that I'd love to explain here about the history of pyproject.toml etc. but the point is that Poetry had its own idea about what it would mean to be an all-in-one tool... and so did everyone else. Meanwhile, the main packaging people had been designing standards with the expectation of making a UNIX-style tool ecosystem work.
Everyone seems to like uv's answer better, but I'm still a believer in composable toolchains, since I've already been using those forever. I actually was an early Poetry adopter for a few reasons. In particular, I would have been fine sticking with Setuptools for building projects if it had supported PEPS 517/518/621 promptly. 621 came later but Poetry's workaround was nicer than Setuptools' to me. And it was easier to use with the new pyproject.toml setup, and I really wanted to get away from the expectation of using setup.py even for pure-Python projects.
But that was really it. The main selling point point of Poetry was (and is) that they offered a lockfile and dependency resolution, but these weren't really things I personally needed. So there was nothing really to outweigh the downsides:
* The way Poetry does the actual installation is, as far as I can tell, not much different from what pip does. And there are a ton of problems with that model.
* The early days of Poetry were very inconsistent in terms of installation and upgrade procedures. There was at least once that it seemed that the only thing that would work was a complete manual uninstall and reinstall, and I had to do research to figure out what I had to remove for the uninstallation as there was nothing provided to automate that.
* In the end, Poetry didn't have PEP 621 support for about four years (https://github.com/python-poetry/roadmap/issues/3 ; the OP was already almost a year after PEP acceptance in https://discuss.python.org/t/_/5472/109); there was this whole thing about how you were supposed to use pyproject.toml to describe the basic metadata of your project for packaging purposes, but if you used Poetry then you used Masonry to build, and that meant using a whole separate metadata configuration. Setuptools was slow in getting PEP 621 support off the ground (and really, PEP 621 itself was slow! It's hard to justify expecting anyone to edit pyproject.toml manually without PEP 621!), but Poetry was far slower still. I had already long given up on it at that point.
So for me, Poetry was basically there to provide Masonry, and Masonry was still sub-par. I was still creating venvs manually, using `twine` to upload to PyPI etc. because that's just how I think. Writing something like `poetry shell` (or `uv run`) makes about as much sense to me as `git run-unit-tests` would.
Good point! What are the other peoper languages in addition to golang and maybe c++ for such tooling? However people... don't like (hate? fear?) c++ nowadays. And I doubt that golang fits well here. So what else? (I'm not pushing towards rust! It just seems rust got into some vacuum, however not fitting it fully like a square thing in a round hole.)
Not everyone would want to get into Apple vendor-locked language (formally not, but it's like saying Chromium is opensource and not related to Alphabet/Google) and its development environment. Especially after JetBrains closed its swift oriented ide [0]. I hate xcode, sorry, after developing for too many years in it.
Swift has an open-source and fully supported first party LSP integration and VSCode plugin, as well as a very active and welcoming community (forums). Using Swift w/o Xcode is now 100% possible and very much recommended by some people when no Apple development-platform specifics are required (native iOS/macOS apps).
The language was initially very much Apple-platforms oriented (had to be), but now that pretty much all the Apple stuff works well they moved beyond that.
Finally where the language comes from does not impact whether you can write good code with it.