Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yet more proof that "the best way to do anything in python is to not do it in python."


It's true that where Python offers critical performance it's typically by providing a nice interface to existing compiled code. But people who work through those interfaces are still fundamentally "doing it in Python"; the most important "it" is that which makes a useful system on top of the number-crunching.

But putting that aside, a big part of uv's performance is due to things that are not the implementation language. Most of the actually necessary parts of the installation process are I/O bound, and works through C system calls even in pip. The crunchy bits are package resolution in rare cases (where lock files cache the result entirely), and pre-compiling Python to .pyc bytecode files (which is embarrassingly parallel if you don't need byte-for-byte reproducibility, and normally optional unless you're installing something with admin rights to be used by unprivileged users).

Uv simply has better internal design. You know that original performance chart "installing the Trio dependencies with a warm cache"?

It turns out that uv at the time defaulted to not pre-compiling while pip defaulted to doing it; an apples-to-apples comparison is not as drastic, but uv also does the compilation in parallel which pip hasn't been doing. I understand this functionality is coming to pip soon, but it just hasn't been a priority even though it's really not that hard (there's even higher-level support for it via the standard library `compileall` module!).

More strikingly, though, uv's cache actually caches the unpacked files from a wheel. It doesn't have to unzip anything; it just hard-links the files. Pip's cache, on the other hand, is really an HTTPS cache; it basically simulates an Internet connection locally, "downloading" a wheel by copying (the cached artifact has a few bytes of metadata prepended) and unpacking it anew. And the files are organized and named according to a hash of the original URL, so you can't even trivially reach in there and directly grab a wheel. I guess this setup is a little better for code reuse given that it was originally designed without caching and with the assumption of always downloading from PyPI. But it's worse for, like, everything else.


Yet more proof that confirmation bias exists.


Find something better than Django.

Or matplotlib.

Or PyTorch.


PyTorch is a C++ project with a Python wrapper.


>Find something better than Django

Rails. QED.


Rails is a good project no doubt.

Django comes batteries included for basic apps, including an admin.


Counterpoint, you still need a 3rd party library to implement a sane rest API in Django. Same with async jobs.


I would definitely not use Django to do REST API.

It can work...but that's not what it was designed for.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: