Well I started with pip because it's what I was told to use. But it was slow and had footguns. And then I started using virtualenv, but that only solved part of the problem. So I switched to conda, which sometimes worked but wrecked my shell profile and often leads to things mysteriously using the wrong version of a package. So someone told me to use pipenv, which was great until it was abandoned and picked up by someone who routinely broke the latest published version. So someone told me to use poetry, but it became unusably slow. So I switched back to pip with the built-in venv, but now I have the and problems I had before, with fewer features. So I switched to uv, because it actually worked. But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.
I'm so glad all the Python packaging challenges are "solved"
I started with "sudo apt install python" a long time ago and this installed python2. This was during the decades-long transition from python2 to python3, so half the programs didn't work so I installed python3 via "sudo apt install python3". Of course now I had to switch between python2 and python3 depending on the program I wanted to run, that's why Debian/Ubuntu had "sudo update-alternatives --config python" for managing the symlink for "python" to either python2 or python3. But shortly after that, python3-based applications also didn't want to start with python3, because apt installed python3.4, but Python developers want to use the latest new features offered by python3.5 . Luckily, Debian/Ubuntu provided python3.5 in their backports/updates repositories. So for a couple of weeks things sort of worked, but then python3.7 was released, which definitely was too fresh for being offered in the OS distribution repositories, but thanks to the deadsnakes PPA, I could obtain a fourth-party build by fiddling with some PPA commands or adding some entries of debatable provenance to /etc/apt/lists.conf. So now I could get python3.7 via "sudo apt install python3.7". All went well again. Until some time later when I updated Home Assistant to its latest monthly release, which broke my installation, because the Home Assistant devs love the latest python3.8 features. And because python3.8 wasn't provided anymore in the deadsnakes PPA for my Ubuntu version, I had to look for a new alternative. Building python from source never worked, but thank heavens there is this new thing called pyenv (cf. pyenv), and with some luck as well as spending a weekend for understanding the differences between pyenv, pyvenv, venv, virtualenv (a.k.a. python-virtualenv), and pyenv-virtualenv, Home Assistant started up again.
This wall of text is an abridged excursion of my installing-python-on-Linux experience.
There is also my installing-python-on-Windows experience, which includes: official installer (exe or msi?) from python.org; some Windows-provided system application python, installable by setting a checkbox in Windows's system properties; NuGet, winget, Microsoft Store Python; WSL, WSL2; anaconda, conda, miniconda; WinPython...
I understand this is meant as caricature, but for doing local development tools like mise or asdf are really something I've never looked back from. For containers it's either versioned Docker image or compile yourself.
The problem for me: a non-python developer, is that I just don't know what to do, ever, to run an existing script or program.
It seems every project out there uses a different package manager, a different version of python, a different config file to set all of that up.
Most of the time, I just have a random .py file somewhere. Sometimes it's a full project that I can look at and find out what package manager it's using. Sometimes it has instructions, most of the time not. _That's_ the situation I struggle with.
Do I just run ./script.py? python script.py? python3 script.py? python3.12 script.py? When inevitably I miss some dependencies, do I just pip install? python pip install? pipx install?
As a developer I'm sure that you just set it up and forget about it. And once things work, they probably keep working for you. But man, it really reflects negatively upon Python itself for me. I don't hate the language, but I sure hate the experience.
I believe what is missing is a way of distributing apps. You face similar issues if you get the C++ source of a random program - there are quite a few build systems in use! However, the compiled program can often just be zipped and shipped, somehow.
The C/C++ ecosystem is a bit more sane, but requires more expertise to fix. As long as you figure out the build process, usually you can rely on the distro packages. For Node and Rust, people really like to use the latest version and not the LTS one for their software.
Uv solves this (with some new standards). ./script.py will now install the python version, create a venv, and install dependencies (very quickly) if they don’t exist already.
Also I mean I understand mise but I personally just prefer using uv for python and bun for typescript both of which can run any version of python/ (node compliant?)
I still like the project though, but I tried to install elixir using it and it was a mess man.
Your comment shows the sad state of software quality those days. Rust is the same, move fast and break things. And lately also Mesa started to suffer from the same disease. You basically need, those days, the same build env like the one on the developer's machine or the build will fail.
O was trying to install Stable Diffusion just yesterday. They use Conda, so I installed it and tried to follow the instructions. First, the yaml file they provided was not valid . Following the commands to install packages explicitly failed because my Rust toolchain was old, so I updated it just for some other Rust dependency to fail to build, it didn’t even compile . Such a shit show.
Bad software quality is when you update your software frequently.
Instead, we should always choose the common denominator of the most obsolete software platform imaginable. If there is an OS that has not been maintained for several decades, then that is the baseline we should strive to support.
Using an operating system with old libraries and language runtimes is not a personal preference with the consequences of restricting oneself to older software versions, no, it is a commandment and must not be questioned.
Please no, I have to deal with old (but still supported) RHEL versions, this is definitely not the way to go.
You have to use ancient C++ standard versions, deal with bugs in libraries that have been fixed years ago, lose out on all kinds of useful improvements or you end up with retrofitting a modern toolchain on an old system (but you still have to deal with an old glibc).
It’s hell. Just make the tooling/QA good enough so that everyone can run on the latest stable OS not too long after it’s released.
I think I have a similar experience in some ways, but building python from source should work on linux in my experience. On a debian ish system I’d expect apt installing build essentials and the libraries you need and you should be good. I’ve done it with some pain on red hat-ish distros, which have tended to ship with python versions older than I’ve experience with. (I guess it’s better these days..?)
I started at about the same time you did, and I've never seen an instance of software expecting a Python version newer than what is in Debian stable. It happens all the time for Nodejs, Go, or Rust though.
I felt like python packaging was more or less fine, right up until pip started to warn me that I couldn't globally install packages anymore. So I need to make a billion venvs to install the same ml, plotting libraries and dependencies, that I don't want in a requirements.txt for the project.
I just want packaging to fuck off and leave me alone. Changes here are always bad, because they're changes.
I'd otherwise agree but this problem seems unique to Python. I don't have problems like this with npm or composer or rubygems. Or at least very infrequently. It's almost every time I need to update dependencies or install on a new machine that the Python ecosystem decides I'm not worthy.
I think pip made some poor design choices very early, but pip stuck around for a long time and people kept using it. Of course things got out of control, then people kept inventing new package management until uv comes along. I don't know enough about Python to understand how people could live with that for so long.
honestly until UV I thought this was the only sane way to package a python app, now it's still the only sane way and I'll use uv in the dockerfile which is honestly more complicated than their docs or reason would expect.
No, they're saying that they have one venv they use for everything (i.e., it's basically a working "pip install --user").
I think it's a good thing that pip finally decided to refuse overwriting distro package paths (the alternative is far worse) but making "pip install --user" not work as well doesn't make sense.
Or for that matter, that the ones they do have are compatible with packages that comes from other places. I've seen language libraries be restructured when OS packagers got hold of them. That wasn't pretty.
I've walked the same rocky path and have the bleeding feet to show for it! My problem is that now my packaging/environment mental model is so muddled I frequently mix up the commands...
What's wrong with just using virtualenv. I never used anything else, and I never felt the need to. Maybe it's not as shipping l shiny as the other tools, but it just works.
The problem is you can do whatever you want in it, and then have no way of reproducing that.
pyproject.toml tries to fix it, poetry kept wanting to use their own wonky names for the tags, I'm not sure why.
Once that is standardized, venvs should be cattle and not pets. That is all that is needed. UV makes that fast by hardlinking in the libraries and telling you the obvoius (that venvs should be cattle and not pets)
I think poetry “lost” because they had to build ahead of the PEPs that standardized their way of doing things. I don’t think uv could exist without the work the poetry people put in, it served as a pretty powerful demonstration of the fact that better python packaging was possible.
There’s nothing wrong with just using virtualenv. I too have used virtualenv plus pip (and sometimes pyenv) for the longest time without issue.
However, uv is the first alternative that has tempted me to switch. uv offers performance improvements over pip and handles pyenv use cases as well. I’ll be paying attention to pyx to see how it pans out.
Nothing is inherently wrong with virtualenv. All these tools make virtual environments and offer some way to manage them. But virtualenv doesn't solve the problem of dependency management.
> But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.
This is why containers are great IMO.
It's meant to solve the problem of "well it works on my machine"
It is hilarious how Composer went from "haha, PHP has a package manager now, bless their hearts" to "why can't it work like Composer?"
I don't think PHP is perfect by any means but I don't dread using Composer, even in contexts like WP plugins where it is definitely not a "native" technology.
We actually built a platform that eliminates all these steps, you can now reproduce GitHub repos with 0 manual config in 60% cases. check for more info on https://x.com/KeploreAI
We just launched it and waiting for first users to be astonished :), let me know if you have any questions
I'm so glad all the Python packaging challenges are "solved"