There is no such thing. Tell me, which combination of the 15+ virtual environments, dependency management and Python version managers would you use? And how would you prevent "project collision" (where one Python project bumps into another one and one just stops working)? Example: SSL library differences across projects is a notorious culprit.
Python is garbage and I don't understand why people put up with this crap unless you seriously only run ONE SINGLE Python project at a time and do not care what else silently breaks. Having to run every Python app in its own Docker image (which is the only real solution to this, if you don't want to learn Nix, which you really should, because it is better thanks to determinism... but entails its own set of issues) is not a reasonable compromise.
This is incoherent to me. Your complaints are about packaging, but the elixir wrapper doesn't deal with that in any way -- it just wraps UV, which you could use without elixir.
What am I missing?
Also, typically when people say things like
> Tell me, which combination of the 15+ virtual environments, dependency management and Python version managers
It means they have been trapped in a cycle of thinking "just one more tool will surely solve my problem", instead of realising that the tools _are_ the problem, and if you just use the official methods (virtualenv and pip from a stock python install), things mostly just work.
I agree. Python certainly had its speedbumps, but it's utterly manageable today and has been for years and years. It seems like people get hung up on there not being 1 official way to do things, but I think that's been great, too: the competition gave us nice things like Poetry and UV. The odds are slim that a Rust tool would've been accepted as the official Python.org-supplied system, but now we have it.
There are reasons to want something more featureful than plain pip. Even without them, pip+virtualenv has been completely usable for, what, 15 years now?
I've seen issues with pip + virtualenv (ssl lib issues, IIRC). I've always used those at minimum and have still run into problems. (I like to download random projects to try them out.) I've also seen issues with python projects silently becoming stale and not working, or python projects walking over other python projects because pip + virtualenv does NOT encompass all Python deps to the metal. This also doesn't mean you can have 2 commandline Python apps available in the same commandline environment, because PATH would have to prefer one or the other at some point.
Here's a question- If you don't touch a project in 1 year, do you expect it to still work, or not? If your answer is the latter, then we simply won't see eye-to-eye on this.
that's not good enough. If I'm in the business of writing Python code, I (ideally) don't want to _also_ be in the business of working around Python design deficiencies. Either solve the problem definitively, or do not try to solve the problem at all, because the middle road just leads to endless headaches for people WHILE ALSO disincentivizing a better solution.
Node has better dependency management than Python- And that's really saying something.
I don't see why it should be so binary. I said it "mostly" just works because there are no packaging systems which do exactly what you want 100% of the time.
I've had plenty of problems with node, for example. You mentioned nix, which is much better, but also comes with tons of hard trade-offs.
If a packaging tool doesn't do what i wanted, but I can understand why, and ultimately the tool is not to blame, that's fine by me. The issues I can think of fit reasonably well within this scope:
- requirement version conflicts: packages are updated by different developers, so sometimes their requirements might not be compatible with each other. That's not pip's fault, and it tells you what the problem is so you can resolve it.
- code that's not compatible with updated packages: this is mainly down to requirement versions which are specified too loosely, and not the fault of pip. If you want to lock dependencies to exact versions (like node does by default) you can do this too (with requirements.txt). It's a bit harsh to blame pip for not doing this for you, it's like blaming npm for not committing your package.lock. It would be better if your average python developer was better at this.
- native library issues: some packages depend on you having specific libraries (and versions thereof) installed, and there's not much that pip can do about that. This is where your "ssl issues" come from. This is pretty common in python because it's used so much as "glue" between native libraries -- all the most fun packages are wrappers around native code. This has got a lot better in the past few years with manylinux wheels (which include native libraries). These require a lot of non-python-specific work to build, so i don't blame pip where they don't exist.
It's not perfect, but it's not a big enough deal to rant about or reject entirely if you would otherwise get a lot of value out of the ecosystem.
The thing is, most people who are writing python code are not in the business of writing python code. They're students, scientists, people with the word "business" or "analyst" in their title. They have bigger fish to fry than learning a different language ecosystem.
It took 30 years to get them to switch from excel to python. I think it's unrealistic to expect that they're going to switch from python any time soon. So for better or worse, these are problems that we have to solve.
> at least be able to use Python, but in a very controlled, not-insane way
Thats funny, about 10 years ago I started my career in a startup that had Python business logic running under Erlang (via custom connector) which handled supervision and task distribution, and it looked insane for me at the time.
Even today I think it can be useful but is very hard to maintain, and containers are a good enough way to handle python.
> containers are a good enough way to handle python
I disagree. My take on that is that they are an ugly enough way to handle Python. And, among other problems, don't permit you to easily mess with the code (one of many reasons why this is ugly). Need access to something stateful from the container app? That's another PITA.
There is no such thing. Tell me, which combination of the 15+ virtual environments, dependency management and Python version managers would you use? And how would you prevent "project collision" (where one Python project bumps into another one and one just stops working)? Example: SSL library differences across projects is a notorious culprit.
Python is garbage and I don't understand why people put up with this crap unless you seriously only run ONE SINGLE Python project at a time and do not care what else silently breaks. Having to run every Python app in its own Docker image (which is the only real solution to this, if you don't want to learn Nix, which you really should, because it is better thanks to determinism... but entails its own set of issues) is not a reasonable compromise.
Was so glad when the Elixir guys came out with this recently, to at least be able to use Python, but in a very controlled, not-insane way: https://dashbit.co/blog/running-python-in-elixir-its-fine