Obligatory “there are now 15 competing standards”
For real though, this looks interesting. I am a long time poetry user, I’ve been mostly happy with it but I do think it could stand to be a little faster. I’ll have to try this out sometime.
Obligatory “there are now 15 competing standards”
For real though, this looks interesting. I am a long time poetry user, I’ve been mostly happy with it but I do think it could stand to be a little faster. I’ll have to try this out sometime.
Poetry support is on their roadmap!
What’s that mean, like they aim to become a drop-in replacement for poetry too? Or make uv able to work with a poetry-style pyproject.toml? I couldn’t find any info about that.
Hmm, I just re-read the blog post and GitHub where I thought I read that and I think I was mistaken…
uv
is fantastic. I would highly recommend it. I've used it in a quite complex environment, with no issues (quite an achievement!) and it's about 10x faster than pip.
I mean... I guess it's not surprising given uv
is written in Rust and pip is written in Python, but even so given pip is surely IO bound I was expecting something like 4x improvement. 10x is impressive.
The actual dependency resolution part, so where you figure out which versions of the dependencies can be used together, is actually notoriously CPU-bound.
At least as far as I'm aware, you generally use a SAT solver for dependency resolution (unless you don't care for correctness), and as Wikipedia puts it:
Boolean satisfiability is an NP-complete problem in general. As a result, only algorithms with exponential worst-case complexity are known.
There are quite sophisticated algorithms at this point, making use of heuristics and whatnot, but they're still just backtracking algorithms at their core. And as Wikipedia puts it so fittingly again:
backtracking is often much faster than brute-force enumeration
You know shit's inefficient, when the best thing to compare it to, is just randomly trying solutions.
Interestingly, dependency resolution is not the only NP hard problem uv tries to solve. During development, it also became clear that we needed some way to simplify PEP 508 marker expressions and ask questions like, "are these marker expressions disjoint?"
you generally use a SAT solver for dependency resolution (unless you don’t care for correctness)
Actually Go's dependency system is specifically designed to avoid the need for global constraint solvers. Go has the most modern and elegant dependency versioning system that I'm aware of. Python was designed before people realised that it's dependency style was a mistake.
I'm on the uv team. I am quite partial to this approach as well. Alas, it's difficult culturally to pull this off in a pre-existing ecosystem. And in the case of Python at least, it's not totally clear to me that it would avoid the need for solving NP hard problems. See my other comment in this thread about simplifying PEP 508 marker expressions.
Other than avoiding needing a SAT solver to resolve dependencies, the other thing I like about Go's approach is that it makes it very difficult to "lie" about the dependencies you support. In a maximal environment, it's very easy to "depend" on foo 1.0
but where you actually need foo 1.1
without issues appearing immediately.
Oo hello. Didn't know that's what you were doing these days! Hope it goes well, though I'd be nervous about a realistic business plan.
Anyway, yeah bit too late for Python.
Having used it for work, I really don't understand the appeal, especially when compared to tools like Poetry. Uv persists in the dependency on requirements.txt, doesn't streamline the publishing process, and contrary to the claims, it's not a drop-in replacement for pip, as the command line API is different.
It's really fast, which is nice if you're working on a nightmare codebase with 3000 dependencies, but most of us aren't, and Poetry is pretty damned fast.
If uv offered some of what Poetry does for me, if at the very least we could finally do away with requirements.txt and adopt something more useable -- baked into pyproject.toml of course -- then I'd be sold. But this is just faster pip.
Early on uv was only trying to replace pip. This latest update is a big step towards becoming a poetry (and pyenv/pipx) replacement too.
Now if they could just help defuckulate the Pypi search problem.
uv 0.3 introduces a cross platform lock file: https://docs.astral.sh/uv/concepts/projects/#lockfile
More precise details on the compatibility of uv pip
with pip
are documented here: https://docs.astral.sh/uv/pip/compatibility/
It's written in Rust.
All jokes about the Rust Evangelism Strike Force aside, various parts of the industry are finally starting to think that "If it's written in Rust, we have less to worry about with respect to that thing, so we won't torture the devs and force them to sneak it in the side door anyway."
It's a thing that I've been seeing at work for the last few years.
Uv is currently only a pip replacement as a dependency resolver (and downloader), it was actually adopted by astral from a different dev afaik
Their vision is to evolve it into a "Cargo for Python", so it's coming.
Is that a real problem? I've never considered that a python package manager should be or could be faster.
To be fair, I don't use python professionally.
definitely not the real reason for a project like this to exist. Python package management can be nightmarish at times depending on what you’re doing. between barebones requirements.txt
, Poetry, and the different conda
s there’s a ton of fragmentation, and none of them do everything you’d want in an ideal way. above and beyond speed, i think uv
is another attempt at it. but it could just be another classic xkcd moment where now there’s just another standard to deal with
uv
is a drop-in replacement for pip
. There's no extra standard. It's pareto better. Honestly the Python community would do the world a favour if the deprecated pip and adopted uv as the official tool, but you can guess how likely that is...
as you might have guessed i haven’t really tried it, but i have been reading about it. that said i have used “drop in replacement” tools like this (we use pnpm
at work), and a drop in replacement is not without quirks. they wouldn’t have made a different tool altogether if it was really a 1:1 replacement. just because the commands are the same doesn’t mean it behaves the same. i.e. i doubt one person on the team could be using uv
while everyone else sticks to pip
they wouldn’t have made a different tool altogether if it was really a 1:1 replacement
Why not? It's 10x faster.
I think it might have some other new features but you don't need to use those.
i doubt one person on the team could be using uv while everyone else sticks to pip
This is exactly what we do at work. There's no way I could convince everyone to switch to uv
so I just switch between them based on an environment variable.
It even supports random stuff like pip install --config-settings editable_mode=compat --editable foo
which is required for static tooling to work (e.g. Pyright).
Yes. For the project I work on pip install
takes about 60 seconds and replacing it with uv
reduces that to about 7 seconds. That's a very significant improvement. Much less annoying interactively and in CI we do this multiple times so it saves a significant chunk of time.
Just out of curiosity, how often do you have to run pip install
?
I dunno maybe once a week or so? We don't actually have a system that detects if your pip install
is out of sync with pyproject.toml
yet so I run it occasionally just to make sure.
And it runs in CI around a dozen times for each PR. Yeah not ideal but there are goodish reasons which I can explain if you want.
No, that makes perfect sense. Thank you for explaining.
I like hearing about other people's environments, because it gives perspective.
The performance is just a "nice to have".
Python package management, especially at scale is infuriating. At work we use python microservices in docker containers and it infuriates me trying to update the one our team is responsible for.
I always like to rant that python 3rd party package management tools are a mistake. We should've gone for an "as simple as possible" setup instead of all this.
So I'm sceptical of UV on principle since it's yet another 3rd party package manager but if it can do all of this and not be a nightmare I'll be ok with it.
I think the main focus is around building out the tool chain - I would think being fast is just a side benefit and the main benefit is being written as the same language as what they want to use for the rest of "cargo"
uv is now capable of installing and managing Python itself, making it entirely self-bootstrapping:
Looking forward to this. One of the blind spots of poetry was to ignore the issue of managing python versions themselves. I'm happy to see they're covering so many aspects of dependency management and replicability.
This is incredible. Truly hats off to the folks at Astral. Can't wait to try all this out and replace all our old bespoke tooling.
Isn't uv
being used as a package manager/resolver in rye
? I'm using rye
for my new projects and it's nice because ruff
and pytest
are being unified in it too.
Yeah it is, eventually they want UV to have feature parity with rye and rye will basically just be a pointer to UV
Rye's developer on their plans for Rye in the context of uv's latest release:
Very impressive results. I think I’ll give the tool a try next time we’re working on a small project. I’m dissatisfied with the existing packaging solutions.
This is great!
@burntsushi@programming.dev, do you know is Astral is working with prefix.dev and their Pixi project? They seem to now have overlapping concerns.
I don't think they have anything to do with each other, it looks like prefix.dev uses conda packages.
Conda is their primary focus, but they support well more than conda packages.
Looks nice. The edge cases will be what determines if it gains adoption
Welcome to the Python community on the programming.dev Lemmy instance!
Past
November 2023
October 2023
July 2023
August 2023
September 2023