I recently watched a talk by the author of uv that was surprisingly fascinating [1]. He goes into a few of the more notable hacks that they had to come up with to make it as fast as it is. The most interesting thing for me was that package resolution in python given constraints defined (eg. in requirements.txt) maps to a boolean satisfiability problem which is NP-complete. So uv uses a custom SAT solver to do this. I totally under-appreciated how much goes into this software and I'm bummed I have to use Poetry at work after having watched this talk.
There are such cases in uv as well, and I’ve hit them quite often when I didn’t specify lower bounds (especially for boto3).
thundergolfer 4 days ago [-]
In our internal benchmarks miniconda is about as fast as uv at installing torch.
thundergolfer 4 days ago [-]
Kind of an aside as this doc is about the complexities of installing particular PyTorch versions, but will say that uv is way faster at installing PyTorch than pip.
We run internal benchmarks of our custom container image builder and in the 'install torch' benchmark the p50 time saved when using `uv` is 25 seconds! (71.4s vs. 43.74s)
---
Aside 2: Seems there's a missing "involves" in this sentence: "As such, installing PyTorch typically often configuring a project to use the PyTorch index."
cik 4 days ago [-]
Joining your aside to tout the benefits of uv. We use uv combined with a simple proxy I wrote, to cache python dependencies, and then install them in parallel. UV also makes it simple to regenerate a requirements file and know who requires the dependencies, which in turn makes it easy to manage the ecosystem, analyze packages, and determine if we can reduce our footprint.
Between that latter feature, the proxy, the parallelization we've reduced build times across ~100 engineers by a solid 10 minutes. There are other things we do as well, but uv is a must have nowadays.
freetonik 4 days ago [-]
Is your caching proxy open source?
cik 4 days ago [-]
It's just nginx. Here's a link to something someone did. It's close enough to be honest, unless you have our specialized needs.
Also use pyproject.toml to specify dependencies, not manually installing stuff with uv pip
ttyprintk 4 days ago [-]
Apparently, uv respects a key in project.toml:
[[tool.uv.index]]
name = "pytorch-cu124"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
gdiamos 4 days ago [-]
Nice, I wonder if there is a way to make it conditional, e.g. to pick a different key for a cpu vs cuda build.
kristjansson 4 days ago [-]
That’s basically what TFA is about? Set up multiple index urls, and use a bog-standard platform marker on the PyTorch dep to pick between them.
ttyprintk 4 days ago [-]
Eh, in that case the environment variable is a bit more evident in a Dockerfile.
samtheprogram 4 days ago [-]
Speeds up installation, or speeds up PyTorch in general?
gleenn 4 days ago [-]
Almost certainly only the install. Uv is basically a pip tool substitute with a few other bells and whistles too but shouldn't affect run time whatsoever.
pcwelder 4 days ago [-]
I just want to thank people behind uv. The tool is just amazing for development, packaging and running packages. And it's blazing fast!
minimaxir 4 days ago [-]
So uv caused a bit of an issue with me installing PyTorch over the weekend.
When installed with brew on my MacBook, uv currently has PyTorch 3.13 as a dependency, which is fine. But PyTorch does not currently have a stable wheel that's compatable with Python 3.13! This resulted in very confusing errors. (Solution was to point to the Nightly index)
That's technically PyTorch's fault, but it's indicitave why a specific page on installing PyTorch is necessary, and it's good to know the documentation specifically calls it out.
the_mitsuhiko 4 days ago [-]
Fun fact: that's called out in the doc you're commenting :)
0cf8612b2e1e 4 days ago [-]
I have run into multiple package problems with 3.13 with a non obvious root cause error message. Thankfully, uv makes it trivial to switch out to 3.12
zanie 4 days ago [-]
I work on our error messages, feel free to open an issue and we'll do our best to make it clearer
emmanueloga_ 4 days ago [-]
Hey there, I experienced a hairy error message recently too while trying to install aider-chat from pypi with Python 3.13 and Pixi (but I was told the error was coming from UV).
"Solution": `pixi add python=3.12`, then `pixi add --pypi aider-chat` succeeds without issues.
A message like "aider-chat seems to be incompatible with python=3.13, try downgrading to python-3.12" would be great, assuming this is really the case.
I can create an issue if it helps, but really quick here's the error I was getting and some talk on discord about it:
Thanks! There's definitely room for improvement there. I'll see what we can do — in general it's a bit of an arcane task to extract clear suggestions from the resolver's error tree.
emmanueloga_ 3 days ago [-]
Sounds good, thank you!
breuleux 4 days ago [-]
I was trying to figure out how to set up a pyproject with uv that could support cuda, rocm and other device types this morning, and next thing I knew, there was a new release adding pretty much exactly what I needed.
The pace of development on uv is really impressive.
mrbonner 4 days ago [-]
Does uv support global Python install now? I need something like Mise for this.
cbenz 4 days ago [-]
Yes via the "--system" option of the different commands.
Or via the global "python-preference" option set to "only-system".
That is not global. From the uv getting started docs:
"When Python is installed by uv, it will not be available globally (i.e. via the python command). Support for this feature is planned for a future release. In the meantime, use uv run or create and activate a virtual environment to use python directly."
So yes, one needs mise/asdf/pyenv or similar for global installs for now.
JimDabell 4 days ago [-]
You could always just `alias python="uv run python"`
chocolight 4 days ago [-]
I've read that Torch was dropping their Conda support, but won't everybody just move to Mamba which is a drop-in replacement of Conda?
Conda (and Mamba) allows to avoid duplicating packages on the disk between environments (not just the downloaded archives, but the resulting expanded files too).
How does uv compare in this regard?
alex_suzuki 4 days ago [-]
In a nutshell, what do I gain from switching to uv from my current workflow, which is:
1) create a venv (`python3.xx -m venv venv`)
2) install packages from a requirements.txt into that venv?
One limitation I know of are the inability to detect stale packages.
Apart from „blazing fast“, which I‘m not convinced it really matters to me as I rarely touch the dependencies, what are the main reasons why uv is gaining traction?
reubenmorais 4 days ago [-]
You get correct version resolution (checking compatibility across the entire tree of deps-of-deps, including against different Python versions) and a lock file which represents a global state of the entire tree and gives you reproducibility of a working setup.
gugagore 4 days ago [-]
Does pip-compile do the same? Or what's the difference?
reubenmorais 4 days ago [-]
pip-compile, poetry, Pipenv, et al all try do roughly the same, with various caveats and design differences (e.g. Pipenv is not meant to be used in libraries, only applications). uv is the latest kid in the block.
infamia 3 days ago [-]
You might gain a couple of things...
No need to activate venvs and the almost inevitable Python pathing "murder mysteries". uv installs in venvs first, and only then someplace else (e.g., globally).
No more clunky typing `python -m pip install foo` even when you have activated your venv (or you think you have). `uv pip install foo` is nicer and easier to remember.
uv add will add new dependencies to your pyproject.toml so you don't have to.
uv can setup skeletons for new projects in a nice, modern way
For older projects, you can have uv to resolve dependencies as of a certain date. I imagine this is great for older projects, especially with numerous dependencies.
It might remove the need for pyenv or the need to rely on your system provided Python, since uv can install Python for your project.
Cross-platform lock files
I've just started looking in to uv, so maybe my list isn't complete/very good. Some down sides include, it's still green (has some bugs naturally and lacks some features) and some might not trust/like that it's VC backed.
alex_suzuki 2 days ago [-]
> For older projects, you can have uv to resolve dependencies as of a certain date. I imagine this is great for older projects, especially with numerous dependencies.
Interesting. I imagine this is a selling point for corporate environments.
imjonse 4 days ago [-]
One reason is that it is not just for project venv/deps management but can replace other tools like pipx and pyenv for most scenarios.
alex_suzuki 4 days ago [-]
PS: one thing I like about my current workflow is no extra tools needed, base python install is all that‘s required.
JimDabell 4 days ago [-]
It’s similar with uv. You have exactly one dependency on the host system – it’s just uv instead of Python. uv will then obtain the correct version of Python for your project. And uv is easier to install than Python – it’s literally just one binary.
alex_suzuki 3 days ago [-]
Interesting, thanks for pointing that out. Will give it a try.
antman 4 days ago [-]
Does anyone succeed in packaging a uv pytorch env to a pyinstaller exe or similar? I am having a hard time but I assume it can be automated
Mxbonn 4 days ago [-]
Now that PyTorch is also ending their anaconda package distribution, I think a lot of ml/ds people should give uv a shot.
nickysielicki 4 days ago [-]
I had no idea this was happening. This comment just made my week. Thank God.
paradite 4 days ago [-]
I was just getting used to pipenv and pyenv combo.
Is this worth switching to?
silvester23 4 days ago [-]
I would also say absolutely. We've been using pipenv for ~6 years and have managed to build a pretty good workflow around it. But uv is just _so much faster_. So we've started moving everything over to uv and I don't think we'll ever look back.
Migrating is not super hard, we wrote a small script that moves all the information from a Pipfile to a pyproject.toml and it works like a charm.
jpalomaki 4 days ago [-]
I've been using a combination of pyenv, venv and Poetry in the past.
Now I have switched to uv with new projects. No problems so far. I definitely recommend giving it a go.
baggiponte 4 days ago [-]
Absolutely. No second thoughts.
cbenz 4 days ago [-]
Absolutely. It changed my Python developer life \o/
fernandotakai 4 days ago [-]
yes, 100%. it's not only faster, but it makes sense.
4 days ago [-]
Rendered at 04:36:57 GMT+0000 (Coordinated Universal Time) with Vercel.
[1] https://www.youtube.com/watch?v=gSKTfG1GXYQ
edit: NP-complete not NP-hard
How does uv’s sat solver compare?
https://www.anaconda.com/blog/a-faster-conda-for-a-growing-c...
We run internal benchmarks of our custom container image builder and in the 'install torch' benchmark the p50 time saved when using `uv` is 25 seconds! (71.4s vs. 43.74s)
---
Aside 2: Seems there's a missing "involves" in this sentence: "As such, installing PyTorch typically often configuring a project to use the PyTorch index."
Between that latter feature, the proxy, the parallelization we've reduced build times across ~100 engineers by a solid 10 minutes. There are other things we do as well, but uv is a must have nowadays.
https://github.com/hauntsaninja/nginx_pypi_cache
https://docs.astral.sh/uv/guides/integration/docker/#install...
When installed with brew on my MacBook, uv currently has PyTorch 3.13 as a dependency, which is fine. But PyTorch does not currently have a stable wheel that's compatable with Python 3.13! This resulted in very confusing errors. (Solution was to point to the Nightly index)
That's technically PyTorch's fault, but it's indicitave why a specific page on installing PyTorch is necessary, and it's good to know the documentation specifically calls it out.
"Solution": `pixi add python=3.12`, then `pixi add --pypi aider-chat` succeeds without issues.
A message like "aider-chat seems to be incompatible with python=3.13, try downgrading to python-3.12" would be great, assuming this is really the case.
I can create an issue if it helps, but really quick here's the error I was getting and some talk on discord about it:
https://pastebin.com/UVkFstJH
https://discord.com/channels/1082332781146800168/12957237931...
The pace of development on uv is really impressive.
Or via the global "python-preference" option set to "only-system".
Cf https://docs.astral.sh/uv/concepts/python-versions/#adjustin... and https://docs.astral.sh/uv/reference/settings/#python-prefere...
The work can be tracked in https://github.com/astral-sh/uv/issues/6265
"When Python is installed by uv, it will not be available globally (i.e. via the python command). Support for this feature is planned for a future release. In the meantime, use uv run or create and activate a virtual environment to use python directly."
So yes, one needs mise/asdf/pyenv or similar for global installs for now.
Conda (and Mamba) allows to avoid duplicating packages on the disk between environments (not just the downloaded archives, but the resulting expanded files too).
How does uv compare in this regard?
One limitation I know of are the inability to detect stale packages.
Apart from „blazing fast“, which I‘m not convinced it really matters to me as I rarely touch the dependencies, what are the main reasons why uv is gaining traction?
No need to activate venvs and the almost inevitable Python pathing "murder mysteries". uv installs in venvs first, and only then someplace else (e.g., globally).
No more clunky typing `python -m pip install foo` even when you have activated your venv (or you think you have). `uv pip install foo` is nicer and easier to remember.
uv add will add new dependencies to your pyproject.toml so you don't have to.
uv can setup skeletons for new projects in a nice, modern way
For older projects, you can have uv to resolve dependencies as of a certain date. I imagine this is great for older projects, especially with numerous dependencies.
It might remove the need for pyenv or the need to rely on your system provided Python, since uv can install Python for your project.
Cross-platform lock files
I've just started looking in to uv, so maybe my list isn't complete/very good. Some down sides include, it's still green (has some bugs naturally and lacks some features) and some might not trust/like that it's VC backed.
Interesting. I imagine this is a selling point for corporate environments.
Is this worth switching to?
Migrating is not super hard, we wrote a small script that moves all the information from a Pipfile to a pyproject.toml and it works like a charm.
Now I have switched to uv with new projects. No problems so far. I definitely recommend giving it a go.