To be honest, this was just a matter of time. As a long time Python developer, I just can’t wrap my head around the lack of something like this. GitHub was going to get hosted packages for Python but never did because it “didn’t align with their strategy objectives and a reallocation of resources” [1] (or some other similar corpospeak) Astral is a great company and I think we can’t question what they’ve achieved and provided to the Python community. uv is a game changer and solves one of the core issues with Python by providing a unified tool that’s also fast, reliable and easy to use. In fact, after using uv for the first time (coming from a combination of pyenv + poetry) I never wanted to go back and this is something all of my peers have experienced too. I’m glad it’s Astral who is doing this, and of course they will have to make money one way or another (which is perfectly fine and I don’t think anyone on this forum can be against that, as long as they are actually providing real value) but I was honestly tired of the paralysis on this matter. I did try to build a registry (pyhub.net) but being just one person with almost no resources and having another full time business made it impossible. Anyway, congrats to the team for the effort! [1] https://github.com/orgs/community/discussions/8542
All python packaging challenges are solved. Lesson learned is that there is not a single solution for all problems. getting more strings attached with VC funded companies and leaning on their infrastructure is a high risk for any FOSS community.
> Why is it so hard to install PyTorch, or CUDA, or libraries like FlashAttention or DeepSpeed that build against PyTorch and CUDA?
This is so true! On Windows (and WSL) it is also exacerbated by some packages requiring the use of compilers bundled with outdated Visual Studio versions, some of which are only available by manually crafting download paths. I can't wait for a better dev experience.
This is effectively what Charlie said they were going to build last September when quizzed about their intended business model on Mastodon: https://hachyderm.io/@charliermarsh/113103564055291456
Astral folks that are around - there seems to be a bit of confusion in the product page that the blog post makes a little more clear.
> The next step in Python packaging
The headline is the confusing bit I think - "oh no, another tool already?"
IMO you should lean into stating this is going to be a paid product (answering how you plan to make money and become sustainable), and highlight that this will help solve private packaging problems.
I'm excited by this announcement by the way. Setting up scalable private python registries is a huge pain. Looking forward to it!
Soon: there are 14 competing Python packaging standards.
This is a joke, obviously. We've had more than 14 for years.
As I said a couple weeks ago, they're gonna have to cash out at some point. The move won't be around Uv -- it'll be a protected private PyPi or something.
https://news.ycombinator.com/item?id=44712558
Now what do we have here?
What does GPU-aware mean in terms of a registry? Will `uv` inspect my local GPU spec and decide what the best set of packages would be to pull from Pyx?
Since this is a private, paid-for registry aimed at corporate clients, will there be an option to expose those registries externally as a public instance, but paid for by the company? That is, can I as a vendor pay for a Pyx registry for my own set of packages, and then provide that registry as an entrypoint for my customers?
The real pyx is an absolutely wonderful graphing package. It's like Tex in that everything looks wonderful and publication-quality.
Is there a big enough commercial market for private Python package registries to support an entire company and its staff? Looks like they're hiring for $250k engineers, starting a $26k/year OSS fund, etc. Expenses seem a bit high if this is their first project unless they plan on being acquired?
I'm brushing up with Python for a new job, and boy what a ride. Not because of the language itself but the tooling around packages. I'm coming from Go and TS/JS and while these two ecosystems have their own pros and cons, at least they are more or less straightforward to get onboarded (there are 1 or 2 tools you need to know about). In Python there are dozens of tools/concepts related to packaging: pip, easy_install, setuptools, setup.py, pypy, poetry, uv, venv, virtualenv, pipenv, wheels, ... There's even an entire website dedicated to this topic: https://packaging.python.org
Don't understand how a private company like Astral is leading here. Why is that hard for the Python community to come up with a single tool to rule them all? (I know https://xkcd.com/927/). Like, you could even copy what Go or Node are doing, and make it Python-aware; no shame on that. Instead we have these who-knows-how-long-they-will-last tools every now and then.
They should remove the "There should be one-- and preferably only one --obvious way to do it." from the Python Zen.
Been waiting to see what Astral would do first (with regards to product). Seems like a mix of artifactory and conda? artifactory providing a package server and conda trying to fix the difficulty that comes from Python packages with compiled components or dependencies, mostly solved by wheels, but of course PyTorch wheels requiring specific CUDA can still be a mess that conda fixes
I lost track of how many different ways to install a Python library there are at the moment.
The real thing that I hope someone is able to solve is downloading such huge amounts of unnecessary code. As I understand, the bulk of the torch binary is just a huge nvfatbin compiled for every SM under the sun when you usually just want it to run on whatever accelerators you have on hand. Even just making narrow builds of like `pytorch-sm120a` (with stuff like cuBLAS thin binaries paired with it too) as part of a handy uv extra or something like that would make it much quicker and easier.
Interesting watching this part of the landscape heating up. For repos you've got stalwarts like Artifactory and Nexus, with upstart Cloudsmith. For libraries you've got the OG ActiveState, Chainguard Libraries and, until someone is distracted by a shiny next week, Google Assured Open Source.
Sounds like Pyx is trying to do a bit of both.
Disclosure: I have interacted a bunch with folks from all of these things. Never worked for or been paid by, though.
Does anyone have insights how this compares to anaconda's approach? To me both seem very similar, ux <-> conda, pyx <-> conda-forge.
Sure, astral's products are remarkable and widely loved, but I would like to understand if there's a USP beyond that?
Can I ask a dumb question. Why does Ruby (for example) not have this problem, but python still can't ship a standard solution which isn't constantly changing and rolled up in some corporate offering?
I don’t know how I feel about one company dominating this space. I love what they do but what happens 5 years down the road?
I wonder whether it will have a flat namespace that everyone competes over or whether the top-level keys will be user/project identifiers of some sort. I hope the latter.
Just try using ux with Google Artifact Registry and AWS CodeArtifact. You need the username in the URL with GAR and you need it in the environment variable with AWS. I'm not sure who's causing the pain with using uv with private repositories (but I'd bet it's uv), but they are selling a solution to that pain.
Full disclosure, uv rocks and is way better than poetry, setuptools and whatever complicated and non-zen way packaging python has had in the past.
Is this going to solve the combinatorial explosion of pre-building native dependencies for every possible target?
Python should get rid of its training wheels :^)
I really want to pay someone money to run package repo mirrors for me, but my problems have been more with npm than with Pypi. Astral, if you're listening.... maybe tackle JS packaging too?
I personally use Nixpkgs to fully isolate python installation on per project basis. And inside this isolated env, once python is built, usually using pip is enough and works well
This way, !each of my repositories has its own nix file defining the list of dependencies and it can easily be built per system as well
Pyx is just a registry, just like Pypi, or did I misunderstood it?
Only thing that is unclear to me is to which extend this setup depends on the package publisher. PyPi might be terrible at least it just works when you want to publish that it leads to more complexity for the ones that are looking to use this piece of free software is not for the maintainer.
Maybe they are only targeting dev tooling companies as a way to simplify how they distribute. Especially in the accelerated compute era.
When this releases it will be crazy, Ive always wondered why something like this didn't already exist.
Really useful concept especially for school.
Cool idea! I think I could benefit from this at my job if they're able to eat Anaconda's lunch and provide secure, self-hosted artifacts.
If uv can install a Python version in 32ms (a time that I saw and confirmed just this morning), then sign me up.
Good on you guys!
I wanted to start a business exactly like this years ago, when I actually worked in Python. I ended up not doing so, because at the time (circa 2014-2015) I was told it would never take off, no way to get funding.
I'm glad you're able to do what ultimately I was not!
I think it’s a good monetization strategy for astral team. I just hope that they do not lock uv only on pyx and that uv seamlessly works with pypi.
> There should be one-- and preferably only one --obvious way to do it.
Been waiting for something like this to make it easier to manage multi-package projects.
Astral is the coolest startup
My 2c when it comes to python packaging:
1. do dev in an isolated environment like docker
2. install deps from config - ie. build a new image when adding a dep
I hate that they are using the pyx name; it's the extension for Cython files. It's going to cause at least a moment of confusion for people. They could have easily checked for name collision in the Python ecosystem but they chose not to do that; that's like a middle finger gesture to the community.
How do you pronounce "pyx"? Pikes, picks, pie-ex?
No Thanks. For majority of my use case pip is just fine. I’m not here to chase time just to live life
Isn't all of this solved with containers (or virtual machines)? Just lxc/lxd/docker a new container, and no more worries about virtual environments or conda or whatever other bandaids. A container per project. A container per version of a project. Heck, a container per package in a project!
`uv` is incredible software and I love it. I am willing to try anything from this team.
I do not trust Astral.
Much ad language.
They do not explain what an installation of their software does to my system.
They use the word "platform".
Welp, I guess it's time to start pulling all the uv deps out of our builds and enjoy the extra 5 minutes of calm per deploy. I'm not gonna do another VC-fueled supply chain poisoning switcheroo under duress of someone else's time crunch to start churning profit.
How many Python packaging solutions do we need?
Thankfully I only use it for OS and application scripting and keep myself to the standard library and application exposed APIs as much as possible, even if there are marvelous alternatives outside the standard library.
Python packaging is the least zen of python thing about python.
using mypy regularly on a large project is pretty slow and painful, so I personally welcome any improvements no matter the origin!!
Who's going to write YAPP, yet another python packager? (Although I see the name is already taken for a different type of project.)
I feel like I must be the crazy one for never having a problem with just vanilla pip, PyPi, and venv (or virtualenv for old Python 2 stuff). Maybe it's just my use case?
I have no connection to astral, and while we dabble in uv (which has been great) we mostly use conda because of CUDA linking etc, but an alternative is at the least welcome and at best an improvement, so I don’t get the degree of cynicism or even hostility on this thread. If you don’t want to use pyx stick with pypi or write your own damn python packaging server.
Neat. uv is spectacular.
But I don’t get it. How does it work? Why is it able to solve the Python runtime dependency problem? I thought uv had kinda already solved that? Why is a new thingy majig needed?
Sweet. Yet another Python packaging silver bullet. I can’t wait to ignore this one while I am waiting for next month’s.
>Modern
I'll pass. I'd rather have the battle-tested old thing, thanks.
An amazing number of words that basically say nothing
Again! ezsetup, setuptools, conda, poetry, uv, now this.
This industry is fucking doomed.
Python devs need to find God
What are the reasons that Python can't implement the same sort of module/packaging system as NodeJS? That seems to work well enough.
Executing a Python script in the same directory as some sort of project.json file that contains all the complicated dependency details would be a pretty good solution to me. But I'm probably missing a whole bunch of details. (Feel free to educate me).
In general I really dislike the current system of having to use new environment variables in a new session in order to isolate Py scripts. It has always seemed like a hack with lots of footguns. Especially if you forget which console is open.
Obligatory xkcd: https://xkcd.com/1987/
(I've seen https://xkcd.com/927/ in the comments a few times, but apparently I'm the first one who had this one in mind)
i wonder if nix has been considered
this is so comical, entirely https://xkcd.com/927/
python has burned me with it's packaging so many times.
Yay, _another_, probably incompattible, python package manager has arrived.
Probably the more useful blog post: https://astral.sh/blog/introducing-pyx
> Waitlist
> Private registry
ouch.
It’s the year 21XX. Another HN story about python packaging being solved for good this time hits the front page. I try to use a python app, am impressed by its capability to generate 10 pages of red text and go back to cryosleep.
I've been burned too many times by embracing open source products like this.
We've been fed promises like these before. They will inevitably get acquired. Years of documentation, issues, and pull requests will be deleted with little-to-no notice. An exclusively commercial replacement will materialize from the new company that is inexplicably missing the features you relied on in the first place.