A concern:<p>More and more plainly, OpenAI and Anthropic are making plays to own (and lease) the "means of production" in software. OK - I'm a pretty happy renter right now.<p>As they gobble up previously open software stacks, how viable is it that these stacks remain open? It seems perfectly sensible to me that these providers and their users alike have an interest in further centralizing the dev lifecycle - eg, if Claude-Code or Codex are interfaces to cloud devenvs, then the models can get faster feedback cycles against build / test / etc tooling.<p>But when the tooling authors are employees of one provider or another, you can bet that those providers will be at least a few versions ahead of the public releases of those build tools, and will enjoy local economies of scale in their pipelines that may not be public at all.
It’s a small tool shop building a tiny part of the Python ecosystem, let’s not overstate their importance. They burned through their VC money and needed an exit and CLI tool chains are hyped now for LLMs, but this mostly sounds like an acquihire to me. Dev tools are among the hardest things to monetize with very few real winners, so good for them to get a good exit.
Small tool shop, burning VC money, true. "Tiny part of the Python ecosystem" is an understatement given how much impact uv has made alone.
Just a tiny project with over 100 million downloads every month, over 4 million every day. No big deal. Just a small shop, don't overstate its importance.<p><a href="https://pypistats.org/packages/uv" rel="nofollow">https://pypistats.org/packages/uv</a>
Sure, but if tomorrow uv and ruff ceased to exist, we could all go back to any number of other solutions.
Ruff is nice, but not important, uv is one of the few things making the python ecosystem bearable. Python is a language for monkeys, and if you don't give monkeys good tools, they will forever entangle themselves and <i>you</i>. It is all garbage wrapped in garbage. At least let me deploy it without having to manually detangle all that garbage by version.<p>I'm done pretending this is a "right tools for the right job" kind of thing, there's wrong people in the right job, and they only know python. If no one self-writes code anymore anyway, at least use a language that isn't a clusterfuck of bad design decisions, and has 1 trillion lines of code in the collective memory of people who don't know what a stack is.
> If no one self-writes code anymore anyway, at least use a language that isn't a clusterfuck of bad design decisions<p>I can get behind the idea that LLM's probably don't need a language designed for humans if humans arent writing it, but the rest of this is just daft. Pythons popularity isn't just pure luck, in fact its only been in recent years that the tooling has caught up to the point where its as easy to setup as it is to write, which should really tell you something if people persevered with it anyway.<p>I'm sorry your favourite language doesnt have the recognition it so rightfully deserves, but reducing python to just "stupid language for stupid people" is, well, stupid
Ahaha, I feel this comment.<p>I used to do backend development in superior languages, and sometimes do hobby frontend in superior languages, but my work is Python now. And it kind of has to be Python: we do machine learning, and I work with GDAL and PDAL and all these other weird libraries and everything has Python bindings! I search for "coherent point drift" and of course there's a Python library.<p>The superior languages I mentioned... perhaps they have like a library for JSON encoding and decoding. You need anything else? Great, now you're a library author and maintainer!
Python existed for years before uv with a huge ecosystem, and will continue to do so after/if it dies
uv, yes*, but really PEP 723:<p><a href="https://peps.python.org/pep-0723/" rel="nofollow">https://peps.python.org/pep-0723/</a><p>* <i>disclosure: We are a commercial client of astral.sh</i>
It’ll probably be a game changer for scripts, yes. Writing “portable” Python scripts was a nice exercise, though (and will be, for a while).
I agree uv is great but let’s not get carried away here. Poetry is good, pip was fine for many use-cases after they added native lock files.
if you are working on one tiny project on your machine that pips in four packages you probably think pip was OK.<p>Circa 2017 I was working on systems that were complex enough that pip couldn't build them and after I got to the bottom of it I knew it not my fault but it was the fault of pip.<p>I built a system which could build usable environments out of pre-built wheels and sketched out the design of a system that was roughly 'uv but written in Python' but saw two problems: (1) a Python dependent system can be destroyed by people messing with Python environments, like my experience is that my poetry gets trashed every six months or so and (2) there was just no awareness by the 'one tiny project on your machine that pips in four packages' people that there was a correctness problem at all and everybody else was blaming themselves for a problem and didn't have a clear understanding of what was wrong with pip or what a correct model for managing python dependencies is (short answer: see maven) or that a 100% correct model was even possible and that we'd have to always settle for a 97% model. The politics looked intractable so I gave up.<p>Now written in rust, uv evaded the bootstrap problem and it dealt with the adoption problem by targeting 'speed' as people would see the value in that even if they didn't see the value in 'correctness'. My system would have been faster than pip because it would have kept a cache, but uv is faster still.
I still believe Rust is a red herring here. Your ‘uv but written in Python’ would probably have the same success as uv does now, if you did focus on speed over correctness. And I’ve yet to hear about pipx or Poetry getting trashed, but if it is a problem, I don’t think it’s impossible to solve in Python vs Rust.<p>> The politics looked intractable so I gave up.<p>So yeah, this is your <i>actual</i> problem. (Don’t worry, I’m in the same camp here.)
Well said.<p>I have used them all and UV is the only one that actually solves the problem.<p>It’s insane that people would suggest that Python can go back.
> everybody else was blaming themselves for a problem and didn't have a clear understanding of what was wrong with pip or what a correct model for managing python dependencies is (short answer: see maven)<p>I always looked down on the Java ecosystem but if it turns out Maven had a better story all along and we all overlooked it, that's wild.
Poetry and friends are so bad that many people continued just using pip -r requirements.txt despite knowing about this other stuff<p>Poetry having users isn’t the metric for success. pip having way less users is.
How is uv awesome and Poetry so bad? They do basically the same things except Astral re-invents the wheel but only part way instead of just relying on the existing tools. uv is fast. As far as I can tell, there's hardly any difference in functionality except for it also replacing PyEnv, which I never use anyway.
uv assuming your local Python is busted to hell and back helps a lot with isolation.<p>Poetry's CLI would often, for me, just fall over and crash. Crashing a lot is not a fundamental problem in the sense you can fix the bugs, but hey I'm not hitting uv crashes.<p>pipenv was even worse in terms of just hanging during package resolution. Tools that hang are not tools you want in a CI pipeline!<p>The end result: `uv run` I expect to work. `pipenv` or `poetry` calls I have to assume don't work, have to put retries into CI pipelines and things like that.
uv has a lot of sensible defaults that prevent clueless developers to shoot their own feet. Uv sync, for example, would uninstall packages not in pyproject.toml
let's get carried away.<p>`uv run` a .py with inline script metadata has all the deps installed and your script running in a venv while poetry is still deciding to resolve...
Ok, what am I missing, I've used python for many many years. What does UV give us over pip + venv + pyenv?<p>(I'm not doing this to be a dick, I genuinely want to know what the use case is)
I've used python for roughly 15 years, and 10 of those years I was paid to primarily write and maintain projects written in Python.<p>Things got <i>bearable</i> with virtualenv/virtualenv wrappers, but it was never what I would call great. Pip was always painful, and slow. I never looked forward to using them - and every time I worked on a new system - the amount of finaggling I had to do to avoid problems, and the amount of time I spent supporting <i>other</i> people who had problems was significant.<p>The day I first used uv (about is as memorable to me as the the day I first started using python (roughly 2004) - everything changed.<p>I've used uv pretty much every single day since then and the joy has never left. Every operation is twitch fast. There has never once been any issues. Combined with direnv - I can create projects/venvs on the fly so quickly I don't even bother using it's various affordances to run projects without a venv.<p>To put it succinctly - uv gives me two things.<p>One - zero messing around with virtualenvwrappers and friends. For whatever reason, I've never once run into an error like "virtualenvwrapper.sh: There was a problem running the initialization hooks."<p>Two - fast. It may be the fastest software I've ever used. Everything is instant - so you never experience any type of cognitive distraction when creating a python project and diving into anything - you think it - and it's done. I genuinely look <i>forward</i> to uv pip install - even when it's <i>not</i> already in cache - the parallel download is epically fast - always a joy.
Everything “just works” and is fast - and that’s basically it.<p>You can run a script with a one liner and it will automatically get you the same python and venv and everything as whoever distributed the python code, in milliseconds if the packages are already cached on your local computer.<p>Very easy to get going without even knowing what a venv or pypi or anything is.<p>If you are already an expert you get “faster simpler tooling” and if you are a complete beginner it’s “easy peasy lemon squeezy”.
for one, it's <i>one</i> tool, that does the job of all three.<p>it just <i>works</i>. i'm not sure how else to describe it other than less faffing about. it just does the right thing, every time. there's a tiny learning curve (mostly unlearning bad or redundant habits), but once you know how to wield it, it's a one stop shop.<p>and as mentioned, it's crazy fast.
It's not horrifically slow.
uv is nice, but not irreplaceable. An open source, maintenance mode fork would work just as fine. And even if all of uv disappeared today, I’d go just back to Poetry. Slower? Sure, a bit.<p>...and then I’ve read the rest of your comment. Please do go read the HN guidelines.
> making the python ecosystem bearable<p>You should really qualify that statement, it implies that the Python ecosystem is bearable.
Yes please, lets start with scraping to bin whole internet using javascript and its family.<p>See the point ?
Maybe you could. I would stare longingly into the void, wondering if I can ever work another python project after having experienced uv, ruff, and ty.<p>Such an outcome would make me wonder regarding the wisdom of "It is better to have love and lost than to have never loved at all."
I was using poetry pretty happily before uv came along. I’d probably go back.<p>Note that uv is fast because — yes, Rust, but also because it doesn’t have to handle a lot of legacy that pip does[1], and some smart language independent design choices.<p>If uv became unavailable, it’d suck but the world would move on.<p>[1] <a href="https://nesbitt.io/2025/12/26/how-uv-got-so-fast.html" rel="nofollow">https://nesbitt.io/2025/12/26/how-uv-got-so-fast.html</a>
It is an MIT licensed project, someone will absolutely fork it.
Ruff is performant but finds about half the issues Pylint does (see <a href="https://github.com/astral-sh/ruff/issues/970" rel="nofollow">https://github.com/astral-sh/ruff/issues/970</a>). Ty is quantitatively the worst of the well-known type checkers (see <a href="https://news.ycombinator.com/item?id=47398023">https://news.ycombinator.com/item?id=47398023</a>). Uv is Astral's only winner.
You are aware that ty has only recently entered beta status?<p>Ruff isn’t stable yet either and has evolved into the de facto standard for new projects. It has more than double the amount of rules than Pylint does. Also downloaded more than 3 times as often as Pylint in the past month.<p>Pylint has some advantages, sure, but Ruffs adoption speaks for itself. Pylint is 25 years old. You’d hope they do some things better.<p>Saying that uv is their only winner is a hilarious take.
Reread the comment I replied to:<p><i>> I would stare longingly into the void, wondering if I can ever work another python project after having experienced uv, ruff, and ty.</i><p>You think you're disagreeing with me, but you're agreeing. To wit: The original post is silly, because ty is beta quality and Ruff isn't stable yet either. Your words.<p>These are just tools, Pylint included. Use them, don't use then, make them your whole personality to the point that you feel compelled to defend them when someone on the Internet points out their flaws. Whatever churns your butter.
>Saying that uv is their only winner is a hilarious take.<p>na this news is good enough reason to move from Ruff back to black and stay the course, I won't use anything else from Astral. I will use uv but only until pip 2/++ gets its shit together and catches up and hopefully then as a community we should jump back on board and keep using pip even if it's not as good, it's free in the freedom sense.
Maybe consider something other than python.
Always choose the best tool for the job.<p>Then import that tool and and check if __name__ == "__main__"
Good luck with that. I haven't been successful at convincing anyone to move away from it. I'm so fucking sick of writing Python at work lol
While I hope it never comes to that, all the code is MIT licensed, I would assume everyone would make the sensible decision for fork it.
I see Apache and MIT license files in their GitHub. What's to prevent the community from forking and continuing development if the licenses change?
Personally I would stop using Python again.
uv is the one thing that made it bearable.
I would just ditch Python, like I did 8 years ago.
>Sure, but if tomorrow uv and ruff ceased to exist, we could all go back to any number of other solutions.<p>Or, more relevant to this conversion: If they closed source tomorrow, the community could fork the current version.
Eurgh, I do not want to ever touch Poetry or pyenv again, thank you very much.
I wish that were also true for the case of Claude/Codex/etc
…if tomorrow <i>python</i> ceased to exist, we could all go back to any number of other solutions.
UV is so much nicer than the other options.
I mean, if you believe the hype on this website, Claude Code could build a perfect clone of uv in a few hours using only the documentation.
Don't understate its importance. I've been using Python for more than 30 years. They solved a problem that a lot of smart people didn't solve (<i>). Python developer experience improved an order of magnitude.<p>(</i>) Sure, they were on the shoulders of giants
I do feel like it is overstated, and the number of downloads is not a good metric at all. There are npm packages with many millions of downloads, too.
The “requests” package gets downloaded one billion times every month, should that be a multi billion dollar VC company as well? It’s a package manager and other neat tooling, it’s great but it’s hardly the essence of what makes Python awesome, it’s one of the many things that makes this ecosystem flourish. If OpenAI would enshittify it people would just fork or move on, that’s all I’m saying, it’s not in any way a single point of failure for the Python ecosystem.
That says more about the sad state of modern CI pipelines than anything about uv's popularity.<p>Not disputing that it's a great and widely used tool, BTW.
Not including direct downloads via the native installers, Homebrew, Winget, or Docker, mind you.
I mean, these sorts of numbers speak to the mind-bogglingly inefficient CI workflows we as an industry have built. I’d be surprised if there were 4 million people in the world who actually know what ‘uv’ is.
It's not difficult to download something yourself 4 million times every day to look popular :)
Right. If anything, this "tiny part" has pretty much <i>taken over</i> Python and turned it from OSS BDFL language into a company-backed one (like Erlang, Scala, C#).
They have some nice ideas. But if they turn to shit you can just fork their tools and use that instead.
VC money bailing out other VCs. A tale as old as time.
Do you have any statistics for that?
uv has almost 2x the number of monthly downloads Poetry has.<p>- <a href="https://pypistats.org/packages/poetry" rel="nofollow">https://pypistats.org/packages/poetry</a>
- <a href="https://pypistats.org/packages/uv" rel="nofollow">https://pypistats.org/packages/uv</a><p>In the 2024 Python developer survey, 18% of the ecosystem used Poetry. When I opened this manifold question[0], I'm pretty sure uv was about half of Poetry downloads.<p>Estimating from these numbers, probably about 30% of the ecosystem is using `uv` now. We'll get better numbers when the 2025 Python developer survey is published.<p>Also see this: <a href="https://biggo.com/news/202510140723_uv-overtakes-pip-in-ci-usage" rel="nofollow">https://biggo.com/news/202510140723_uv-overtakes-pip-in-ci-u...</a><p>[0]: <a href="https://manifold.markets/JeremiahEngland/will-uv-surpass-poetry-in-monthly-p" rel="nofollow">https://manifold.markets/JeremiahEngland/will-uv-surpass-poe...</a>
anecdotally every place ive worked at has switched over and never looked back.
Same. It's game-changing - leaps and bounds above every previous attempt to make Python's packaging, dependency management, and dev workflow easy. I don't know anyone who has tried uv and not immediately thrown every other tool out the window.
I use uv here and there but have a bunch of projects using regular pip with pip-tools to do a requirements.in -> requirements.txt as a lockfile workflow that I've never seen enough value in converting over. uv is clearly much faster but that's a pretty minor consideration unless I were for some reason changing project dependencies all day long.<p>Perhaps it never grabbed me as much because I've been running basically everything in Docker for years now, which takes care of Python versioning issues and caches the dependency install steps, so they only take a long time if they've changed. I also like containers for all of the other project setup and environment scaffolding stuff they roll up, e.g. having a consistently working GDAL environment available instantly for a project I haven't worked on in a long time.
2 things: First, you can (and should) replace your `pip install` with `uv pip install` for instant speed boost. This matters even for Docker builds.<p>Second, you can use uv to build and install to a separate venv in a Docker container and then, thanks to the wonders of multistage Docker builds, copy that venv to a new container and have a fully working minimal image in no time, with almost no effort.
been in the python game a long time and i've seen so many tools in this space come and go over the years. i still rely on good ol pip and have had no issues. that said, we utilize mypy and ruff, and have moved to pyproject etc to remotely keep up with the times.
uv solved it, it will be the only tool people use in 2 more years. if you’re a python shop / expert then you can do pip etc but uv turned incidental python + deps from a huge PITA for the rest of us, to It Just Works simplicity on the same level or better than Golang.
Then can they please figure out some way of invoking it that doesnt require prefixing everything with 'uv'
You can source the virtualenv like normal.
For any command, you can create an 'alias' in your shell config. That way you can get rid of the prefix.
direnv, .envrc, "layout uv"<p><a href="https://github.com/direnv/direnv/wiki/Python#uv" rel="nofollow">https://github.com/direnv/direnv/wiki/Python#uv</a>
alias in ~/.zshrc?
uv run bash/zsh/your shell of choice
I don't want software on my computer, that just downloads and installs random stuff. This is the job of the OS in particular the package manager.
You're welcome to live in the 90s dark ages, I feel this attitude and the shape of the old linux distros like Debian that laboriously re-package years-old software have been one of the biggest failures of open source and squandered untold hours of human effort. It's a model that works okay for generic infrastructure but requires far too much labor and moves far too slowly with quite a poor experience for end users and developers. Why else would all modern software development (going back to perl's cpan package manager in 1995) route around it?
Do you not use non-OS package managers?<p>If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool?
> Do you not use non-OS package managers?<p>Mostly no, sometimes I give up and still use pip as a separate user.<p>> If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool<p>I haven't felt the need to use Go, the only Java software I use is in the OS repo. I don't want to use JS software for other reasons. This is one of the reasons why I don't like Rust rewrites. Python dependencies are very often in the OS repo. If there is anything else, I compile it from source and I curse when software doesn't use or adheres to the standard of the GNU build system.
Thanks for explaining your workflow. It seems predictable, but like it <i>really</i> locks you into one of the few (albeit popular) programming languages that has many/most of its development libraries repackaged by your OS. There are plenty of very popular languages that don't offer that at all.<p>Go and Rust, specifically, seem a bit odd to be allergic to. Their "package managers" are largely downloading sources into your code repository, not downloading/installing truly arbitrary stuff. How is that different from your (presumably "wget the file into my repo or include path") workflow for depending on a header-only C library from the internet which your OS doesn't repackage?<p>I understand if your resistance to those platforms is because of how <i>much</i> source code things download, but that still seems qualitatively different to me from "npm install can do god-knows-what to my workstation" or "pip install can install packages that shadow system-wide trusted ones".
I hope you understand you are part of a very, very small minority.
Personally I run "apt install whateverineed"
In general I agree with you. But not for software dev packages.<p>The package manager I use, apt on Debian, does not package many Python development repos. They've got the big ones, e.g. requests, but not e.g. uuid6. And I wouldn't want it to - I like the limited Debian dev effort to be put towards the user experience and let the Python dev devs worry about packaging Python dev dependencies.
What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.<p>And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
> What’s the point of constraining oneself to what is in the OS package manager? I like to keep my dependencies up to date. The versions in the OS package manager are much older.<p>I favor stability and the stripping of unwanted features (e.g. telemetry) by my OS vendor over cutting edge software. If I really need that I install it into /usr/local, that it what this is for after all.<p>> And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.<p>This is a reason to select the OS. Software shouldn't require exact versions, but should stick to stable interfaces.
Don't worry, gramps, pip won't trigger your tinfoil hat.
Then don't use it?
Do you use pip?
[dead]
Geospatial tends to be the Achilles heel for python projects for me. Fiona is a wiley beast of a package, and GDAL too. Conda helped some but was always so slow. Pip almost uniformly fails in this area for me.
As a point of information: Astral did not, in fact, burn through its VC money. I agree that dev tools are difficult to monetize, though.<p>(Source: I'm an Astral employee.)
> As a point of order: Astral did not, in fact, burn through its VC money.<p>That's a point of information, not a point of order.
Is pointing out the incorrect use of a point of order itself a point of order, or also a point of information?
You're right, I've edited it.
what does that mean
Finally someone competent to answer the crucial question. Taken into account the enormous amount of excellent work you did, and the fact that dev tools are hard to monetize, what was your strategy?
You can find some resources on our strategy in previous blog posts, like this one on pyx[1].<p>[1]: <a href="https://astral.sh/blog/introducing-pyx" rel="nofollow">https://astral.sh/blog/introducing-pyx</a>
Check this HN thread from 8 months ago: <a href="https://news.ycombinator.com/item?id=44358216">https://news.ycombinator.com/item?id=44358216</a>
[dead]
UV is arguably this decade's most important addition to the python ecosystem. They are a small, but they are important.
uv is the best thing to happen to package management in Python.<p>It's not perfect, but it is light-years better than what preceded it.<p>I jumped ship to it and have not looked back. (So have many of my clients).
> Dev tools are among the hardest things to monetize with very few real winners, so good for them to get a good exit.<p>I'm on the fence about cancelling my JetBrains subscription I've had for nearly 10 years now. I just don't use it much. Zed and Claude Code cover all my needs, the only thing I need is a serious DataGrip alternative, but I might just sit down with Claude and build one for myself.
I'm curious what you think of <a href="https://seaquel.app" rel="nofollow">https://seaquel.app</a> as a DataGrip alternative.<p>It's a project I'm working on to build a database management product I've always wanted.<p>I spend way too much time obsessing over UX but hope people appreciate it :).
What do you want in your datagrip alternative. I'm working on some stuff, and interested to hear how people approach data with LLMs
I've been using DBeaver CE since I lost my DataGrip license.
They were hyped here without any pushback. Maybe OpenAI thinks the Astral folks will now evangelize and foist Codex and ChatGPT onto the open source "community".<p>People need to be very careful about resisting. OpenAI wants to make everyone unemployed, works with the Pentagon, steals IP, and copyright whistleblowers end up getting killed under mysterious circumstances.
Every uv-related post here had a few people going "i don't want to use VC funded stuff!! what about rug pull!!", although perhaps they were drowned out eventually all the people (like me) going "uv is fantastic and solves 15 years of python packaging hell"
given that they delivered the goods I would not say they were "hyped"
That was my feeling - more than 'owning' uv etc I could see this as being about getting people onboard who had a proven track record delivering developer tooling that was loved enough to get wide adoption
> tiny part of the Python ecosystem<p><a href="https://xkcd.com/2347/" rel="nofollow">https://xkcd.com/2347/</a>
Uv is the defacto way to do projects. Ty is really really good. Ruff is the defacto linter. I mean they’ve earned a lot of clout.
That's kind of like saying Cargo is a small part of the Rust ecosystem.<p>It's not there yet, but it's getting there.
uv and ruff is not tiny part anymore, its growing fast
This is so dystopian… they built something that worked and now are being “acquihired” into oblivion, and we’re supposed to be happy about it? I’m glad a few of the early people just got rich I guess, but it seems like a terrible system overall.
It's not any different from the launch of the FSF. There's a simple solution. If you don't want your lunch eaten by a private equity firm, make sure whatever tool you use is GPL licensed.
> If you don't want your lunch eaten by a private equity firm, make sure whatever tool you use is GPL licensed.<p>1. For the record: the GPL is entirely dependent on copyright.<p>2. If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.
"Clean room" is doing a lot of heavy lifting. Having the entire corpus of knowledge for humanity and how LLMs work, how can you honestly argue in court that this is purely clean room implementation?<p>This is right up there with Meta lawyers claiming that when they torrent it's totally legal but when a single person torrents it's copyright infringement.
> If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.<p>Isn't that the same for the obligations under BSD/MIT/Apache? The problem they're trying to address is a different one from the problem of AI copyright washing. It's fair to avoid introducing additional problems while debunking another point.
Maybe I'm reading wrong here, but what's the implication of the clean room re-implementations? Someone else is cloning with a changed license, but if I'm still on the GPL licensed tool, how am I "not protected"?
1. Company A develops Project One as GPLv3<p>2. BigCo bus Company A<p>3a. usually here BigCo should continue to develop Project One as GPLv3, or stop working on it and the community would fork and it and continue working on it as GPLv3<p>3b. BigCo does a "clean-room" reimplementation of Project One and releases it under proprietary licence. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their "original" version.
2. BigCo owns ProjectOne now
3a. Bigco is now free to release version N+1 as closed source only.
3b. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their original version.
As a real world example, Redis was both Company A and BigCo. Project One is now ValKey.
There's basically no different between GPL and BSD in that case.
If clean-room re-implementations are allowed to bypass copyright/licenses (software) copyright is dead in general?
well no, (clean room )reimplementations of APIs have done since time immemorial. copyright applies to the work itself. if you implement the functionality of X, software copyright protects both!<p>patents protect ideas, copyright protects artistic expressions of ideas
While the license is important, it's the community that plays the key role for me. VC funder open source is not the same as community developed open source. The first can very quickly disappear because of something like a aquihire, the second has more resilience and tends to either survive and evolve, or peter out as the context changes.<p>I'm careful to not rely too heavily on VC funded open source whenever I can avoid it.
The biggest scam the mega-clouds and the Githubs ever pulled was convincing open source developers that the GPL was somehow out of vogue and BSD/MIT/Apache was better.<p>All so they could just vacuum it all up and resell it with impunity.
I don't remember GitHub or Amazon advocating MIT over GPL.<p>Feel free to prove me wrong by pointing out this massive amount of advocacy from "mega-clouds" that changed people's minds.<p>The ads, the mailing list posts, social media comments. Anything at all you can trace to "mega-clouds" execs.
Does the CEO of github count? <a href="https://youtu.be/-bAAlPXB2-c?t=180" rel="nofollow">https://youtu.be/-bAAlPXB2-c?t=180</a>
<a href="https://choosealicense.com/" rel="nofollow">https://choosealicense.com/</a><p><a href="https://choosealicense.com/about/" rel="nofollow">https://choosealicense.com/about/</a><p>> "GitHub wants to help developers choose an open source license for their source code."<p>This was built by GitHub Inc a very very long time ago.
I think this was more about "please choose _any_ license" because of the problem outlined here:<p><a href="https://opensource.stackexchange.com/questions/1150/is-my-code-floss-just-because-it-is-published-it-on-github" rel="nofollow">https://opensource.stackexchange.com/questions/1150/is-my-co...</a>
The site puts the MIT and GPLv3 front and center with a nice quick informative blurb on them. How are they pushing MIT over GPL?
I don't see anything on there saying that non-copyleft licenses are better, unless you are in an ecosystem that prefers a different license.
>This was built by GitHub Inc a very very long time ago.<p>So long ago, in fact, that it was five years before their acquisition by Microsoft.
I remember a somewhat prominent dev in the DC area putting on Twitter around 2012 or so something like "I do plenty of open source coding and I don't put a fucking license on it" and it stuck with me for all these years that it was a weird stance to take.
Dan Bernstein took that attitude back in the 90s - I think his personal theory of copyright went something like "if it doesn't have a license, then it's obviously public domain", which ran counter to the mainstream position of "if it doesn't have a license, then you have to treat it as proprietary".<p>And, sure, djb wasn't actually likely to sue you if you went ahead and distributed modified versions of his software... but no-one else was willing to take that risk, and it ended up killing qmail, djbdns, etc stone dead. His work ended up going to waste as a result.
I doubt the lack of license was the reason DJB's projects didn't take over the world. Most of them required heavy forking to break away from hardwired assumptions about the filesystem and play nice with the OS distribution, and DJB is himself notoriously difficult to work with. Still, qmail managed to establish maildir as the standard format and kill off mbox, and for that alone I'm eternally grateful.
Well, there were always plenty of patches available - it's just that lots of them conflicted with each other, and that <i>was</i> a product of the licensing.<p>Agreed with the rest, though. I relied heavily on qmail for about a decade, and learned a lot from the experience, even if it was a little terrifying on occasion!
> mainstream position<p>Not everybody is dictated by corporate attorneys; I don’t think this is an accurate portrayal.
> his personal theory of copyright went something like "if it doesn't have a license, then it's obviously public domain"<p>I mean philosophically and morally, sure, one can take that position ... but copyright law does not work like that, at least not for anything published in the US after 1989 [1].<p>[1] <a href="https://www.copyright.gov/circs/circ03.pdf" rel="nofollow">https://www.copyright.gov/circs/circ03.pdf</a>
John Carmack said that about a week ago.
The big cloud providers are perfectly happy to use GPL'd stuff (see: Elastic, MySQL). They don't need to use embrace-and-extend, they're content with hosting.<p>The ones pushing for permissive licenses are rather companies like Apple, Android (and to some extent other parts of Google), Microsoft, Oracle. They want to push their proprietary stuff and one way to do that in the face of open source competition is by proprietary extensions.
They <i>do</i> sort the list of default licenses by alphabetical,<p>and that seems like a strange choice…<p>Could you say more?
You probably mean AGPL. Companies hated GPL from the start and nothing has changed to this day. But the cloud is specifically against AGPL.
Huh? When you deploy something in the cloud, you don't have to share your GPL'ed stuff either. Google doesn't.
Since when has GPL stopped any company from doing so? I'd genuinely be happy to hear of a single time someone was actually pressed about their GPL compliance after FSF v. Cisco, it's like some immaterial sword of Damocles, the entire weight of which is the shame of losing face, which is fleeting in a society post-appeal-to-authority.
A GPL license helps but if support for a dependency is pulled you'll likely end up needing to divert more resources to maintain it anyways. There really isn't any guarantee against this cost - you either pay someone else to maintain it and hope they do a good job, build it in house and become an "also that thing" company, or follow a popular project without financially supporting it and just hope other people pick up your slack.<p>Preferring GPL licensed software means that you're immune to a sudden cut off of access so it's always advisable - but it's really important to stay on top of dependencies and be willing to pay the cost if support is withdrawn. So GPL helps but it isn't a full salve.
somebody looked at Claude Code's binaries and Anthropic is testing out their own app platform called antspace. Not sure why people are shocked, they've been cloning features of their API customers and adding them to their core products since day 1. Makes sense they will take user data and do it for Claude Code by copying features or buying up what developers are using so they can lock people into a stack. These are the same people that trained on every scrap of data they could get their hands on and now complain about distilling models from their output<p><a href="https://x.com/AprilNEA/status/2034209430158619084" rel="nofollow">https://x.com/AprilNEA/status/2034209430158619084</a><p>Ironically this type of stuff really makes me doubt their AGI claims, why would they bother with this stuff if they were confident of having AGI within the next few years? They would be focused on replacing entire industries and not even make their models available at any price. Why bother with a PaaS if you think you are going to replace the entire software industry with AGI?
> they've been cloning features of their API customers and adding them to their core products since day 1<p>Is this not just the strategy of all platforms. Spy on all customers, see what works for them and copy the most valuable business models. Amazon does that with all kinds of products.<p>Platforms will just grow to own all the market and hike prices and lower quality, and pay close to nothing to employees. This is why we used to have monopoly regulations before being greedy became a virtue.
It is exactly the strategy of all platforms - they get greedy to the point of screwing over their own customers. I've lost count of number of times I've seen a platform get popular and then expand to offer the same services as its customers, often even undercutting market rates.<p>Just wait till they offer "Developer Certification" so you have to pay them to get a shiny little badge and a certificate while they go around saying no badge = you're shit.
> this type of stuff really makes me doubt their AGI claims, why would they bother with this stuff if they were confident of having AGI within the next few years?<p>because AGI doesn't grow in a cage, it requires a piece of software running somewhere. someone has to build both to get that happen. that is like a high school level question.
Theoretically it only requires it for birth. One can argue that once we achieve the singularity, it could immediately scale on its own as it decides.
Typical llm user, thinks they're a genius.
You know until today I dismissed all those concerns about <i>uv</i> being a commercial product but now I am very concerned.<p>Microsoft has been a reasonable steward of github and npm considering everything but I don't feel so good about OpenAI this makes me reconsider my use of <i>uv</i> and Python as a whole because <i>uv</i> did a lot to stop the insanity. Not least Microsoft has been around since 1975 whereas I could picture OpenAI vanishing instantly in a fit of FOMO.
It's even worse than that, Astral took over python-build-standalone and uv uses its Python builds on all platforms.<p>That means OpenAI will be able to do whatever they want to your Python binaries, including every Python binary in your deployments, with whatever telemetry that want to instrument in the builds.
In the many darker timelines that one can extrapolate, capturing essential tech stacks is just a pre-cursor to capturing hiring.<p>Once we start seeing Open AI and Anthropic getting into the certifications and testing they'll quickly become the gold standard. They won't even need to actually test anyone. People will simply consent to having their chat interactions analyzed.<p>The models collect more information about us than we could ever imagine because definitionally, those features are unknown unknowns for humans. For ML, the gaps in our thinking carry far richer information about is than our actual vocabularies, topics of interest, or stylometric idiosyncrasies.
As if there will be hiring in the fullness of time.<p>There will come a day when you can will an entire business into existence at the press of a button. Maybe it has one or two people overseeing the business logic to make sure it doesn't go off the rails, but the point is that this is a 100x reduction in labor and a 100,000x speed up in terms of delivery.<p>They'll price this as a $1M button press.<p>Suddenly, labor capital cannot participate in the market anymore. Only financial capital can.<p>Suddenly, software startups are no longer viable.<p>This is coming.<p>The means of production are becoming privatized capital outlays, just like the railroads. And we will never own again.<p>There is nothing that says our careers must remain viable. There is nothing that says our output can remain competitive, attractive, or in demand. These are not laws.<p>Knowledge work may be a thing of the past in ten years' time. And the capital owners and hyperscalers will be the entirety of the market.<p>If we do not own these systems (and at this point is it even possible for open source to catch up?), we are fundamentally screwed.<p>I strongly believe that people not seeing this - downplaying this - are looking the other way while the asteroid approaches.<p>This. Is. The. End.
Oh are compilers going away? Or personal computers for that matter?<p>If the barrier to button-pressed companies goes that high up, the cost to run/consume the product also goes up. Making hand-rolled products cheaper.<p>Slower paced to roll out things? Sure.<p>That's the precarious balance these LLMs providers have to make. They can't just move on without the people feeding it data and value. The machine is not perpetual.
There could be opportunities we haven't anticipated.<p>What if labor organizes around human work and consumers are willing to pay the premium?<p>At that point, it's an arms race against the SotA models in order to deepen the resolution and harden the security mechanisms for capturing the human-affirming signals produced during work. Also, lowering the friction around verification.<p>In that timeline, workers would have to wear devices to monitor their GSR and record themselves on video to track their PPG. Inconvenient, and ultimately probably doomed, but it could extend or renew the horizon for certain kinds of knowledge work.
This is pure fantasy. Even if LLM's eventually became this capable (they will not), your whole scenario doesn't make sense any sense.<p>What is far more likely is that governments use AI to oppress their citizens with robots and drones. That is the thing to be scared of.
And never forget that we collectively cheered it on as the asteroid's crater got deeper and wider.
If it ever goes bad, well I hope that that’s an impetus for new open source projects to be started — and with improvements over and lessons learned from incumbent technologies, right at the v1 of said projects.
If LLMs turn out to be such a force multiplier, the way to fight it is to ensure that there are open source LLMs.
I think the issue is that LLMs are a cash problem as much as they are a technical problem. Consumer hardware architectures are still pretty unfriendly to running models which are actually competitive to useful models so if you want to even do inference on a model that's going to reliably give you decent results you're basically in enterprise territory. Unless you want to do it really slowly.<p>The issue that I see is that Nvidia etc. are incentivised to perpetuate that so the open source community gets the table scraps of distills, fine-tunes etc.
You got me thinking that what's going to happen is some GPU maker is going to offer a subsidized GPU (or RAM stick, or ...whatever) if the GPU can do calculations while your computer is idle, not unlike Folding@home. This way, the company can use the distributed fleet of customer computers to do large computations, while the customer gets a reasonably priced GPU again.
The kinds of GPUs that are in use in enterprise are 30-40k and require a ~10KW system. The challenge with lower power cards is that 30 1k cards are not as powerful, especially since usually you have a few of the enterprise cards in a single unit that can be joined efficiently via high bandwidth link. But even if someone else is paying the utility bill, what happens when the person you gave the card to just doesn’t run the software? Good luck getting your GPU back.
Consumer hardware is there. grab a mac or AMD395+ and Qwen coder and Cline or Open code and you're getting 80% of the real efficiency.
New Strix Halo (395+) user here. It is very librating to be able to "just" load the larger open-weight MoEs. At this param count class, bigger is almost always better --- my own vibe check confirms this, but obviously this is not going to be anywhere close to the leading cost-optimized closed-weight models (Flash / Sonnet).<p>The tradeoff with these unified LPDDR machines is compute and memory throughput. You'll have to live with the ~50 token/sec rate, and compact your prefix aggressively. That said, I'd take the effortless local model capability over outright speed any day.<p>Hope the popularity of these machines could prompt future models to offer perfect size fits: 80 GiB quantized on 128 GiB box, 480 GiB quantized on 512 GiB box, etc.
The problem is even if an OSS had the resources (massive data centers the size of NYC packed with top end custom GPU kits) to produce the weights, you need enormous VRAM laden farms of GPUs to do inference on a model like Opus 4.6. Unless the very math of frontier LLMs changes, don’t expect frontier OSS on par to be practical.
I feel like you're overstating the resources required by a couple orders of magnitude. You do need a GPU farm to do training, but probably only $100M, maybe $1B of GPUs. And yes, that's a lot of GPUs, but they will fit in a single datacenter, and even in dollar terms, there are many individual buildings in NYC that are cheaper.
I refer you to the data centers under construction roughly the size of Manhattan to do next generation model training. Granted they’re also to house inference, but my statement wasn’t hyperbole, it’s based on actual reality. To accommodate the next generation of frontier training it’s infeasible for any but the most wealthy organizations on earth to participate. OSS weights are toys. (Mind you i like toys)
> you need enormous VRAM laden farms of GPUs to do inference on a model like Opus 4.6.<p>It's probably a trade secret, but what's the actual per-user resource requirement to run the model?
There's already an ecosystem of essentially undifferentiated infrastructure providers that sell cheap inference of open weights models that have pretty tight margins.<p>If the open weights models are good, there are people looking to sell commodity access to it, much like a cloud provider selling you compute.
Open-source models will never be _truly_ competitive as long as obtaining quality datasets and training on them remains prohibitively expensive.<p>Plus, most users don't want to host their own models. Most users don't care that OpenAI, Anthropic and Google have a monopoly on LLMs. ChatGPT is a household name, and most of the big businesses are forcing Copilot and/or Claude onto their employees for "real work."<p>This is "everyone will have an email server/web server/Diaspora node/lemmy instance/Mastodon server" all over again.
That would be accepting the framing of your class enemy, there is no reason to do that.
unless they are also pirate LLMs, I don't see how any open source project could have pockets deep enough for the datacenters needed to seriously contend
If it goes bad? It’s too late by that point. And how is open source going to compete with billions of investment dollars?
If AI tools are as good as the CEOs claim, we should have no friction towards building multiple open source alternatives very quickly. Unless of course, they aren’t as good as they are being sold as, in which case, we have nothing to worry about.
What would the new open source projects do differently from the "old" ones? I don't think you can forbid model training on your code if your project is open source.
I think the good news here is that since OpenAI is a zombie company at this point this particular acquisition shouldn't be too concerning - and from what I've seen Anthropic has been building out in a direction of increased specialization. That said vertical integration is as much of a problem as it always was and it'd be excellent to see some sane merger oversight from the government.
Honestly, for now they seem to be buying companies built around Open Source projects which otherwise didn't really have a good story to pay for their development long-term anyway. And it seems like the primary reason is just expertise and tooling for building their CLI tools.<p>As long as they keep the original projects maintained and those aren't just acqui-hires, I think this is almost as good as we can hope for.<p>(thinking mainly about Bun here as the other one)
Hmm, from my perspective, an essential step to legitimize "vibecoding" in an enterprise setting is to to have a clearly communicated best practice - and have the LLM be hyper optimized for that setting.<p>Like having a system prompt which takes care of the project structure, languages, libraries etc<p>It's pretty much the first step to replacing devs, which is their current "North Star" (to be changed to the next profession after)<p>Once they've nailed that, the devs become even more of a tool then they're already are (from the perspective of the enterprise).
But how does this work out in the long run, in the case of AGI?<p>If AGI becomes available, especially at the local and open-source level, shouldn't all these be democratized - meaning that the AGI can simply roll out the tooling you need.<p>After all, AGI is what all these companies are chasing.
Stop using MIT licensed software being run by small vc backed operations if you value stability. They are risky and often costly Trojan horses.
What do you mean? MIT is essentially as open as you can get. The worst that can happen is that they will relicense, eventually, to force big users to pay, but when that happens everybody knows how it goes: some consortium of other big companies forks it and continues development as if nothing happened.
Vercel did this as well. I called it "ecosystem capture" but I like "means of production" too.
These companies are telling us software development is over. They are positioning themselves as the means of production. You want to build anything you do it through them. And since ‘software is solved’ this is not a software but a user acquisition.
The risk is one of them buys it and pulls an Oracle / Java (Sun) stunt.
If it becomes too antagonistic, people will change. The desire to build things is larger than any given iron fist du jour. Just ask Oracle or IBM.
It's open source. Forking it is an option. And with AI, one-shotting a replacement is an option as well. Or having it make changes to your fork. Just because you can, doesn't mean you should do that of course.<p>The point is that the value of accumulated know how and skill that lead to things like uv isn't lost even if the worst would happen to the company or people behind it. I don't think there are many signs of that. I don't think they had much of a revenue model around providing OSS tools. It's problematic for a lot of VC funded companies. An exit like this is as good as it gets. OpenAI now pays them to do their thing. Investors are probably pretty happy. And we maybe get to skip the enshittification that seems inevitable with the whole IPO/hedge funds circus that many vc funded OSS companies end up being subjected to. Problem solved. Congratulations to the team. They can continue doing what they love doing in a company that clearly loves all things python. And who knows what they can do next when freed from having to worry about making investors happy?<p>Big companies and OSS have always had symbiotic relationships. Some of the largest contributors to open source are people working in big companies. OpenAI fits this tradition beautifully. Most big software companies actively contribute to OSS projects that are relevant or important to them. Even very secretive companies like Apple or profit focused sharks like Oracle. Google, Meta, IBM. There are very few large software companies that aren't doing that. OSS without this very large scale corporate sponsor ships would just be a niche thing. Yes there are a lot of small projects. I have a few of my own even. But most of the big ones have some for profit businesses behind them.<p>The real meta question is of course if we still need a lot of the people centered development tooling when AIs are starting to do essentially all of the heavy lifting in terms of coding. I think we might need very different tools soon.
"Bearded German philosopher" once again being uncannily applicable to 21st century happenings...
it never made sense to have devs all over the world doing the same task with tiny variation. Centralization was inevitable. LLMs might have been a step change but the trajectory was already set.
The exact opposite has started: every single developer with an LLM subscription now has 45 variations of any foundational tool and library, to cater to their weird use-case, because it was easier for the LLM to just modify it rather than adapting to it. Almost nobody upstreams such improvements (or they are too niche anyway).<p>The ecosystem will be this way for a while, if not the new normal.
I see it as:LLMs will solve the problems they create. They'll 10x your eccentric layout, modularity, monolith; they'll multiply your anti-DRY or try to make everything DRY. It's largescale copy/paste/find/replace in a glorious crescendo of idiosyncratic programming. `
>> "means of production" in software<p>Ah yes, it was impossible to write software before these companies existed, and the only way to write software is via the products from these companies. They sure do control the "means of production".
As the cost of building software trends towards $0 I don't see how one can realistic own "the" means of production rather than "a" means. Any competitor can generate a similar product cheaply.
This is a logical conclusion of most open source tools in a capitalist economy, it's been this way for decades.<p>Equivalent or better tools will pop up eventually, heck if AI is so fantastic then you could just make one of your own, be the change you want to see in the world, right?
These are MIT/Apache 2. Sure they can buy and influence the direction but they can't prevent forks if they stray from what users want.
Of course they're trying to capture existing tech stacks. The models themselves are plateauing (most advancement is coming from the non-LLM parts of the software), they took too much VC money so they need to make some of it back. So gobbling up wafers, software, etc... is the new plan for spending the money and trying to prevent catastrophic losses.
They’re not ahead on code though, and they recently announced giving up on rich media AI entirely lmao<p>The fuck does OpenAI have to offer?<p>Nothing I need.<p>The only reason Gemini is the best is UX, really running my own Mistral 7b is more than fine.<p>Because slow ass Gemini is still a slightly more convenient experience I use that.<p>Nobody OWNS nor will own the means of essentially thoughts. It’s such a silly idea I wonder if it’s propaganda.
Not a concern: LLMs can fork all these products while they're still open licensed.
Explain to me how this is any different than Microsoft, Blackrock, Google, Oracle, Berkshire or any other giant company acquiring their way to market share?
If our corporate overlords are gonna buy up all that is good I’d rather it have been Anthropic and not that wierdo humans-need-food-and-care-for-inference-so-LLMs aren’t-that-power-hungry Sam Altman. Man that guy is weird.<p>Oh well. They’ll hopefully get options and make millions when the IPO happens. Everyone eventually sells out. Not everyone can be funded by MIT to live the GNU maximalist lifestyle.
[dead]
[dead]
[flagged]
What language is universally better than Python? I don't think Python is perfect, but it is definitely one of the best languages out there. It is elegant and it is has a huge ecosystem of libraries, frameworks and tutorials. There is a lot of battle-tested software in Python that is running businesses.
Maybe something with dependent typing like Idris2 or a future Haskell release?
What?! It's one of the slowest languages on the planet, it's not type safe, it has not real concurrency.<p>I can't believe people say this with a straight face
> It's one of the slowest languages on the planet<p>It's fast enough for many use cases. That doesn't mean that there is no room for optimization, but this is far less a deciding factor these days.<p>> it's not type safe<p>You can do static analysis with Mypy and other tools.<p>> it has not real concurrency.<p>There's different mechanisms for running things concurrently in Python. And there's an active effort to remove the GIL. I also have to ask: What is "real" concurrency?<p>Admittedly, the things you mention are not Python's strongest points. But they are far from being dealbreakers.
> cost of significantly better languages is essentially free<p>Is it? We still need meatspace humans to vet what these AI agents produce. Languages like C++ / Rust etc still require huge cognitive overhead relative to Python & that will not change anytime soon.<p>Unless the entire global economy can run on agents with minimal human supervision someone still has to grapple with the essential complexity of getting a computer to do useful things. At least with Python that complexity is locked away within the CPython interpreter.<p>Also an aside, when has a language ever gotten traction based solely on its technical merits? Popularity is driven by ease-of-use, fashion, mindshare, timing etc.
Yeah that's sort of fair today, although we have switched over most of our org to Rust and it hasn't been much of a problem. The LLM can usually explain small parts of code with high accuracy if you are unsure.<p>Overall the switch has been very much loved. Everything is faster and more stable, we haven't seen much of a reduction in output
Your stance is aggressive and provocative, but no less so than the challenge AI poses to software developers in general. I think what you say should be seriously entertained.<p>And as someone who loves Python and has written a lot of it, I tend to agree. It's increasingly clear the way to be productive with AI coding and the way to make it reliable is to make sure AI works within strong guardrails, with testsuites, etc. that combat and corral the inherent indeterminism and problems like prompt injection as much as possible.<p>Getting help from the language - having the static tooling be as strict and uncompromising as possible, and delegating having to deal with the pain to AI - seems the right way.
It's such a laughable take. First of all a language is never getting popular simply because it's good. Actually most used languages are usually terrible.[0]<p>Secondly it's non factual. Python's market share grew in 2025[1][2][3]. Probably driven by AI demand.<p>[0]: even truer for natural languages.<p>[1]: <a href="https://survey.stackoverflow.co/2025/technology#most-popular-technologies-language-language" rel="nofollow">https://survey.stackoverflow.co/2025/technology#most-popular...</a><p>[2]: <a href="https://survey.stackoverflow.co/2024/technology#most-popular-technologies-language-prof" rel="nofollow">https://survey.stackoverflow.co/2024/technology#most-popular...</a><p>[3]: <a href="https://pypl.github.io/PYPL.html" rel="nofollow">https://pypl.github.io/PYPL.html</a>
Yes most languages are terrible except the ones that are actually performant and good like Rust.<p>Do you really think AI agents of the future will be coding in Python??? What advantage would that possibly give them? That's the only laughable take here
Define future. If it means the next five years, yes, I absolutely expect people and machines to keep writing Python.
I think there are many examples throughout history of better performing options not displacing counterparts. I think, really, the only "laughable" thing here is the ignorance on display that's riding atop the arrogance.<p>Rust is great. But AI isn't displacing Python anytime soon.<p>Moreso it sucks that Astral's been bought by a company with such a horrible leader at the helm.
Yeah the swath of billions of new devs that have a lower barrier to try out coding will navigate them to python
That's an interesting take, but I'm not sure 'easy to write' is the only advantage.<p>There is also a really good ecosystem of libraries, especially for scientific computing. My experience has been that Claude can write good c++ code, but it's not great about optimization. So, curated Python code can often be faster than an AI's reimplementation of an algorithm in c++.
I feel like this is a relatively hot take. Python has advantages beyond being easy to write. It's simple. It can do just about anything any other language can do. It's not the most performant on its own, but it's performant enough for 99% of use cases, and in the 1% you can write a new or use an existing C library instead. Its simplicity and ease of adoption make python very well represented in the training data.<p>If I ask an LLM or agentic AI to build something and don't specify what language to use, I'd wager that it'll choose python most of the time. Casual programmers like academics or students who ask ChatGPT to help them write a function to do X are likely to be using Python already.<p>I'm not a Python evangelist by any means but to suggest that AI is going to kill Python feels like a major stretch to me.<p>EDIT: when I say that Python can do anything any other language can do, that's with the adage in mind. Python is the second best language for every task.
Isn't that also an advantage for LLMs? Apart from more available data.
Let's see how it plays out. My current assumption is that degrees and CVs will become <i>more</i> important in the workplace. Things like good architecture, maintainability, coherence, they are all hard to measure. A true 10x developer without a college degree will lose to the PhD without any hard skills. And these types only speak python, so they will instruct the AI to type python. Or maybe they'll vibecode rust and elixir, I don't know. But the cynic in me strongly thinks this will make all our bullshitty jobs way more bullshitty, and impostors will profit the most.
hilariously bold take with no evidence to support the claim
Absolutely agree with this. I'm hoping via advent of agentic, Rust dominates in the next few years. It may even cause Wasm to be dominant as the new "applet" language.
What strikes me most about this acquisition isn't the AI angle. It's the question of why so many open source tools get built by startup teams in the first place.<p>I maintain an open source project funded by the Sovereign Tech Fund. Getting there wasn't easy: the application process is long, the amounts are modest compared to a VC round, and you have to build community trust before any of that becomes possible. But the result is a project that isn't on anyone's exit timeline.<p>I'm not saying the startup path is without its own difficulties. But structurally, it offloads the costs onto the community that eventually comes to depend on you. By the time those costs come due, the founders have either cashed out or the company is circling the drain, and the users are left holding the bag. What's happening to Astral fits that pattern almost too neatly.<p>The healthier model, I think, is to build community first and then seek public or nonprofit funding: NLnet, STF, or similar. It's slower and harder, but it doesn't have a built-in betrayal baked into the structure.<p>Part of what makes this difficult is that public funding for open source infrastructure is still very uneven geographically. I'm based in Korea, and there's essentially nothing here comparable to what European developers can access. I had no choice but to turn to European funds, because there was simply no domestic equivalent. That's a structural problem worth taking seriously. The more countries that leave this entirely to the private sector, the more we end up watching exactly this kind of thing play out.
I think this overstates the “betrayal” angle.<p>A lot of great open source comes out of startups because startups are really good at shipping fast and getting distribution (open source is part of this strategy). Users can try the tool immediately, and VC funding can put a lot of talent behind building something great very quickly.<p>The startup model absolutely creates incentive risk, but that’s true of any project that becomes important while depending on a relatively small set of maintainers or funders.<p>I’m not sure an acquisition is categorically different from a maintainer eventually moving on or burning out. In all of those cases, users who depend on the project take on some risk. That’s not unique to startups; it’s true of basically any software that becomes important.<p>There’s no perfect structure for open source here - public funding, nonprofit support, and startups all suck in their own ways.<p>And on the point you make about public funding being slow: yeah, talented people can’t work full-time on important things unless there’s serious funding behind it. uv got as good as it is because the funding let exceptional people work on it full-time with a level of intensity that public funding usually does not.
That's fair, and I don't really blame anyone for taking the startup route. It's often the only realistic path to working full-time on something you care about. My point is more that it shouldn't have to be. The more public funding flows into open source infrastructure, the less that tradeoff becomes necessary in the first place. Korea being almost entirely absent from that picture is part of why I feel this so keenly.
I cannot agree more though I have little experience in open source. I knew that Korean environment for open source software would be touch before coming back from Europe, it seems much easier to target international traction rather than focusing on domestic interest.<p>Personally, I'd like to know, since you have been active in Korea, if there is any groups that I can attend to.
uvs success is downstream of paying like 10 very good rust tooling developers to work on uv full time.<p>Full time effort put into tools gets you a looooot of time to make things work well. Even ignoring the OSS stuff, many vendors seem to consider their own libs to be at best “some engineers spend a bit of time on them” projects!
Yeah, it's pretty hard to argue that at least for the python tooling use case, the typical open source methodologies had failed to solve things as well in a couple decades as uv did in a few years as a startup. The lesson we should take from that is probably more up for debate.
Most likely, because it is less money :-p. But also because it is less known and harder, as you already mentioned. Personally, I'm based in Mexico, and I would never have thought about trying to get nonprofit funding for a community project, nor would I know where to start to get that.
> <i>it doesn't have a built-in betrayal baked into the structure</i><p>Astral was founded as a private company. Its team has presumably worked hard to build something valuable. Calling their compensation for that work 'betrayal' is unfair.<p>Community-based software development sounds nice. But with rare exception, it gets outcompeted by start-ups. Start-ups work, and they work well. What is problematic is the tech giants' monopoly of the acquisition endpoint. Figuring out ways for communities to compete with said giants as potential acquirers is worth looking into–maybe public loans to groups of developers who can show (a) they're committed to keep paying for the product and (b) aren't getting kicked back.
> I maintain an open source project funded by the Sovereign Tech Fund.<p>I would absolutely love to know more about this if you are willing to share the story?
open source allows you to build community trust must faster, and community trust/adoption is key<p>I don't see any betrayal here, since the tools are still OSS - yeah OpenAI might take it a different direction and add a bunch of stuff I don't like/want, but I can still fork
capitalism / the bazaar is faster than communism / the cathedral, news at 11
This is a serious risk for the open source ecosystem and particularly the scientific ecosystem that over the last years has adopted many of these technologies. Having their future depend on a cap-ex heavy company that is currently (based on reporting) spending approx. 2.5 dollars to make a dollar of revenue and must have hypergrowth in the next years or perish is less than ideal. This should discourage anybody doing serious work to adopt more of the upcoming Astral technologies like ty and pyx. Hopefully, ruff and uv are large enough to be forked should (when) the time comes.
On the flip side, I'm not sure I ever saw a revenue plan or exit strategy for Astral <i>other</i> than acquihire. And most plausible bidders are unfortunate in one way or another.
Astral was building a private package hosting system for enterprise customers. That was their stated approach to becoming profitable, while continuing to fund their open source work.
Private package hosting sounds like a commodity that would be hard to differentiate.
A commodity yes, but could be wrapped in to work very nicely with the latest and greatest in python tooling. Remember, the only 2 ways to make money are by bundling and unbundling. This seems like a pretty easy bundling story.
It's also a crowded and super mature space space between JFrog (Artifactory) and Sonatype (Nexus). They already support private PyPI repositories and are super locked in at pretty much every enterprise-level company out there.
Yeah you'd think so but somehow JFrog (makers of Artifactory) made half a billion dollars last year. I don't really understand that. Conda also makes an implausible amount of money.
Makes sense to me.<p>Most of the companies that spend $$$$ with them can't use public registries for production/production-adjacent workloads due to regulations and, secondarily a desire to mitigate supply chain risk.<p>Artifactory is a drop-in replacement for every kind of repository they'll need to work with, and it has a nice UI. They also support "pass-through" repositories that mirror the public repositories with the customization options these customers like to have. It also has image/artifact scanning, which cybersecurity teams love to use in their remediation reporting.<p>It's also relatively easy to spin up and scale. I don't work there, but I had to use Artifactory for a demo I built, and getting it up and running took very little time, even without AI assistance.
Yeah I mean I understand the demand. My previous company used Artifactory. I just don't understand why nobody has made a free option. It's so simple it seems like it would be a no brainer open source project.<p>Like, nobody really pays for web servers - there are too many good free options. They're far more complex than Artifactory.<p>I guess it's just that it's a product that only really appeals to private companies?
From my understanding there are a lot of companies that need their own package repositories, for a variety of reasons. I listened to a couple podcasts where Charlie Marsh outlined their plans for pyx, and why they felt their entry into that market would be profitable. My guess is that OpenAI just dangled way more money in their faces than what they were likely to get from pyx.<p>Having a private package index gives you a central place where all employees can install from, without having to screen what each person is installing. Also, if I remember right, there are some large AI and ML focused packages that benefit from an index that's tuned to your specific hardware and workflows.
Private artifact repositories also help to mitigate supply chain risk since you can host all of your screened packages and don't have to worry about something getting removed from mvn-central, PyPI, NPM, etc.<p>Plus the obvious need for a place to host proprietary internal libraries.
We have some kind of simple pip repo that is private where I work. What would astral bring to the table?
How many people use that simple pip repo daily? If the number is not in the high hundreds, or a few thousands; maybe nothing. But once you get up there, any kind of better coordination layer is useful enough to pay money to a third party for, unless maintaining a layer over pip is your core competency.
I mean that was a thing at one point but I feel like it is baked into github/gitlab etc now
What would be the added value against JFrog or Nexus, for example?
that was never going to work, let's be honest
i mean ofc but like you can self-host pypi and the "Docker Hub" model isn't like VC-expected level returns especially as ECR and GHCR and the other repos exist
They could have joined projects like the Linux Foundation which try to not depend on any single donor, even though complete independence from big tech is not possible. I don't know the motivation behind Astral's approach, but this acquisition does leave a weird taste behind about how serious they were about truly open source software. Time will tell, I guess. (Edit: typo)
> I don't know the motivation behind Astral's approach, but this acquisition does leave a weird taste behind about how serious they were about truly open source software.<p>It was because Astral was VC funded.<p><a href="https://astral.sh/blog/announcing-astral-the-company-behind-ruff" rel="nofollow">https://astral.sh/blog/announcing-astral-the-company-behind-...</a>
My hope would be that this eventually pushes pip to adopt a similar feature-set and performance improvements. It's always a better story when the built-in tool is adequate instead of having to pick something. And yes UV is rust but it's pretty clear that Python could provide something within 2-5x the speed.
The problem is funding.<p>There seems to be a pervasive believe that the Python tooling and interpreter suck and are slow because the maintainers don’t care, or aren’t capable.<p>The actual problem is that there isn’t enough money to develop all of these systems properly.<p>Google says that Astral had 15 team members. Or course, it’s so hard to make these projections. But it wouldn’t shock me if uv and ruff are each individually multi-million dollar pieces of software.<p>If you’d like to invest a million dollars to improve pip, or work for free for 3 years to do it yourself, I’m not sure if anyone would object.
pip isn't exactly a "built-in" tool. Beyond the python distribution having a stub module that downloads pip for you.
`ensurepip` does not "download pip for you". It bootstraps pip from a wheel included in a standard library sub-folder, (running pip's own code from within that wheel, using Python's built-in `zipimport` functionality).<p>That bootstrapping process just installs the wheel's contents, no Internet connection required. (Pip does, of course, download pip for you when you run its self-upgrade — since the standard library wheel will usually be out of date).
These tools are open source, if they lock them down the community will just fork them.
Nice idea in theory, in practice is how many folks down in Nebraska are going to show up.
as someone who works in the python tooling space I think you underestimate the number of people who would be willing to do this. i would personally help maintain a community fork of ruff if it got to the point where one was needed, though I draw the line at moving to nebraska first.
isn't that's the point of open source software? like when Oracle bought Sun. someone forked mysql and created mariadb.
Indeed, and how much has it kept the pace versus the alternatives?<p>Also, the survivors are the exception, not the rule.
This might be true for uv and ruff, and hopefully that will happen. But pyx is a platform with associated hosting and if successful would lock people into the Astral ecosystem, even if the code itself was open source.
I never adopted them, keep using mostly Python written stuff.<p>Either pay for the product, or use stuff that isn't dependent on VC money, this is always how it ends.
> I never adopted them, keep using mostly Python written stuff.<p>Maybe you use non-transitive pure Python dependencies, but it's likely that your tools and dependencies still rely on stuff in Rust or C (e.g.: py-cryptography and Python itself respectively).
I use mostly the batteries, given that the only purpose I have for Python, since version 1.6, is UNIX scripting tasks, beyond shell.<p>As mentioned multiple times, since my experience with Tcl and continuously rewriting stuff in C, I tend to avoid languages that don't come with JIT, or AOT, in the reference tooling.<p>I tend to work with Java, .NET, node, C++, for application code.<p>Naturally AI now changes that, still I tend to focus on approaches that are more classical Python with pip, venv, stuff written in C or C++ that is around for years.
There are ways to independently fund open source projects, though. I have previously contributed to the Python Software Foundation and to individual open source maintainers through GitHub donations (which are not dependent on GitHub, as there are many alternatives). Projects like the Linux Foundation exist, too. And government funding, especially for scientific endeavors or where software is used to fulfill critical state tasks, is an option, too. I refuse to subject to the hypercommercialization of software and still believe in the principles behind open source.
As opposed to Pip, which is obviously free and sustainable forever.
Would single maintainers of critical open source projects be a better situation?
I don't know how to search for that report, can you share it?
> This is a serious risk for the open source ecosystem and particularly the scientific ecosystem that over the last years has adopted many of these technologies.<p>At worst, it's just Anaconda II AI Boogaloo. The ecosystems will evolve and overcome, or will die and different ecosystems rise to meet the need going forward.<p>I anticipate OpenAI will get bored and ignore Astral's tools. Software entropy will do its thing and we will remember an actively developed uv as the good old days until something similar to cargo gets adopted as part of Python's standard distribution.
Possibly the worst possible news for the Python ecosystem. Absolutely devastating. Congrats to the team
Can't blame you for not trusting OpenAI, but it seems to me they would gain very little from fucking up uv (or more precisely doing things that have a side effect of fucking up uv), and they have tons of incentive to cultivate developer good will. Better to think of buying and supporting a project like this as a <i>very</i> cheap way to make developers think they're not so bad.
One thing to keep in mind is that uv was the product of a whole lot of packaging PEPs finally landing and standards being set. That combined with not having to support all the old baggage meant they could have an effect modernizing community packaging standards.<p>I hope those two factors mean that if things go really wrong, then the clean(ish) break with all the non standard complex legacy means an easier future for community packaging efforts.
I hope they got paid, I will be very sad if they didn't at least get G5 money.
Curious how well upstream contributors or projects get contributed for these sort of headline-gathering acquisitions (probably not at all, unfortunately).<p>OpenClaw notably was built around Mario Zechner's pi[0]; uv I believe was highly adapted from Armin Ronacher's rye[1], and uses indygreg's python-build-standalone[2] for distributing Python builds (both of which were eventually transferred to Astral).<p>[0]: <a href="https://github.com/badlogic/pi-mono" rel="nofollow">https://github.com/badlogic/pi-mono</a><p>[1]: <a href="https://github.com/astral-sh/rye" rel="nofollow">https://github.com/astral-sh/rye</a><p>[2]: <a href="https://github.com/astral-sh/python-build-standalone" rel="nofollow">https://github.com/astral-sh/python-build-standalone</a>
On the other hand, we get to see what other thing will try to replace pip
Yeah, no. there are many worse news than this.<p>In the worst case, Astral will stop developing their tools, someone else will pick them up and will continue polishing them. In the best case, they will just continue as they did until now, and nothing will really change on that front.<p>Astral is doing good work, but their greatest benefit for the ecosystem so far was showing what's possible and how it's down. Now everyone can take up the quest from here and continue. So any possible harm from here out will be not that deep, at worst we will be missing out on many more cool things they could have built.
as a long-time python-hater, i am jack’s smug smiling face
Let the forks roll!
UV_DISABLE_AGENT=1 UV_DISABLE_AI_HINTS=1 uv add
Not who I would've liked to acquire Astral. As long as OpenAI doesn't force bad decisions on to Astral too hard, I'm very happy for the Astral team. They've been making some of the best Python tooling that has made the ecosystem so much better IME.
If Codex’s core quality is anything to go by, it’s time to create a community fork of UV
Eh, if it turns out to be too bad I guess I’ll just end up switching back to pipenv, which is the closest thing to uv (especially due to the automatic Python version management, but not as fast).
I would much rather use pipenv, if it only had the speed of uv.<p>Every interface kenneth reitz originally designed was fantastic to learn and use. I wish the influx of all these non-pythonistas changing the language over the last 10 years or so would go back and learn from his stuff.
Does pipenv download and install prebuilt interpreters when managing Python versions? Last I used it it relied on pyenv to do a local build, which is incredibly finicky on heterogenous fleets of computers.
People would just make pipenv fast? There are some new tools that can help with that..
The priorities of the tooling will change to help agents instead of human users directly. That's all that's happening.
This has me thinking about VS Code and VS Codium. I've used VS Code for a while now, but recently grew annoyed at the increasingly prevalent prompts to subscribe to various Microsoft AI tools. I know you can make them go away, but if you bounce between different systems, and particularly deal with installing VS Code on a regular basis, it becomes annoying.<p>I started using VS Codium, and it feels like using VS Code before the AI hype era. I wonder if we're going to see a commercial version of uv bloated with the things OpenAI wants us all to use, and a community version that's more like the uv we're using right now.
MS is actively making your life using VS Codium a pain. They removed the download button the extension marketrplace making it very difficult to download extensions and installing them in VS Codium since VS Codium does not have access to the official MS extension marketplace. Many don't publish outside the marketplace for example Platformio. [1]<p>[1] <a href="https://github.com/platformio/platformio-vscode-ide/issues/1802#issuecomment-3370225987" rel="nofollow">https://github.com/platformio/platformio-vscode-ide/issues/1...</a>
Also, Microsoft does not allow use of their LSP for python. You have to use the barebones Jedi LSP.
I've not struggled to find the things I need at <a href="https://open-vsx.org" rel="nofollow">https://open-vsx.org</a> (usually by searching directly within VSCodium), but then I only use it for editing things like markdown docs and presentations, LaTeX/Typst, rather than coding, which I prefer to do in a terminal and with a modal editor.
1. You can add a bookmark that executes enough JavaScript to download the VSIX as usual.
2. I think you can patch the product.json from VSCodium to use VSCode. Gets overwritten on every update probably.<p>Honestly though, it's easier to disable ~three settings in VSCode and call it a day.
Luckily I avoided extensions before switching to VS Codium.<p>Glad to hear that I am avoiding Microsoft's spam.
I really wanted to use vscodium but had to go back to vscode proper because the remote ssh extension is just nowhere near as good. The open source one uses a JS library to implement the SSH protocol rather than using a system binary which means many features (GSSAPI) aren't supported. Also just seems like a bad idea to use an SSH implementation that's not nearly as battle tested as openssh...
Not often that I audibly groan at a HN headline :-(
Same here. I’ve adopted uv across all of my Python projects and couldn’t be happier. ty looks very promising as well.<p>Probably inevitable, and I don’t blame the team, I just wish it were someone else.
Monkey paw curls tight<p>Microsoft acquires Astral<p>Wish comes with a cost
I kind of feel like the nature of the Python ecosystem is a dozen or so extremely useful frameworks/tools that everyone uses heavily for 3 years and then abandons and never speaks of again.<p>I'm not very deep in Python anymore, but every time I dip my toes back in it's a completely different set of tools, with some noticably rare exceptions (eg, numpy).
i think most of the previous package management tools have been like 70% solutions - for 70% of things, they're much better than previous tools, but then 30% of the problem goes unaddressed. For example, poetry does a good job at managing your package, but decided not to manage python versions. So, I still need some other bootstrap process involving a huge pile of bash.<p>uv feels like the first 100% coverage solution i've seen in the python space, and it also has great developer experience.<p>I can't speak to the rest of the ecosystem, when I write python it's extremely boring and I'm still using Flask same as i was in 2016.
I cant see how the ecosystem evolves being a bad thing?
Ty, Ruff, UV, all great tools I recently started really using and I couldn't be happier with them.<p>Sigh
I think, it may be the first time I am actually upset by acquire announcement. I am usually like "well, it is what it is", but this time it just feels like betrayal.
This is a weird pattern accross OpenAI/Anthropic to buy startups building better toolings.<p>I don't really see the value for OAI/Anthropic, but it's nice to know that uv (+ ty and many others) and Bun will stay maintained!
Somebody took a deeper look at Claude Code and claims to find evidence of Anthropic's PaaS offering [1]. There's certainly money to be made by offering a nice platform where "citizen developers" can push their code.<p>From Astral the (fast) linter and type checker are pretty useful companions for agentic development.<p>[1] <a href="https://x.com/AprilNEA/status/2034209430158619084" rel="nofollow">https://x.com/AprilNEA/status/2034209430158619084</a>
`uv agent` and `bun agent` in 3....2.....1....
Good that they got some money and a longer runaway, but I have my doubts the product will improve rather than be smothered to death.<p>Embrace, extend, extinguish. Time will tell.
The value is to control the tool chain from idea to production so it can be automated by agents. It's no secret that the final goal is to fully replace developers, the flow "idea to production". It's easier to control that flow if you control each tool and every step.<p>I won't be surprised if the next step is to acquire CI/CD tools.
Does OpenAI use a lot of python?<p>There is the literal benefit of "we use the hell out of this tool, we need to make sure it stays usable for us" and then there is what they can learn from or coerce the community in to doing.
I don't know about OpenAI using a lot of Python, but Astral builds all their tools in Rust and just exposes Python bindings. Codex is all Rust. It feels like a reasonable acquisition from that perspective. They're banking on at least in part on the Astral team being able to integrate with and supercharge Codex.
Yeah, it’s all mostly heaps of Python internally aside from Codex.
They probably prompted for what they should do next and got this as a half-hallucinated response lol
> it's nice to know that uv (+ ty and many others) and Bun will stay maintained!<p>Depends if you think the bubble is going to pop, I suppose. In some sense, independence was insulation.
Isn't this something to do with their paid pyx(as opposed to ty/ruff etc) thingy?
I'm not so sure. I sort of wish they hadn't been acquired because these sort of acquihires usually result in stifling the competition while the incumbent stagnates. It definitely is an acquihire given OpenAI explicitly states they'll be joining the Codex team and only that their existing open-source projects will remain "maintained".
I'm expecting Anthropic to buy Zed
I mean they are “startups” on the way to mega-companies. They need internal tooling to match.
Why do you think that uv, etc. will stay maintained? They will for now, but as soon as cash is tight at OpenAI, they'll get culled so fast that you won't see it coming. This is the risk.
Noooo, uv, you were the chosen one! (meme)<p>Jokes aside, these tools are currently absolutely free to use, but imagine a future when your employers demand you use Claude Code because that's the only license agreement they have, and they stop their AI agents from using uv. Sure, we all know how to use uv, but there will also come a time and place when they will ask us to not write a single line of code manually, especially if you have your agents running in "feared the most by clueless middle managers" "production".<p>Are you ready for factionalism and sandbox wars? Because I'm not. I just want to write my code, push to "production" as I see fit and be happy as pixels start shifting around.
great for astral, sucks for uv. was nice to have sane tooling at least for a few years, thanks for the gift.
I really hope they don't kill off uv or turn it into some way to sell OpenAI services. But I suspect that's exactly what's going to happen :(
I don't know. yarn never really turned into a vehicle to sell Facebook, though you always kind of transiently knew it was FB that offered it. I imagine that sort of transient advertising is it's own value, too.
Time for the PSF to consider something inspired by uv as a native solution.
This feels like a natural move given how important developer tooling has become to the AI ecosystem. If OpenAI can integrate Astral’s tooling into workflows more deeply (linting, packaging, etc.), it could significantly reduce friction for building AI-powered apps.<p>The interesting question is whether Astral stays relatively independent (like GitHub under Microsoft) or becomes tightly coupled to OpenAI’s platform.
Woah, first Anthropic buys Bun, now OpenAI Astral?<p>Seems like the big AI players love buying up the good dev tooling companies.<p>I hope this means the Astral folks can keep doing what they are doing, because I absolutely love uv (ruff is pretty nice too).
> I hope this means the Astral folks can keep doing what they are doing, because I absolutely love uv (ruff is pretty nice too).<p>That is definitely the plan!
Being in this industry for over 20 years probably jaded me a lot, I understand that's the plan but it's almost always the plan (or publicly stated as).<p>Only time will tell if it will not affect the ecosystem negatively, best of luck though, I really hope this time is different™.
I've been in the industry for similarly long, and I understand and sympathize with this view. All I can say is that _right now_, we're committed to maintaining our open-source tools with the same level of effort, care, and attention to detail as before. That does not change with this acquisition. No one can guarantee how motives, incentives, and decisions might change years down the line. But that's why we bake optionality into it with the tools being permissively licensed. That makes the worst-case scenarios have the shape of "fork and move on", and not "software disappears forever".
I personally get a lot of confidence in the permissive licensing (both in the current code quality, and the "backup plan" that I can keep using it in the event of an Astralnomical emergency); thank you for being open source!
> No one can guarantee how motives, incentives, and decisions might change years down the line. But that's why we bake optionality into it with the tools being permissively licensed. That makes the worst-case scenarios have the shape of "fork and move on", and not "software disappears forever".<p>Okay, so better prepare already, folks!
> All I can say is that _right now_<p>I've also been in the industry for similarly long and I believe you 100% that's "all you can say" ;)
Literally there is no public comment you are allowed to make that we haven't heard 100 times before.<p>Congratulations though!
JS vs Python wars, redux?
Yes just like "Whatsapp is joining Facebook but it'll always be free and independent"!!
>Seems like the big AI players love buying up the good dev tooling companies.<p>Would be a good mustache-twirling cartoon villain tactics, you know, try to prevent advances in developer experience to make vibecoding more attractive =)
It also hints even The Big Guys can’t LLM their tooling fully, and that current bleeding edge “AI” companies are doing that IT thing of making IT for IT (ie dev components, tooling, etc), instead of conquering some entire market on one continent or the other…
You know it's absolutely going that way. That's the lifecycle of corporate strategy.
It's a good news to me considering their open-source nature. If/when they go downhill there will be still the option to fork, and the previous work will still have been funded.<p>Now for those wondering who would fork and maintain it for free, that is more of a critic of FOSS in general.
Uff. Reading all the comments made my head hurt.<p>I love(d) `uv`. I think it's one of the best tools around for Python ecosystem... Therefore the pit in my tummy when I read this.<p>Yes, congrats to the team and all that.<p>I'm more worried about the long term impact on the ecosystem, as are almost everybody who dropped a comment here.<p>My own thoughts echo somewhat what @SimonW wrote here [1]<p>[1] <a href="https://simonwillison.net/2026/Mar/19/openai-acquiring-astral/" rel="nofollow">https://simonwillison.net/2026/Mar/19/openai-acquiring-astra...</a><p>However, a forking strategy is may (or may not) be the best for `uv`.<p>Could we count on the Astral team to keep uv in a separate foundation?
I love uv and the other tooling Astral has built. It really helped reinvigorate my love for Python over the last year.<p>Something like this was always inevitable. I just hope it doesn’t ruin a good thing.
I see a lot of comments that are "somebody should fork this" or "community will fork it" or similar.<p>I didn't see a single comment of "I will fork it" type.
Welp. I used to respect Astral. I hope someone responsible forks their Python tooling and maintains it. Ideally a foundation rather than a company.
I don't think it's a coincidence that a lot of the (stale) tooling that Astral replaced were managed by foundations instead of companies...
Yeah, well, the fact is that every person who ever touches Python needed uv, but only Astral folks created it. So, nope, there's no one capable of filling the void, just accept that it's fucked now. The best die first.
25 years of Python behind me, please let me tell you that hopefully you're wrong : we don't "need" uv :)
I write python all the time and I've never used it
If that were true, Astral wouldn't have been able to build it in the first place. It's an Open Source tool. Perhaps folks excited about working on it can move to the Python Foundation and maintain it there. Perhaps companies who saw today's acquisition and became deeply worried about the future of this tooling could help support and fund such an effort.
Part of the reason Astral as a team is so well liked is precisely <i>because</i> they are not part of the main fold or related to "Core Python"; they are an independent vendor, one that delivered high quality code and listened directly to users and their own (extensive) experience to do so, and they succeeded at that repeatedly. Python packaging has {been seen as, actually been} miserable <i>for years</i>, and so by the same token the capacity to believe in/buy into solutions from the "core project" has dwindled. "If it took Astral to fix it, why would it be any different going forward?"<p>So that's all it really comes down to; uv isn't loved just because it's great but because it is <i>in good hands</i>. This real/perceived change of hands pretty much explains all the downstream responses to the news that you see in this thread. Regardless of who bought them, any fork is going to have very, very big shoes to fill, and filling those shoes appropriately is the big worry.
Does that not also suggest (cautious, make sure we back it up with our actions) optimism about this acquisition? We're not breaking up the band. These tools will be in the same hands as before. And it would be extremely value-destructive to bring in a team like ours and then undermine what made us valued and successful.
Fair enough. But that does seem like something that'd depend more on the people than the organization. Whoever forks it will need to be trusted to continue to be "good hands", whatever organization they operate under the auspices of.
OpenAI continue to send mixed messaging.<p>"Our AI can do anything a human can do, better, faster, cheaper" -> Buys a product instead of asking their AI to just make it.<p>Really doesn't give me confidence in your product!!!!
As someone who loves Astral and hates OpenAI, this is making me pretty sad.
I feel some <i>"commoditize your complements"</i> (Spolsky) vibes hearing about these acquisitions. Or, potentially, "<i>control</i> your complements"?<p>If you find your popular, expensive tool leans heavily upon third party tools, it doesn't seem a crazy idea to purchase them for peanuts (compared to your overall worth) to both optimize your tool to use them better and, maybe, reduce the efficacy of how your competitors use them (like changing the API over time, controlling the feature roadmap, etc.) Or maybe I'm being paranoid :-)
I don’t know who I would’ve like to see but them, buy OpenAI is not it. Sad day for uv, ruff and ty users.
Happy for the team, sad for users. I just don’t believe their work will continue under new ownership
What happens when OpenAI’s burn dries up their cash?
They mysteriously gain a lot of government contracts.<p>In a completely unrelated event, Donald sues Sam for 10M$ for calling him old, Sam grudingly agrees to pay him 16M$ and a beer.
That's where taxpayers come in a the ultimate bagholder.
They get more money from investors, go public, or get bought.
Taxpayers bail them out.
`uv` will get PIP'd
RAM prices go down. My hope though is that the period RAM prices stay up will put electron apps out of market.
$110B will surely last for at least a year.
That money is going directly to Jensen as quickly as possible to secure OpenAI's place in the delivery queue
Was that $110B in cash? I wouldn't be surprised if it's based on something else (paid for using deals for stock).
"We must be regulated to contain the nuclear bomb like power of our products. Oh look it escaped again!", etc
Reading this news only leaves me worried about long-term future of these open source tools.
I have long since found the VC model for open source questionable. If you are not selling popular enough direct enterprise support what is the model to actually make money.<p>Take ruff, I have used it, but I had no idea it even had a company behind it... And I must not be only one and it must not be only tool like it...
And so, more core functionality developers depend on becomes dependent on a continuing stream of billions in VC funding. What could go wrong?
Well, that's the first shoe dropping. Thankfully uv and ruff are MIT licensed and in a good place, so worst comes to worst...
The "commitment to open source" line in these press releases usually has a half-life of about 18 months before the telemetry starts getting invasive.
> uvex init my_new_slop_project —-describe “make me the bestest saas that will make $1M ARR per day”
—-disable_thinking
—-disable_slop_scaffolded_feature<p>> uvex add other_slop_project —-disable_peddled_package_recommendations<p>> implicitly phoning home your project, all source code, its metadata, and inferring whether your idea/use-case is worth steamrolling with their own version.<p>This is the future of “development”. Congrats to the team.
I am very unhappy about this. Astral tools like uv are key to my work/experimenting process. I think OpenAI sucks as a company.<p>That said, I hope the excellent Astral team got a good payday.
Congrats to the Astral team, they've done great work and deserve everything.<p>As a user of uv who was hoping it would be a long term stable predictable uninteresting part of my toolchain this sucks, right?
All other devtools now will follow suit - to get acquihired by openai or anthropic.
And this is why we don't use tools by VC funded corps.
I'm into this.<p>Anthropic acquiring Bun, now OpenAI acquiring Astral. Both show the big labs recognize that great AI coding tools require great developer tooling, and they are willing to pay for it rather than build inferior alternatives. Good outcome for the teams.<p>Not exactly a great look for the "AGI is right around the corner" crowd — if the labs had it, they would not need to buy software from humans.
This wasn't a software acquisition at all. They were already able to use Astral's software just like anyone else. They wanted a good dev-tools team so that's what they bought.<p>> the Astral team will join the Codex team at OpenAI and over time, we’ll explore deeper integrations that allow Codex to interact more directly with the tools developers already use
> we’ll explore deeper integrations that allow Codex to interact more directly with the tools developers already use<p>Gross.
That's the cognitive dissonance<p>How can two things be the same:<p>Iur AI will replace every SWE in the next 5yrs<p>Our tooling is bad<p>We need to buy a company that brought a revolution in python tooling<p>If LLMs and GTP5 codex is soo good then how isn't all tooling related to OpenAI ebinf written by it?<p>Same goes for Claude. They say Claude writes Claude... Why isn't Claude Code then writtten by Claude in such a way that it has a platform specific optimized binary rather than an Electron app.<p>It's almost as if they're generating hype for their sales/valuation
I'm assuming that they were buying great Rust devs (given codex is written in it).
This is your friendly PSA that pip-tools still exist.<p><a href="https://github.com/jazzband/pip-tools" rel="nofollow">https://github.com/jazzband/pip-tools</a>
goddammit<p><a href="https://jazzband.co/news/2026/03/14/sunsetting-jazzband" rel="nofollow">https://jazzband.co/news/2026/03/14/sunsetting-jazzband</a>
Ah but that's not a shiny new tool I can add to my cv /s
Get off it.<p>* uv is better than pip-tools in every single way, apart from it being VC-backed.<p>* Nobody is placing any value on uv being on someone’s CV.<p>You are essentially incorrectly LLM pattern-matching being a jaded, sarcastic “smart person”. Learn to tell the difference between a “shiny new thing” and a legitimately better tool. If you can’t do that, you’re at best a sub-par developer.
uv and ruff are one of the best things that happened in the python ecosystem the last years. I hope this acquisition does not put them on a path to doom.
Astral took over python-build-standalone, and uv uses its custom Python builds in deployments.<p>I do not want OpenAI putting their fingers in my Python binaries, nor do I want their telemetry.
I like uv, but not sure this is a good path forward for the python ecosystem.
why? lot's of good work came to Python by people who were sponsored by big tech companies. make Python better for them, and for a lot of other people too.<p>(sure, it's a bit different than contributing to CPython, but I'd argue not that different)
It is VERY different. One company now has complete control of the activities of the team developing these tools. Contributing to Python (money or time) gets you some influence, but doesn't allow you to dictate anything - there's still a team making the decisions.
Your heuristic is terrible. It’s not about it being a “big tech company”. I’m sure that you can be more intelligent and nuanced than that. It’s about fit, aligned goals, aligned vision, aligned incentives.<p>I do not see any alignment between the Astral that I want Astral to be, and OpenAI.
Mixed feelings, happy for the guy, he deserves it. Unhappy about whom he went with, though not sure if he had other buyers / offers in the mix?
This will solve the problem of when the package you want to install doesn't exist yet.
Astral was always going to have to find <i>some</i> way to sustain itself financially. They weren’t going to just make the best free tools in the ecosystem forever. uv is sufficiently entrenched as infrastructure that I’m sure it’ll take no time for a community fork to show up if they do anything stupid with it.
Personally, I'd expect a few good years of stewardship, and then a decline in investment. I can only hope there are enough community members to keep things going by then.
They should be allowed to make money from their work. Their work is MIT licensed, if it goes south it is rescuable by the community.<p>Things come and go, let’s not beat up some dudes who made some cool stuff, made everyone’s lives easier and then sold up. There is a timeline where this makes UV / python better.
This is why I still like to setup projects and environments with my own `make` `venv` and `pip`.
No it’s not. You do it because you didn’t see the point in changing what you do. Don’t pretend that this is your “I told you so” moment.
The QOL improvement offered by uv over your approach is certainly substantial, but it’s also easy to get off of uv. Nobody here feels legitimately “trapped” in the uv “ecosystem”. For 99.999% of Python projects, if they need to get off of uv, it’s going to be a very quick thing to do.<p>The disappointment and anger is because we’ve had a nice QOL improvement which is now more directly threatened in a way that it was before, and it’s always hard to go backwards. A QOL improvement that you never had in the first place. So…congrats?<p>Unless your point is “this is why I deprive myself of nice things, because they can go away”…which is just silly.
Yes, I think we are mostly in agreement. I tried `uv` several times and never wanted to add it as a dependency because I wasn't convinced by the QOL. That being said, the biggest concern I always had was just that it was adding another layer of complexity and similar to Conda or pyenv it just wasn't something I was going to like.<p>I probably did come off a bit 'told you so' but I guess it was more that it felt like this was finally an answer to a question/curiosity I've had about `uv` where I didn't understand the dissonance between how others felt about it and how I did.
What excites me about the OpenAI + Astral acquisition: Codex CLI, uv, and ruff are all written in Rust. Fast by design, and fully open source.
While I -- like most other commenters -- am dubious of both OpenAI and this acquisition, I think it's pretty reasonable to wait to see how this turns out before rushing to final judgment.<p>Everything I've seen from Astral and Charlie indicates they're brilliant, caring, and overall reasonable folks. I think it's unfair to jump to call them sell-outs and cast uv and the rest as doomed projects.
Sure, I don't think anybody disagrees, I sure don't. You never know, and all. It's just that we (the imaginary crowd you are arguing with) are not hopeful. And your "but wait, guys, I think they are good people!" is some quite pitiful attempt to console us. Sure, good, brilliant and caring, that's why we are upset in the first place. Always more sad when it's somebody you liked that dies.<p>And framing it as "sell-outs" is cheap rhetoric that means nothing. The fact is, they were the company who never really had a solid business model, but provide a lot of value for the community. Being acquired by some infinite-money company was always the best outcome they could hope for. Well, they did. Probably got a ton of money. Will it require some sacrifice? Well, some people would say that working for a company who makes products for the Department of War of the USA on conditions that even Anthropic found too ugly to satisfy, is enough of a sacrifice on its own. I am pretty sure though that most people would be willing to make this sacrifice for the right amount of money (with "right amount" being a variable part). So calling someone a sell-out is usually just bitterness about the fact that it wasn't you who managed to sell out. I mean, not judging someone for a sacrifice they make isn't the same thing as pretending they didn't make a sacrifice. Sometimes we (the world, they were trying to make better) are a sacrifice. That's all.
Charlie's fine. OpenAI are the problem here. Similar situation to steipete. Happy for the person, sad for the tool/ecosystem/everyone else.
Not similar at all. One has been a miracle for the Python ecosystem, another was a small scale Twitter hype-fart.
I suppose my point is: I would expect that Charlie and co. carried their negotiations with OpenAI with the same laser-focused, careful judgment that catapulted Astral to success in the first place. I don't mean to fanboy, but I generally trust that they made the best decision for not only them, but the Python community as a whole.
We always "wait and see" and it always turns out terrible. Even if the original founders stay on, eventually they will get pushed out when their morals conflict with company goals. Wont happen overnight, but uv will enshitify eventually.
As a non python dev I really thought UV and TY are great tools and liked their approaches but I don't know how good it is that they are privately held... no a fan
Related (OpenAI announcement): <a href="https://news.ycombinator.com/item?id=47438716">https://news.ycombinator.com/item?id=47438716</a>
Given their statements, influencers and indeed their raison d'être I don't understand why this and bun was acquired? Why did they not just Ralph loop it? Claude is famously not made by humans anymore.<p>One prompt and call it a weekend. Surely they have the compute.<p>Oooooh, right.<p>Same reason we don't have windows 41 either.
The Bun acquisition made a little sense, Boris wanted Daddy Jarred to come clean up his mess, and Jarred is 100% able to deliver.<p>This doesn't make as much sense. OpenAI has a better low level engineering team and they don't have a hot mess with traction like Anthropic did. This seems more about acquiring people with dev ergonomics vision to push product direction, which I don't see being a huge win.
They do have a hot mess with traction amongst developers. Codex is far behind Claude Code (in both the GUI and TUI forms), and OpenAI's chief of applications recently announced a pivot to focus more on "productivity" (i.e. software and enterprise verticals) because B2B yields a lot more revenue than B2C.
Haha just migrated everything off openai and on ruff/uv/ty last week. Sorry guys, it's clearly my fault.
they said they will keep maintaining uv/ruff/ty ... but that's an impossible promise to keep with their priority changes once they are in the bed with OAI.
Happy for the devs, they deserve the presumably massive payout for the amount of value they’ve brought to the Python community.
I don't get it. Why buy Astral? Why not just fund it? Why not just hire the company to do whatever work you want to get out of the team as part of the merger?<p>Why buy, when they can rent?<p>(Not to mention, multiple companies could hire and fund them.)
F*CK. take everything from me why dontcha?
UV, Ruff, and Ty are all very good things, hopefully that doesn't change and gets better.
i feel like moves like this make it even harder for new open-source tools to break through. there's already evidence that LLMs are biased toward established tools in their training data (you can check it here <a href="https://amplifying.ai/research/claude-code-picks" rel="nofollow">https://amplifying.ai/research/claude-code-picks</a>). when a dominant player acquires the most popular toolchain in an ecosystem, that bias only deepens. not because of any skewing, but because the acquired tools get more usage, more documentation, more community content. getting a new project into model weights at meaningful scale is already really hard. acquisitions like this make it even harder.
I'm also concerned about this, but I feel as though uv and ruff's explosive growth happening alongside and despite that of LLMs demonstrates that it's not a show-stopper. I vividly recall LLM coding agents defaulting to pip/poetry and black/flake8, etc. for new projects. It still does that to some extent, but I see them using uv and ruff by default -- without any steering from me -- with far greater frequency.<p>Perhaps it's naive optimism, but I generally have hope that new and improved tools will continue to gain adoption and shine through in the training data, especially as post-training and continual learning improve.
Tested the "Kagi LinkedIn Speak" translator[0] from a couple of days ago[1] on this. Works pretty great! If you translate it back and forth a number of times, it pretty much distills it to the essence.<p>[0] <a href="https://translate.kagi.com/?from=linkedin&to=en" rel="nofollow">https://translate.kagi.com/?from=linkedin&to=en</a><p>[1] <a href="https://news.ycombinator.com/item?id=47408703">https://news.ycombinator.com/item?id=47408703</a>
I hope OpenAI realizes they cannot buy developer goodwill.
They are not trying to buy developer goodwill, they are trying to catch up with Antrophic in terms of getting those B2B contracts, which is currently the most realistic path towards not running out of money.
1. The Register reports OpenAI is well ahead of Anthropic in B2B contracts. It's Anthropic playing catch-up, not OpenAI.<p>2. In any case, the announcement strongly suggests that customer acquisition had little to do with this. The stated purpose of the acquisition, as I read it, is an acquisition (plus acquihire?) to bolster their Codex product.<p>3. But if they were hoping for some developer goodwill as a secondary effect... well, see my note above.
I do hope every at Astral got a a nice pay-out for this.<p>It does look like this is going to be the norm for popular open source projects related to AI ecosystem, but I guess open source developers need to get paid somehow if that project is their only livelihood.<p>Shame for the end-user though. As you will always be second guessing how they will ruin the tool, i.e. via data collection or AI-sloppifying it. It is likely OpenAI won't, but it is not a great feeling knowing a convenient tool you use is at the whim of a heartless mega-corp.
Interesting acquihire. I would have assumed MS would have snagged them (until their __layoffs__ last year). My gut is that this is more for Python expertise, and ruff/ty knowledge of linting code than uv...<p>I'm a heavy user and instructor of uv. I'm teaching a course next week that features uv and rough (as does my recent Effective Testing book).<p>Interesting to read the comments about looking for a change. Honestly, uv is so much better than anything else in the Python community right now. We've used projects sponsored by Meta (and other questionable companies) in the past. I'm going to continue enjoying uv while I can.
these (uv and bun) are not acquihires, they're acqui-rootaccess
Have you looked at the .github/ folder of any actively developed python packages lately? It has become difficult to find one where there <i>isn't</i> a few interesting people with code-execution-capable push/publish/cache-write access somewhere along the blown up transitive dependency/include chains.
We need to explore this angle. With OpenAI already strongly being intelligence gathering apparatus for the US, now with this acquisition, it will potentially have access to the code and environment variables of a good chunk of private projects even when Codex doesn't.
Welp. Guess we just wait for the next package management tool to come around. Really thought uv was gonna be the one.<p>Good for Astral though I guess, they do great work. Just not optimistic this is gonna be good for python devs long term.
I'm confused as to what will happen to their platform product which was in closed beta - pyx. Since they no longer need to worry about money (I assume) they no longer need to chase after enterprise customers?
"OpenAI is focusing employee and investor attention on its enterprise business as the artificial intelligence startup gears up to go public, potentially by the end of the year, CNBC has learned."<p><a href="https://www.cnbc.com/2026/03/17/openai-preps-for-ipo-in-2026-says-chatgpt-must-be-productivity-tool.html" rel="nofollow">https://www.cnbc.com/2026/03/17/openai-preps-for-ipo-in-2026...</a>
Astral threads here have been surprisingly flag resistant and plentiful. This takeover explains a lot.<p>I suspect some OpenClaw "secure" sandbox is coming (Nvidia jealousy) with Astral delivering the packages for Docker within Docker within Qemu within Qubes. A self respecting AI stack must be convoluted.<p>I can't wait until all this implodes after the IPOs.
Would there be any interest in me fixing the bugs in Pyflow and getting it updated to install newer python versions? It's almost identical to uv in concept, but I haven't touched it in 6 years.<p>Astral has demonstrated that there is desire for this sort of "just works" thing, which I struggled with, and led me to abandoning it. (I.e.: "pip/venv/conda are fine, why do I want this?", despite my personal experience with those as high-friction)
Counting days until "uv will stop being dveloped".
This might not be bad as long as Astral is allowed to continue to work on improving ty, uv and ruff. I do worry about they'll get distracted by their Codex job duties though.
After investing a bunch in converting my projects to, and evangelizing uv, I feel betrayed. I smell stability troubles ahead. Should've stuck to Conda.
This is like, the worst outcome for Astral. This is severely disappointing. I say this as someone that uses Astral and OpenAI products practically every day. This is such a terrible fit.<p>I just hope that Charlie doesn’t trot around the dev circuit (like he has in years past) trying to sell everyone on this “being a good thing, actually”. I hope that he isn’t given the space to sell any story other than “we took the AI money despite it being a terrible fit”, because that story would just be a lie. The fact that this blog post is already trying to preemptively justify it—“well in my launch post what I said is…”—is extremely, extremely telling.<p>This is so hugely disappointing. And again, I am at this point quite bullish on AI. This isn’t a philosophical or anti-AI take at all, because those are easier to dismiss.<p>I’m not going to pretend to “congratulate the team” or whatever. As far as I’m concerned, that’s HN culture brain rot. Some of y’all in ‘startup culture’ may see getting acquired by OpenAI as some sort of big prize or worth celebrating or whatever, but I certainly don’t.
Fantastic for the team, huge fan for Ruff and Uv. Hope OpenAI continues with the OSS tooling and not introduce restrictive licensing.
I’ve been thinking about purchasing zsh myself
to be expected at some point, but for the independence and best interest of the Python ecosystem, I don't think it's a plus.
It is interesting to see this after yesterday’s announcement of Unsloth Studio:<p><a href="https://news.ycombinator.com/item?id=47414032">https://news.ycombinator.com/item?id=47414032</a><p>Uv did solve a distribution problem for them.<p>There is still a lot of room to grow in the space of software packaging and distribution.
Great that I keep using traditional Python tools.
My initial reaction was being weirdly sad about this and I don't fully understand why yet. I read the headline, clicked into the link, and just went noooooooo. I really like uv and I hope it continues to do well, congrats to the team though and hope everyone there gets a good outcome.
Not surprised at all on this. I've been really suspicious about how hard `uv` was being pushed in 24/25.
I think the push has been entirely organic. Compared to existing tooling, uv is fantastically fast.<p>One of the bigger pain points I’ve faced in Python is dependency resolution. conda could take 30-60 minutes in some cases. uv took seconds.<p>A serious quality of life improvement.
> dependency resolution. conda could take 30-60 minutes<p>Quite literally this is what first raised the alarm bells for me. Dependency resolution complexity is more of a symptom. If that delay ends up being the point where Ops finally agrees that things have gone very wrong, then fixing that delay is not really helping hire the <i>maintenance</i> folks that can make those dependencies.. well, "dependable" again.
uv replaces pip (and venv, and pipx, and more), not conda. If you want a uv-equivalent that replaces conda, then look at pixi: <a href="https://pixi.prefix.dev/" rel="nofollow">https://pixi.prefix.dev/</a>
Sure, it replaces pip, but _we used conda_ for env management. And the slow part was still _dependency resolution_, much like pip.<p>Was there a better option? I’m sure. Choices were made. Regrets were had. Switching to uv was a huge improvement for our purposes.
If all you are installing using conda is pypi packages and different versions of Python, then sure, uv is a fine replacement. I use it for that as well. But if you are using other conda/conda-forge/bioconda packages, then uv isn't a replacement since it cannot install those. However, Pixi can replace conda for that use-case and it's about as fast as uv since it uses uv's dependency resolution code
Are you legitimately unable to tell the difference between something being “pushed” and people just legitimately liking and using something?<p>If you’re going to be a grump about everything, sometimes your broken clock is going to be right. It’s still not worth showing off.
"was being pushed" ... by whom? I think there's widespread grassroots support for it because it's a good tool.
Hey now, I was a completely organic shill! I worked for free!
uv and ruff are incredible tools for python development, and I've loved my time using ty. This acquisition is absolutely terrible.
> It is increasingly clear to me that Codex is that frontier.<p>I'm not really sure about this.
I'd expect OpenAi to make some type of Github clone next, perhaps with Astral, or maybe with jujutsu.
Why? Github is already owned by Microsoft, who are deep in with OpenAI. And what worth would a Github-clone even have for the world? It's not like there is any important innovation left in that space at the moment, or are there any?
Will this be the beginning of the Great Rust vs Zig battle ?
That's unfortunate for the python community.
I can understand the move by Astral team, but it's still difficult to accept it.
I think it’s impossible to predict what will happen with this new trend of “large AI company acquires company making popular open source project”. The pessimist in me says that these products will either be enshittified over time, killed when the bubble bursts, or both. The pragmatist in me hopes that no matter what happens, uv and ruff will survive just like how many OSS projects have been forked or spun out of big companies. The optimist in me hopes that the extra money will push them to even greater heights, but the pessimist and the pragmatist beat the optimist to death a long time ago.
It’s open source. If you want it to go in a different direction fork it and take it in that direction. Instead of the optimist, the pessimist, and the pragmatist the guy you need is the chap who does some work.
I will start migrating from uv, ty and ruff first thing this weekend. It will be painful but not being dependent on OpenAI will be more than worth it.
nooooooooooooooo god why. I loved uv. just why
Company that repeatedly tells you software developers are obsoleted by their product buys more software developers instead of using said product to create software. Hmm.
I work at OpenAI. Software developers are not obsoleted by Codex or Claude Code, nor will they be soon.<p>For our teams, Codex is a massive productivity booster that actually increases the value of each dev. If you check our hiring page, you’ll see we are still hiring aggressively. Our ambitions are bigger than our current workforce, and we continue to pay top dollar for talented devs who want to join us in transforming how silicon chips provide value to humans.<p>Akin to how compilers reduced the demand for assembly but increased the demand for software engineering, I see Codex reducing the demand for hand-typed code but increasing the demand for software engineering. Codex can read and write code faster than you or me, but it still lacks a lot of intelligence and wisdom and context to do whole jobs autonomously.
This seems like a reasonable take. Maybe you could inform your CEO, the media and influencer sycophants, the tech companies that are laying off tens of thousands of developers while mandating the use of your company's tool, and everyone else responsible for us being inundated with outlandish claims that software engineering is dead on a literally daily basis. Hey, while I'm asking for wishes that won't be granted, maybe get people in your company to stop thinking they're so important that it's okay to buy 40% of the world's RAM supply with borrowed money, making it cost 4.5x as much for the rest of us?
They said it'll be good enough in two weeks, give them some time!
They're not buying developers, they are buying the whole ecosystem to produce software. Still aligned with their original message.
They're writing the software to end all softwares!
When someone at work talks about all software devs being replaced I link them to the Anthropic career pages.
As good as the team is, that's not what they're buying in this case.
What are they buying?
> Second, to our investors, especially Casey Aylward from Accel, who led our Seed and Series A, and Jennifer Li from Andreessen Horowitz, who led our Series B<p>They are buying out investors, it's like musical chairs.<p>The liquidity is going to be better on OpenAI, so it pleases everyone (less pressure from investors, more liquidity for investors).<p>The acquisition is just a collateral effect.
Are you implying that the revenue multiple on this acquisition is lower than openAIs and that they'd be making money by acquiring and folding into their valuation multiple? I think that's not the case and I would wager non existent.<p>This was an acquihire (the author of ripgrep, rg, which codex uses nearly exclusively for file operations, is part of the team at Astral).<p>So, 99% acquihire , 1% other financial trickery. I don't even know if Astral has any revenue or sells anything, candidly.
They raised 4M USD, they have 26 full-time employees (they pay 120<->200K / yr, cf <a href="https://pitchbook.com/profiles/company/523411-93" rel="nofollow">https://pitchbook.com/profiles/company/523411-93</a> ).<p>It means the company almost reached their runway, so all these employees would have to find a job.<p>It's a very very good product, but it is open-source and Apache / MIT, so difficult to defend from anyone just clicking on fork. Especially a large company like OpenAI who has massive distribution.<p>Now that they hired the employees, they have no more guarantees than if they made a direct offer to them.
So I don't see how the acquisition is collateral - it's an acquihire plain and simple, if anything else it would be supply chain insurance as they clearly use a lot of these tools downstream. As you noted the licensing is extremely permissive on the tools so there appears to be very little EV there for an acquirer outside of the human capital building the tools or building out monetized features.<p>I'm not too plugged into venture cap on opensource/free tooling space but raising 3 rounds and growing your burn rate to $3M/yr in 24 months without revenue feels like a decently risky bag for those investors and staff without a revenue path or exit. I'd be curious to see if OpenAI went hunting for this or if it was placed in their lap by one of the investors.<p>OpenAI has infamously been offering huge compensation packages to acquire talent, this would be a relative deal if they got it at even a modest valuation. As noted, codex uses a lot of the tooling that this team built here and previously, OpenAI's realization that competitors that do one thing better than them (like claude with coding before codex) can open the door to getting disrupted if they lapse - lots of people I know are moving to claude for non-coding workflows because of it's reputation and relatively mature/advanced client tools.
A brief note, your numbers are way off here — Astral subsequently raised a Series A and B (as mentioned in the blog post) but did not announce them. We were doing great financially.<p>(I work at Astral)
It seems you are one of the most active contributors there.<p>I would sincerely have understood better (and even wished) if OpenAI made you a very generous offer to you personally as an individual contributor than choose a strategy where the main winners are the VCs of the purchased company.<p>Here, outside, we perceive zero to almost no revenues (no pricing ? no contact us ? maybe some consulting ?) and millions burned.<p>Whether it is 4 or 8 or 15M burned, no idea.<p>Who's going to fill that hole, and when ? (especially since PE funds have 5 years timeline, and company is from 2021).<p>The end product is nice, but as an investor, being nice is not enough, so they must have deeper motives.
I mean you pirouetted onto the AI hype train before running out of working capital - I guess that's doing great financially by some definitions.
> They raised 4M USD<p>What was their pitch?
I can see why the former investors and Astral founders would like that, what I don't see is what OpenAI get out of the deal.
mindshare and a central piece of the python package management ecosystem.
IMO, they are buying business just to put them down later to avoid potential competition. The recipe is not new, it has been practiced by Google/Microsoft for many years.
uv
And, they buy a company writing tooling for Python in not Python.
A tool might not be the best tool to build itself, doesn't mean it is not good.
You don't use a screwdriver to craft screwdrivers. Doesn't mean screwdrivers are inherently bad
Next you’ll complain about CPython being written in C.
[flagged]
And buying a niche developer tool is helping with that?
which AI company hasn't?
"Fascism" is when military. The more military, the more fascist. According to this metric, the USSR / DDR with its "anti-fascist wall" was super extra fascist because they were armed to the teeth.
they were definitely totalitarian, slightly different mix of ideology. Fascist is a fairly good description here, it describes close collaboration of government with corporations to advance national goals. US had somewhat fascist tendencies for a long time now.
I was hoping that uv and ruff were the ones. I guess Python has a curse.
That didn't take long: <a href="https://news.ycombinator.com/item?id=46648627">https://news.ycombinator.com/item?id=46648627</a>
It was pretty obvious that some sort of acquisition was imminent. Astral is vc-funded and has to somehow generate returns for investors. An IPO is extremely unlikley in this market.
It’s meant to be bought so at least no more guessing.<p>Ant is building their app distribution platform, so no wonder OpenAI thinking the same, it will only surprise me if they move so slow.
This is feeling weird, like now if the AI bubble pops it will bring core technologies like runtimes (bun) and package managers (astral)
now what's next? vite?
Time for <a href="https://zubanls.com/" rel="nofollow">https://zubanls.com/</a>
Great someone cashed out, time for the next startup idea.
Who advises on these acquisitions?<p>Or are they just using a dartboard?
Any move that strengthens future oligopolies is a net loss for all consumers.<p>I don't care how good/bad a company is, because I lived long enough to know that most of them started off like that. Good luck to the uv team.
If they just give Astral money to keep going, great, but I have difficulty believing they would be so altruistic. This is quite an upsetting acquisition.
py stuff aside for a sec, having this team working on Codex will be huge.<p>I didn’t see a way they ever dethroned Claude until now.<p>Happy as a Codex user, gloomy as a Python one.
Wtf!? Is this an early April's fools? I've been recommending astral tools left and right, Looks like I'm out a good chunk of social capital on that.<p>Who's organizing a fork, or is python back to having only shitty packaging available? :(
Does this mean pip is finally getting PIP'd?
Well this sucks. So the most evil new company has acquired the most exciting 'open' source company. The vibes are about to get bad. Hopefully this speeds up the next phase of AI destroying the economy and everything good about tech.
This leaves me a bit scared for uv and ruff to be honest.
> I am so excited to keep building with you.<p>Fixed: I am so excited to take these millions of dollars.
Don't love it. But, I'm glad the Astral folks are getting the bag.
rip uv
Just when I moved from poetry to uv.
Wonder if they can still use claude code in their repos now
Should I freeze my plans to migrate from `poetry` to `uv` at "${WORK}"?
Maybe it's time to get out my Cowlishaw Rexx book again…
Whoa, So Sam and Drio are just gonna buy out every popular open source projects now?
will private packages hosted on pyx be available for openai to use as training data?
uv has been very useful but I also looking at pixi. Anyone have any experience with that? I hear good things about it.
Wow - this is potentially the most cynical hacker news thread I've ever read. When did this place trade it's curiosity and excitement about technology for constant doomerism?<p>Congrats Astral and co!
Open source tools are being snapped up by a company famous for reneging on its non-profit mission as soon as they sniffed some profit. Wow gee, imagine the cynicism.
I think it was always that way. When an org gets big and its leaders become powerful, then there's always a good portion of snark and mistrust from this community.
Blindly celebrating all accusations is not “curiosity and excitement about technology”. Disappointing that you can’t tell the difference between tech and the SV startup scene. They are different things.
Sorry, but I don't think cheering on OpenAI represents curiosity and excitement about technology.<p>I'm happy for the Astral team but in my opinion, big tech especially is incompatible with curiosity and hacker ethos.
This was pretty obvious to just about anyone tbh. FastAPI is probably next
The lead dev seems to think about using AI with FastAPI already: <a href="https://tiangolo.com/ideas/library-agent-skills/" rel="nofollow">https://tiangolo.com/ideas/library-agent-skills/</a>
Don't even joke
I thought some more about it, and unfortunately it makes sense. IIRC there were several "insider" blogposts from OpenAI that said something along the lines of "Yeah almost every service we write is FastAPI"
yeah and I wouldnt be surprised if some oltp database platform is in the horizon as well. iirc they already acquired rockset. The eventual goal for these AI platforms is to own the entire e2e platform for e2e agents, from literally the language ecosystem(anthropic betting on TS/Bun) to local dev tooling using their harnesses to production-grade stuff that an agent needs.
Where can I find some of those?
It would seem to me that purchasing a piece of software as an AI company is just an outright admission that they could not generate an equivalent piece of software for a better price?<p>If it was cheaper to use their internal AI to create these tools, they would.
How are they acquiring it without "open" in their name?
If you don't pay for your tools and support OSS financially, this is what happens.<p>Although Astral being VC funded was already headed this way anyway.<p>Deno, Pydantic (Both Sequoia) will go the same way as with many other VC backed "open source" dev tools.<p>It will go towards AI companies buying up the very same tools, putting it in their next model update and used against you.<p>Rented back to you for $20/mo.
So instead of finally building an enterprise-grade package manager where you could pay for validated, verified and secure packages, we're going to vibe project management and let a slop-spiggot fill the trough. Brilliant. Incredibly pleased that the last sane tools in the entire python ecosystem are getting gutted to discourage the last few non-braindead devs from bothering.
Don't get me wrong I love getting 300 dependabot updates per day. It's a huge productivity booster and even if you devote 1/2 your dev team to keeping this shit up to date, you'd still be vulnerable to repo-jacking, because the entire pkg ecosystem is broken. The other thing i love about npm and pypi is the way a single small team will re-download in ci (regardless of caching) a TiB of packages all day long for no reason. Love waiting for gh actions to re-import infinite packages for the nth time before it times out and you restart it manually. makes so much sense. Great work all. glad openai is putting the nails in this retard coffin.
Amusing that the best python tools are written entirely in rust.
…amusing how? CPython is written in C, JVM is written in mix of cpp and Java, Rust was written using OCaml initially. Don’t know why you’re snickering. Do you also find it amusing that by the time cpp/rust team scaffolds and compiles initial boilerplate, python team is already making money?
That's what you get with toy languages.
I wonder who's going to pick up VoidZero
I see people in this thread complain about the acquisition but the source code of uv is right there [1]. Fork it and move on. If ClosedAI enshittifies uv, gather with a bunch of other people and prop up a new version.<p>[1] <a href="https://github.com/astral-sh/uv" rel="nofollow">https://github.com/astral-sh/uv</a>
No, you.<p>These comments are so silly and condescending. You aren’t being clever or smart. You think that you need to go to GitHub and find the repo for uv and put it in a little footnote? You think that you’re being a clever boy by doing that? Everyone knows how open source works. We are developers.
So many negative comments but not a single:<p>- I'm willing to pay for Astral ecosystem so it stays independent/open source<p>- I'm willing to fork the project
I (along with many others) always thought that Astral being VC backed is going to lead to a future disappointment for the community.
I don't understand how anyone is surprised at this point. VC project trying to build a brand just isn't going to lead to some utopic community.
I at least thought we'd get a few more good years before the rug pull.
I am actually quite saddened by this. It's very unlikely that' I'll keep using uv, now. I don't trust this kind of shit.
Damnit I was really rooting for uv :(
Pyright and ty are under the same roof now.
Related discussion: <a href="https://news.ycombinator.com/item?id=47438716">https://news.ycombinator.com/item?id=47438716</a>
It should have been FastAPI instead.
what happen when openai goes brankrupt?
Why do I feel uneasy about this?
Can Astral's stuff be forked?
So no problem in joining OpenAI after the whole DoD/DoW mess?<p>> I started Astral to make programming more productive.<p>And now they help make killing more productive
Well shit, I feel betrayed. This is exactly the opposite of what I thought Charlie's goals were. I thought he was focused on making the Python ecosystem better.
So begins the uv-Bun war.
Unfortunate, wince I trust Astral, and can't say the same about openAI
Booooooooooooooooooo
If I were to engage in Python development, what's the alternative to uv?
How does this make sense
So vite.dev is next.
Ugh, this isn't good.<p>I hate relying on anything that is controlled by a single company. Considering that Astral is basically brand new in the python timeline, it is concerning that they are already being acquired.<p>On the other hand, UV is so fast that it makes up for anything I find annoying about it.
Another HN darling falls from grace. But hey, the next one will not follow the same steps!
Undisclosed amount?
Oh no! This is actually terrible. Get ready for "premium tooling only available in Codex(TM)".
This is where POTUS should step in and stop this sale. Not cool.
Welp, back to pip
Btw astral repo has Claude as one of its top contributors
should I be glad I never got off pip?
won't increase your subscriptions, people are dropping out in the millions and no one wants jewPT
This acquisition doesn't make too much sense for the longevity of Astral's software because Astral's software is orthogonal to Codex. It seems more like a acqui-hire. If tomorrow OpenAI were to stop funding Astral's software due to a cash crunch, it would be game over for uv et al. Codex doesn't need uv.
Goddamnit
Astral to Join OpenAI (astral.sh)
OpenAI to Acquire Astral(<a href="https://openai.com/index/openai-to-acquire-astral/" rel="nofollow">https://openai.com/index/openai-to-acquire-astral/</a>)<p>what can I say?
Terrible news for Python ecosystem. I guess the money was too much to reject this ridiculous offer.
I really loved uv, I am happy for the developers at astral but I am sad as a user seeing this :(<p>Any good alternatives to uv/plans for community fork of uv?
I really love uv.<p>Its always hard to really trust these corporate funded open source products, but they've honestly been great.<p>…but I find it difficult to believe openai owning the corner stone of the python tooling ecosystem is good thing for the python ecosystem.<p>There is no question openai will start selling/bundling codex (and codex subscriptions) with uv.<p>I dont think I want my package manger doing that.
Hn's favorite company meets hn's most hated company.<p>Hilarity in the comments will ensue
Genuinely. UV is so awesome and OpenAI is so meh.<p>I am not even sure how to feel about this news but feel a bit disappointed as a user even if I might be happy for the devs that they got money for such project but man, I would've hoped any decent company could've bought them out rather than OpenAI of all things.<p>Maybe OpenAI wants to buy these loved companies to lessen some of the hate but what its doing is lessening the love that we gave to corporations like astral/uv sadly, which is really sad because uv is/(was?) so cool but now we don't know where this tool might be headed next given its in the hands of literally OpenAI :(
Thank you n-gate
"Sir, you now have twice as many private jets as Dario"<p>"But he owns a tooling company. WHY can't I have that? :( :("
congrats team !
what a shame
Codex team now has the legends who created Pyright and UV/Ruff/Ty.
nooo
WAT? uv now belongs to Trump mega-donors? That is not good for the Python ecosystem.
good job
Associated OpenAI post: <a href="https://openai.com/index/openai-to-acquire-astral/" rel="nofollow">https://openai.com/index/openai-to-acquire-astral/</a> (<a href="https://news.ycombinator.com/item?id=47438716">https://news.ycombinator.com/item?id=47438716</a>)
So, any good alternatives to uv?
Ah shit, back to poetry
Don’t you dare enshittify my uv.
This is the worst possible news. Fantastic team at Astral joining a bunch of scumbag scammers at "Open"AI.
well, fuck.
damn it, another one bites the dust sadly
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[dead]
[flagged]
Would you please stop breaking the site guidelines? We've warned/asked you countless times. You should know not to post like this by now.<p><a href="https://news.ycombinator.com/newsguidelines.html">https://news.ycombinator.com/newsguidelines.html</a>
Thanks for chiming in, dang.<p>While I do see that the tone of the comment was stingy, it was aimed towards the frustration I have experienced while developing for Python and this piece of news as well.<p>I didn't see it as a bad thing as it not really aimed at anybody in particular, more like an opinion on Python's shortcomings.<p>I will try to post more substantive/less emotional comments going forward.
Astral's tooling is excellent and almost makes up for Python being a badly-designed language. Almost.
Greed knows no limit<p>OpenAI is Microslop, so it's the classic EEE, nothing new to see<p>It's like with systemd now planning to enforce gov. age verification<p>People will censor you if you dare say something negative on this website<p>So i guess, <i>wears a clown hat</i> "congrats!"
Congrats!<p>This of course means more VC funding for FOSS tools since a successful exit is a positive signal.
Funding is as good as gone until the Iran mess is over.
> a successful exit is a positive signal<p>This is peak finance brainrot. In no scenario is abandoning ship a positive signal, even if you managed to pocket some valuables on the way out.<p>Let's stop celebrating dysfunctional business models and consolidation of the industry around finance bros who give zero fucks about said industry.
Solid move by Altman - good signal they’re serious about capturing the Claude Code market from Anthropic.<p>What I don’t understand is why hasn’t anyone bought Jetbrains yet.<p>Atlassian? AWS? Google?
Because Jetbrain strategy wasn't to burn money with free tools to eventually exit with the jackpot. They have been profitable for over a decade, simply asking users to pay a fair price for great product.
Atlassian? Bitbucket as a platform for agentic development.. shudder
Most likely because Jetbrains is not for sale. Google almost certainly offered to buy at some point.