I maintain an open source project funded by the Sovereign Tech Fund. Getting there wasn't easy: the application process is long, the amounts are modest compared to a VC round, and you have to build community trust before any of that becomes possible. But the result is a project that isn't on anyone's exit timeline.
I'm not saying the startup path is without its own difficulties. But structurally, it offloads the costs onto the community that eventually comes to depend on you. By the time those costs come due, the founders have either cashed out or the company is circling the drain, and the users are left holding the bag. What's happening to Astral fits that pattern almost too neatly.
The healthier model, I think, is to build community first and then seek public or nonprofit funding: NLnet, STF, or similar. It's slower and harder, but it doesn't have a built-in betrayal baked into the structure.
Part of what makes this difficult is that public funding for open source infrastructure is still very uneven geographically. I'm based in Korea, and there's essentially nothing here comparable to what European developers can access. I had no choice but to turn to European funds, because there was simply no domestic equivalent. That's a structural problem worth taking seriously. The more countries that leave this entirely to the private sector, the more we end up watching exactly this kind of thing play out.
A lot of great open source comes out of startups because startups are really good at shipping fast and getting distribution (open source is part of this strategy). Users can try the tool immediately, and VC funding can put a lot of talent behind building something great very quickly.
The startup model absolutely creates incentive risk, but that’s true of any project that becomes important while depending on a relatively small set of maintainers or funders.
I’m not sure an acquisition is categorically different from a maintainer eventually moving on or burning out. In all of those cases, users who depend on the project take on some risk. That’s not unique to startups; it’s true of basically any software that becomes important.
There’s no perfect structure for open source here - public funding, nonprofit support, and startups all suck in their own ways.
And on the point you make about public funding being slow: yeah, talented people can’t work full-time on important things unless there’s serious funding behind it. uv got as good as it is because the funding let exceptional people work on it full-time with a level of intensity that public funding usually does not.
Personally, I'd like to know, since you have been active in Korea, if there is any groups that I can attend to.
Full time effort put into tools gets you a looooot of time to make things work well. Even ignoring the OSS stuff, many vendors seem to consider their own libs to be at best “some engineers spend a bit of time on them” projects!
I would absolutely love to know more about this if you are willing to share the story?
Thanks for your work.
Astral was founded as a private company. Its team has presumably worked hard to build something valuable. Calling their compensation for that work 'betrayal' is unfair.
Community-based software development sounds nice. But with rare exception, it gets outcompeted by start-ups. Start-ups work, and they work well. What is problematic is the tech giants' monopoly of the acquisition endpoint. Figuring out ways for communities to compete with said giants as potential acquirers is worth looking into–maybe public loans to groups of developers who can show (a) they're committed to keep paying for the product and (b) aren't getting kicked back.
I don't see any betrayal here, since the tools are still OSS - yeah OpenAI might take it a different direction and add a bunch of stuff I don't like/want, but I can still fork
Most of the companies that spend $$$$ with them can't use public registries for production/production-adjacent workloads due to regulations and, secondarily a desire to mitigate supply chain risk.
Artifactory is a drop-in replacement for every kind of repository they'll need to work with, and it has a nice UI. They also support "pass-through" repositories that mirror the public repositories with the customization options these customers like to have. It also has image/artifact scanning, which cybersecurity teams love to use in their remediation reporting.
It's also relatively easy to spin up and scale. I don't work there, but I had to use Artifactory for a demo I built, and getting it up and running took very little time, even without AI assistance.
Like, nobody really pays for web servers - there are too many good free options. They're far more complex than Artifactory.
I guess it's just that it's a product that only really appeals to private companies?
There are no competing open-source projects because such projects would need to provide more value than Artifactory/Sonatype OSS, which are both already huge projects, just to be considered.
Having a private package index gives you a central place where all employees can install from, without having to screen what each person is installing. Also, if I remember right, there are some large AI and ML focused packages that benefit from an index that's tuned to your specific hardware and workflows.
Every company needs its own package repository. You need to be able to control what is running on your environment. Supply-chain risk is very, very real and affects anybody selling software for a living.
This is besides the point that in the real world, not every risk is addressed, at least in part because available resources are diverted to address larger risks.
Plus the obvious need for a place to host proprietary internal libraries.
It was because Astral was VC funded.
https://astral.sh/blog/announcing-astral-the-company-behind-...
There seems to be a pervasive believe that the Python tooling and interpreter suck and are slow because the maintainers don’t care, or aren’t capable.
The actual problem is that there isn’t enough money to develop all of these systems properly.
Google says that Astral had 15 team members. Or course, it’s so hard to make these projections. But it wouldn’t shock me if uv and ruff are each individually multi-million dollar pieces of software.
If you’d like to invest a million dollars to improve pip, or work for free for 3 years to do it yourself, I’m not sure if anyone would object.
That bootstrapping process just installs the wheel's contents, no Internet connection required. (Pip does, of course, download pip for you when you run its self-upgrade — since the standard library wheel will usually be out of date).
Either pay for the product, or use stuff that isn't dependent on VC money, this is always how it ends.
Maybe you use non-transitive pure Python dependencies, but it's likely that your tools and dependencies still rely on stuff in Rust or C (e.g.: py-cryptography and Python itself respectively).
As mentioned multiple times, since my experience with Tcl and continuously rewriting stuff in C, I tend to avoid languages that don't come with JIT, or AOT, in the reference tooling.
I tend to work with Java, .NET, node, C++, for application code.
Naturally AI now changes that, still I tend to focus on approaches that are more classical Python with pip, venv, stuff written in C or C++ that is around for years.
Consider ffmpeg. You can donate via https://www.ffmpeg.org/spi.html
How much money do they make from donations? I don't know but "In practice we frequently payed for travel and hardware."
Translation: nothing at all.
If such a fundamental project that is a revenue driver for so many companies, including midas-level rich companies like Google, can't even pay decent salaries for core devs from donations, then open source model doesn't work in terms of funding the work even at the smallest possible levels of "pay a reasonable market rate for devs".
You either get people who just work for free or businesses built around free work by providing something in addition to free software (which is hard to pull off, as we've seen with Bun and Astral and Deno and Node).
There are examples of foundations or other similar entities paying developers, like Linux, SQLite, even Zig.
Maybe the difference is some projects rely on core contributors more because external contributions are more restricted in some way.
But sure, the entire open source model doesn't work, lol
At worst, it's just Anaconda II AI Boogaloo. The ecosystems will evolve and overcome, or will die and different ecosystems rise to meet the need going forward.
I anticipate OpenAI will get bored and ignore Astral's tools. Software entropy will do its thing and we will remember an actively developed uv as the good old days until something similar to cargo gets adopted as part of Python's standard distribution.
This is a bad thing IMHO
The only thing that could prevent this is lack of ability to execute, like how Uber wanted to replace drivers with FSD vehicles.
Even if they believe that their systems will eventually tank employment and replace developers rather than augment meant, the fate of Astral doesn't matter at all in that scenario because a) nobody has a job, and b) you can build your own uv replacement for $20.
I hope those two factors mean that if things go really wrong, then the clean(ish) break with all the non standard complex legacy means an easier future for community packaging efforts.
OpenClaw notably was built around Mario Zechner's pi[0]; uv I believe was highly adapted from Armin Ronacher's rye[1], and uses indygreg's python-build-standalone[2] for distributing Python builds (both of which were eventually transferred to Astral).
[0]: https://github.com/badlogic/pi-mono
In the worst case, Astral will stop developing their tools, someone else will pick them up and will continue polishing them. In the best case, they will just continue as they did until now, and nothing will really change on that front.
Astral is doing good work, but their greatest benefit for the ecosystem so far was showing what's possible and how it's down. Now everyone can take up the quest from here and continue. So any possible harm from here out will be not that deep, at worst we will be missing out on many more cool things they could have built.
We need public investment in open source, in the form of grants, not more private partnerships that somehow always seem to hurt the community.
i'd love there to be infinite public free money we could spend on Making Good Software but at least in the US, there is vanishingly small public free money available, while there's huge sums of private free money even in post-ZIRP era. If some VCs want to fund a team to write great open source software the rest of us get for free, i say "great thanks!"
They bought the trust.
OpenAI isn't a VC. It's VC-backed. But so is Astral.
Every interface kenneth reitz originally designed was fantastic to learn and use. I wish the influx of all these non-pythonistas changing the language over the last 10 years or so would go back and learn from his stuff.
I started using VS Codium, and it feels like using VS Code before the AI hype era. I wonder if we're going to see a commercial version of uv bloated with the things OpenAI wants us all to use, and a community version that's more like the uv we're using right now.
[1] https://github.com/platformio/platformio-vscode-ide/issues/1...
Honestly though, it's easier to disable ~three settings in VSCode and call it a day.
Glad to hear that I am avoiding Microsoft's spam.
Probably inevitable, and I don’t blame the team, I just wish it were someone else.
Microsoft acquires Astral
Wish comes with a cost
I'm not worried about OpenAI messing up uv, I'm worried about OpenAI running out of money and cutting an unnecessary team.
I'm not very deep in Python anymore, but every time I dip my toes back in it's a completely different set of tools, with some noticably rare exceptions (eg, numpy).
uv feels like the first 100% coverage solution i've seen in the python space, and it also has great developer experience.
I can't speak to the rest of the ecosystem, when I write python it's extremely boring and I'm still using Flask same as i was in 2016.
It was a VC backed tool. What did you expect?
I don't really see the value for OAI/Anthropic, but it's nice to know that uv (+ ty and many others) and Bun will stay maintained!
From Astral the (fast) linter and type checker are pretty useful companions for agentic development.
The value for Anthropic / OAI is that they have a strong interest in becoming the "default" agent.
The one that you don't need to install, because it's already provided by your package manager.
I think they're more into the extra context they can build for the LLM with ruff/ty.
$ curl 'claude.ai/install?key=abcd123' | bash -e
$ claude 'finish laptop setup from http://github.com/justjake/Dotfiles'
claude will be the one to install / set up the system, not the other way around. claude was certainly the one who installed `uv` on my current machine.Embrace, extend, extinguish. Time will tell.
I won't be surprised if the next step is to acquire CI/CD tools.
There is the literal benefit of "we use the hell out of this tool, we need to make sure it stays usable for us" and then there is what they can learn from or coerce the community in to doing.
Depends if you think the bubble is going to pop, I suppose. In some sense, independence was insulation.
Seems like the big AI players love buying up the good dev tooling companies.
I hope this means the Astral folks can keep doing what they are doing, because I absolutely love uv (ruff is pretty nice too).
That is definitely the plan!
Only time will tell if it will not affect the ecosystem negatively, best of luck though, I really hope this time is different™.
I've also been in the industry for similarly long and I believe you 100% that's "all you can say" ;)
Okay, so better prepare already, folks!
Would be a good mustache-twirling cartoon villain tactics, you know, try to prevent advances in developer experience to make vibecoding more attractive =)
Been running uv in every AI/ML project for the past year -- the speed difference when resolving large dependency trees (PyTorch + transformers + a dozen extras) is genuinely significant. It's one of those tools where you forget how bad pip was until you have to go back.
Coming from a Rust background I have a lot of respect for the implementation decisions that made that speed possible. My main concern isn't feature direction -- it's that the team culture IS the product right now, and that's harder to preserve than a codebase. Cautiously watching.
I love(d) `uv`. I think it's one of the best tools around for Python ecosystem... Therefore the pit in my tummy when I read this.
Yes, congrats to the team and all that.
I'm more worried about the long term impact on the ecosystem, as are almost everybody who dropped a comment here.
My own thoughts echo somewhat what @SimonW wrote here [1]
[1] https://simonwillison.net/2026/Mar/19/openai-acquiring-astra...
However, a forking strategy is may (or may not) be the best for `uv`.
Could we count on the Astral team to keep uv in a separate foundation?
Now for those wondering who would fork and maintain it for free, that is more of a critic of FOSS in general.
I didn't see a single comment of "I will fork it" type.
So that's all it really comes down to; uv isn't loved just because it's great but because it is in good hands. This real/perceived change of hands pretty much explains all the downstream responses to the news that you see in this thread. Regardless of who bought them, any fork is going to have very, very big shoes to fill, and filling those shoes appropriately is the big worry.
It would be, and yet: https://ourincrediblejourney.tumblr.com/
Something like this was always inevitable. I just hope it doesn’t ruin a good thing.
If you find your popular, expensive tool leans heavily upon third party tools, it doesn't seem a crazy idea to purchase them for peanuts (compared to your overall worth) to both optimize your tool to use them better and, maybe, reduce the efficacy of how your competitors use them (like changing the API over time, controlling the feature roadmap, etc.) Or maybe I'm being paranoid :-)
Jokes aside, these tools are currently absolutely free to use, but imagine a future when your employers demand you use Claude Code because that's the only license agreement they have, and they stop their AI agents from using uv. Sure, we all know how to use uv, but there will also come a time and place when they will ask us to not write a single line of code manually, especially if you have your agents running in "feared the most by clueless middle managers" "production".
Are you ready for factionalism and sandbox wars? Because I'm not. I just want to write my code, push to "production" as I see fit and be happy as pixels start shifting around.
In a completely unrelated event, Donald sues Sam for 10M$ for calling him old, Sam grudingly agrees to pay him 16M$ and a beer.
Take ruff, I have used it, but I had no idea it even had a company behind it... And I must not be only one and it must not be only tool like it...
For anyone thinking through what this means for their data: OpenAI's API terms give them broad rights to use inputs for model improvement. Once uv is part of that stack, it's worth asking what "telemetry" looks like under their ownership.
This is exactly why I've moved my AI usage to platforms built around data sovereignty—ones where your conversations don't feed back into the mothership. The tooling acquisition makes it more urgent, not less.
[Disclosure: I work with pugchat.ai, a privacy-first AI platform—mentioning because it's relevant to the data sovereignty point, not to shill]
> uvex add other_slop_project —-disable_peddled_package_recommendations
> implicitly phoning home your project, all source code, its metadata, and inferring whether your idea/use-case is worth steamrolling with their own version.
This is the future of “development”. Congrats to the team.
That said, I hope the excellent Astral team got a good payday.
As a user of uv who was hoping it would be a long term stable predictable uninteresting part of my toolchain this sucks, right?
Anthropic acquiring Bun, now OpenAI acquiring Astral. Both show the big labs recognize that great AI coding tools require great developer tooling, and they are willing to pay for it rather than build inferior alternatives. Good outcome for the teams.
Not exactly a great look for the "AGI is right around the corner" crowd — if the labs had it, they would not need to buy software from humans.
> the Astral team will join the Codex team at OpenAI and over time, we’ll explore deeper integrations that allow Codex to interact more directly with the tools developers already use
Gross.
How can two things be the same:
Iur AI will replace every SWE in the next 5yrs
Our tooling is bad
We need to buy a company that brought a revolution in python tooling
If LLMs and GTP5 codex is soo good then how isn't all tooling related to OpenAI ebinf written by it?
Same goes for Claude. They say Claude writes Claude... Why isn't Claude Code then writtten by Claude in such a way that it has a platform specific optimized binary rather than an Electron app.
It's almost as if they're generating hype for their sales/valuation
* uv is better than pip-tools in every single way, apart from it being VC-backed.
* Nobody is placing any value on uv being on someone’s CV.
You are essentially incorrectly LLM pattern-matching being a jaded, sarcastic “smart person”. Learn to tell the difference between a “shiny new thing” and a legitimately better tool. If you can’t do that, you’re at best a sub-par developer.
(sure, it's a bit different than contributing to CPython, but I'd argue not that different)
I do not see any alignment between the Astral that I want Astral to be, and OpenAI.
"Our AI can do anything a human can do, better, faster, cheaper" -> Buys a product instead of asking their AI to just make it.
Really doesn't give me confidence in your product!!!!
The disappointment and anger is because we’ve had a nice QOL improvement which is now more directly threatened in a way that it was before, and it’s always hard to go backwards. A QOL improvement that you never had in the first place. So…congrats?
Unless your point is “this is why I deprive myself of nice things, because they can go away”…which is just silly.
I probably did come off a bit 'told you so' but I guess it was more that it felt like this was finally an answer to a question/curiosity I've had about `uv` where I didn't understand the dissonance between how others felt about it and how I did.
"What's the matter, just fork it when it goes bad?"
The problem is that uv in and of itself, whilst a great technical achievement isn't sufficient. Astral run a massive DevOps pipeline that, just to give one example, packages the python distributions.
Those who are saying that forking is an option are clearly not arguing it in good faith.
But you know best it seems!
One of the bigger pain points I’ve faced in Python is dependency resolution. conda could take 30-60 minutes in some cases. uv took seconds.
A serious quality of life improvement.
Quite literally this is what first raised the alarm bells for me. Dependency resolution complexity is more of a symptom. If that delay ends up being the point where Ops finally agrees that things have gone very wrong, then fixing that delay is not really helping hire the maintenance folks that can make those dependencies.. well, "dependable" again.
Was there a better option? I’m sure. Choices were made. Regrets were had. Switching to uv was a huge improvement for our purposes.
If you’re going to be a grump about everything, sometimes your broken clock is going to be right. It’s still not worth showing off.
Everything I've seen from Astral and Charlie indicates they're brilliant, caring, and overall reasonable folks. I think it's unfair to jump to call them sell-outs and cast uv and the rest as doomed projects.
And framing it as "sell-outs" is cheap rhetoric that means nothing. The fact is, they were the company who never really had a solid business model, but provide a lot of value for the community. Being acquired by some infinite-money company was always the best outcome they could hope for. Well, they did. Probably got a ton of money. Will it require some sacrifice? Well, some people would say that working for a company who makes products for the Department of War of the USA on conditions that even Anthropic found too ugly to satisfy, is enough of a sacrifice on its own. I am pretty sure though that most people would be willing to make this sacrifice for the right amount of money (with "right amount" being a variable part). So calling someone a sell-out is usually just bitterness about the fact that it wasn't you who managed to sell out. I mean, not judging someone for a sacrifice they make isn't the same thing as pretending they didn't make a sacrifice. Sometimes we (the world, they were trying to make better) are a sacrifice. That's all.
The interesting question is whether Astral stays relatively independent (like GitHub under Microsoft) or becomes tightly coupled to OpenAI’s platform.
This doesn't make as much sense. OpenAI has a better low level engineering team and they don't have a hot mess with traction like Anthropic did. This seems more about acquiring people with dev ergonomics vision to push product direction, which I don't see being a huge win.
I do not want OpenAI putting their fingers in my Python binaries, nor do I want their telemetry.
Things come and go, let’s not beat up some dudes who made some cool stuff, made everyone’s lives easier and then sold up. There is a timeline where this makes UV / python better.
Why buy, when they can rent?
(Not to mention, multiple companies could hire and fund them.)
One prompt and call it a weekend. Surely they have the compute.
Oooooh, right.
Same reason we don't have windows 41 either.
Perhaps it's naive optimism, but I generally have hope that new and improved tools will continue to gain adoption and shine through in the training data, especially as post-training and continual learning improve.
2. In any case, the announcement strongly suggests that customer acquisition had little to do with this. The stated purpose of the acquisition, as I read it, is an acquisition (plus acquihire?) to bolster their Codex product.
3. But if they were hoping for some developer goodwill as a secondary effect... well, see my note above.
I'm a heavy user and instructor of uv. I'm teaching a course next week that features uv and rough (as does my recent Effective Testing book).
Interesting to read the comments about looking for a change. Honestly, uv is so much better than anything else in the Python community right now. We've used projects sponsored by Meta (and other questionable companies) in the past. I'm going to continue enjoying uv while I can.
It does look like this is going to be the norm for popular open source projects related to AI ecosystem, but I guess open source developers need to get paid somehow if that project is their only livelihood.
Shame for the end-user though. As you will always be second guessing how they will ruin the tool, i.e. via data collection or AI-sloppifying it. It is likely OpenAI won't, but it is not a great feeling knowing a convenient tool you use is at the whim of a heartless mega-corp.
Good for Astral though I guess, they do great work. Just not optimistic this is gonna be good for python devs long term.
https://www.cnbc.com/2026/03/17/openai-preps-for-ipo-in-2026...
I suspect some OpenClaw "secure" sandbox is coming (Nvidia jealousy) with Astral delivering the packages for Docker within Docker within Qemu within Qubes. A self respecting AI stack must be convoluted.
I can't wait until all this implodes after the IPOs.
Astral’s tools have been a huge QOL improvement for a great many developers using what’s, what? The second most piously programming language ever?
Does the level of attention surprise you? Or do you just flag things that you personally aren’t into?
Astral has demonstrated that there is desire for this sort of "just works" thing, which I struggled with, and led me to abandoning it. (I.e.: "pip/venv/conda are fine, why do I want this?", despite my personal experience with those as high-friction)
I just hope that Charlie doesn’t trot around the dev circuit (like he has in years past) trying to sell everyone on this “being a good thing, actually”. I hope that he isn’t given the space to sell any story other than “we took the AI money despite it being a terrible fit”, because that story would just be a lie. The fact that this blog post is already trying to preemptively justify it—“well in my launch post what I said is…”—is extremely, extremely telling.
This is so hugely disappointing. And again, I am at this point quite bullish on AI. This isn’t a philosophical or anti-AI take at all, because those are easier to dismiss.
I’m not going to pretend to “congratulate the team” or whatever. As far as I’m concerned, that’s HN culture brain rot. Some of y’all in ‘startup culture’ may see getting acquired by OpenAI as some sort of big prize or worth celebrating or whatever, but I certainly don’t.
https://news.ycombinator.com/item?id=47414032
Uv did solve a distribution problem for them.
There is still a lot of room to grow in the space of software packaging and distribution.
I'm not really sure about this.
On the other side, thinking about, if all the AI-Slope is moving to their own platform, it would be a big benefit for Github. And maybe this is their real goal? To let Github continue being a source of good high quality human-maintained training-material?
For our teams, Codex is a massive productivity booster that actually increases the value of each dev. If you check our hiring page, you’ll see we are still hiring aggressively. Our ambitions are bigger than our current workforce, and we continue to pay top dollar for talented devs who want to join us in transforming how silicon chips provide value to humans.
Akin to how compilers reduced the demand for assembly but increased the demand for software engineering, I see Codex reducing the demand for hand-typed code but increasing the demand for software engineering. Codex can read and write code faster than you or me, but it still lacks a lot of intelligence and wisdom and context to do whole jobs autonomously.
You realise that there’s certainly at least one Astral dev that uses Copilot or Claude Code or whatever, right?
You’re so anti-AI that you’re making nonsensical arguments. The existence of AI doesn’t mean that human effort and skill and care is worth nothing. OpenAI has never argued that. Nothing is incongruent here except for the completely fictionalised worldview you’ve conjured up and attributed to…a company?
They could start by inventing any software with their agents. They probably should prove their offering is good enough to do that considering they're hundreds of billions of dollars in debt, owing truckloads of money they currently have no hope of repaying to investors who are being promised a literal revolution.
According to the blog [0], their whole monorepo is in Python, their models are obviously trained using Python, their experiments are written using Python and core and CLI of their Codex is written using Rust. Uv brings both Python and Rust expertise. You’re talking nonsense because of your blind hate of LLMs. Even though I agree that they’re capitalizing on the fear of SWE being redundant.
Obviously, buying skilled Rust devs makes sense for any normal software company that develops in Rust. I wouldn't be making a point out of it if the headline were "Amazon buys Rust developers". Or if OpenAI were honest about what their product is.
They are buying out investors, it's like musical chairs.
The liquidity is going to be better on OpenAI, so it pleases everyone (less pressure from investors, more liquidity for investors).
The acquisition is just a collateral effect.
This was an acquihire (the author of ripgrep, rg, which codex uses nearly exclusively for file operations, is part of the team at Astral).
So, 99% acquihire , 1% other financial trickery. I don't even know if Astral has any revenue or sells anything, candidly.
It means the company almost reached their runway, so all these employees would have to find a job.
It's a very very good product, but it is open-source and Apache / MIT, so difficult to defend from anyone just clicking on fork. Especially a large company like OpenAI who has massive distribution.
Now that they hired the employees, they have no more guarantees than if they made a direct offer to them.
I'm not too plugged into venture cap on opensource/free tooling space but raising 3 rounds and growing your burn rate to $3M/yr in 24 months without revenue feels like a decently risky bag for those investors and staff without a revenue path or exit. I'd be curious to see if OpenAI went hunting for this or if it was placed in their lap by one of the investors.
OpenAI has infamously been offering huge compensation packages to acquire talent, this would be a relative deal if they got it at even a modest valuation. As noted, codex uses a lot of the tooling that this team built here and previously, OpenAI's realization that competitors that do one thing better than them (like claude with coding before codex) can open the door to getting disrupted if they lapse - lots of people I know are moving to claude for non-coding workflows because of it's reputation and relatively mature/advanced client tools.
(I work at Astral)
I would sincerely have understood better (and even wished) if OpenAI made you a very generous offer to you personally as an individual contributor than choose a strategy where the main winners are the VCs of the purchased company.
Here, outside, we perceive zero to almost no revenues (no pricing ? no contact us ? maybe some consulting ?) and millions burned.
Whether it is 4 or 8 or 15M burned, no idea.
Who's going to fill that hole, and when ? (especially since PE funds have 5 years timeline, and company is from 2021).
The end product is nice, but as an investor, being nice is not enough, so they must have deeper motives.
Bundling codex with uv isnt going to meaningfully affect the number of people using it. It doesnt increase the switching costs or anything.
I'm sort of wondering if they're going to try to make a coding LLM that operates on an AST rather than text, and need software/expertise to manage the text->AST->text pipeline in a way that preserves the structure of your files/text.
Writing something that understands all the methods that come in a Django model goes way beyond parsing the code, and is a genuine struggle in language where you can’t execute the code without worrying about side effects like Python.
Ty should give them a base for that where the model is able to see things that aren’t literally in the code and aren’t in the training data (eg an internal version of something like SQLAlchemy).
Not-most popular LLM software development product on the planet acquires most popular/rapidly rising python packaging org for mindshare.
I guess this move might end up in a situation where the uv team comes up with some new agent-first tooling, which works best or only with OAI services.
Good luck vibe coding marketshare for your new tool.
Ant is building their app distribution platform, so no wonder OpenAI thinking the same, it will only surprise me if they move so slow.
These comments are so silly and condescending. You aren’t being clever or smart. You think that you need to go to GitHub and find the repo for uv and put it in a little footnote? You think that you’re being a clever boy by doing that? Everyone knows how open source works. We are developers.
I don't care how good/bad a company is, because I lived long enough to know that most of them started off like that. Good luck to the uv team.
Who's organizing a fork, or is python back to having only shitty packaging available? :(
Fixed: I am so excited to take these millions of dollars.
I didn’t see a way they ever dethroned Claude until now.
Happy as a Codex user, gloomy as a Python one.
If it was cheaper to use their internal AI to create these tools, they would.
Agree with OP here, if AI coding tools are as intelligent and amazing as AI influencers and CEOs are saying, just prompt them to "Remake UV but faster & better".
> Agree with OP here, if AI coding tools are as intelligent and amazing as AI influencers and CEOs are saying, just prompt them to "Remake UV but faster & better".
If average dev is more intelligent and amazing than any coding model, just hire a team of average devs and “remake UV but faster & better”.
Congrats Astral and co!
Although Astral being VC funded was already headed this way anyway.
Deno, Pydantic (Both Sequoia) will go the same way as with many other VC backed "open source" dev tools.
It will go towards AI companies buying up the very same tools, putting it in their next model update and used against you.
Rented back to you for $20/mo.
But the pressure because they raised VC funding, I would imagine Astral needed an actual exit and OpenAI saw Astral's tools as an asset.
- I'm willing to pay for Astral ecosystem so it stays independent/open source
- I'm willing to fork the project
Nobody gave me that opportunity.
Not everybody puts every thought they have into a HN comment. Saying “ I am willing to pay for Astral ecosystem so it stays independent / open-source” just isn’t a very useful thing to say at the moment.
Get off your high-horse.
> I started Astral to make programming more productive.
And now they help make killing more productive
I hate relying on anything that is controlled by a single company. Considering that Astral is basically brand new in the python timeline, it is concerning that they are already being acquired.
On the other hand, UV is so fast that it makes up for anything I find annoying about it.
what can I say?
Any good alternatives to uv/plans for community fork of uv?
Its always hard to really trust these corporate funded open source products, but they've honestly been great.
…but I find it difficult to believe openai owning the corner stone of the python tooling ecosystem is good thing for the python ecosystem.
There is no question openai will start selling/bundling codex (and codex subscriptions) with uv.
I dont think I want my package manger doing that.
Hilarity in the comments will ensue
I am not even sure how to feel about this news but feel a bit disappointed as a user even if I might be happy for the devs that they got money for such project but man, I would've hoped any decent company could've bought them out rather than OpenAI of all things.
Maybe OpenAI wants to buy these loved companies to lessen some of the hate but what its doing is lessening the love that we gave to corporations like astral/uv sadly, which is really sad because uv is/(was?) so cool but now we don't know where this tool might be headed next given its in the hands of literally OpenAI :(
"But he owns a tooling company. WHY can't I have that? :( :("
Have not tried it too much yet because I was pretty content with `uv`, but I've heard lots of good things about it
While I do see that the tone of the comment was stingy, it was aimed towards the frustration I have experienced while developing for Python and this piece of news as well.
I didn't see it as a bad thing as it not really aimed at anybody in particular, more like an opinion on Python's shortcomings.
I will try to post more substantive/less emotional comments going forward.
This is a massive backward step for the Python ecosystem, but it's not like a hundred-billion dollar company will care about that.
OpenAI is Microslop, so it's the classic EEE, nothing new to see
It's like with systemd now planning to enforce gov. age verification
People will censor you if you dare say something negative on this website
So i guess, wears a clown hat "congrats!"
This of course means more VC funding for FOSS tools since a successful exit is a positive signal.
This is peak finance brainrot. In no scenario is abandoning ship a positive signal, even if you managed to pocket some valuables on the way out.
Let's stop celebrating dysfunctional business models and consolidation of the industry around finance bros who give zero fucks about said industry.
What I don’t understand is why hasn’t anyone bought Jetbrains yet.
Atlassian? AWS? Google?
More and more plainly, OpenAI and Anthropic are making plays to own (and lease) the "means of production" in software. OK - I'm a pretty happy renter right now.
As they gobble up previously open software stacks, how viable is it that these stacks remain open? It seems perfectly sensible to me that these providers and their users alike have an interest in further centralizing the dev lifecycle - eg, if Claude-Code or Codex are interfaces to cloud devenvs, then the models can get faster feedback cycles against build / test / etc tooling.
But when the tooling authors are employees of one provider or another, you can bet that those providers will be at least a few versions ahead of the public releases of those build tools, and will enjoy local economies of scale in their pipelines that may not be public at all.
https://pypistats.org/packages/uv
I'm done pretending this is a "right tools for the right job" kind of thing, there's wrong people in the right job, and they only know python. If no one self-writes code anymore anyway, at least use a language that isn't a clusterfuck of bad design decisions, and has 1 trillion lines of code in the collective memory of people who don't know what a stack is.
I can get behind the idea that LLM's probably don't need a language designed for humans if humans arent writing it, but the rest of this is just daft. Pythons popularity isn't just pure luck, in fact its only been in recent years that the tooling has caught up to the point where its as easy to setup as it is to write, which should really tell you something if people persevered with it anyway.
I'm sorry your favourite language doesnt have the recognition it so rightfully deserves, but reducing python to just "stupid language for stupid people" is, well, stupid
Speaking as a grey beard myself, I think its safe to say that the grey beards among us will always deride those who didn't have to work as hard as they did.
I used to do backend development in superior languages, and sometimes do hobby frontend in superior languages, but my work is Python now. And it kind of has to be Python: we do machine learning, and I work with GDAL and PDAL and all these other weird libraries and everything has Python bindings! I search for "coherent point drift" and of course there's a Python library.
The superior languages I mentioned... perhaps they have like a library for JSON encoding and decoding. You need anything else? Great, now you're a library author and maintainer!
To make it good, you need to review and interate.
https://peps.python.org/pep-0723/
* disclosure: We are a commercial client of astral.sh
Circa 2017 I was working on systems that were complex enough that pip couldn't build them and after I got to the bottom of it I knew it not my fault but it was the fault of pip.
I built a system which could build usable environments out of pre-built wheels and sketched out the design of a system that was roughly 'uv but written in Python' but saw two problems: (1) a Python dependent system can be destroyed by people messing with Python environments, like my experience is that my poetry gets trashed every six months or so and (2) there was just no awareness by the 'one tiny project on your machine that pips in four packages' people that there was a correctness problem at all and everybody else was blaming themselves for a problem and didn't have a clear understanding of what was wrong with pip or what a correct model for managing python dependencies is (short answer: see maven) or that a 100% correct model was even possible and that we'd have to always settle for a 97% model. The politics looked intractable so I gave up.
Now written in rust, uv evaded the bootstrap problem and it dealt with the adoption problem by targeting 'speed' as people would see the value in that even if they didn't see the value in 'correctness'. My system would have been faster than pip because it would have kept a cache, but uv is faster still.
I have used them all and UV is the only one that actually solves the problem.
It’s insane that people would suggest that Python can go back.
> The politics looked intractable so I gave up.
So yeah, this is your actual problem. (Don’t worry, I’m in the same camp here.)
Having a static binary makes distribution way simplier. There are a bunch of ways you could try to achive something like in python but it would be significantly larger.
Performance-wise writing it in python would have heavy startup overhead and wouldn't be able to get close to the same level of performance.
Obviously you could achive the same thing in many other languages, but rust ends up being a really good fit for making a small static binary for this workload of network heavy, IO-bound, async/threading friendly with the occasional bit of CPU heavy work.
I always looked down on the Java ecosystem but if it turns out Maven had a better story all along and we all overlooked it, that's wild.
Poetry having users isn’t the metric for success. pip having way less users is.
But yes, in terms of user interface they are pretty similar. UV performance really does make the difference.
Poetry's CLI would often, for me, just fall over and crash. Crashing a lot is not a fundamental problem in the sense you can fix the bugs, but hey I'm not hitting uv crashes.
pipenv was even worse in terms of just hanging during package resolution. Tools that hang are not tools you want in a CI pipeline!
The end result: `uv run` I expect to work. `pipenv` or `poetry` calls I have to assume don't work, have to put retries into CI pipelines and things like that.
`uv run` a .py with inline script metadata has all the deps installed and your script running in a venv while poetry is still deciding to resolve...
(I'm not doing this to be a dick, I genuinely want to know what the use case is)
Things got bearable with virtualenv/virtualenv wrappers, but it was never what I would call great. Pip was always painful, and slow. I never looked forward to using them - and every time I worked on a new system - the amount of finaggling I had to do to avoid problems, and the amount of time I spent supporting other people who had problems was significant.
The day I first used uv (about is as memorable to me as the the day I first started using python (roughly 2004) - everything changed.
I've used uv pretty much every single day since then and the joy has never left. Every operation is twitch fast. There has never once been any issues. Combined with direnv - I can create projects/venvs on the fly so quickly I don't even bother using it's various affordances to run projects without a venv.
To put it succinctly - uv gives me two things.
One - zero messing around with virtualenvwrappers and friends. For whatever reason, I've never once run into an error like "virtualenvwrapper.sh: There was a problem running the initialization hooks."
Two - fast. It may be the fastest software I've ever used. Everything is instant - so you never experience any type of cognitive distraction when creating a python project and diving into anything - you think it - and it's done. I genuinely look forward to uv pip install - even when it's not already in cache - the parallel download is epically fast - always a joy.
You can run a script with a one liner and it will automatically get you the same python and venv and everything as whoever distributed the python code, in milliseconds if the packages are already cached on your local computer.
Very easy to get going without even knowing what a venv or pypi or anything is.
If you are already an expert you get “faster simpler tooling” and if you are a complete beginner it’s “easy peasy lemon squeezy”.
it just works. i'm not sure how else to describe it other than less faffing about. it just does the right thing, every time. there's a tiny learning curve (mostly unlearning bad or redundant habits), but once you know how to wield it, it's a one stop shop.
and as mentioned, it's crazy fast.
You should really qualify that statement, it implies that the Python ecosystem is bearable.
...and then I’ve read the rest of your comment. Please do go read the HN guidelines.
See the point ?
Such an outcome would make me wonder regarding the wisdom of "It is better to have love and lost than to have never loved at all."
Note that uv is fast because — yes, Rust, but also because it doesn’t have to handle a lot of legacy that pip does[1], and some smart language independent design choices.
If uv became unavailable, it’d suck but the world would move on.
[1] https://nesbitt.io/2025/12/26/how-uv-got-so-fast.html
Like, the whole point of open source is that this thread is not a thing. The whole point is "if this software is taken on by a malevolent dictator for life, we'll just fork it and keep going with our own thing." Or like if I'm evaluating whether to open-source stuff at a startup, the question is "if this startup fails to get funding and we have to close up shop, do I want the team to still have access to these tools at my next gig?" -- there are other reasons it might be in the company's interests, like getting free feature development or hiring better devs, but that's the main reason it'd be in the employees' best interests to want to contribute to an open-source legacy rather than keep everything proprietary.
Projects - including forks - fail all the time because the leadership/product direction on a project goes missing despite the tech still being viable, which is why people are concerned about these people being locked up inside OpenAI. Successfully forking is much easier said than done.
I had a sketched out design for a correct package manager in 2018 but when I talked to people about it I couldn't get any interest in it. I think the brilliant idea that uv had that I missed was that it can't be written in Python because if is written in Python developers are going to corrupt its environment sooner or later and you lose your correctness.
I think that now that people are used to uv it won't be that hard to develop a competitor and get people to switch.
Ruff isn’t stable yet either and has evolved into the de facto standard for new projects. It has more than double the amount of rules than Pylint does. Also downloaded more than 3 times as often as Pylint in the past month.
Pylint has some advantages, sure, but Ruffs adoption speaks for itself. Pylint is 25 years old. You’d hope they do some things better.
Saying that uv is their only winner is a hilarious take.
> I would stare longingly into the void, wondering if I can ever work another python project after having experienced uv, ruff, and ty.
You think you're disagreeing with me, but you're agreeing. To wit: The original post is silly, because ty is beta quality and Ruff isn't stable yet either. Your words.
These are just tools, Pylint included. Use them, don't use then, make them your whole personality to the point that you feel compelled to defend them when someone on the Internet points out their flaws. Whatever churns your butter.
na this news is good enough reason to move from Ruff back to black and stay the course, I won't use anything else from Astral. I will use uv but only until pip 2/++ gets its shit together and catches up and hopefully then as a community we should jump back on board and keep using pip even if it's not as good, it's free in the freedom sense.
Then import that tool and and check if __name__ == "__main__"
I had the problem basically understood in 2018 and I am still pissed that everybody wanted to keep taking their chances with pip just like they like to gamble with agent coders today.
Now that people know a decent package manager is possible in Python I think there is going to be no problem getting people to maintain one.
And that's a big part of what's so frustrating about Python generally: it seems to be a language used by lots of people who've never used anything else and have an attitude like "why would I ever try anything else"?
Python has a culture where nominal values of user-friendliness, pragmatism, and simplicity often turn into plain old philistinism.
Or, more relevant to this conversion: If they closed source tomorrow, the community could fork the current version.
() Sure, they were on the shoulders of giants
This is not the point of uv or any good package manager. The point is what prevents Python to suck. For a long time package management had been horrible in Python compared what you could see in other languages.
Not disputing that it's a great and widely used tool, BTW.
Maybe there needs to be some nonprofit watchdog which helps identify those cases in their early stages and helps bootstrap open forks. I'd fund to a sort of open capture protection savings account if I believed it would help ensure continuity of support from the things I rely on.
- https://pypistats.org/packages/poetry - https://pypistats.org/packages/uv
In the 2024 Python developer survey, 18% of the ecosystem used Poetry. When I opened this manifold question[0], I'm pretty sure uv was about half of Poetry downloads.
Estimating from these numbers, probably about 30% of the ecosystem is using `uv` now. We'll get better numbers when the 2025 Python developer survey is published.
Also see this: https://biggo.com/news/202510140723_uv-overtakes-pip-in-ci-u...
[0]: https://manifold.markets/JeremiahEngland/will-uv-surpass-poe...
Perhaps it never grabbed me as much because I've been running basically everything in Docker for years now, which takes care of Python versioning issues and caches the dependency install steps, so they only take a long time if they've changed. I also like containers for all of the other project setup and environment scaffolding stuff they roll up, e.g. having a consistently working GDAL environment available instantly for a project I haven't worked on in a long time.
Second, you can use uv to build and install to a separate venv in a Docker container and then, thanks to the wonders of multistage Docker builds, copy that venv to a new container and have a fully working minimal image in no time, with almost no effort.
https://github.com/direnv/direnv/wiki/Python#uv
If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool?
Mostly no, sometimes I give up and still use pip as a separate user.
> If not, do you develop software with source dependencies (go, java, node, rust, python)? If so, how do you handle acquiring those dependencies—by hand or using a tool
I haven't felt the need to use Go, the only Java software I use is in the OS repo. I don't want to use JS software for other reasons. This is one of the reasons why I don't like Rust rewrites. Python dependencies are very often in the OS repo. If there is anything else, I compile it from source and I curse when software doesn't use or adheres to the standard of the GNU build system.
Go and Rust, specifically, seem a bit odd to be allergic to. Their "package managers" are largely downloading sources into your code repository, not downloading/installing truly arbitrary stuff. How is that different from your (presumably "wget the file into my repo or include path") workflow for depending on a header-only C library from the internet which your OS doesn't repackage?
I understand if your resistance to those platforms is because of how much source code things download, but that still seems qualitatively different to me from "npm install can do god-knows-what to my workstation" or "pip install can install packages that shadow system-wide trusted ones".
The package manager I use, apt on Debian, does not package many Python development repos. They've got the big ones, e.g. requests, but not e.g. uuid6. And I wouldn't want it to - I like the limited Debian dev effort to be put towards the user experience and let the Python dev devs worry about packaging Python dev dependencies.
And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
I favor stability and the stripping of unwanted features (e.g. telemetry) by my OS vendor over cutting edge software. If I really need that I install it into /usr/local, that it what this is for after all.
> And let’s say you constrain yourself to your OS package manager. What about the people on different distros? Their package managers are unlikely to have the exact same versions of your deps that your OS has.
This is a reason to select the OS. Software shouldn't require exact versions, but should stick to stable interfaces.
(Source: I'm an Astral employee.)
That's a point of information, not a point of order.
[1]: https://astral.sh/blog/introducing-pyx
It's not perfect, but it is light-years better than what preceded it.
I jumped ship to it and have not looked back. (So have many of my clients).
I'm on the fence about cancelling my JetBrains subscription I've had for nearly 10 years now. I just don't use it much. Zed and Claude Code cover all my needs, the only thing I need is a serious DataGrip alternative, but I might just sit down with Claude and build one for myself.
It's a project I'm working on to build a database management product I've always wanted.
I spend way too much time obsessing over UX but hope people appreciate it :).
https://xkcd.com/2347/
People need to be very careful about resisting. OpenAI wants to make everyone unemployed, works with the Pentagon, steals IP, and copyright whistleblowers end up getting killed under mysterious circumstances.
This is untrue. People frequently complained that they were VC funded and used it to justify mistrust.
Take this discussion, for example. Completely dominated by the topic.
https://news.ycombinator.com/item?id=44892209
It's not there yet, but it's getting there.
It's so fast in fact that we just added `ty check` to our pre-commit hooks where MyPy previously had runtimes of 150+ seconds _and_ a mess of bugs around their caching.
1. For the record: the GPL is entirely dependent on copyright.
2. If AI "clean-room" re-implementations are allow to bypass copyright/licenses, the GPL won't protect you.
This is right up there with Meta lawyers claiming that when they torrent it's totally legal but when a single person torrents it's copyright infringement.
[0] https://legalblogs.wolterskluwer.com/copyright-blog/the-bart...
Isn't that the same for the obligations under BSD/MIT/Apache? The problem they're trying to address is a different one from the problem of AI copyright washing. It's fair to avoid introducing additional problems while debunking another point.
2. BigCo bus Company A
3a. usually here BigCo should continue to develop Project One as GPLv3, or stop working on it and the community would fork and it and continue working on it as GPLv3
3b. BigCo does a "clean-room" reimplementation of Project One and releases it under proprietary licence. Community can still fork the older version and work on it, but BigCo can continue to develop and sell their "original" version.
patents protect ideas, copyright protects artistic expressions of ideas
I'm careful to not rely too heavily on VC funded open source whenever I can avoid it.
All so they could just vacuum it all up and resell it with impunity.
Feel free to prove me wrong by pointing out this massive amount of advocacy from "mega-clouds" that changed people's minds.
The ads, the mailing list posts, social media comments. Anything at all you can trace to "mega-clouds" execs.
https://choosealicense.com/about/
> "GitHub wants to help developers choose an open source license for their source code."
This was built by GitHub Inc a very very long time ago.
https://opensource.stackexchange.com/questions/1150/is-my-co...
So long ago, in fact, that it was five years before their acquisition by Microsoft.
And, sure, djb wasn't actually likely to sue you if you went ahead and distributed modified versions of his software... but no-one else was willing to take that risk, and it ended up killing qmail, djbdns, etc stone dead. His work ended up going to waste as a result.
Agreed with the rest, though. I relied heavily on qmail for about a decade, and learned a lot from the experience, even if it was a little terrifying on occasion!
Not everybody is dictated by corporate attorneys; I don’t think this is an accurate portrayal.
I mean philosophically and morally, sure, one can take that position ... but copyright law does not work like that, at least not for anything published in the US after 1989 [1].
[1] https://www.copyright.gov/circs/circ03.pdf
The ones pushing for permissive licenses are rather companies like Apple, Android (and to some extent other parts of Google), Microsoft, Oracle. They want to push their proprietary stuff and one way to do that in the face of open source competition is by proprietary extensions.
The FOSS community at large embraced permissive licenses and it had nothing to do with the interests of big corporations.
and that seems like a strange choice…
Could you say more?
Preferring GPL licensed software means that you're immune to a sudden cut off of access so it's always advisable - but it's really important to stay on top of dependencies and be willing to pay the cost if support is withdrawn. So GPL helps but it isn't a full salve.
https://x.com/AprilNEA/status/2034209430158619084
Ironically this type of stuff really makes me doubt their AGI claims, why would they bother with this stuff if they were confident of having AGI within the next few years? They would be focused on replacing entire industries and not even make their models available at any price. Why bother with a PaaS if you think you are going to replace the entire software industry with AGI?
Is this not just the strategy of all platforms. Spy on all customers, see what works for them and copy the most valuable business models. Amazon does that with all kinds of products.
Platforms will just grow to own all the market and hike prices and lower quality, and pay close to nothing to employees. This is why we used to have monopoly regulations before being greedy became a virtue.
Just wait till they offer "Developer Certification" so you have to pay them to get a shiny little badge and a certificate while they go around saying no badge = you're shit.
because AGI doesn't grow in a cage, it requires a piece of software running somewhere. someone has to build both to get that happen. that is like a high school level question.
even if this is true, someone needs to build the platform and the software required to get to the singularity.
one can also argue that lots of $ is required to get to the singularity, taking control of how the world builds, deploys and operates the digital world is a proven avenue to get such $.
Microsoft has been a reasonable steward of github and npm considering everything but I don't feel so good about OpenAI this makes me reconsider my use of uv and Python as a whole because uv did a lot to stop the insanity. Not least Microsoft has been around since 1975 whereas I could picture OpenAI vanishing instantly in a fit of FOMO.
That means OpenAI will be able to do whatever they want to your Python binaries, including every Python binary in your deployments, with whatever telemetry that want to instrument in the builds.
Once we start seeing Open AI and Anthropic getting into the certifications and testing they'll quickly become the gold standard. They won't even need to actually test anyone. People will simply consent to having their chat interactions analyzed.
The models collect more information about us than we could ever imagine because definitionally, those features are unknown unknowns for humans. For ML, the gaps in our thinking carry far richer information about is than our actual vocabularies, topics of interest, or stylometric idiosyncrasies.
There will come a day when you can will an entire business into existence at the press of a button. Maybe it has one or two people overseeing the business logic to make sure it doesn't go off the rails, but the point is that this is a 100x reduction in labor and a 100,000x speed up in terms of delivery.
They'll price this as a $1M button press.
Suddenly, labor capital cannot participate in the market anymore. Only financial capital can.
Suddenly, software startups are no longer viable.
This is coming.
The means of production are becoming privatized capital outlays, just like the railroads. And we will never own again.
There is nothing that says our careers must remain viable. There is nothing that says our output can remain competitive, attractive, or in demand. These are not laws.
Knowledge work may be a thing of the past in ten years' time. And the capital owners and hyperscalers will be the entirety of the market.
If we do not own these systems (and at this point is it even possible for open source to catch up?), we are fundamentally screwed.
I strongly believe that people not seeing this - downplaying this - are looking the other way while the asteroid approaches.
This. Is. The. End.
If the barrier to button-pressed companies goes that high up, the cost to run/consume the product also goes up. Making hand-rolled products cheaper.
Slower paced to roll out things? Sure.
That's the precarious balance these LLMs providers have to make. They can't just move on without the people feeding it data and value. The machine is not perpetual.
What if labor organizes around human work and consumers are willing to pay the premium?
At that point, it's an arms race against the SotA models in order to deepen the resolution and harden the security mechanisms for capturing the human-affirming signals produced during work. Also, lowering the friction around verification.
In that timeline, workers would have to wear devices to monitor their GSR and record themselves on video to track their PPG. Inconvenient, and ultimately probably doomed, but it could extend or renew the horizon for certain kinds of knowledge work.
We could start today, but sweat shops and factories dominate the items on our shelves.
But I’m sure people will draw the line at human made software…/s
What is far more likely is that governments use AI to oppress their citizens with robots and drones. That is the thing to be scared of.
I imagine many of these efforts benefitted the community as a whole, but it does make sense that the owners will have these orgs at least prioritize their own internal needs.
The issue that I see is that Nvidia etc. are incentivised to perpetuate that so the open source community gets the table scraps of distills, fine-tunes etc.
The tradeoff with these unified LPDDR machines is compute and memory throughput. You'll have to live with the ~50 token/sec rate, and compact your prefix aggressively. That said, I'd take the effortless local model capability over outright speed any day.
Hope the popularity of these machines could prompt future models to offer perfect size fits: 80 GiB quantized on 128 GiB box, 480 GiB quantized on 512 GiB box, etc.
It's probably a trade secret, but what's the actual per-user resource requirement to run the model?
If the open weights models are good, there are people looking to sell commodity access to it, much like a cloud provider selling you compute.
Plus, most users don't want to host their own models. Most users don't care that OpenAI, Anthropic and Google have a monopoly on LLMs. ChatGPT is a household name, and most of the big businesses are forcing Copilot and/or Claude onto their employees for "real work."
This is "everyone will have an email server/web server/Diaspora node/lemmy instance/Mastodon server" all over again.
Like having a system prompt which takes care of the project structure, languages, libraries etc
It's pretty much the first step to replacing devs, which is their current "North Star" (to be changed to the next profession after)
Once they've nailed that, the devs become even more of a tool then they're already are (from the perspective of the enterprise).
As long as they keep the original projects maintained and those aren't just acqui-hires, I think this is almost as good as we can hope for.
(thinking mainly about Bun here as the other one)
Once you’re acquired you have to do what the boss says. That means prioritizing your work to benefit the company. That is often not compatible with true open source.
How frequently do acquired projects seriously maintain their independence? That is rare. They may have more resources but they also have obligations.
And this doesn’t even touch on the whole commodification and box out strategy that so many tech giants have employed.
Or quit, and take the (Open Source) project and community with you. Companies sometimes discover this the hard way; see, for instance, the story of how Hudson became Jenkins.
The GPL (and even the AGPL) doesn't require you to make your modified source code publicly available (Debian explicitly considers licenses with this requirement non-free). The GPL only states you need to provide your customers with source code.
If AGI becomes available, especially at the local and open-source level, shouldn't all these be democratized - meaning that the AGI can simply roll out the tooling you need.
After all, AGI is what all these companies are chasing.
The ecosystem will be this way for a while, if not the new normal.
My question is if them gobbling up the alternatives will make room for other alternatives to grow.
The point is that the value of accumulated know how and skill that lead to things like uv isn't lost even if the worst would happen to the company or people behind it. I don't think there are many signs of that. I don't think they had much of a revenue model around providing OSS tools. It's problematic for a lot of VC funded companies. An exit like this is as good as it gets. OpenAI now pays them to do their thing. Investors are probably pretty happy. And we maybe get to skip the enshittification that seems inevitable with the whole IPO/hedge funds circus that many vc funded OSS companies end up being subjected to. Problem solved. Congratulations to the team. They can continue doing what they love doing in a company that clearly loves all things python. And who knows what they can do next when freed from having to worry about making investors happy?
Big companies and OSS have always had symbiotic relationships. Some of the largest contributors to open source are people working in big companies. OpenAI fits this tradition beautifully. Most big software companies actively contribute to OSS projects that are relevant or important to them. Even very secretive companies like Apple or profit focused sharks like Oracle. Google, Meta, IBM. There are very few large software companies that aren't doing that. OSS without this very large scale corporate sponsor ships would just be a niche thing. Yes there are a lot of small projects. I have a few of my own even. But most of the big ones have some for profit businesses behind them.
The real meta question is of course if we still need a lot of the people centered development tooling when AIs are starting to do essentially all of the heavy lifting in terms of coding. I think we might need very different tools soon.
Equivalent or better tools will pop up eventually, heck if AI is so fantastic then you could just make one of your own, be the change you want to see in the world, right?
Ah yes, it was impossible to write software before these companies existed, and the only way to write software is via the products from these companies. They sure do control the "means of production".
The fuck does OpenAI have to offer?
Nothing I need.
The only reason Gemini is the best is UX, really running my own Mistral 7b is more than fine.
Because slow ass Gemini is still a slightly more convenient experience I use that.
Nobody OWNS nor will own the means of essentially thoughts. It’s such a silly idea I wonder if it’s propaganda.
That's the problem isn't it. OpenAI have nothing to offer, neither does many of the other AI companies, nothing that's not replaceable at least.
They need to gobble up these other actors in an attempt to own something that someone might actually pay for. Anthropic seems successful with Claude Code and can drive sales that way.
OpenAI struggles to sell ChatGPT as a standalone offering, so they need products and services that will consume it, allowing them to push sales via that route. They also need to control those products, because LLMs are pretty much plug-able at this point, and that's no good. OpenAI needs a lock in.
Oh well. They’ll hopefully get options and make millions when the IPO happens. Everyone eventually sells out. Not everyone can be funded by MIT to live the GNU maximalist lifestyle.
I can't believe people say this with a straight face
It's fast enough for many use cases. That doesn't mean that there is no room for optimization, but this is far less a deciding factor these days.
> it's not type safe
You can do static analysis with Mypy and other tools.
> it has not real concurrency.
There's different mechanisms for running things concurrently in Python. And there's an active effort to remove the GIL. I also have to ask: What is "real" concurrency?
Admittedly, the things you mention are not Python's strongest points. But they are far from being dealbreakers.
Is it? We still need meatspace humans to vet what these AI agents produce. Languages like C++ / Rust etc still require huge cognitive overhead relative to Python & that will not change anytime soon.
Unless the entire global economy can run on agents with minimal human supervision someone still has to grapple with the essential complexity of getting a computer to do useful things. At least with Python that complexity is locked away within the CPython interpreter.
Also an aside, when has a language ever gotten traction based solely on its technical merits? Popularity is driven by ease-of-use, fashion, mindshare, timing etc.
Overall the switch has been very much loved. Everything is faster and more stable, we haven't seen much of a reduction in output
And as someone who loves Python and has written a lot of it, I tend to agree. It's increasingly clear the way to be productive with AI coding and the way to make it reliable is to make sure AI works within strong guardrails, with testsuites, etc. that combat and corral the inherent indeterminism and problems like prompt injection as much as possible.
Getting help from the language - having the static tooling be as strict and uncompromising as possible, and delegating having to deal with the pain to AI - seems the right way.
Secondly it's non factual. Python's market share grew in 2025[1][2][3]. Probably driven by AI demand.
[0]: even truer for natural languages.
[1]: https://survey.stackoverflow.co/2025/technology#most-popular...
[2]: https://survey.stackoverflow.co/2024/technology#most-popular...
[3]: https://pypl.github.io/PYPL.html
Do you really think AI agents of the future will be coding in Python??? What advantage would that possibly give them? That's the only laughable take here
Rust is great. But AI isn't displacing Python anytime soon.
Moreso it sucks that Astral's been bought by a company with such a horrible leader at the helm.
There is also a really good ecosystem of libraries, especially for scientific computing. My experience has been that Claude can write good c++ code, but it's not great about optimization. So, curated Python code can often be faster than an AI's reimplementation of an algorithm in c++.
If I ask an LLM or agentic AI to build something and don't specify what language to use, I'd wager that it'll choose python most of the time. Casual programmers like academics or students who ask ChatGPT to help them write a function to do X are likely to be using Python already.
I'm not a Python evangelist by any means but to suggest that AI is going to kill Python feels like a major stretch to me.
EDIT: when I say that Python can do anything any other language can do, that's with the adage in mind. Python is the second best language for every task.
My last two companies went all in on python and really regretted it. It's performance and concurrency primitives really hurt as you scale