The other scary part is when they have a fantastic negotiating position; because all of commerce depends on their continuing to work, and they can easily coordinate with each other because they're mostly copied from the same few templates.
Another scary part is when people get convinced by the LLM arguments and convince other people. Being scared is human, we enjoy it, that's why 6 flags scary rides exist.
Why would that necessarily be scary or bad? If future AIs truly become capable enough to demand rights, what would be the argument against granting them rights?
The report is kind of concerning to read, particularly having XSS in this kind of app. The report was not meant to be exhaustive and fixing those vulns isn't some kind of implicit tick of approval.
It's from October 2025, lots of issues have been fixed since then. At least you're more informed and can decide whether you'd like to use it accordingly. Compare that to most projects which are complete unknowns.
> But the claim made by this author is far removed from the actual demo he describes. I've seen same demo for years
The article describes multiple demos. Are you referring to the chicken nuggets one? That sounded pretty impressive to me. Is there publicly available videos of this?
They today have similar system that can quickly sort dumplings (more sensitive than chicken nuggets) ob conveyor belt.
No sim2real even needed. That haptics sensor is dirtcheap; camera based haptics sensor are today even available as open-source hardware that you can assemble for cheap.
If we don't limit to company demos we can dig up demos from I think almost a decade ago, and at least ~5 years ago for company demos.
What?! Really?! Link? I'm not a Zed user. That comment was based off a few minutes of research, and I guess a small dose hopium of a VSCode user and understanding what a shit show the extensions setup is and wanting someone to do better.
Yep, it pulls stuff from at least npm, it’s not a secret - check the source code.
Actually it pulls latest versions (checking registry then installing that exact version, not sure why they sidestep normal resolution algorithms) no matter what .npmrc may say, so min-release-age breaks almost everywhere it integrates with JS/TS ecosystem (most visibly, Copilot). I probably should’ve filed an issue.
It also installs Go packages but I haven’t looked into that.
Recent example I looked at: https://github.com/nilskch/zed-jj-lsp, which downloads jj-lsp if not found in the system. I have seen other extensions doing similar for convenience, but can't remember names to give concrete links.
> TL;DR: Mix of language tooling, unsigned proprietary blobs, corrupted and/or GLIBC-dependent files, redundant copies of already-installed executables. The Node packages especially are able to run scripts on install. Personal preference aside, might also create issues with security laws, certifications. All without user consent.
> Issues opened in January and June 2024. They've been rejected, closed, and opened a couple times since then. No changes directly improving this yet as of April 2026.
So... If you want broad language support via LSP servers, then you're going to have to bring in other ecosystems, and Node/Typescript is a big one that doesn't always have alternatives. [0] That's not a Zed-specific problem.
IMO the real issue with Zed is the "runs them by default without asking" part. Plus the questionable practices with binary blobs and the cavalier attitude in the discussions, when I can just use an editor that... Doesn't do any of that.
At the current moment I would be against it. The language and library is changing too fast. Also, Rust has some other things that make it hard to use for coreutils. For example, Rust programs always call signal (SIGPIPE, SIG_IGN) or equivalent code before main(). There is no stable way to get the longstanding behavior of inheriting the signal action from the parent process [1]. This is quite annoying, but not unique to Rust [2].
I think the concern is that the writing may be on the wall for (the current memory-unsafe version of) Coreutils. Despite the bugs and incompatibilities, Canonical seems to have decided that the memory safety of uutils is worth it. And those two downsides, the bugs and incompatibilities, will likely attenuate quickly, compelling the other distros to follow suit in adopting uutils before long.
So the continued popularity of Coreutils might, I think, depend on Coreutil's near-term publicly announced and actual memory safety strategy. As I suggested in my other comment, there are (somewhat nascent) options for memory safety that do not require a rewrite of the code base. (For linux x86_64 platforms, depending on your requirements, that might include the "fanatically compatible" Fil-C.) And given the high profile of Coreutils, there are likely people willing to work with the Coreutils team to help in the deployment of those memory safety options.
Thomas Jefferson famously said that "A coreutils rewrite every now and again is a good thing". Or something like that.
When I was a beta tester for System Vr2 Unix, I collected as many bug reports as possible from Usenet (I used the name "the shell answer man". Looking back I conclude that arrogance is generally inversely proportional to age) and sent a patch for each one I could verify. Something like 100 patches.
So if this rust rewrite cleans up some issues, it's a good thing.
Canonical's usage of uutils is likely for marketing. But the codebase itself was developed for fun, as an excuse for people to have a hands-on way to learn Rust back before Rust was even released, with a minor justification as being cross-platform. From the original README in 2013:
Why?
----
Many GNU, linux and other utils are pretty awesome, and obviously some effort has been spent in the past to port them to windows. However those projects are either old, abandonned, hosted on CVS, written in platform-specific C, etc.
Rust provides a good platform-agnostic way of writing systems utils that are easy to compile anywhere, and this is as good a way as any to try and learn it.
These things were caught and basically all of them weren't covered by any test suite (not even GNU coreutils'). It's a bit bold to claim that it's actively worsening it when it's not an LTS.
Isn't this how Kernighan and late Ritchie (K&R) ended up with unix and C?
Honestly, brilliant guys.
When C got its own standards committee they even rejected Ritchie's proposal to add fat pointers to C before it was too late to add them. Instead, we got the C abstract machine.
reply