

I’m primarily talking about Win32 API when I talk about Windows, and for Mac primarily Foundation/AppKit (Cocoa) and other system frameworks. What third-party libraries do or don’t do is their own thing.
There’s also nothing wrong with bundling specialized dependencies in principle if you provide precompiled binaries. If it’s shipped via the system package manager, that can manage the library versions and in fact it should do that as far as possible. Where this does become a problem is when you start shipping stuff like entire GUI toolkits (hello bundled Qt which breaks Plasma’s style plugins every time because those are not ABI-compatible either).
The amount of time that I had to get out of .dll-hell on Windows on the other hand. The Linux way is better and way more stable.
Try running an old precompiled Linux game (say Unreal Tournament 2004 for example). They can be a pain to get working. This is not just some “ooooh gotcha” case, this is an important thing that’s missing for software preservation and cross-compatibility, because not everything can be compiled from source by distro packagers, and not every unmaintained open-source software can be compiled on modern systems (and porting it might not be easy because of the same problem).
I suppose what Linux is severely lacking is a comprehensive upwards-compatible system API (such as Win32 or Cocoa) which reduces the churn between distros and between version releases. Something that is more than just libc.
We could maybe have had this with GNUstep, for example (and it would have solved a bunch of other stuff too). But it looks like nobody cares about GNUstep and instead it seems like people are more interested in sidestepping the problem with questionably designed systems like Flatpak.
Ugh, that would complicate things. If that’s the case, all I can say is that’s really negligent (and goes into what I originally said about lack of stable ABI really ruining Rust for me — technically I said static linking but that’s really the core issue)