

1·
2 years agoAI is to computer science what black magic is to science.
Seriously, what do you get after you’ve spent days and days to train a model? An inscrutable blob that may as well be proprietary software written for an alien CPU; studying it is damn near impossible, understanding how it works would require several lifespans, and yet it works, and we trust these models and use them to get solutions to problems that would normally be impossible to handle by computers using “real” computer science. And one day, this trust will bite us in the ass, not in the form of an “AI rebellion” but with every system that uses AI becoming unreliable because of situations outside its training.
Generally speaking, Linux needs better binary compatibility.
Currently, if you compile something, it’s usually dynamically linked against dozens of libraries that are present on your system, but if you give the executable to someone else with a different distro, they may not have those libraries or their version may be too old or incompatible.
Statically linking programs is often impossible and generally discouraged, making software distribution a nightmare. Flatpak and similar systems made things easier, but it’s such a crap solution and basically involves having an entire separate OS installed in parallel, with its own problems like having a version of Mesa that’s too old for a new GPU and stuff like that. Applications must be able to be packaged with everything they need with them, there is no reason for dynamic linking to be so important in Linux these days.
I’m not in favor of proprietary software, but better binary compatibility is a necessity for Linux to succeed, and I’m saying this as someone who’s been using Linux for over a decade and who refuses to install any proprietary software. Sometimes I find myself using apps and games in Wine even when a native version is available just to avoid the hassle of having to find and probably compile libobsoletecrap-5.so