To be fair, if you use only the Linux syscall interface, then a program that you compiled on x86 in the 90s will probably still run anywhere today. Linus is adamant about this.
But if you want to use... anything else, then it's unlikely to work at all unless you are very specific about your target. There isn't one company deciding that glibc or mesa or whatever is binary backward compatible on every kernel for every platform forever. Microsoft is, somewhat, one such company. That's why System32\*.dll have such stable interfaces -- it's their job to translate whatever late 90s system/graphics facilities some boomer dreamed up into whatever the current Windows hodgepodge of system services support. It's no wonder Microsoft is trying to drop support for hardware like crazy.
This implicit compatibility isn't true for all Windows programs, though. Consider Visual Studio. Couldn't compile my console program on my computer and run it on my dad's computer. He had to first install the "Redistributable," which for him and most people might as well be a rootkit and a super scary virus program bad guy.
This is frankly the opposite of how it should've been. Who cares that the kernel ABI doesn't change from release to release? As an end-user, I couldn't care less. Even as a developer I care very little indeed, because I'm not writing drivers. I'm almost never programming directly against the kernel's interface but rather using my language's standard library, which is already an abstraction over the kernel interface.
Platform vendors (this includes distro maintainers) should recompile and re-package libraries for each new ABI in each OS release, done. End-users won't ever notice, they can run their applications portably because the user-mode library ABIs haven't changed... Which is what Win32 has done, and what glibc has consistently failed to do.
The Unix world was lazy about it because of the approach of recompiling across somewhat source compatible systems thanks to POSIX, so there was reasonably fast portability if you didn't go too far off the beaten path.
But doing anything other than C (w/ Cfront maybe) and Fortran and Pascal was a problem, even without binary compat. Even from version to version (legacy of which we now have in glibc breaking binary compat all the time).
Microsoft went hard on the idea that if you bought/build a program for Windows version X, it would run on version X+1. You didn't have to buy a special upgraded version. You could update easier.
The same approach later drove introduction of things like PC System Design Guide and ACPI so you could just upgrade your computers instead of waiting for special OS upgrade just to boot (like it was common in other platforms, including Mac, VMS, and Unix workstation world).
Design wise, GUI parts of WinAPI aren't all that different from working with X intrinsics etc libraries (i.e. the parts above raw xlib)
How is that successful cross platform?
Arguably there has been equal or larger effort invested/wasted in cross-platform and cross-distro frameworks/APIs/packaging and yet the result still doesn't work. Partly that's due to duplication of effort; there's (mostly) one WINE competing with Qt/Gtk/whatever times Snap/Flatpak/AppImage/whatever.
Because hundreds of developers, multiple open source projects and the backing of major corporations made it happen, not because Microsoft wanted it but in spite of it.
In this case the route to success was via marketing (isn't it always so?), via market share, via application dominance (attracting developers to develop for the platform), and via insane levels of backward's compatibility. It was successful not because of the code itself (end users don't care 2 figs about the elegance of the code) but because they optimized for the end user experience.
Linux optimized for the experienced, technically adept user, who wanted to fiddle, customize and could write programs. Apple optimized for the "now", ignoring the past and regularly made existing programs obsolete and unrunnable.
I wrote windows programs in 1995. They still run today. They have run on all versions of Windows since then, without even a recompile. Everything I have [1] just keeps running. And it turns out, that's something users really want.
I get that we're all technical folk here. I get that we strive for technical excellence and elegance. I get that we operate in the "now", ignoring hardware and software from the past. But the market is different, and wants different things. If you want a successful business you need to understand the market, not just your own aesthetics. Microsoft understands that, and that's why the market (especially the business market) relies on them.
[1] - Except games. Copy protection on some of my games means they don't run anymore - but to be fair those were hacks designed specifically to prevent the game running in the first place.
It's an OK success.
A gigantic success would mean there's no friction at all running Windows apps on other platforms.
Even with all that development work it is a LONG way from easy to run Windows apps on Linux.
Will I ever target it? No, I'd rather you rip my bones and eyes out. But it's unarguably successful.
I guess he missed http3, which now makes up 35% of web traffic.
So IETF is the "big, top-down standardisation body" producing "bloated, inefficient and largely unused standards" here?
Wouldn't be the first time someone characterized it as such.
> The internet is an example of the implementation of a top-down approach when a scientist submits a paper for an RFC and iterates on it until it's de-facto protocol of the internet.
Yeah, he must be talking about the IETF. Very consensus-driven, most participants funded by vendors, difficult to iterate after RFC approval.
> While their nimbler competition is being adopted, iterated, and expanded. In the internet protocol use cases, OSI Model is now essentially just a theory taught in networking training, certification, and classes. In the real world, the internet is TCP/IP, and it's TCP IP that runs on computers, phones, and other devices.
Now I'm confused. TCP/IP are literally defined by IETF RFCs.
Microsoft, Google, and Apple have invested millions to polish their GUI solutions because that’s where their revenue comes from.
Doing it that way works great for open source where anyone can recompile the software for a new target, but for proprietary software they would have given you a Windows blob and that's about it.
Meanwhile the problems with Java were mostly not the JVM. Its current problem is, of course, Oracle.
Interesting provocative article, I bet it will be praised on some Microsoft sponsored conference.
Wine and Proton are not tributes to Win32's portability. They are symptoms of a desktop market that Microsoft locked hard enough that the rest of us had to reverse engineer our way out. Market damage, not collaboration.
The ecosystem was not won on technical merit. OEM per-processor licensing, embrace-extend-extinguish against Java and the web, document format lock-in, and a long pattern of obstructing standardization attempts that would constrain Windows (PWI in 1994, ECMA-234 in 1995, OpenDocument later) while pushing their own through when it extended reach.
No CS curriculum holds up Win32 as exemplary API design. No system copied it. A successful API earns adoption. Win32 enforced it.
With Linux, you have to target specific distros, do something insane like a giant bundle of everything, or static linking or some other craziness, or open up your source code and let someone else take the headache. Oh and I almost forgot.. install scripts that detect distros, install dependencies. And god help you if you need to ship a kernel module.
>The ecosystem was not won on technical merit. OEM per-processor licensing, embrace-extend-extinguish against Java and the web, document format lock-in, and a long pattern of obstructing standardization attempts that would constrain Windows (PWI in 1994, ECMA-234 in 1995, OpenDocument later) while pushing their own through when it extended reach.
Windows has broad hardware compatibility, a stable enough application platform (see above), aggressive backward compatibility, a large developer ecosystem, and distribution through OEMs. Those are technical merits, even if they are not the only merits.
Early Java was horrid for everybody except the architecture astronauts who could cram ten GoF design patterns into a hello world program. It only got traction because a different wannabe monopolist, Sun Microsystems, spent heavily to get it pushed into CS curriculums. Fortunately, the one-two punch of Linux and Intel killed Sun or we might all be cursing them today instead of Microsoft.
[1] https://en.wikipedia.org/wiki/United_States_v._Microsoft_Cor....
The Java stuff wasn't even the craziest part. The whole thing from investigation to the appeal took the FTC and DOJ 11 years, where they were unable to kill of Microsoft's dominance of Internet Explorer through lawfare, but it only took Mozilla and later Google six years, nearly half that time, for an open-source web browsers to have more market share than Internet Explorer.
It turns out that a better product was all that was needed. It's too bad that the Mozilla Foundation has changed course and is now adamant that Firefox be as unusable as possible.
[EDIT] I'd actually say MS losing the J# lawsuit was a net positive since it gave Hejlsberg the opportunity to create C#.
It was only after they went bankrupt and got bought by Oracle that things like OpenSolaris getting killed off and Java lawsuits started happening.
That is not what happened. Sun Microsystems had immense revenue and clout in the server and enterprise space because of the dotcom boom, so much so that their advertising declared "We're the dot in dotcom." Microsoft was trying to duke it out with them in the server space but Windows Server was just barely starting to become decent at that point so MS didn't get all that much traction.
When the dotcom bust hit, Sun went into a tailspin because of the glut of Sun server hardware from dead dotcoms at bargain basement prices. That eventually passed but by that time Linux + Intel was good enough to undercut both Sun and Microsoft in the server space. With no way to compete with free as in beer software, Sun was doomed.
Which is why Microsoft had to use such dirty tricks to prevent them from making inroads into workstations and desktops at the point that they still had competitive hardware.
> With no way to compete with free as in beer software, Sun was doomed.
Sun was a hardware company that did everything it could to commoditize software. That strategy works extremely well for hardware companies -- Intel successfully did the same thing for many years -- as long as their hardware is competitive.
They were perfectly content to sell SPARC hardware with Linux on it. But to do that they need to sell enough of it to keep up the R&D, i.e. they needed to ship desktop chips in similar quantities to Intel instead of only servers.
That came at a cost and the market size of people that really really wanted / needed that field toughness was considerably smaller than the general office usage market.
Whereas if not for that, you could do both. Design a solid chip and then put dozens of them in a big iron cabinet for big money but also offer desktops with just one of them for prices that compete with Dell. Except that Dell's customers expect to open their existing Office documents and run their Windows API proprietary software and then won't buy from you.
Not even those that have Android/Linux NDK builds, bother with porting to GNU/Linux.
Besides blaming Microsoft, look inside into the endless reboots of audio stack, GNOME vs KDE vs XFCE vs Sway vs whatever is cool in Linux Desktops this month, X Windows vs Wayland,...
I was a believer, until 2010, then went back into Windows 7. If it wasn't for gaming and .NET, I would probably be on macOS instead.
Taking care of Linux deployments is part of my job, so I know pretty well how it goes today, don't need the have you tried standard Linux forum replies.
> Not even those that have Android/Linux NDK builds, bother with porting to GNU/Linux.
It is a huge hassle to make a new build to a new platform. You double build system, release management, and testing. Compared to just one plat. Games are complicated, and testing all the dynamic behaviour is also complicated.
Making just a Win32 build really saves resources.
Also Win32 has been a stable api for a long time. Linux apis tend to change, and old games don't get re-built. The win32 build is therefore also provably a lot more long lived, compered to anything you build on linux.
Thats also important because of the Dont Kill Games effort and so on.
Valve basically failed to provide the business value for those studios.
There has been little to no interest in doing the reverse, at least until WSL, which is just containers anyways. (WSfU barely counts as an "attempt.")
I would hardly consider anything relying on a compatibility shim "compatible." Especially since Wine is not a perfect shim!
you cant just go and download a precompiled blob from a website and run it everywhere, like you can with macOS and Windows.
glibc only targets one audience, one which can recompile its apps when needed.
What linux badly needs is a stable ABI for Userspace Apps, and Win32 is just that sadly.
Sure you can! It's called AppImage or Flatpak or Snap!
I'm also not sure why compiling is treated as some taboo? It's not like Windows where it's actually impossible to set up a toolchain. Your distro comes with one installed! So that means you can run a single installer file, just like in Windows, except this is a shell script or anything else—and it can just compile and link everything for you, quite easily! The user doesn't have to know or understand how this works at all.
Why is that bad? It's bad to run binary blobs...! It's good to tailor your software to your specific environment and hardware!
Virtually all software for the desktop is compiled once and then shipped to users and they never see the source.
This works for Windows and macOS because their ABI allows it. For Linux you have to target each and every distro like a whole OS and keep up with it or your app won't run anymore after a few years.
This is also the reason why Linux has package managers and windows didn't have a official one for nearly 40 years. So its not all bad.
But yeah then they need to track distros and such. I hope there are a couple of distros that have better back compatibility eventually.
What about Linux do you think changes so much? Everything still speaks X11 or PulseAudio on desktop. More broadly, the standard library is...the standard library? What's the specific issue?
The article defines "success" in the Windows context as being "available everywhere". It does not address how it got to that point.
And sure, you might not like Microsoft, and you may not like how it became successful (using the above definition) but the fact that it is available everywhere is not in dispute.
Of course most successful things have murky pasts. We don't necessarily agree with how it got there, but there it is. That is, at least in the technical sense irrelevant. You may prefer LP's or CD's, but streaming is now the successful way to get your music.
That doesn't mean it's the only way though, and of course you are free to not use Windows programs, or play games via Steam etc. That is your choice.
In my experience even with games that have native Linux support running through proton seems to have less issues.
I emphatically disagree. It is a hilarious and catastrophic failure of Linux userspace that the best API for running games on Linux is Win32. This has absolutely nothing to do with what Microsoft locking down the desktop market. It has 100% literally everything to do with Linux userspace being a clusterfuck of terrible design.
Linux adopted Win32 because it actually worked. They didn't have to. They could have simply invented a better API that didn't suck. But that's quite hard.
So yeah. I emphatically, but respectfully, disagree with your entire thesis.
> Linux adopted Win32 because it actually worked.
WINE would have been invented one way or another because enough people would have wanted to run Windows programs on Linux in a world where Linux had dominant market share. For Pete's sake, there are Commodore 64 emulators for Linux in a world where that system has been dead for decades. It has nothing to do with what "works" or not. WINE and Proton are developed as actively as they are today because Microslop has been able to market so effectively to convince average joes and businesss leaders to buy their crappy OS. This has nothing to do with the quality of APIs.
> They could have simply invented a better API that didn't suck.
Is there ever a situation where this statement isn't true? Everything built in software can be seen as sucking, and all software could have been written better the first time around.
No, It is Linux's severe failure that it lacked a singular, stable, and unified userspace GUI API.
Careful, some people are too hell bend on idealogies than making a reliable product. They'd never understand this.
This is one of the reasons why open source (free as in beer) will never work for anything serious. When my work depends upon a software someone made for free, there's an unnecessary power dynamic in play where since I didn't pay for it, they can rugpull me anytime. Before someone comes with the argument of forking, that's not how an economy works. I can't be the farmer, the truck driver, the salesman and sometimes even the buyer at the same time.
The kernel doesn't owe anything to the distros, which is insane, distros doesn't owe anything to the various libraries and vice versa, none of them owes anything to the application developers. It's a boat where each person is rowing in their own direction because they don't care what others does with their code and they don't have to. Because they are working for FREE®.
The effects of this is catastrophic. Everytime I try to switch, I can't find a single userspace application that works half as good as windows applications. People need incentives to make their work good, no one does anything for free. Free just means it sucks.
If microsoft/apple/google tomorrow releases their own distro, every single one will abandon their flavor of the week arch/ubuntu/mint/fedora for that one. One that's made by people in exchange for money. I bet the ubuntu developers use macbooks.
Completely untrue - Linux "adopted" Win32 because the majority of video games are written for Windows (and thus Win32).
They could not have invented a better API because the entire reason Proton exists is because developers don't build native Linux games.
Correct. However you grossly misunderstand why why dev don't build native Linux games. The answer is that distributing proprietary software that works on "Linux" is an absolute unmitigated clusterfuck of pain, misery, and woe.
Cross-platfrom is very very easy. It's a solved problem. There are indie devs with custom engines that can easily ship for Windows, macOS, iPhone, Android, Playstation, Switch, and Xbox.
If Linux userspace had an ABI that wasn't garbage then adding "Linux" to that list would be very easy. The fact that devs don't and it is your primary clue.
I keep putting "Linux" in quotes because back in the day with r/LinuxGaming would spam Kickstarters begging for Linux support that's what they asked for. Of course there is no such thing as shipping for Linux. There is the Linux kernel and glibc and a kajillion different distros that are all unique and terrible in an myriad of different ways.
And it turns out that Linux is such a minefield clusterfuck that the actually best ABI is Win32. It'd be great if Linux designed a new ABI that was better. Seriously, that'd be awesome. But in the meantime Win32 it is!
Studios don't target Linux, they target Windows and sometimes Mac.
Imagine if Flappy Bird targeted only iPhone, because there were only 50,000,000 Android users in the world (hardly worth supporting). Then Android creates an iPhone runtime on Android so people can play Flappy Bird on it, and you conclude "iPhone actually worked, this is evidence that Android is a hilarious and catastrophic failure."
The calculus is very very very boring and simple. Game devs will support every single platform on the planet in which the cost to support that platform - both directly and long-term maintenance - is less than the increased revenue that platform provides.
It is not uncommon for indie games on custom engines to support Windows, Playstation, Switch, Switch 2, Xbox. And, depending on the game, iOS or Android. Sometimes macOS although that's increasingly rare.
Native Linux builds are pretty rare. Especially before Proton got pretty good with Steamdeck release.
Supporting Linux is a monumentally tremendous pain in the ass. Radically more than literally any other platform. It is hands-down the hardest and most painful to support natively. So painful that emulating Win32 is a clear win. Valve's runtime helps a lot. And supporting another path is just a waste of time.
Very sad!
Companies end up not doing profitable things all the time, for many reasons. One rational reason is that while action 1 might be profitable, action 2 is even more profitable. So the fact that Linux is not supported does not show that it would not be profitable, but rather that there are other things the companies can use labour for which they think is even more profitable. If they could freely clone their employees (and "unclone" them afterwards) all profitable things would get done.
(This is just nitpicking about your economic argument, I have no reason to think your conclusion is wrong).
Funny, I have the same feelings after 5 seconds of using MSVC or looking at Win32 documentation. Or is it WinRT now, or is it .NET Core, or .NET Framework, or UWP or OLE or COM, or whatever the API du jour is which will be slightly incompatible and incomplete with the rest of the ecosystem in poorly documented and inscrutable ways?
Performance profiling and debugging tools are critical for game development. What's your equivalent to strace again, the one that's built into the system natively? There isn't one?
All major game engines I am aware of support native Linux builds and have for years, anyways.
I guess there's a reason 80% of the servers in the world run Windows. Because it's so hard to develop for. Er, uh...no wait!
> Performance profiling and debugging tools are critical for game development.
Profiling and debugging tools are RADICALLY superior on Windows. RADICALLY. GDB/LLDB is garbage. For debugging Visual Studio (for adults, not VSCode), or on special occassion WinDbg, is great. Raddbg may be awesome some day and may also support Linux. That'll be great. Today is not that day.
Superluminal is spectacular. They're working very hard on a Linux version. It's taking them a long time because Linux is bad.
> All major game engines I am aware of support native Linux builds and have for years, anyways.
Unity and Unreal do have buttons to export to Linux. Most proprietary game engines don't have Linux clients. Linux for headless servers you control is fine.
> 80% of the servers
Yawn.
The Linux pain is trying to deploy proprietary binaries that run on customer machines which are infinite in variation. Running headless on a single Linux image you control is very different.
Anyhow. Let me know when you ship a game with 3D graphics to customers and have to deal with all their support issues!
Good luck! Then we can switch places so you can install for the 5.3% (and growing) of your gaming userbase that doesn't use an OS with ads in it.
P.S. he who lives in the house of WinDbg is not allowed to throw stones. At anyone. Ever. Nobody thinks that it is "great," you must be kidding.
It's the reason why Linux still hasn't taken off on desktop. People want good application support, not infinite customisations and blazing fast compile speeds. In fact, slow compile speeds might even give people an excuse to go chat with colleagues and take breaks. What you want is not what others want too
You know which platform is super duper mega easy to cross-compile to? And you know what platfrom is (almost) FUCKING IMPOSSIBLE to compile against an arbitrarily old version of glibc? The answer (in order) is Windows and Linux.
But in anycase you still have it wrong. Compiling and running for your machine isn't the problem. The question is can you give me a binary that runs on my machine. And also I'm not going to tell you what the environment is. But I will yell at you if it doesn't work.
Anyhow. Portable toolchain install for MSVC + WinSDK took about 30 seconds to download. Very easy. Here you go: https://gist.githubusercontent.com/mmozeiko/7f3162ec2988e81e...
I do agree it's annoying this isn't default behavior.
> you must be kidding.
Geez you are very frustrating to communicate with. I literally said "for special occasion". Good grief. Visual Studio debugger is kinda mediocre but still best-in-class and Linux doesn't even have an equivalent to compare against. WinDbg has some slick commands for super niche cases. Awful GUI with no discoverability though. (Just like Linux! bad dum tsh)
This control flow has been taught everywhere and is the basis of node's async loop (node waits on io, winapi waits on kb/mouse/timers) so "no system copied it" is complete ignorance. It is the first thing I think of when I'm designing an async flow.
This is always the case.