Contrary to common understanding LLVM wasn't the very first one, ACK also not, there are others predating it when diving into compiler literature.
ThreadX.
https://en.wikipedia.org/wiki/ThreadX#Products_using_ThreadX
This RTOS, later rebranded Microsoft Azure RTOS, later still made FOSS:
https://www.theregister.com/2023/11/28/microsoft_opens_sourc...
ThreadX is the RasPi firmware. The GPU is the primary processor of the Pi: the ARM cores are essentially just co-processors.
The sursprise comes when you try to compile the minimal book version and find out that it is not as lean as presented in the book but actually depends on hundereds of assembler files (see https://github.com/rochus-keller/Minix3/tree/Minix3_Book_TCC).
Tanenbaum explicitly mentions multiple times that the book is a subset of the code because it would be too long to print with the library. So he covers mostly the main areas.
But the source code, in its entirety, is mounted under /usr/src. And it has all the assembly files in ACK files, mostly in lib I believe. You can compile it with a make command and it works as expected.
The author makes it seem like there’s some terrible thing. Am I missing some gory directory? Yes the ACK syntax would need to be ported over to something more modern like NASM or FASM if someone wants to move the whole kitchen sink, new linker scripts made as a result of exported symbols etc. It is painful but alas, so is the archaic K&R C.
I don’t know if that’s necessary though? It sounds like a waste of time to begin with.
I mean this book is ancient, and nobody really uses 32-bit protected mode. I’m mostly doing it out of curiosity even though I already stood up a small 64-bit long mode thinger.
Let me know what I’m missing!
Unfortunately, when MINIX3 as started, it was copied directly from MINIX2 and a lot of interesting stuff was left out.
I assume you mean because the assembler was manually migrated in later Minix versions, not because there is a tool which can do so automatically. Or did I miss this?
> and a lot of interesting stuff was left out
Can you please make examples what you mean specifically?
One example is the new compiler driver that can use ACK and GCC from a single compiler driver and that can automatically convert assembler and figure out which archiver to use to create libraries.
Another example is library support for filenames longer than 14 characters that was completely transparent. MINIX3 just broken all backward compatibility by increasing the size of directory entries to a new fixed size.
I'm sure there is more stuff, these are just a few I remember.
2025: https://news.ycombinator.com/item?id=42833638
2020: https://news.ycombinator.com/item?id=22310987 and https://news.ycombinator.com/item?id=22612420
But - the repository is not "everything you need"; it actually relies on a lot from an existing platform - GCC, Lua, Make, Python etc. So, you would typically use this to cross-compile it seems.
It seems to be free now anyway, since 2005 according to the git history, under a 3-clause BSD license.
[1] https://www.gnu.org/gnu/thegnuproject.en.html
" Shortly before beginning the GNU Project, I heard about the Free University Compiler Kit, also known as VUCK. (The Dutch word for “free” is written with a v.) This was a compiler designed to handle multiple languages, including C and Pascal, and to support multiple target machines. I wrote to its author asking if GNU could use it.
He responded derisively, stating that the university was free but the compiler was not. I therefore decided that my first program for the GNU Project would be a multilanguage, multiplatform compiler."
And not only was the university 'free' and the compiler not, neither was 'Minix', which was put out there through Prentice Hall in a series of books that you had to pay a fairly ridiculous amount of money for if you were a student there.
So the VU had the two main components of the free software world in their hand and botched them both because of simple greed.
I love it how RMS has both these quotes in the same text:
"Please don't fall into the practice of calling the whole system “Linux,” since that means attributing our work to someone else. Please give us equal mention."
"This makes it difficult to write free drivers so that Linux and XFree86 can support new hardware."
And there are only a few lines between those quotes.
This explains the final 2 sentences of the original Linux announcement:
> PS. Yes - it's free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that's all I have :-(.
The book publisher is blamed for preventing Minix from being freely distributed: https://en.wikipedia.org/wiki/Minix#Licensing
Ironically, that single threaded nature of the FS made it a perfect match for my own little OS and I happily hacked it to pieces to bootstrap it using message passing into a FS executable. That trick probably saved me a year in bringing up the kernel to the point that the OS could compile itself, which greatly sped up development.
Not to defend the textbook grift or the lack of vision here, but I strongly suspect an undergraduate minix course taught at VU would be very good. It’s not obvious to me that it would be inferior to the xv6-based course taught at MIT, for example.
Note that all I'm doing here is taking AT at his word that he developed Minix solely because the source to Unix wasn't free to universities to hack on. They could have adopted Linux from the day that it became available then, or at least the beginning of the next academic year.
I reminder taking a security-oriented class ages ago and hacking on an operating system that was already dead as a trilobite, and we were all smart enough to realize this was not a triumph we’d be bragging about to our future children (or recruiters). Bleh.
So that already suggests a fantastic way to make some progress.
I think Tanenbaum had a unique vision at the time, but he went about it in the most ham handed manner possible and if not for VU Minix wouldn't even be remembered today. Linus had a huge advantage: he didn't have a lifestyle to support just yet.
For MINIX the situation was different and I think more unfortunate. AST wanted to make sure that everybody could obtain MINIX and made his publisher agree to distributing the MINIX sources and binaries on floppies. Not something the publisher really wanted, they want to sell AST's book. In return the publisher got (as is usual for books) the exclusive right to distribute MINIX.
Right at the start that was fine, but when Usenet and the Internet took off, that became quite painful. People trying to maintain and distribute patch sets.
A friend of mine was studying under Andy and I had a chat with him about this at his Amstelveen residence prior to the release. He was dead set on doing it that way. As a non-student and relatively poor programmer I pointed out to him that his chosen strategy would make Minix effectively unaffordable to me in spite of his stated goal of 'unlocking unix'. So I ended up in Torvald's camp when he released Linux as FOSS (I never contributed to either, but I figured as a user I should pick the one that would win the race, even if from a tech perspective I agreed more with Tanenbaum than with Torvalds).
Minix was (is?) flogged to students of VU for much longer than was beneficial to those students, all that time and effort (many 100's of man years by now) could have gone into structurally improving Linux. But that would have required admitting a mistake.
MINIX was originally a private project of ast. It worked very well for the goal of teaching student the basics of operating systems.
One thing that might have been a waste of time is making the MINIX utilities POSIX compliant. Then again, many students would like an opportunity to work on something like that. The ones that wanted to work on Linux could just do that. Students worked in their free time on lots of interesting projects that were unrelated to the university.
Sure, but time is a very finite quantity and wasting a couple of years on Tanenbaum's pet project may have resulted in some residual knowledge about how operating systems in general worked but looking at most of the developments they pursued the bulk were such dead-ends that even outside of VU there was relatively little adoption. The world had moved to Linux and VU refused to move with it.
From being ahead they ended up being behind.
Some people spent a lot more time on MINIX, but that was either as a hobby or the PhD students who worked on MINIX3. But MINIX3 generated lots of papers with a best paper award, so that can hardly be seen as wasted from an academic point of view.
Another interesting fact is that until Linux came to be, GCC only became relevant because Sun started the trend among UNIX vendors to split UNIX into user and developer SKUs, thus making the whole development tooling behind an additional license.
I'll be honest, I don't understand your point here?
I'm not sure if I'm reading satire or they are having some fun trolling.
https://en.wikipedia.org/wiki/Vrije_Universiteit_Amsterdam
Amusingly, the Dutch verb "vrijen" does, in fact, mean to have sex.
Vry == "free" (noun) or "to court/kiss/have sex" (verb, contextual).
Linux the kernel has the drivers.
https://compilers.iecc.com/comparch/article/92-04-041
UniPress made a PostScript back-end for ACK that they marketed with the NeWS version Emacs, whose slogan was "C for yourself: PostScript for NeWS!"
https://news.ycombinator.com/item?id=42838736
>UniPress ported and sold a commercial version of the "Extended Amsterdam Compiler Kit" for Andrew Tanenbaum for many CPUs and versions of Unix (like they also ported and sold his Unix version of Emacs for James Gosling), so Emacs might have been compiled with ACK on the Cray, but I don't recall.
>During the late 80's and early 90's, UniPress's Enhanced ACK cost $9,995 for a full source license, $995 for an educational source license, with front ends for C, Pascal, BASIC, Modula-2, Occam, and Fortran, and backends for VAX, 68020, NS32000, Sparc, 80368, and others, on many contemporary versions of Unix.
>Rehmi Post at UniPress also made a back-end for ACK that compiled C to PostScript for the NeWS window system and PostScript printers, called "c2ps", which cost $2,995 for binaries or $14,995 for sources.
>Independently Arthur van Hoff wrote a different C to PostScript compiler called "PdB" at the Turing Institute, not related to c2ps. It was a much simpler, more powerful, more direct compiler written from scratch, and it supported object oriented PostScript programming in NeWS, subclassing PostScript from C or C from PostScript. I can't remember how much Turing sold it for, but I think it was less than c2ps.
https://compilers.iecc.com/comparch/article/92-04-041
https://donhopkins.com/home/archive/NeWS/NeScheme.txt
My goodness, this is hard to imagine from today when open source has driven the price of software (code itself) to nil. And that's the price from decades ago. While I'm glad I don't have to pay 15K for a C to PostScript compiler, as someone who might have written similar software if I'd lived back in those days - I can imagine an alternate timeline where I'd be getting paid to write such tools instead of doing it as a hobby project.
> NeScheme.txt
Nice rabbit hole about LispScript, what a cool idea. I've been re-studying Scheme recently, its history and variants like s7, and was appreciating its elegance and smallness as a language, how relevant it still is. One of the books I'm reading uses Scheme for algorithmic music composition. (Notes from the Metalevel: An Introduction to Computer Composition)