> RISC OS gives applications access to much of the memory map, and so if a program accidentally scribbles over the wrong parts of that address space, the whole computer can freeze up – which in testing our Pi 400 did several times.
Yep. There are good aspects and bad aspects to old-school 1980s OS design.
On the other hand, the entire PC industry was built on DOS and 16-bit Windows which were exactly like this.
Apple made enough money to buy the dying NeXT from selling classic 68K and PowerPC Macintoshes which were exactly like this.
Early Linux was exactly like this, too. I remember running Red Hat Linux 4.2 on my SPARCstation and having kernel panics right, left and centre. Several a day, every day.
Everything we use today, all the hardware and all the software, was built both on software like this and from software like this.
> Early Linux was exactly like this, too. I remember running Red Hat Linux 4.2 on my SPARCstation and having kernel panics right, left and centre. Several a day, every day.
No, not exactly like this. Even early versions of Linux used separate address spaces for (each) user process and system memory (like all, but the earliest Unix systems), preventing unprivileged user space processes to clobber system memory. A wayward pointer in an unprivileged application is strictly speaking undefined behavior, but on Unix systems typically causes a segmentation fault signal (by default terminating the offending application), not a crash of the whole system.
That doesn't mean that there were no bugs in the kernel and crashes resulting from those, but even already mid-nineties, Linux, while perhaps not yet comparable with the likes of Solaris and Interactive Unix, wasn't any worse than SCO Unix and much more stable than the 16bit offerings from MS. Linux kernel crashes were rare (not as rare as today, thanks to continuous effort of hundreds of contributors and the prioritizing of the fixing of regressions), more often however users experienced out-of-memory situations, which before the addition of the oom-killer could effectively freeze a Linux system for a long time or even indefinitely. Also the X11-server or rather some graphics card driver (part of the X11 server then) wasn't quite of the same quality and when it crashed, terminated the user's session (all of the user's application started during that session). I.o.w., used as a small server, Linux was reasonable stable early on, used as a GUI desktop system not so much.
It's bullsh1t. It's not true. It wasn't an "investment". Microsoft stole Apple code and used it in Video for Windows. It got caught. Apple took MS to court and was going to win, so it settled. The marketing-lizards span this as "investing" but it's not true.
Not only I walk this planet since the 1970's, I lived through this "lie" on all key publications of the time, and I used the Cult of Mac site on purpose, exactly because I was expecting that reply of yours.
Enough said.