Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> UNIX as a whole, and C in particular weren't any kind of well-designed projects. They are awful inside and out. Their staying power isn't due to the goodness of the design or their remarkable performance. They are with us because of the network effect.

Can you explain why Algol-68 and PL/I didn't generate any network effects?

C may have survived for over 50 years because of network effects. But it didn't get in the position to have network effects by being "awful inside and out*. It got in the position to have network effects by being considerably better than the existing alternatives. (Better for actually writing programs, not better in any theoretical CS kind of way.)



I don't think it's fair to say that C was better by itself, as a language. It's more that it was better specifically in the niche of early microcomputers, in part because it has much less competition there. And then when that niche exploded - for other reasons - so did C.

In many ways, I think this parallels the rise of JavaScript, which was far from the best language in general, but happened to be the best language that would run in any browser - and so as popularity of the web grew, so did JS.


C was better than Pascal? Or Fortan? Puh-leeease!

Despite small community and not a lot of people working on Fortran compilers (at least as compared to C), Fortran programs usually still beat C on benchmarks.

And Pascal? -- Well, it has so much better grammar... like, it was specifically designed to be unambiguous LL(1) language.

And these are only the two I can name off the top of my head that would straight-up win against C in almost every respect, or at least draw. And these two predate C.

If you look more closely at the history of UNIX and C, you realize that people who created it weren't guided by some great ambition to make a good product... they were just pricks who didn't like to study what others did before them, and thus invented their own square-wheel bicycle. They then also lacked the insight into how defective their bicycle was, but were really eager to sell it to those who knew even less about bicycles. It was through pure luck that UNIX took off and won the OS race. It has nothing to do with its engineering qualities.


> Fortran programs usually still beat C on benchmarks.

Sure... for the kinds of programs that were written in Fortran, which tended to be math-heavy. But nobody wanted to write text-processing programs in Fortran, or parsers, or operating systems, or memory managers. (I mean, seriously, think about writing "grep" in Fortran. You might be able to do it, but C, for all its flaws, is still a far better tool for that kind of task.)

Pascal... better grammar. Horrible to actually use, though, at least before the Turbo Pascal extensions. Text processing was incredibly painful, because there was no such thing as a variable-length string, which was a crippling limitation. I/O was also pretty broken. It was very much not better than C.

Neither Fortran nor Pascal would straight-up win against C for general-purpose programming, still less for system programming.

> they were just pricks who didn't like to study what others did before them

Feel free to cool it with the ad-hominems. They are against site rules.


> Sure... for the kinds of programs that were written in Fortran,

As someone who has to look into the code of utils-linux, which is very much written in C, I can tell you that C... well, shouldn't have been used there. And the few, but still a significant number of times I had to make a trip into Linux kernel code, I can confidently tell you that C is a bad choice for that kind of program too.

There aren't good programs for C, or, to put it differently, C is not a good choice to solve any problem.

Just to give you an example of a bug in utils-linux I faced very recently, to, hopefully illuminate the problem further: there's a utility called mdadm (short for multiple devices admin). "Multiple devices" is a Linux name for RAID (basically, with minor differences). So, this utility must talk to the kernel a lot, and especially the drivers such as raid0, raid1 etc.

The thing is, and due to historical mishaps... in the previous iteration of this communication protocol tools were expected to use various ioctls to talk to kernel. The system grew and grew, and had outgrown itself. ioctls don't cut it anymore as they need to carry too much information, much more than is plausible to stuff into the simple mechanism that they are. So, the new wave of the kernel-utils communication is through sysfs. But, here comes the horror of every C programmer: parsing! If you talk to sysfs, all you get is file streams. These file streams have structured data in them, but there's no unifying format (so you cannot piggy-back on someone's hard work on creating a universal library to parse that stuff), nor are there any decent utilities in C to deal with extraction of data from file streams.

The result? -- the authors of mdadm discovered that in some circumstances using ioctl isn't going to work anymore, specifically, when dismantling MD devices, but they also realized that parsing sysfs stuff is just too hard for them and... gave up. There's a "TODO" in their code, has been there for many years, that says that they should be using sysfs... and nothing has been done about it.

I could blame the "lazy" mdadm programmers for it, but really, it's a fault of C. Even in some unappealing language like Python, this would've been a no-brainer...

Would Pascal win here? -- Absolutely. A lot of fears that C programmers have when it comes to deal with strings are non-issues in Pascal. But, wait, Pascal has evolved, where C hasn't. Ada is in many ways a spiritual successor to Pascal.

> Feel free to cool it with the ad-hominems.

What you wrote is an ad-hominem attack, regardless of the site rules, I don't care about it. What you quoted, however, is based on memoirs and first-person impressions from people familiar with the subject. It gives fair and appropriate description of the charters in question. While many of them are no longer with us, the remaining ones aren't likely to dispute the claim.


> Can you explain why Algol-68 and PL/I didn't generate any network effects?

The community was too small. The grows in the programming field was very rapid. I don't want to say "exponential" because I don't have the actual numbers, but you could see it because, well, you would almost never meet anyone with more than some 5 years of experience in almost any programming field for decades, i.e. in 70's, 80's, 90's... I started my career in the 90's, and, so far, in real life, I only met three programmers who started more than 10 years before my time.

The new generation of programmers simply didn't know much of what the previous generation did, and this repeated many times over, not just with Algol or PL/I.

> C may have survived for over 50 years because of network effects. But it didn't get in the position to have network effects by being "awful inside and out".

I don't know how much do you know about UNIX history, but you are willfully misquoting me. Literally, the success of UNIX, and by proxy, of C is the network effect. In a more literal sense than you probably imagine. The appeal of UNIX was more or less this: the "real" computers of the day, the so-called "big iron" had always custom-made operating systems for them. Essentially, every different hardware model would ship with a different OS. UNIX was the first portable OS in a sense that you could install it on more than one CPU architecture (not from the very start, but that was the goal, and they succeeded at it). And the reason why people wanted a portable OS was the idea behind how they wanted to build networks back in those days: have a "real big-iron" have a "side-kick" computer that handles the networking issues. The side-kicks would all run the same system, and serve as adapters to "real" computers. UNIX, in a sense, was a glorified router modem... at least, that's what it was meant to be.

Quite soon programmers realized that instead of connecting "side-kick" computers to "real" computers they may make "real" computers run the same standardized OS. A much simpler one at that! Very little thought was put into thinking about why those systems on "real" computers had to be so complex and big. The same kind of enthusiast who proclaimed that Emacs is huge, but ended up using Eclipse, or the same enthusiasts who proclaimed that Ada is huge, but ended up using C++ are the shortsighted programmers who promoted UNIX in its early days, fullheartedly believing that complexity will somehow evaporate, that they are getting a simple tool to solve complex problems...

So, yeah, because UNIX did became popular due to network effect, literally and figuratively. C just piggy-backed on its success.*


> I don't know how much do you know about UNIX history...

Perhaps more than you. I started a decade before you, so I was there for more of it than you were. (Not at the beginning, I admit.)

> but you are willfully misquoting me.

Not willfully - that takes intent. What, specifically, did I say you said that isn't what you said, or that was out of context? Having read your reply here, I still don't see what I'm misquoting.


Being there and studying it is very different. It wouldn't surprise you to discover that the authors of UNIX were there, would it? And yet, somehow, I believe that I know better than they do -- how comes?

You wouldn't believe it, but things often are easier to judge in hindsight, than when being involved with them...

> willfully

You pretend that I said that UNIX got into its position by being awful inside and out, but what I wrote is that it got into it's position despite being awful inside and out. In other words, you pretend to misunderstand me, and then argue with something I didn't say.


You sure are free with accusations of bad faith on my part. That also is against site guidelines.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: