Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I really don't understand why people keep misunderstanding this post so badly. It's not a complaint about C as a programming language. It's a complaint that, due to so much infrastructure being implemented in C, anyone who wants to interact with that infrastructure is forced to deal with some of the constraints of C. C has moved beyond merely being a programming language and become the most common interface for in-process interoperability between languages[1], and that means everyone working at that level needs to care about C even if they have no intention of writing C.

It's understandable how we got here, but it's an entirely legitimate question - could things be better if we had an explicitly designed interoperability interface? Given my experiences with cgo, I'd be pretty solidly on the "Fuck yes" side of things.

(Of course, any such interface would probably end up being designed by committee and end up embodying chunks of ALGOL ABI or something instead, so this may not be the worst possible world but that doesn't mean we have to like it)

[1] I absolutely buy the argument that HTTP probably wins out for out of process



I don't see that as a problem. C has been the bedrock of computing since the 1970s because it is the most minimal way of speaking to the hardware in a mostly portable way. Anything can be done in C, from writing hardware drivers, to GUI applications and scientific computing. In fact I deplore the day people stopped using C for desktop applications and moved to bloated, sluggish Web frameworks to program desktop apps. Today's desktop apps are slower than Windows 95 era GUI programs because of that.


Ok you're still missing the point. This isn't about C being good or bad or suitable or unsuitable. It's about whether it's good that C has, through no deliberate set of choices, ended up embodying the interface that lets us build rust that can be called by go.


Yes, because C is, by virtue of its history and central role in the development of all mainstream operating systems, the lowest common denominator.

Also, if I remember correctly, the first Rust and Go compilers were written in C.


Yes! It's easy to see why we got here, but that doesn't mean it's the optimal outcome!


Rust used OCaml, and Go only used C, because it was partially created by the C authors, and they repurposed the Plan 9 C compiler for Go.

Usually it helps to know why some decision was taken, it isn't always because of the technology alone.


OCaml was used for rust.


> Yes, because C is, by virtue of its history

Sure history is great and all, but in C it's hard to say reliably define this int is 64-bit wide, because of the wobbly type system. Plus, the whole historical baggage of not having 128-bit wide ints. Or sane strings (not null terminated).


> in C it's hard to say reliably define this int is 64-bit wide

That isn't really a problem any more (since c99). You can define it as uint64_t.

But we have a ton of existing APIs that are defined using the wobbly types, so we're kind of stuck with it. And even new APIs use the wobbly types because the author didn't use that for whatever reason.

But that is far from the only issue.

128 bit ints is definitely a problem though, you don't even get agreement between different compilers on the same os on the same hardware.


> 128 bit ints is definitely a problem though, you don't even get agreement between different compilers on the same os on the same hardware.

you technically have _BitInt(128) in C23, but I'm not sure that would even generate what you expect it to.


You're still thinking of C as a programming language but the blogpost is not about this, it's about using C to describe interfaces between other languages.

> because it is the most minimal way of speaking to the hardware in a mostly portable way.

C is not really the most minimal way to do so, and a lot of C is not portable anyway unless you want to become mad. It's just the most minimal and portable thing that we settled on. It's "good enough" but it still has a ton of resolvable problems.


Of some computing platforms.


> I don't see that as a problem.

It kinda is. Because it was made in the 1970s, and it shows (cough null-terminated strings uncough).

Or you know having a 64-bit wide integer. Reliably.

You did read the article, right?


> could things be better if we had an explicitly designed interoperability interface?

Yes, we could define a language-agnostic binary interoperability standard with it's own interface definition language, or IDL. Maybe call it something neutral like the component object model, or just COM[1]. :)

[1] https://en.wikipedia.org/wiki/Component_Object_Model


The general idea is sound. The implementation less so.


VHDL vs Verilog is a good parallel from the chip world. VHDL was designed from ground up.

Verilog is loosely based on C. Most designs are done in Verilog.


VHDL tends to reign in European hardware companies.


And Japan, I'm told.


I wonder why there aren't many successful European hardware products.


I would assert any company that doesn't go bankrupt is successful, doesn't need to be late stage capitalism.

Other than that, Nokia until Microsoft placed an agent on it, Phillips that contributed to CDs, ASML...


Three is not many


ST, Infineon, ARM(!!), NXP, Raspberry Pi Holdings, Nordic Semiconductor (original developers of AVR, now known for Bluetooth chips and auch)...


" any company that doesn't go bankrupt is successful"


Well, VHDL was heavily based on Ada.


Of course things could be better. That doesn’t mean that we can just ignore the constraints imposed by the existing software landscape.

It’s not just C. There are a lot of things that could be substantially better in an OS than Linux, for example, or in client-server software and UI frameworks than the web stack. It nevertheless is quite unrealistic to ditch Linux or the web stack for something else. You have to work with what’s there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: