Hacker Newsnew | past | comments | ask | show | jobs | submit | khaledh's commentslogin

Several years ago we had a data processing framework that allowed teams to process data incrementally, since most datasets were in the range of terabytes/day. The drawback is that it's append-only; i.e. you can't update previously processed output; you can only append to it. One team had a pipeline that needed to update older records, and there was a long discussion of proposals and convoluted solutions. I took a look at the total size of the input dataset and it was in the range of a few gigabytes only. I dropped into the discussion and said "This dataset is only a few gigabytes, why don't you just read it in full and overwrite the output every time?" Suddenly the discussion went quiet for a minute, and someone said "That's brilliant!". They only needed to change a few lines of code to make it happen.

<sarcasm>Let's also abandon disk storage's linear block addressing and go back to CHS addressing</sarcasm>


It was down if you tried to access it while authenticated (i.e. you have a cookie). It was loading fine for unauthenticated sessions (e.g. incognito).


I'm no expert myself, but is this the same as Russell's type hierarchy theory? This is from a quick Google AI search answer:

    Bertrand Russell developed type theory to avoid the paradoxes, like his own, that arose from naive set theory, which arose from the unrestricted use of predicates and collections. His solution, outlined in the 1908 article "Mathematical logic as based on the theory of types" and later expanded in Principia Mathematica (1910–1913), created a hierarchy of types to prevent self-referential paradoxes by ensuring that an entity could not be defined in terms of itself. He proposed a system where variables have specific types, and entities of a given type can only be built from entities of a lower type.


I don't know that much about PM but I from what I read I have the impression that for the purposes of paradox avoidance it is exactly the same mechanism but that PM in the end is quite different and the lowest universe of PM is much smaller than than that of practical type theories.


Markdown won. Simplicity always wins. Markdown is now the de facto documentation format, for better or for worse.


One of the best rotoscoped games I've ever played was Flashback[1] (1992). I was also wowed by Prince of Persia[2] (1989); it was ahead of its time in that period.

[1] https://store.steampowered.com/app/961620/Flashback

[2] https://en.wikipedia.org/wiki/Prince_of_Persia_(1989_video_g...


The problem with macro-laden C is that your code becomes foreign and opaque to others. You're building a new mini-language layer on top of the base language that only your codebase uses. This has been my experience with many large C projects: I see tons of macros used all over the place and I have no idea what they do unless I hunt down and understand each one of them.


Like the Linux kernel?

Macros are simply a fact of life in any decent-sized C codebase. The Linux kernel has some good guidance to try to keep it from getting out of hand but it is just something you have to learn to deal with.


I think this can be fine if the header provides a clean abstraction with well-defined behaviour in C, effectively an EDSL. For an extreme example, it starts looking like a high-level language:

https://www.libcello.org/



I built something similar a while ago (although not as featureful): https://github.com/khaledh/pagemagic


> A note on interpreters: If the executable file starts with a shebang (#!), the kernel will use the shebang-specified interpreter to run the program. For example, #!/usr/bin/python3 will run the program using the Python interpreter, #!/bin/bash will run the program using the Bash shell, etc.

This caused me a lot of pain while trying to debug a 3rd party Java application that was trying to launch an executable script, and throwing an IO error "java.io.IOException: error=2, No such file or directory." I was puzzled because I know the script is right there (using its full path) and it had the executable bit set. It turns out that the shebang in the script was wrong, so the OS was complaining (actual error from a shell would be "The file specified the interpreter '/foo/bar', which is not an executable command."), but the Java error was completely misleading :|

Note: If you wonder why I didn't see this error by running the script myself: I did, and it ran fine locally. But the application was running on a remote host that had a different path for the interpreter.


Note, that this is not a Java specific problem, it can occur with other programs as well. "No such file or directory" is just the nice description for ENOENT, which can occur in a lot of syscalls. I typically just run the program through strace, then you will quickly see what the program did.


For those interested, I did a breakdown of the hashbang: https://blog.foletta.net/post/2021-04-19-what-the/


Also be aware that kernel support for shebangs depends on CONFIG_BINFMT_SCRIPT=y being in the kernel config.


The more you try to solve a problem at a large scale, the less empathetic to humans it becomes. This has been happening for a long while now, and IMO has caused society to become "disconnected" from each other and more "connected" to devices. AI is just a new catalyst to this process. My fear is that a time will come where interacting with humans becomes the exception, not the norm.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: