Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Seeing projects like this makes me wonder: how come great ideas often fail to be picked up, when bad ones are repeated again and again.

Take the idea of having a single language for the shell and system programming. It forces you to think harder about the design of the language, so that it correctly works both as a command-line interpreter and as a compiled language. Lisp machines had that. Oberon system had that (in a graphical way, even). Forth systems had that. So it's not like we collectively didn't know. But neither Unix nor DOS/Windows even tried. Instead, they both feature shell languages that can best be described as "pathetic". Why did we have to wait until Windows 10 to have useful text selection in the CMD shell?

Or in programming languages, why is it that new languages always seem to take C and C++ as role models, while spectacularly failing to take any inspiration from Erlang, Ada, SmallTalk, Intentional Software, or Lisp? Why is XL, my own language, still the only one I know where it's easy to do multi-operator overloading, i.e. overload A+B*C, something it has been doing for at least a decade? Why are we unable to create a language that would work well both on GPUs and CPUs, or better yet, let me easily describe an application that uses both?

Or in interactive apps, why do we still not have good reactive functional frameworks for the web, similar to what Tao3D does for interactive 3D? (See http://bit.ly/1G8VOmr for my rant on that topic). Why is creating an OpenGL application getting more complex with each generation, instead of simpler? Why can a kid easily use GameMaker, but none of the Apple developer tools?

Or in operating systems, why did Linus Torvalds decide to mimick Unix and not the kind of mind-blowing operating system design ideas that were floating at the time? Like TaOS, the OS that was recompiling byte code at page-in time and could run a same program on different processor architectures during its lifetime? Or like MacOS, the original one, and its built-in graphical toolbox where, unlike X11, you could easily draw an ellipse inside a rectangle irrespective of line width? Or like GeOS and its 64K-based multitasking OS with vector fonts and GUI apps?

In summary, why is it that we always seems to ignore good ideas, but have no concerns repeating bad ones?

My theory is that inferior technology needs superior marketing, and that's how you win in the end. That's just a theory, of course ;-)



The canonical article on this is http://www.jwz.org/doc/worse-is-better.html from 25 years ago.

The common thread in what you're saying is "why do people not choose (technically pleasing solution) over (familiar solution with known warts)", and the answer to that is that change is really expensive and unpopular. Especially if it forces people to abandon their existing custom tooling, workflow, infrastructure, and skills.

Would most people use a new operating system with no web browser? Probably not. So that sets a minimum bar on the quantity of software you have to port to get it adopted. Which is a vast amount, especially if you want to write it in a new language as well. And new languages have new pitfalls.

For the shell, it's not clear that the same language is necessarily ideal: shell commands are ephemeral and the user wants to type as little as possible, while code lives forever and is generally more complex. Strong type systems and other forms of testability are important for code but clutter for the command line.

Linux chose Minix to replicate partially because he'd been taught it at university and partially because he could leverage the GNU infrastructure.


Worse is better

edit : damn it took me too long without refreshing the page to answer someone did it better




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: