Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> where it describes design choices driven specifically by compile time

I'd like to point out two things here:

1. There are common library functions that we put in single files. These are imported by a lot of other files. The fact that these incur a large recompile cost is unfortunate and I think it doesn't have to be this way if dependencies were calculated at a more granular level.

2. The far larger implication to compile time becoming exponential is coupling of concerns. This is solely in the developer's responsibility. For a type system that guarantees correctness of code, there is no way around the fact that all affected modules must recompile. eg, if a fundamental law of physics were to change, the whole universe would have to recompute.

So I think for a 'substantial group of developers', the focus should be on helping people to recognise what is coupling and how to design de-coupled systems.



Beyond separate compilation, incremental compilation can go a far way here in minimizing what has to be recompiled on a change. You can even be incremental at tree level for really aggressive change latency reductions (though batch becomes lower because of memoization overhead, not to mention memory consumption concerns).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: