Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In those days I'd just be happy with a language that let me quickly manipulate 16-bit integers. AppleSoft used floating-point for all intermediate calculations, which made even simple loops very slow.

I enjoyed using GraFORTH, which was a 10 kb dialect of FORTH with bitmapped and 3D wireframe graphics (!) that compiled to threaded code.



Steve Wozniak's original Apple II Integer Basic had 16-bit signed integers, but no built-in floating point support.

Applesoft (licensed from Micosoft) had 16-bit integer variables (such as A%) as well as floating point, but you are right that it converted them to and from real with every operation, which was slow. They were useful for saving memory (2 bytes instead of 5) and not much else.

There were BASIC extensions published in places like Nibble and Call-A.P.P.L.E. that added native integer math to Applesoft using the & command, so you could write things like "A% = B% &+ C%", and the operation was performed without conversion to real.

Let's also not forget SWEET-16, Woz's software emulation of a 16-bit kind-of-RISC processor on the 6502, that had 16-bit arithmetic. Reading the source code of SWEET-16 blew my young, impressionable mind.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: