Acorn was a curious company. It managed to get incredible amounts of work done, by assigning big projects to individuals instead of teams. My memory is not to be relied on at this lapse of time, but I seem to recall that in the final years there was a browser maintained by one guy, a port of Java by two, and an implementation of directX by another. Obviously all those projects were much smaller back then (around 98) but still, those devs were doing the work of what another company had a team to do. And in fact this does work, as communication overhead is reduced, but in many cases the increase in productivity loses strategically to the slower time to market.
From personal experience it's amazing how much more productive I am on solo projects vs working with other people. When you're solo you can just go, but in a team everything needs to be discussed or at least communicated.
Scaling software development has been THE vexing problem since day one. There's no doubt that the most efficient system fits in the head of a single person; the challenge is then what?
Either software development teams are a wonderful metaphor for multithreaded code, or multithreaded code is a wonderful metaphor for software development teams. I'm not sure which.
Each wants to claim a keyboard and a mouse, but due to new pair programming requirements set by management there is one fewer set of peripherals than there are developers.
Related to this subject is Casey Muratori's video about Conway's law and a possible extension to it. The communication overhead of working in teams, and the fact it's harder to address cross-cutting concerns in them, is a key theme in it.
There is also Descartes' quote about how a work produced by one master is often better than one in which many are involved, because of the unifying vision.
It takes coordinated effort of many, many people. The same with making a-bomb, the same with making anything bigger in software.
It’s nice that Linus started kernel and GIT but nowadays he’s not writing much code and most likely he would is not able to review personally each and every PR.
You're right. We also have proverbs like "two heads are better than one" and "standing on the shoulder of giants" so people recognise that both sides are important and have their value.
Right now, I'm working on my own on a personal project attempting to do something a little novel and I appreciate being able to go back and refine my ideas/previous code based on things I learn and additional thinking (even rewriting from scratch), when I'm more likely to face friction (like "stick to the suboptimal approach; it's not that bad") and cause trouble for teammates if I was working with someone else. So the value of working alone speaks more to me currently than the value of working in teams, but they both have their place.
I read The Mythical Man Month recently (first published in 1975), and while some of it is charmingly dated (have a secretary take your code and run it down for you!), it's astonishing how much of its discussion and advice for structuring a team of programmers remains relevant even today.
As a teenager, I misread the title and borrowed it from the library. Imagine my disappointment and embarrassment when I realized that there was no mention of a human-moth hybrid at all.
Efficient but not necessarily better. When I'm solo developing I go back later and I made some questionable decisions a team member would have identified.
My amusing if cynic take: it’s a function of the number of opinions about how to do it. A project can tolerate 2 easily, 3 in many cases, 4 and above is difficult terrain. To scale up team count, you need to increase the count of “unopinionated, doesn’t really care devs” to prevent too many opinionated devs landing on the same part(s) of the projects and conflicting. Put one or 2 on each pillar of the project - 3 tops if they work together excellently. If a project needs more bodies, drop in unopinionated devs. There’s enough bus factor that they catch each others code, but not so much that it grinds to a halt in communication overhead.
I've pretty much found the most it scales linearly is 2, and only then in good conditions such as working well together, greenfielding, and something with clear enough boundaries.
After that, well, it basically flatlines and even seems to decrease at times.
> but in many cases the increase in productivity loses strategically to the slower time to market.
I disagree that it always means slower time to market - if the individual is empowered and minimum process (no PMs, "grooming", estimates, etc) a sharp individual can run circles around a full team.
Well, "many cases" isn't "always". A bigger team will get there faster if the development is larger than a certain size, if it's smaller then an individual can win.
I recall seeing a Macromedia Director player but never heard of a DirectX port. In any case the lack of hardware floating point in most of their machines was looking like a big mistake by the mid 90s. Their compilers were also way off the pace and that was getting to be a problem.
Tbh I think I am slightly bitter about having stuck with Acorn a bit too much, and should have jumped away sooner. It is clear Acorn knew they were toast even before the Risc PC. A lot of these very impressive developments were consequently glorious wastes of time, which is kind of tragic too.
I'm not sure if the directX port ever saw the light of day. At that point the top brass were putting their hopes in the 'Network Computer' [1] and Set Top Boxes, so it may have shipped with one of those, or been intended to.
Brian McCullough's book How the Internet Happened (https://www.amazon.com/How-Internet-Happened-Netscape-iPhone...) angles this in terms of the centralized digital superhighway versus the open distributed internet. (Where thin clients, set top boxes, etc. are in the first group and web browsers and WWW protocol are the second.)
Essentially, by 2005 the open internet had won, but the iPhone (or more precisely: the App store) became the dream of a thin client and became the platform NCs had initially targeted — turning the internet into a walled garden with a vengeance.
The NC is one of those things that feels like it was simply to far ahead of where the technology and the mind set of users was. Nowadays it would have much more traction thanks to SaaS.
In tech, one step ahead is an innovator. Two step ahead is a martyr.
> The NC is one of those things that feels like it was simply to far ahead of where the technology and the mind set of users was.
They were just like X dumb terminals that predated them by about a decade: far behind what the technology was offering with storage and computing power becoming cheaper every year.
I'm glad they never caught up, and hope the same happens to Chromeboxes/books; I don't want prices of common hardware I use go up because of market shrinkage due to lots of people ditching real computers in favor of dumb terminals where even the simplest service is something that they must access and run remotely with no or reduced local storage/computing power.
Sorry for having an unpopular opinion, but to me SaaS is like going back 40-50 years to the mainframes era, and essentially is a way to put everything behind a counter so that users can be charged tomorrow for what today is still free.
Trivia at this point. But the Oracle Network Computer was a low-end x86 box running FreeBSD and a full-screen Netscape Navigator. Very much a product of its time.
(WebTV, later purchased by Microsoft, was the more successful product in this space.)
Acorn did the reference profile NC for Oracle, and it was an ARM7-based machine with NCOS, a stripped down version of RISC OS. The Acorn-built NCs were then sold under a variety of brandings, including Acorn's own and Xemplar (the Acorn/Apple education collaboration in the UK). Was the Oracle Network Computer a later variant?