It’s really important - and people seem to be unwilling to recognize - the huge amount of work done by browser vendors 10-15 years ago to actually make the specs usable for the purpose of implementing a browser (the whatwg work and es3.1 were all about actually building out correct and complete.
The w3c specs, and similar era es specs could not be used to implement a browser, and a lot of the browser engineering work of the time was basically trying to find site breakage and then reverse engineering whatever gecko and/or trident did. It meant things that today might take just a few hours to implement by following the spec could take weeks. At one point I spent a month working out what the correct behaviour and sequence of events should be fired when pressing keys on a keyboard because what the specs said both under specified what should happen, and also was just wrong in some edge cases. It was a truly miserable era, and the people who seem hellbent on bashing whatwg and what not for some perceived slight remain absurd to me.
That's interesting. The people working on these specs seem to be doing an idealistic work even though their employers' interests are more to keep their moats from having the, more or less, two only "HTML engines" viable on the modern web. Is it really idealism or are there interests at play that I don't know about? Is that stuff too low level to be used as a competitive advantage?
The people working on browsers care about their users being able to use the internet in their browser. Absent the specification work that just wasn't possible - it took years to get webkit to the level that you could reliably expect all pages to work, even when spoofing the user agent (sites used to user agent gate access to only Firefox and IE, nowadays they just block every that isn't chrome[1]). The specification work helps everyone - without it you have to spend immense amounts of time reverse engineering whatever features other browsers ship if sites start using it, and similarly you don't have to worry as much that if you ship some feature some other browser will implement it slightly differently and then you have to reverse engineer what they did in the feature you shipped without specifying. That used to happen a lot. Similarly producing a specification and going through the standards groups can help identify issues and prevent them becoming a problem before the feature is shipped. It's super easy to go "I have a problem" and then make a feature that solves your specific problem rather than the more general issue you're working with. Back while I worked on webkit, Google engineers were chronic for this, and the problem is they would then frame any pushback as being "you hate the web" rather than "oh yeah, maybe we should have one complete spec instead of dozens of slightly different overlapping ones".
Overall I'd say the bulk of folk I worked with from pretty much every company I interacted with cared specifically about the web as platform independently of whatever their employer might be doing (Google, Microsoft, Apple, Mozilla, Opera, Yahoo, Facebook, are all companies I recall having significant amounts of communication with, as well as various academics, oss projects, etc and there are very few times I encountered bad faith behavior).
[1] Commercial site developers, especially "enterprise" ones, are absolutely abysmal developers. The only thing that has changed in the last few years is them switching from IE only to Chrome only.
Around 2004-2007 most of the key web specification writers (From WHATWG; Hixie, etc, etc)
a) were employed by Opera Software
b) were surprisingly young (often around or even below the age of 18)
Then Google poached most of these people (from their partner; Opera had by then invented the placed browser search engine business model for them; allegedly Google is paying Apple $15B/yr these days.)
A few years later the scope of the web started growing in an insane manner, spearheaded by Google.
This later period was also the same time that the web got deeply javascriptified and largely turned from documents into apps that you download and execute. Before that it had been documents with sprinkles of javascript.
The bulk of the work was people from Mozilla, Apple, and even Microsoft (those that was mostly in the context of TC39/ECMAScript). Opera did contribute - for similar reasons to Apple: the existing specs were abysmal and making usable browsers was hard simply due to the need to actually work out what a browser is meant to do. Some folk like Anne, Hixie, etc were employed by opera at the time, but to portray that as if that is the majority or "key" work is disingenuous.
> Then Google poached most of these people
As much as I dislike google, claiming you can poach people implies that they're property, and speaks to the anti-poaching lawsuits of a decade or so ago.
> A few years later the scope of the web started growing in an insane manner, spearheaded by Google.
Google does take a shotgun approach to specifications, and one of the biggest issues when I left (the frustration of dealing with google engineers significantly impacted the burn out I experienced) was them spewing out piles of half assed "specifications" that helped whatever individual project they were working on at the time rather than well thought out specs that provide general solution to the general problem space. The constant battle against that, especially when in a world where forums like HN are filled with people just parroting AlexR style talking points of "Apple is fighting [BS half assed spec] because they hate the open web".
> speaks to the anti-poaching lawsuits of a decade or so ago.
There's a large difference between
a) colluding to not hire people between equally sized competitors
and
b) aggressively reaching out and poaching a (foreign) partner's key people, especially when that partner is 50-100x smaller
I'm sure you're aware of this so I'm left a bit confused here.
..
> Google does take a shotgun approach to specifications... in a world where forums like HN are filled with people just parroting AlexR style talking points of "Apple is fighting [BS half assed spec] because they hate the open web".
> b) aggressively reaching out and poaching a (foreign) partner's key people, especially when that partner is 50-100x smaller
> I'm sure you're aware of this so I'm left a bit confused here.
So to clarify, if you ever work for a small company you can't move to a bigger competitor? Or are you suggesting legislation that prohibits large companies paying more?
I'm really not clear what you think is correct here?
> Umm... Okay.
Literally I was agreeing with you about the workload that comes from google's approach to spewing "specs" is excessive, especially when coupled with the attitude of chrome fans.
You're not exactly wrong if I parse it carefully with rigid dedication to your desired interpretation. The overall hyperbole of the argument prevents that interpretation from being natural to the reader. (websites in 2011 were just documents with a little sprinkle of javascript here and there is where you 100% lose me)
The WHATWG approach is not without its downsides, even setting aside the ideological points (e.g. that with no fixed releases you can never be done in any sense):
The spec is good and makes you life easy (well, as not-tremendously-difficult as it could ever be) if you go with the LibWeb/Ladybird approach of transcribing it into code line by line; then there is a yawning abyss of performance engineering and conformance bugs a mile wide; then there are all other real implementations, which do not follow the spec word for word for reasons of efficiency. There are a myriad ways to do an LL(1) parser, all well-known to be equivalent; but you are on the hook for proving the validity of every little discrepancy with the WHATWG algorithm that you’ve had to introduce for one reason or another.
Speaking as a person who has implemented numerous web specs, you generally do start with a close approximation to the actual spec - I believe my original Map/Set implementations in JSC were exactly that.
Once you have a complete and correct implementation, then you can actually measure things and work out the critical parts and make them perform reasonable well.
But it’s really hard to do perf work if you don’t already have a reference/known good implementation.
And that’s what LibWeb is doing, which is giving them a workable baseline. Certainly enough that I’ve seen a few Kling videos where he abLe to focus on perf issues. A lot of the core performance bottlenecks devolve to “cache more”, and then devolves to those most hallowed of problems: cache invalidation.
The w3c specs, and similar era es specs could not be used to implement a browser, and a lot of the browser engineering work of the time was basically trying to find site breakage and then reverse engineering whatever gecko and/or trident did. It meant things that today might take just a few hours to implement by following the spec could take weeks. At one point I spent a month working out what the correct behaviour and sequence of events should be fired when pressing keys on a keyboard because what the specs said both under specified what should happen, and also was just wrong in some edge cases. It was a truly miserable era, and the people who seem hellbent on bashing whatwg and what not for some perceived slight remain absurd to me.