The author suggest HTTP/2 as a solution to web cruft.
I could be wrong, but I see the HTTP/2 ploy as a proposed way to deliver more cruft, faster.
What do you think is going to be in those compressed headers?
How large do HTTP headers need to be? What exactly are they trying to do? I look at headers on a daily basis and most of what I see is not for the benefit of users.
We can safely assume the compressed headers that HTTP/2 would enable would have nothing to do with advertising?
Again, I could be wrong, but in my estimation the solution to web cruft (unsolicited advertising) is not likely to come from a commercial entity that receives 98% of its revenue from web advertisers.
The web cruft problem parallels software bloat and the crapware problem (gratuitous junk software pre-installed on your devices before you purchase them).
The more resources that are provided, e.g., CPU, memory, primary storage, bandwidth, the more developers use these resources for purposes that do not benefit users and mostly waste users' time.
This is why computers (and the web) can still be slow even though both have increased exponentially in capacity and speed over the last two decades. I still run some very "old" software and with today's equipment it runs lightening fast.
The reason it is so fast is because the software has not been "updated".
HTTP/2 largely won't help the problems mentioned in the article. If I'm loading 200+ assets for 30-50 hosts, HTTP/2 can't help because I'm making 30-50 TCP connections and fetching 5-8 resources over each. The efficiency of HTTP/2 over HTTP/1.1 really doesn't excel when fetching so few resources per connection.
HTTP/2 helps when you are downloading 200+ assets for 1 or 2 hosts.
I routinely use HTTP/1.1 pipelining from the command line to retrieve 100 assets at a time. But these are assets that I actually want: i.e., the content.
Somehow I doubt that the 200+ "assets" coming from 1 or 2 hosts automatically when using a web browser authored by a corporation or "non-profit organization" that is connected to the ad sales business are going to be "assets" that I actually want.
> I could be wrong, but I see the HTTP/2 ploy as a proposed way to deliver more cruft, faster.
Could not possibly agree more. I tried my own HTTP/1 site in that site-tester, and it has a 105ms response time for entire pages. And my pages are dynamically generated.
HTTP/2 (and especially the Mozilla/Google-imposed TLS requirement) takes away the simplicity of the web. The thing that allowed you to write a few lines of code, and suddenly your old Commodore 64 was a web server. Increased complexity is going to lead to more and more of a monoculture, which is going to leave us all more vulnerable to attacks (like with OpenSSL and Wordpress.)
The article is also a little bit ironic for me. He speaks of unnecessary widgets added to pages, yet his unnecessary sidebar is overlapping the text in the article, cutting off the ends of every line, and even maximizing the window doesn't get rid of it ( picture: http://i.imgur.com/qJeyP5v.png ). I had to use "inspect element" to delete the sidebar to read the page in full. (I'm sure it doesn't happen for everyone or it'd have been fixed, but it does on my browser.)
It won't serve to eliminate cruft, but it will drastically reduce the effect it has on load times, which is a very significant part of the problem. If server push and connection multiplexing are used right, you could load an entire page in 1-2 round trips.
Hopefully, a lot of that second round trip would be extraneous (ads/tracking/etc) and the user can start using your content immediately.
The author suggest HTTP/2 as a solution to web cruft.
I could be wrong, but I see the HTTP/2 ploy as a proposed way to deliver more cruft, faster.
What do you think is going to be in those compressed headers? How large do HTTP headers need to be? What exactly are they trying to do? I look at headers on a daily basis and most of what I see is not for the benefit of users.
We can safely assume the compressed headers that HTTP/2 would enable would have nothing to do with advertising?
Again, I could be wrong, but in my estimation the solution to web cruft (unsolicited advertising) is not likely to come from a commercial entity that receives 98% of its revenue from web advertisers.
The web cruft problem parallels software bloat and the crapware problem (gratuitous junk software pre-installed on your devices before you purchase them).
The more resources that are provided, e.g., CPU, memory, primary storage, bandwidth, the more developers use these resources for purposes that do not benefit users and mostly waste users' time.
This is why computers (and the web) can still be slow even though both have increased exponentially in capacity and speed over the last two decades. I still run some very "old" software and with today's equipment it runs lightening fast. The reason it is so fast is because the software has not been "updated".