Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Everyone has JavaScript, right? (kryogenix.org)
87 points by tosh on April 21, 2024 | hide | past | favorite | 113 comments


> Does the corporate firewall block JavaScript? Because loads of them still do.

[citation needed]

> Does their ISP or mobile operator interfere with downloaded JavaScript? Sky accidentally block jQuery, Comcast insert ads into your script, and if you've never experienced this before, drive to an airport and use their wifi.

Ever heard of HTTPS?

> Do they have addons or plugins installed which inject script or alter the DOM in ways you didn't anticipate? There are thousands of browser extensions. Are you sure none interfere with your JS?

If the user installs an extension that breaks the web, it’s the user’s fault, and the developers should not cater to them.


> if the user installs an extension that breaks the web

It does not break the web. It may break single sites that were written on naive assumptions; I see a lot of sites break because my extensions do not allow loading analytics libraries. This means their js was dependent on this libraries being actually active.

I'd say this is case a case of _unbreaking_ the web.


>It may break single sites that were written on naive assumptions; Google Translate and many other libraries break React based sites if they are using refs.

I don't think that point it falls under "written on naive assumptions"

https://github.com/facebook/react/issues/11538

the issue says closed but you can easily catch it in various sites and use cases.


Still user decides to use crappy extension to break websites.

Some users send me email that they cannot see images on my website. They have adblocker rule where they block all images which contain ad.

So dsa231dfsaade.jpg is blocked. And you say that is website developers fault?


Of course you can decide to not serve those visitors. But if you want to capture as much as possible of your target audience you might want to consider which users can’t see your website - and the OP raises a number of non-obvious ways where JS adoption might affect that.


Sorry, but your comments text contained “target” it is blocked keyword and adblocker removed it.

Can you reword your comment?


Your example is a bit extreme, and it sounds like you are talking about you are talking about your own website which, with all due respect, I doubt anyone else is remotely invested in the success of, so you as a developer have an unusual degree of freedom to be this dismissive.

In the real world, anyone who is as dismissive as you is likely in a position where they’re going to have a boss telling them to pull their head in. The reality is the vast vast majority of users won’t know have a clue how ‘disrespectful’ the browser extension they’ve installed is of ‘the wishes of the developer’ or whatever, and a poor customer experience is a poor customer experience, no matter whose “fault” it is.


It is reality of bad extensions(user skill issue) which break websites.


If the site doesn't work with adblock, I just close the tab instead of using whatever skill it might take to hit disable


Ever heard of HTTPS?

Ever heard of SSL MITM Proxy? Required by law (and corporate compliance departments) in many industries.

Also, cloudflare. They are the MITM for what, like 80% of the internet now?


Cloudflare may be MITM, but with the consent of the website owner. If they break JS (which I doubt they would), the website owner would certainly notice.

Police states and "internet security" mal^H^H^Hsoftware may MITM stuff. If you're actually doing business with customers in those police states, you probably have someone local who can test. "Internet security" software is unlikely to break things just to insert ads (as the ISP mentioned in the post did over plain HTTP).


> Required by law (and corporate compliance departments) in many industries.

Could you please provide details about this?



Thank you. I don’t doubt that certain governments twist the arms of companies for privileged access. From my understanding, however, those practices are not the norm in the west (and certainly not overtly codified). IMHO, 90%+ of the data captured in the “West” is from exploits and such.


Also very curious about this. And yes, there are companies that do that - and then it's usually called Deep Packet Inspection or similar, not MITM, because that's a term attributed to malicious parties.

And in that case, the proxy can both, manipulate the JS as well as the HTML, so the argument of the linked page is still faulty.


Ndjd she eje


Cloudflare does not fiddle with or inject JS without the server's consent. They're not comcast.


/me laughs so hard that milk shoots out of his nose


This. The Cloudflare captcha page exists and will send you in an endless loop while you're on a 3g connection.


Not sure if this is the same kind of "endless loop" that I've experience on the Cloudflare captcha page; while on a VPN, the only way I can get past the "I am a Human" captcha is to refresh the page and click the area where the checkbox appears as fast as I can with my mouse before it even appears. It might take two or three tries, but it hasn't failed yet.

I have no idea why this works so well, but I haven't found another way (other than getting off of my VPN) to get past this screening page.


This doesn’t work unless you have access to the device, which mobile ISPs surely don’t. Some countries might mandate it but I don’t think it’s a feature to make your websites police state approved.


Corporate is the key part being missed here

They provide the device(s), they have access to it. Countless ways to mess with the trust.

If anyone thinks they're safe from their employer due to SSL, lol.


I know that but the original comment was in response to:

> Does their ISP or mobile operator interfere with downloaded JavaScript?

In any case, if some rogue corporate proxy breaks your website, then either they pay you to fix it or you tell them to get lost.


Totally fair, not a judgement - just being a reminder down here in the thread


> Also, cloudflare. They are the MITM for what, like 80% of the internet now?

Please give source for that claim where Cloudflare implements man-in-the-middle attack.


The person you're replying to said cloudflare was the man in the middle. Not that they executed attacks.


So, you have found three points that might not apply. Does that make the point of the article invalid? I think it doesn‘t.


Those are the three arguments I find the least believable. Other arguments are also questionable in 2024. 2G networks are not really usable for Internet consumption. Outages or failures happen, but they can happen for a HTML-only page. Pages where the primary goal is content consumption should use progressive enhancement and limited scripting, but it doesn’t really make much sense for applications.


It could be interesting to explore how things fail. If it is supper easy to fix ill do it.


Yeah, agreed. The general tone of the flowchart is misleading: tons of paths lead to "cannot use javascript" and only one leads to "can use javascript" creates the misleading implication that the probability of the user having JS is low and the reverse probability is high.

The articles point seems to be: if you're writing a webapp or anything interactive, then a mere 99.999% of visitors will be able to use the content.


This whole topic is as interesting as "tabs vs spaces" or "emacs vs vim". There's a vocal minority of people that want to tell everybody else how the web should work, and everybody has ignored them for the last 20 years, and will continue to ignore them.


"If the user installs an extension that breaks the web, it’s the user’s fault, and the developers should not cater to them."

Likely so. My normal default is to browse the Web without JavaScript enabled, I've worked the Web this way since the 1990s. The reasons are:

(a) The web runs just so much faster without JS, also those jitters, pauses and delays caused by JS simply disappear. If people knew how much faster the non-JS mode actually is then many more would partake of it.

(b) These days, many web pages are enormous, it's not unusual to see pages that are over 7 or 8MB in size, they're so slow that one can just about have a coffee break before they load. Disable JS in one of these pages and its content can be well in excess of an order of magnitude smaller.

(c) Without JS, ads are almost a thing of the past. I don't bother to run ad-blockers simply because I don't have to. Without JS ads simply don't download.

(d) Without JS much of the online spying and tracking simply disappears. Those who want to track users can't help themselves from using JS (it seems to go with the territory).

(e) Without JS, annoying audio and videos don't start automatically. That's good because if I want to watch the content I can just copy its link to say NewPipe and have a much more pleasant experience.

(f) Without JS, pop-ups and messages such as those that ask for permission to accept cookies don't appear, similarly so for those asking for donations. Many web operators wait for a few seconds before displaying these messages, by that time one's already reading page content so the message is aimed to deliberately distract the viewer. Notable offenders are the Guardian and Wiki especially during its money drives, but there are thousands more just like them.

(g) Web browsing sans JS is more secure, there's less chance of receiving nefarious scripts etc.

It's not JavaScript that's the problem, rather it is web developers who actually abuse it to end users' considerable disadvantage. Turning off JS is my attempt to level the playing field.

People ask how I manage with sites that have key functions that require JS to work and without which the sites are useless. Several answers: first, sometimes I do have to turn on JS but it's certainly more the exception than the rule. To turn JS on/off I just hit a toggle button on the browser that invokes say the QuickJava add-on (and I never bother using 'cantankerous' browsers such as Chrome, Edge, etc.).

Second, when I come across a website that only displays a blank page if JS is disabled and or requires JS for some key function to work then I simply move onto somewhere else! If any web developers who adopt such practices are reading this then let me tell you categorically that there's nothing else that will get me off that site faster than that, in a sub-second I'm out of there. QED!

Third, people say I'm missing out on many web enhancements by turning off JS. Often that's true but in most instances I just do not want them. I find most pages that use these visual enhancements distracting and or outright annoying. When I read a book I do not expect everything on the page to be continually popping in and out, same goes for webpages. What's nice about HN's website is that its pages are static and they carry no superfluous junk. That's how I like it. In my opinion prerequisite training for web developers ought to be a compulsory course on human ergonomics (they'd then perhaps better understand what Web surfers actually want to view and then design their pages accordingly).

Finally, I'd add that I've noticed repeatedly over the years that websites with the largest size pages and those which include the most superfluous amount of junk are also the sites whose content I find least interesting. The fact is there's just so much content on the web that if 99% of websites were of the type that I've no interest in visiting then I still wouldn't have time to visit all the others in my lifetime (I can afford to be very choosy).

In essence, on the Web it's easy to set one's own rules and be choosy about what one views. And I'm sure the reason for why not more act as I do is that many have been forced into using the narrow webcasting as offered by the big internet giants. They of course all require surfers to have JavaScript enabled.


I feel like the author is stuck in the 2000's where JS should indeed only support the browsing experience. But we're in 2024 we're JS has reached almost the same level of support in every major browser.

Many of the arguments are just plain stupid. The JS hasn't loaded yet? So what about images? If your HTML document loads, your JS will too. If you're unsure about that, don't load from sources you don't control.

And if a MITM injects code into your JS, then they inject that into your HTML as well...


>The JS hasn't loaded yet?

My dad is an avid reader of news articles on his iPhone. That's mostly text with some photos here and there, ostensibly anyway.

Whenever I happen to see how much traffic he downloads over the better part of a working day, somehow he racks up almost if not over a gigabyte. And that's with iOS set to data saving mode.

1GB. For what should be mostly textual and static image content. What the sincere fuck?

The only reason people need phone data plans with sky high data speeds and caps, aside from streaming, is because malvertisements and JavaShit waste away most of it. This is fucking cancer.


There’s no way even 10% of this is due to JS. It’s probably mostly images and videos which make up this data usage.


Exactly. Even of the part that is JavaScript it's mostly going to be tracking and ads. GA, segment and so on.

Use of react and fetch is not unreasonable in 2024 folks.


If you only use JS for progressive enhancement or optional features, you don't need to worry if your user has JS or not.

Most websites are inherently text, most interactions are form-like. Unless the website is a web application, which does complex and dynamic things, JS seems like it makes websites worse. JS riddled sites are slow, have poor usability, and often are over-animated to the point of visual distraction.


> most interactions are form-like

I'd say that most interactions are link-like, but that proves your point even further.


JS is both the best and worst thing to have happened to computing in the last decades. Great creations based on it, but I constantly feel that dev community comes from a completely different planet than me when it comes to thinking about quality, resources, and priorities. I often feel that the greater JS community does more harm than good when it comes to what can be built atop that ecosystem.


I have seen so many of these "you shouldn't do this completely normal, accepted and usual thing in $current_year because of corner case XYZ", but who is the target audience?

As a developer, I don't have the authority to decide I'm going to spend resources on making the site work without javascript. I can sneak in some extra hours, but continuously testing if the site works without it, and getting other developers to do the same is not a small task.

Managers? They are not going to read this.


There are lots of people in the world and some of them have unusual perspectives on how the world should work.


I did a double-take because there's two ways to read that. We have an unusual, thin slice of humanity here on HN, a minority with disproportionate power to impose its will upon billions while calling others with different lifestyles "marginal", "edge cases", and so on.


> Have they switched of JavaScript? Because people still do.

As an experiment I tend to disable JavaScript periodically. It is amazing how fast and responsive applications become. The amount of cruft that is downloaded is insane! Typically my experiments end after a month or so, because a lot of website don’t work _at all_ with JavaScript disabled.


I generally browse with JS disabled, most things I read "just work".

To achieve this I make use of uMatrix, with the global default being JS disabled, and CSS disabled. Then for sites where I consider it worth the cost, and value the content, I selectively enable JS.

Now one other reason I could give for why not JS would be screen readers, I imagine in content does not render without JS being active, those readers may well be rendered inoperative. Or at best very poor to interact with.


I feel like this is trying to convince me to support clients without javascript.

The answer is no. Absolutely not.

If you do not have javascript then I don’t want you as a user. I don’t care why. Javascript or no service.


I have some pretty popular personal web app and I still get angry e-mails from people using Windows XP that my site is broken and I am terrible developer for making broken websites.

I wonder how they even manage to browse web daily. Or they have to send daily hate mail for developers...


I'll support latest version minus one of major browsers.

That's all I'll support - if any user has a problem because they are on an old version, they should use some other website instead of mine.

Lack of JavaScript or latest web browser is a user problem.


that's just aggressive and lazy.


also, even the fastest updating browsers, Firefox and Chrome, take several months to move most users to a new version. There's a spread across a few versions tapering at about 3 or 4 back but current minus 1 is gonna cut off double digit percentages of users. Minus 2 or 3 would be reasonable for someone being aggressive and lazy, but 1 is kind of obscene (or bullshit.)


why?


Only supporting the very latest browsers is basically a "works on my machine" attitude. That's lazy.

It means not supporting 11% of iPhone users and 18% of iPad users [0]. That's aggressive.

Adoption of evergreen browsers is also slower than that. Chrome 124 was released this week and even though 123 has been out for a month, the majority of desktop users are still on 122.

[0] https://developer.apple.com/support/app-store/


I make these talking points because I hate javascript as a developer. I won't ever bother listing all the reasons why, because they are so well known. I'll actively repeat these points ad nauseum in every meeting if you have this attitude with me on the team. To me, JS is a terrible language to work with, but I will make it about the user. You will lose this argument with the boss, because he doesn't care about languages, he cares about more users. So you can fold your arms and pout about it like this, and I will still win. Most of my apps work without JS and when someone tries to introduce a JS only feature, I make their lives miserable with the boss until they give up or quit their job.


so what are these arguments in favor of the user?

as a user i like websites that are interactive and fast to respond. from my experience that works better if the site is done in a frontend framework and built to reduce roundtrips to the server.

i am on a slow connection, and when on hackernews for example on a slow day half the time the pages don't even load. so i sit there wanting to reply to a comment and i can't because first the reply link fails, and when it finally loads, submit fails.

had hackernews been written with a frontend framework, then i could click reply and submit without internet access, and the page could store my comment until internet is back and send it without me having to babysit it.

even just plain reading would would be better. with js new comments could be loaded in the background and they could be added to the page and marked as new without reloading the whole page. a much smoother experience than having to wait for a reload.

i don't know what kind of sites you are building. maybe in your domain this kind of example doesn't apply, but claiming that javascript is bad for users across the board is just plain nonsense. most users do not care whether something is done in js or not. they care that it functions well, and there are cases where javascript provides the better functionality. (submitting this comment took 1.3 seconds + another 300ms to load the updated page. and that's fast. with js it could happen in the background.)


Both examples kinda sound like an anti-feature.

> had hackernews been written with a frontend framework, then i could click reply and submit without internet access, and the page could store my comment until internet is back and send it without me having to babysit it.

Which websites do this? I have never seen nor would expect such a behaviour. Would you expect it to still post if you close the tab? What if you close the tab, but open some other page on HN? Would you expect to see it in your profile?

> even just plain reading would would be better. with js new comments could be loaded in the background and they could be added to the page and marked as new without reloading the whole page. a much smoother experience than having to wait for a reload.

So when you’re on a slow (and possibly traffic-limited) connection you would want the site to hog the channel with update-fetching? Loading the new comments, shifting everything as you read it? This might be OK for linear flat comments, but for discussion trees this just sounds like a nightmare, again - who does this?


new comments don't need to appear automatically, they just could be preloaded so when i click to see them, they appear instantly.

if you have a fast connection then it is hard to imagine how grating it is that every click takes a second or two to resolve.

i hate waiting for stuff to load. when reading hackernews i frequently reload the page to get the new comments. and that always takes more time than i am comfortable with. it takes long enough for me to often end up focusing on something else. preloading comments does not take much bandwidth. less than reloading the whole page. even if i end up not reading the new ones. and it doesn't have to happen more than once a minute. maybe even less.

stackoverflow is another example that could be improved. it hides some of the comments under each answer, and when i click to reveal them it takes 1-2 seconds for them to show up. i don't know why they do that. they could load all the comments at once and hide some of them so that they'll show up instantly when i click.


Which popular examples are engineered with these JS niceties in comment submissions ?


I’m with you. On the Internet today, a web browser is a thing that runs JS. Clients without it are HTTP browsers. Given that no browser maker has shipped that as a default config in what, like 20 years?, that ship has sailed.

JS came out a year before CSS for Pete’s sake. It just turned 28. It’s finishing its residency and can rent cars. I get why people didn’t like it nearly 30 years ago. I didn’t because it didn’t run well on the Amiga I had back then. But complaining about it today makes me imagine someone complaining about “why don’t these so-called ‘web’ sites gracefully fall back to table layout for those of us who can’t or won’t use CSS?” I understand the fundamental difference. I just don’t think it matters anymore.


If you understand the fundamental difference, why are you using CSS as a comparison?


Because sometimes analogies are convenient, although imperfect. Close up, sure, JS is running random code on your computer and CSS is a non-executable description of a page’s appearance. Those are very different. Farther away, they’re both fundamental technologies of the modern web that most new things are built on, and disabling them breaks a huge portion of the web.


And sometimes analogies lead you astray like in this case of comparing the age of some tech to biological age or random code (which btw also breaks plenty of the web) and styling


I don't understand. Why not build most of the functionality in HTML + CSS and use JavaScript only for interactivity and "real time"?

Hacker News is a very good example of that done right. I have JavaScript disabled and I can use the site just fine. The only thing I can't do is collapse comment threads.


Because this closes you off from being able to use a whole suite of development options (such as front end JS frameworks).


> People still do.

Links to an 11 year old article.


I still do. In fact, hello from a browser with JavaScript disabled


In the article's defence, it _was_ written around early 2015. [1]

[1] https://web.archive.org/web/20240000000000*/https://www.kryo...


I've noticed a lot on popular topics on Scott Alexander's ACX on substack, once comment numbers reach the high 3 digits or so, the page gets unbearably slow and sometimes even crrashes my browser tab when I try and scroll or select text. I haven't bothered to try and debug what's going on - my guess would be they're bubblesorting the comments or something dumb like that - but I do know that if you turn JS off for that site, the post itself loads instantly, and shows you the whole thing at once rather than firing event handlers as you scroll down or click or select text.

So +1 to substack for working fine without JS enabled (even for subscriber-only posts) but if someone from there is reading this, can you take a look at what's causing the problem in the first place?


>> How many times have you had a mobile browser hang forever loading a page and then load it instantly when you refresh?

At least once per day, it's insane. I am not a front dev, so I never really thought about it. I often just click a news website, and stop the connection manually to read it, instead of continuously waiting for the "proper" response


related: JavaScript Naked Day https://news.ycombinator.com/item?id=40104842

i browse the web with firefox + ublockOrigin with 3rd-party {scripts,frames} blocked by default. it is a shame how many websites will show their main asset (text) not until i allow some (or all, if in a hurry/impatient) foreign site java script.

The post touts for an 'escalator' pattern, which is a good pattern to follow (for many things unrelated to web as well).


This is why I think JS Naked Day is a cool concept. It's happening on Apr 24th. https://js-naked-day.org/


Maybe the author can strategically synthesize something novel, and still relevant. This is like arguing about cloud and hosted infrastructure in 2006.


LOL.

Your average JS engineer needs to have all the browsers stripped from their machine. They need to be given the following VM's: 1. tool bar riddled version of windows that only has 8 gigs of ram and at lest one unreliable key. 2. A chrome book, but with all the filters of the nearst school system. 3. Chrome, on linux, 512mb of ram.

And they get what ever the low end Samsung J series its a phone that makes calls as their cell (cause no cheating).

I give that about 3 months till the web is fast again.


They should also give all designers the tiniest still supported mobile phone screen, while enlarging the fonts by at least 20% and making the screen black and white.

I wouldn't go as far as making the language German but.....well actually.....yes put the phone in German as well!


Das ist keine Dzamdzung das ist Scheissung!


Who knew devs/corporations don't optimize the web for people with no spending power?


A lot of corporations make their revenue selling in great volumes to people with little spending power.

Walmart can’t afford to design their website to Louis Vuitton’s audience.


I talk about this so much at my workplace and every time I mention slower phones I get laughed at by my boss. It's idiotic.

"Just get an iPhone"

Like... my dude, you rolled out of uni and started your own business and started out making at least double every month than normal people and you buy a new phone every 2 years. This is not normal.

Our target customer is very literally the "average consumer", so this is the dumbest take I've ever seen.

I try to minimize garbage collection and optimize React rendering but React is honestly quite difficult to get right when it comes to details in performance, even after a few years of experience :(. It doesn't help that I explicitly don't get time to focus on these kinds of details.


Meh, are slow phones really an issue?

I'm using a motorola android i bought three years ago for around $200 and it's absolutely fine for browsing the web and messaging. I could justify buying an iPhone (after all it comes down to a dollar a day over two/three years), but I don't feel like I have to.


Slow phones on slow connections.

Try to access the Internet on modern Germany while crossing through the country on train.


i have lived in germany. germans should suffer their slow internet until they get fed up and demand better.

so no, adapting my website so that it works on a german train is not an important goal. i would even argue that if germany is my audience it is counterproductive. demand better internet.


How thankful of you, there are many countries that are even worse, but sure the confort of the developer is more relevant than of your users.


i am living in such a country now. and if those countries are my target audience i will certainly build for that. heck, i might even hire a dev in germany and send him traveling on trains as a test case ;-)

but if my target audience is germany, i will probably not focus on the tiny fraction of german train travelers.


Fair enough. As an Austrian I'm deeply familiar with the experience in German trains but that doesn't get any better with an iphone either.

I can assure that Apple's RF engineers working around here are suffering the same issues.


It gets better when Website designers take that into consideration.


Why?

What could possibly be so important to be done that can not wait until I am in a place with a faster connection?


Why is really the question of universe.

Not everyone is you, and some people like to keep themselves busy reading stuff online.


And there are countless ways to keep oneself busy, the majority of which do not require Javascript-heavy websites.


As mentioned, not everyone is you.


And not everyone is you, either.

OP was saying about devs needing to build their product/applications/websites on underpowered hardware, presumably as a way to have the same experience as most people. That's reasonable, provided that this is the context where people will use your product/application/website. The context you are giving (slow phones on spotty connections) is so specific that becomes irrelevant for any project manager that could be interested in evaluating the merits of the claim.

So, yeah, I don't know why you seem so upset about my question, but if you detach yourself from the case, you'd see that your response to "are slow phones really an issue?" does not make a strong argument.


You are holding it wrong. (TM)


Depends on who is your target audience :

https://news.ycombinator.com/item?id=39729057

Most people don't live in the first world countries, don't have cheap access to high-end devices like a half a decade old android flagship (and 8.4% of people live on less than $2.15 a day), and any website relying on JavaScript for basic features literally won't be able to LOAD for them.


> Most people don't live in the first world countries, [...] any website relying on JavaScript for basic features literally won't be able to LOAD for them.

Citation needed.


Ok, I guess this is one of those cases where I don't break rules when I say :

RTFA

(the one I linked to)


You can't just link to a wall of text to support random statements. Citation needed.


even low end smartphones work ok with basic javascript. the people who live on $2 a day mostly have a feature phone that is barely internet capable (remember WAP?) or they have no phone at all. i don't know about modern feature phones, but i expect they can handle javascript just fine, because it doesn't make sense to build a new phone today that doesn't. (i just did a search on some african online sites and found that smartphones can be had from $30, but most importantly that new web capable feature phones are not any cheaper (should those even be called feature phones?))

where internet is available it is usually not that slow either because it doesn't make sense to build a slow internet. i mean, it may be slow compared to what we are used to but even 1Mbps is enough to load at least text oriented websites, even with javascript. those companies do need to offer a service quality that people are willing to pay for. slow internet is a problem of old infrastructure, but developing countries tend not to have old infrastructure to begin with.


Then comes one after another manager, PO, PM, designer. They want: privacy consent dialog, social media integration, newsletter and subscription prompt dialog, put all photos into a clickable gallery, floating auto playing video, and AI chat somewhere right there in the corner.


Which is very sad, if you think about it. None of that benefits me as a user, it's for their benefit.


Someone brought up the topic of JS and screenreaders upthread. Screenreaders can work fine with JS. The pain points are:

1. Use of JS to reimplement standard HTML widgets. This has broken my screenreader more times than I care to remember. 2. All of this user-irrelevant garbage, like the dialogs, the social media buttons, and so forth.

I remember a few years back trying to pay my electric bill. Yes, I was using a JS-capable browser. I couldn't actually pay my bill, but there were plenty of "follow us on Facebook" and similar. Like seriously, folks, I just wanna send you money. Really, this should even be something I could do without JS.

Another example: we do online shopping at https://www.fredmeyer.com. Their website is absolutely terrible, with all of the busyness, some of it user hostile. Seriously folks, I just want to give you money for product, not follow you on facebook.

Some sites get it really, really right. I play chess on lichess.org. That site requires JavaScript. And I don't see any way that it could possibly be avoided. But it works beautifully with my screenreader. It's snappy too, even under Firefox on a Raspberry Pi. I used to be a hard core "screw JS" guy. I've softened my stance, because I know it can be used correctly and to great effect.


Mature and professional developers would care what you thought, as a user.


You're not helpful at all, and not a team player.


privacy consent dialog

Just this little reminder: you do not need a privacy consent dialog, if you don't abuse private data.

No external trackers or 3rd party cookies? No data sales to third parties? No shady tactics of your own? You can skip the dialog.


Internal tracks count too. You cant track the users just because you just an internal tracker. It is about any tracking. Not external tracking.


Now you only have to convince sales and marketing who funnel all analytics to Google and Meta. They also like dashboards embeddable into PowerPoint.


I mean if the 2% of the whole internet does not have JS, and they are not a part of your target demographics, then it's not a problem in a business sense. If they are, then yup, you should be prepared.

This post is great in terms of making you aware that even in near perfect conditions JS might be not available, but the impact and solutions should be considered on a case-by-case basis.


Ok but what are the actual stats? I can’t imagine more than 1 percent of users have JS disabled. I think I can live without that.


Missing tab sleeping - js won’t ie fire your setTineout if tab went to sleep after being idle for a while.


Everyone = 99.99%


i just hate typescript. js is ok


i hate javascript, typescript is ok

i don’t understand the hate for TS. after the initial learning curve, it has only made my life easier. i can’t imagine making a large refactor on a JS code base - the fear of breaking some unknown contract keeps me up at night


I dabbled in JS for 25 years. Last year I had reason to write some TS. What a breath of fresh air that was! I doubt I’d start a new project in bare JS now.


Because there are thousands of settings that have to be considered for the transport to work correctly. And for every packet out there, these settings need to be correct. Then debugging is also difficult, because stepping through the TS code doesn't always work as expected.


You only need to get those settings right once, and only at build time.

I won't defend the mess that are web build systems, but your assertion that things need to be right "for every packet" is false. And while I've had no trouble debugging TS code, you could always just debug the generated unminified JavaScript instead.


I meant package, not packet. And no , you don't have to get it right only once. I started a svelte app with TS, and when I tried to add Prisma, it took me a lot of time and research to get the settings correct, so everything would work out. And I wasn't alone. There were numerous posts, and issues on GH about that same thing, so I was far from alone.

And debugging JS is one way of doing things. But when I write code, I want to debug that code. Not the transpired stuff. It just adds a layer of annoying complexity, that otherwise wouldn't be there.


If MS is so smart with TS, VSCode, and Copilot, let the TS write itself. Including defining the right transpilable variable types.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: