I asked a question yesterday Should I Bother to Develop For JavaScript Disabled?. I think the consencus is: Yes, I should develop for JavaScript Disabled. Now I just want to understand why users disable JS. It seems many developers (I guess people who answered the questions are developers) disable JS. Why is that. Why do users disable JS? For security? Speed? or what?
-
I think when you say 'users', the consensus was from developer users, NOT Joe Bloggs users.... – Darknight Dec 14 '10 at 15:35
-
9I think you're making assumptions based on anecdotal evidence. Fact is, that 99.7% users do not turn off JS. In fact, if they really would have JS turned off, they wouldn't have answered question here, because this site does not work without JS. – vartec May 28 '11 at 18:16
-
2I don't know anyone who does. – kirk.burleson May 29 '11 at 13:40
-
6@varted, @kirk: I know a lot of people who do, or at least partly. A lot of security conscious people will only allow JavaScript on sites they whitelist, for instance. And I know a lot who disable JS on their smartphone as it often either worth the battery it drains. – haylem Jun 06 '11 at 16:44
-
1I have javascript disabled in Chrome by default, for security reasons, though I enable it for websites that are worth it. I really dislike that so many websites don't work without javascript, there are more and more of them that won't work at all. – Czarek Tomczak May 23 '13 at 19:02
-
I think that --To See This Comment, Like our page on Facebook and get awesome updates on other Katana314 services! -- is often a pretty big reason. However, vartec is right - Stack Exchange is a developer-centric site, and I'd expect computer-savvy users to be far *more* likely to disable JS. – Katana314 Dec 12 '13 at 15:16
-
does anyone know a good site for statistics that show how many users have disabled js? i've been googling this, what came out was an awful lot of old forum poop – WayneEra Oct 28 '15 at 08:22
-
1One of the reasons is also **privacy**. Websites cannot track you without Javascript. – Hannes Karppila Mar 01 '16 at 22:06
-
3@HannesKarppila They can still use cross domain cookies and web-bug tracking images etc. Tracking users works fine unless they disable javascript *and* cookies, in which case the internet is pretty unusable. – NickG Sep 20 '16 at 15:32
-
Oh hey, it's 8 years later and I now disable JavaScript so my room doesn't heat up to mine cryptocoins for someone else. That and the remote possibility of Spectre attacks. I'll enable it for your website if I judge that your website has a legitimate reason to use it, and isn't abusign it. – user253751 Apr 05 '18 at 05:44
8 Answers
One disables JavaScript in a browser environment because of the following considerations:
- Speed & Bandwidth
- Usability & Accessibility
- Platform Support
- Security
Speed & Bandwidth
A lot of applications use way too much JavaScript for their own good... Do you need parts of your interface to be refreshed by AJAX calls all the time? Maybe your interface feels great and fast when used with a broadband connection, but when you have to downgrade to slower connection speeds, a more streamlined interface is preferred. And switching off JavaScript is a good way of preventing dumb-struck web-apps of refreshing the world every 15 seconds or so for no good reason. (Ever looked at the amount of data Facebook sends through? It's scary. It's not only a JS-related issue though, but it's part of it).
We also tend to off-load more and more of the processing to the client, and if you use minimalistic (or just outdated) hardware, it's painfully slow.
Usability & Accessibility
Not all user interfaces should expressed in a dynamic fashion, and server-generated content might be perfectly acceptable in many cases. Plus, some people simply don't want this type of interfaces. You cannot please everybody, but sometimes you have the chance to and the duty to satisfy all your users alike.
Finally, some users have disabilities, and thou shalt not ignore them, ever!!!
The worst-case scenarios here, in my opinion, are government websites that try to "modernize" their UIs to appear more friendly to the public, but end up leaving behind a big chunk of their intended audience. Similarly, it's a pity when a university student cannot access his course's content: because he/she is blind and his screen-reader doesn't support the site, or because the site is so heavy and requires ad-hoc modern plug-ins that he/she doesn't get to install on that refurbished laptop bought on e-bay 2 years ago, or again because he/she goes back home to another country for the spring break and the local bandwidth constraints cannot cope with the payload of the site.
Not everybody lives in a perfect world.
Platform Support
This point relates to the 2 previous ones and tends to be less relevant nowadays, as browsers embed JavaScript engines that are a level of magnitude more efficient than they used to be, and this keeps getting better.
However, there's no guarantee that all your users have the privilege of using modern browsers (either because of corporate constraints - which force us to support antediluvian browsers for no good reason, really - or other reasons which may or may not be valid). As mentioned by "Matthieu M." in the comments, you need to remember that a lot of people still use lower-quality hardware, and that not everybody uses the latest and coolest smartphone. As of today, there are still a significant portion of people using phones that have embedded browsers with limited support.
But, as I mentioned, things do get better in this area. But then you still need to remember the previous points about bandwidth limitations if you keep polling very regularly (or your users will enjoy a nice phone bill).
It's all very inter-related.
Security
While obviously you could think that nothing particularly dangerous can be done with JavaScript considering it runs in a browser environment, this is totally untrue.
You do realize that when you visit P.SE and SO you are automatically logged in if you were logged on any other network, right? There's some JS in there. That bit is still harmless though, but it uses some concepts that can be exploited by some malevolent sites. It is completely possible for a website to use JavaScript to gather information about some things you do (or did) during your browsing session (or the past ones if you don't clear your session data every time you exit your browser or run the now common incognito/private browsing modes extensively) and then just upload them to a server.
Recent vulnerabilities (working in major browsers at the time) included the ability to gather your saved input forms data (by trying out combinations for you on a malevolent page and recording the suggested texts for each possible starting letter combinations, possibly telling attackers who you are, where you work and live) or to extract your browsing history and habits (A very clever hack doing something as simple as injecting links into the page's DOM to match the color of the link and see if it's been visited. You just need to do this on a big enough table of known domain names. And your browser getting faster at processing JavaScript, this type of thing gets done quickly.)
Plus let's not forget that if your browser's security model is flawed, or the websites you visit don't protect themselves when enough against XSS attacks, then one might use JavaScript to simply tap into your open sessions on remote websites.
JavaScript is mostly harmless... if you use it for trusted websites. Gmail. Facebook (maybe... and not even...). Google Reader. StackExchange.
But yeah sure, JavaScript cannot be that bad, right? And there are scarier things to fear online anyway. Like thinking you're anonymous when you really aren't that much, as shown by the Panopticlick experiment of the EFF. Which is also partly done using JavaScript. You can even read their reasons to disable JavaScript to avoid browser fingerprinting.
All this being said, there might be perfectly good situations where you don't need to bother about supporting JavaScript. But if you offer a public-service website, do consider accepting both types of clients. Personally, I do think a lot of modern web-apps and websites would work just as well using the former server-generated content model with no JavaScript at all on the client side, and it would still be great and possibly a lot less consuming.
Your mileage may vary depending on your project.

- 28,856
- 10
- 103
- 119
-
5Facebook, for example, is a terrific drain on your CPU. I've come across some sites that were so poorly coded (or seemed to be), they would basically freeze my computer by fully loading the CPU (with a few other tabs open). – Mark C Dec 14 '10 at 03:35
-
@Marc C: It is indeed, it's really scary. Also, have a look at the forms it send through via POST requests every time you click on *ANYTHING*. That scared the living *#$ out of me... If you have an ISP limiting your monthly bandwidth, do you use the Facebook Lite portal! – haylem Dec 14 '10 at 03:41
-
1@Haylem Interesting observation, but how did it scare you? Surely you do not believe sending text even begins to compare with video bandwidth! – Mark C Dec 14 '10 at 04:04
-
3@Mark C: I consider sending back as much as 140K for clicking on "send" when I type a comment a slightly exaggerated use of web forms, honestly. Might have been in specific cases at the time and that have been fixed since then (hopefully). I lived for a short while in a country with a, hmmm, restrictive stance on internet censorship and not so great connection quality, and that makes you appreciate good ol' text-based websites a lot more! – haylem Dec 14 '10 at 04:08
-
1@Marc C: In which case I obviously wasn't using much video online. Again, as stated in my answer, don't assume everybody is as lucky as you might be in having a nice and speedy connection. If you think about it, that's a pretty recent development, and that's not the case for everybody. – haylem Dec 14 '10 at 04:09
-
Downvote? Really? what's so wrong in what I point out and how does it not answer the OP's question? – haylem Dec 14 '10 at 04:10
-
@Haylem A hundred and forty K for a click! They should have a "fundraising" slogan, **"Write [Time Machine](http://www.gutenberg.org/ebooks/35.html#download) with One Click.** A few more clicks and you have [A Tale of Two Cities](http://www.gutenberg.org/ebooks/98.html#download). – Mark C Dec 14 '10 at 04:13
-
@Mark C: I just happened to have my Firebug console at the time and see a bunch of POST forms with either multi-form data or JSON code going through. I preferred not look more into it :) – haylem Dec 14 '10 at 04:17
-
@Haylem Still, when you say "limiting...bandwidth" I think on the order of at least some hundreds of MB. Okay, so that's 13MB per 100 comments. – Mark C Dec 14 '10 at 04:18
-
@Haylem Hm, maybe I should downgrade my understanding of the internet to "low". On the other hand, I know many people know hardly anything about it. – Mark C Dec 14 '10 at 04:20
-
@Mark C: quite normal really, I wouldn't claim to 1% of what there's to know. And that's the annoying bit: we do have nowadays a lot of non-technical people online. Which is a GREAT thing, and something we want to achieve. But they are facing a lot of nasty issues. Guess that's a new environment for a lot of people. We've learned over time to protect ourselves in real life, avoid zones of potential conflict, etc... but online people still go visit websites for dubious purposes and then wonder how it happened to that their computers behave strangely or that they receive letters from Nigeria. – haylem Dec 14 '10 at 04:25
-
8+1 for mentioning accessibility. Half the damned web is completely unusable to me, and I'm neither a casual computer user nor do I need to rely on JAWS (yet). – Stan Rogers Dec 14 '10 at 05:09
-
2@Stan Rogers: it matters a lot to me. I had a chance to work with a blind guy at university, who happened to be both student and teacher, and I was blown away by his abilities. And I find it rather sad that big companies and even educational institutions now come up with crappy artsy web-sites where these users are left out. – haylem Dec 14 '10 at 14:00
-
Should "Not everybody leaves in a perfect world" be "Not everybody lives in a perfect world"? – Chris Dec 14 '10 at 17:30
-
@Chris: outch. Indeed. It was early in the morning (or late into the night). Thanks. – haylem Dec 14 '10 at 17:48
-
No problem, I did not want to change it myself because I was not sure if it was correct and I was reading it wrong (early in the morning). :-) – Chris Dec 14 '10 at 21:54
-
@haylem: I am quite late on the debate... but to add to the list of browsers that don't handle javascript/heavy pages well, we can take a look at the mobile phones. Old Blackberrys or Nokias come to mind :/ – Matthieu M. May 28 '11 at 19:26
-
@Matthieu M.: fair point. I didn't have much time but I just added a section about this. – haylem May 29 '11 at 13:12
-
+1 Good list. For longer-time internet users, I would also point out that there's a lot of bad history with Javascript in web pages. It was not uncommon in the early days of the web to find that a page had loaded javascript merely so it could do obnoxious things like window.alert, and it has only semi-recently begun where pages actually need javascript to perform useful tasks. – Dan Lyons May 23 '13 at 18:01
-
1@DanLyons: "Semi-Recently" meaning here since about 2005 I guess? That's already about 8 years, which on the timeline of WorldWideWeb history is already a very large (and significant) chunk. :) But that's right. Though I wouldn't shame JS today for its mistakes (and for the developers' misuses) of yesterdays. – haylem May 23 '13 at 20:31
-
2+1 for accessibility. I work for a site that is very heavily related to healthcare. (Thankfully not the abysmal one in the news) As much benefit as JS gives us, I'm greatly saddened of our priorities. – Katana314 Dec 12 '13 at 15:18
-
I think the speed point is a mixed one. AJAX enables key functionality to happen without full page POSTs and reloads. If done right, it can drastically decrease bandwidth needs (except possibly for first load). – andrewb Aug 21 '15 at 05:55
-
Because trusting somebody to write a funny comic strip every morning and trusting somebody to run arbitrary Turing-complete code on my computer are two very different things.

- 101,921
- 24
- 218
- 318
-
3+1 for the funny analogy. Though the fact that it's Turing complete has **nothing* to do with the dangerousness of the execution. – haylem Dec 14 '10 at 14:56
-
4@haylem: Being Turing-complete means that it is impossible to mechanically prove secure in the general case. Heck, it's even impossible to prove basic things like that it doesn't run forever. For a more restrictive language, it would be possible for the client browser to prove that the script isn't doing something dangerous. – Jörg W Mittag Dec 14 '10 at 15:25
-
23Turing-completeness is *only* about computability. It says nothing about whether the interpreted language is allowed to open files, make HTTP requests, etc. The only inherent danger in Turing-completeness is the possibility of an infinite loop. – dan04 Feb 26 '11 at 07:01
-
@dan04 Or that it tries to emulate an x86 processor running Windows running a Desktop application which gets projected into your browser window - all in Javascript. Turing completeness is scary – sinni800 Mar 11 '13 at 10:32
-
@dan04 and now cryptocoin mining botnets (which only require compute resources, and the ability to send results back). – user253751 Apr 05 '18 at 05:46
I am not a web developer, and I have only a moderate understanding of the way the internet works. So this is an answer from a user.
My experience leads me to believe many sites are simply poorly coded, whether out of laziness or ignorance: When I would view a basically static web page, such as a Facebook page, my CPU usage would increase by something like 15%, and drastically more with multiple tabs. Eventually it got to the point where I would have to wait for a response after clicking a button or link and my CPU would overheat and lock up.
On many of these worst offenders (sites), nothing visible is changing and nothing interactive is happening. I could only suppose the site's code was constantly making excessive refreshes, polls, and endless loops.
This drove me to install NoScript to free my CPU usage and stop browsing from becoming a frustrating chore.
The other wonderful add-on I use is FlashBlock.

- 1,289
- 14
- 21
-
Facebook doesn't provide static pages: it's using a technique called [long polling](http://en.wikipedia.org/wiki/Push_technology#Long_polling) to check for new notifications, IM messages, and newsfeed items. All of those things require JavaScript and some amount of CPU power. – Dec 14 '10 at 15:38
-
2@MarkTrapp Yes, that is why I said "basically static" although it is not strictly speaking a static page. HyperPhysics would be an example of a site with static pages. I realize there is probably a need to do that kind of thing otherwise the boxes would never disappear and you would not see notifications until you refreshed the page, **but**: It seems each site helps itself to more of your resources than it should, similar to the situation where a professor or teacher expects you to put their work first. – Mark C Dec 15 '10 at 00:00
-
if you think that facebook is static page, then you shouldn't comment on this question. – Dainius Jul 15 '13 at 08:01
-
@Dainius You seem to be confusing jargon with English, and are not following logic here. What is it doing with all those CPU cycles here? That is the problem. Maybe it is better now, but a lot of these websites help themselves to obscene amounts of your CPU time. – Mark C Mar 06 '16 at 19:04
-
Mark, you call yourself webdev and asking, what does static vs dynamic page have to do with cpu cycles? or you really think that "view a basically static web page, such as a Facebook page" is true? – Dainius Mar 07 '16 at 21:55
-
@Dainius You are almost an engineer, except without the analytical skill. You did not even read correctly. – Mark C Mar 25 '16 at 05:23
-
I disable JS for speed reasons. TechCrunch without JavaScript takes a few seconds to load with a primed cache. With JavaScript it takes almost 20 seconds, more if the cache isn't primed.
Lots of sites have become bloated with JavaScript, especially image galleries and commerce sites. Removing this gives you a better browsing experience in most cases.

- 23,019
- 10
- 65
- 100
For me it all about security. I use noscript to allow certain websites to run javascript, while disallowing most.
In the end you really never know where the danger lies (nobel web site infected on techspot.com). Many zero-day (and other) exploits use javascipt; closing this one avenue of attack feels like a step in the right direction.

- 432
- 3
- 8
-
1You need brackets around something for that link to kick in. That reminds me, I learned only last winter that the Yahoo! Sports ads were infected with some kind of malware (or would infect you). The young man administrating the home network where we were staying blacklisted numerous sites that had infectious ads. – Mark C Dec 14 '10 at 05:11
My main reason is that it suppresses the most annoying ads. I'd rather not use AdBlock Plus, since that can affect revenue for the sites I visit (and I've used a site or two where the terms of service said I was not to disable ads). NoScript limits the potential obnoxiousness of ads, and I'm willing to live with the rest of them.
There's also the security consideration, and that's largely related to ads also, since any site that sells ads has to be considered potentially hostile.
Moreover, I don't necessarily know a site is dodgy before I visit it. Some people enjoy sending out links to sites, and aren't necessarily honest.

- 20,238
- 2
- 55
- 82
-
Also trusted sites may include harmful Javascripts - through XSS or because they are hacked. The example of the Nobel-prize-site comes to mind. – Mnementh Dec 14 '10 at 17:14
Because browsers used to have slow JavaScript implementations and too many n00b web designers just used it for irrelevant things like button rollovers.
On a fast machine, with a modern browser, nobody in their right mind disables it all the time. Which is not to say there aren't plenty of very "security conscious" people and others without the funds, desire, or know-how to be running a modern browser on a fast computer... It was only recently that IE6 stopped being the most popular browser on the internet!

- 333
- 1
- 4
-
"and others without the funds, desire, or know-how to be running a modern browser on a fast computer". I can understand and agree with the "funds" part. I can understand with the "desire" part, though I'd think it usually would be more a matter of "need" as an imposed constraint that a refusal to have a decent computer. But I don't really get the "know-how" part. How can you be unskilled to the point of not buying a recent computer? Or if you do, of misusing it to the point of not installing the bundled browser and using an older one instead? :) – haylem Dec 14 '10 at 04:20
With Javascript activated any website may execute code on my Computer. I don't even know, if the particular website executes code and what it does. Even more worse, someone else may insert code without my knowledge into a normally harmless website (XSS). Recently a the well-known german computer-magazin c't made an article, taht an 16-year-old tried the online-banking-sites of the most common banks in germany. Many of them - including the biggest - were vulnerable for XSS. And you don't even notice, that your online-banking-site executes some Javascript that changes for example the target and the amount for a transaction. With disabled Javascript the XSS-attack in the context of a trusted site is useless, I don't execute the malicious code.

- 1,575
- 13
- 20