All in the <head>

– Ponderings & code by Drew McLellan –

– Live from The Internets since 2003 –


Why is Progressive Enhancement so unpopular?

27 January 2014

A little earlier today, having read how Sky broadband had blocked the jQuery CDN I tweeted

To which many responded this is why we don’t rely on CDNs and how you can (shock, horror) even host your own JavaScript fallback and how you make a hole at each end of the shell and suck with a straw. In order to clarify the problem, I followed up with

The internet, as a network, is designed to be tolerant of faults. If parts of the network fail, the damage gets routed around and things keep working. HTML is designed to be tolerant of faults. If a document has unrecognised tags, or only partially downloads or is structured weirdly, the browser will do its best to keep displaying as much of that page as it can without throwing errors. CSS is designed to be tolerant of faults. If a selector doesn’t match or a property is not supported, or a value is unrecognised, the browser steps over the damage and keeps going.

JavaScript is brittle and intolerant of faults. If a dependancy is missing, it stops. If it hits unrecognised syntax, it stops. If the code throws an error, in some cases it stops there too. If part of the script is missing, it likely won’t even start. As careful as we are to code defensively within our JavaScript, it counts for nothing if the code doesn’t run.

Does that mean we shouldn’t use JavaScript? Of course not. Scripting in the browser is an important part of the experience of using the web in 2014. It’s my opinion that you shouldn’t depend on JavaScript running for your site to work. Build with HTML, add styling with CSS, add behaviour with JavaScript. If the JavaScript fails, the HTML should still work.


This isn’t a new concept, it’s a very old one. What is new, however, is the backlash against this very simple idea by people who at the same time consider themselves to be professional web developers.

It used to be that progressive enhancement was the accepted ‘best practise’ (ugh) way to do things. If you’re building a site today you’d generally make it responsive. Any new site that isn’t responsive when it could be is considered a bit old-hat and a missed opportunity. So it used to be with progressive enhancement. If you built a site that depended on JavaScript, chances are you were a cowboy and didn’t really know what you were doing – a skilled developer wouldn’t do it that way, because they know JavaScript can break.

Somewhere along the line that all got lost. I’m not sure where – it was still alive and well when jQuery launched with it’s find something, do something approach (that’s progressive enhancement). It was lost by the time AngularJS was ever considered an approach of any merit whatsoever.

When did the industry stop caring about this stuff, and why? We spend hours in test labs working on the best user experience we can deliver, and then don’t care if we deliver nothing. Is it because we half expect what we’re building will never launch anyway, or will be replaced in 6 months?

Perhaps I’m old fashioned and I should stop worrying about this stuff. Is it ok to rely on JavaScript, and to hell if it breaks? Perhaps so.

- Drew McLellan


  1. § Martin Bean:

    Call me unpopular, but that’s exactly the approach I take to building websites. JavaScript is the last thing I add, only added to enhance an app’s functionality after it has been built in PHP, and then delivered to the client in HTML and CSS.

  2. § Luke C.:

    The argument I always heard was “who browses without JS enabled these days, anyway!?”

    I think it’s just perceived as too much work for the number of cases in which JS fails.

    A poor argument, to be sure.

  3. § Jens O. Meiert:

    Not that I’ve paid that much attention to your last writings (I should, I suppose), but I wouldn’t have thought we’d come to that big of an agreement on a technical matter, Drew ;) I’ll listen more closely.

  4. § Ben:

    “JavaScript is brittle and intolerant of faults. If a dependancy is missing, it stops. If it hits unrecognised syntax, it stops. If the code throws an error, in some cases it stops there too. If part of the script is missing, it likely won’t even start. As careful as we are to code defensively within our JavaScript, it counts for nothing if the code doesn’t run.”

    For each of those points you could say the same of PHP. But of course you wouldn’t put a PHP script live with a syntax error in it. Or with part of the script missing. And you’d handle the errors properly.

    I agree that sites should be able to function without JS, but I don’t see any of your criticisms of JS being specific to the language.

  5. § Drew:

    Ben, the difference being that on the server you have control over the environment. In the browser your code can fail for reasons you’ll not even be aware of.

  6. § Andreas:

    Simply the best answer why progressive enhancement is needed comes from Jake Archibald

  7. § Otto:

    People stopped developing the web properly when they began to perceive their target audience as largely uniform.

    Back when it was perfectly feasible to be browsing with either Mozilla or Lynx, then you made darned sure that your HTML was uniform and descriptive. When the market moved to Firefox and IE (pre-Chrome), you were careful with your XHTML and CSS. Nowadays, a lot of developers presume IE8-9 or better on a modern PC.

    The limitations are no longer so stark, basically. There’s a difference between Mozilla and Lynx, but not quite so much difference between Chrome and Firefox.

    Not that differences don’t exist, but they’re not in the target market. Modern devs generally are not aiming for people with low-bandwidth connections, or people in, say, Africa. Even though these are users too, those markets tend to be very specific, and not necessarily running the latest gear, and perhaps some other tech that doesn’t fit the requirements. But if you’re selling designer watches, then your user probably has a recent browser that can run most JS. The fallout from your non-working browsers just doesn’t matter quite so much.

    And always remember the mantra of the new developer: “make it work”. That’s the first goal when you’re starting out. You may not have the background to understand the reasoning behind HTML or why CSS at first seems mildly insane. Your goal as a new dev is to create something that works and is cool. It works on the three browsers you tested with your high speed internet connection, and you generally don’t look further. Who doesn’t have at least 5 megabits download anyway? :P

    Yes, we should develop with these sort of techniques. We don’t anymore because the web developer community is primarily new developers who have either no or very little formal training, don’t have the background knowledge to know better, don’t generally think about any case that they’ve not seen before, and frankly as human beings we’re all ultimately lazy. Doing things right is harder. Getting it working “right now” is enough for most people. And everybody is attracted to shiny new toys, no matter what previous “standards” they might go directly against.

    Just my 2 cents.

  8. § Chris:

    I do agree a website should not be javascript-dependent. A website should be completely browseable -even if degraded – and we should be able to navigate it even if we use curl, elinks or MSIE 6 (!)
    I agree with that.

    But is a WHOLE different thing a web app. What is called “web app” could be defined as a javascript application that has an UI created with css and html
    AngularJS that you just mentioned is for web apps, NOT for blogs or static sites – it would be a huge mistake to use it for such things.

    Ok now, I don’t think that creating a “metro only” app, or a “linux-gtk3 only” app is a bad thing.

    A “metro-only” app would not work on windows 7. Same way, I think is acceptable to create a “modern-browser only, javascript-enabled only” app, even if it does not work on my MSIE 7 (or even on MSIE8!)
    So I think that if “MegaNice” app has a message like “please upgrade your browser to use this app.” is acceptable. A message like “Please use the latest chrome or firefox or MSIE to use this app” is ok.

    But, I agree that MegaNice’s WEBSITE should be browseable even on my win2000 machine!

  9. § Theo:

    Love to be old fashioned. I saw many pop up login forms rely on javascript, i mean even if javascript don’t break, the user will not find the form with javascript disabled. Thanks for the read!

  10. § Heydon:

    Whether self-hosted or from a CDN, javascript is
    Sometimes blocked or simply fails to load.
    Web authors like us have learnt to detect this and might
    Bookmark to try again later. To everyone else it’s just broken.

  11. § Peter:

    When did the industry stop caring? I think developers in individual companies (myself included) made supporting IE6 look so difficult that product managers just started saying, “look, don’t bother with those crappy old browsers”, and that conceit – that it’s difficult to support “long tail” users – took pressure off developers and product managers, and made testers’ lives much easier.

    We’ve seen an explosion of diversity in devices, and now we need to promote device-inclusive web development. Web developers shouldn’t be scared of committing to support diverse devices and users. We need to promote testing methodologies (both automated and manual) that ease the burden on developers and testers.

    My view as a web developer: for most content that we are presenting we can and should support pretty much everyone accessing our content. That doesn’t need to compromise the quality of the experience for the lucky people with shiny new retina MBPs or whatever; we can enhance for them. If JS breaks, everyone gets the base experience. No-one gets told f*** you (

    Personally, I love the idea of someone viewing my work successfully on a Kindle Fire, or a Nokia N95.

  12. § Francesco:

    I see a few people use AngularJS (and talk about AngularJS at conferences) without even addressing this point. They only say something about Google that maybe will be able to index pages loaded in JavaScript because, after all, AngularJS is by Google (more or less), as if that were the (only) problem.

    Whe I read how it worked, I assumed it was just for web apps (or, as I like to call them, “Interactive sites that would not work without JavaScript anyway because they don’t have real content, the content is the interaction”… but “web app” is shorter, okay), but what I am afraid of is that people reading sites about AngularJS or going to talks about AngularJS are not going to notice this fundamental difference, and start making websites with it.

    And, I don’t know what most developers do, but 99% of the sites I make are still “websites”, with content. And even thuogh web apps are extremely important, most sites will still be websites for a very long time, because we NEED content, not just interaction.

    I’m glad I finally found an article about this.

  13. § Wil:

    I completely agree that progressive enhancement is a great thing, sometimes you’ll be using a piece of JavaScript you didn’t write yourself, and that person might not have caught an exception, or maybe it clashes with another piece of JavaScript you didn’t write yourself.

    I think the argument for JS being ‘brittle’ is one thing, and ‘intolerant of faults’ is another.

    It’s just like any other language, if it encounters an uncaught exception execution with stop just like it would in PHP, Python, Ruby, etc. It’s exactly as intolerant of fault as those languages, the argument is that programmers aren’t catching exceptions or they’re not dealing with them correctly. That’s not JS: it’s programmers.

    As for brittle, I’m not sure of the argument for this one. It’s true that there are several engines that will execute your JS in the wild (SpiderMonkey, V8, etc.) and they’ll support different versions of EcmaScript. If you want to support EcmaScript 5 which is pretty much all modern browsers, then write for that. Test your applications with unit and integration tests, including with those libraries you didn’t write yourself. Again it’s just like using PHP6 or Python 3, except that the execution or interpretation is done clientside, so you need to be wary of the limitations and method availabilities.

    Good JavaScript works perfectly well, and I think it’s unfair to say the language should only ever be used to enhance sites. There’s some amazing work being done by very talented people to provide frameworks and libraries that you can rely on in production.

  14. § kimblim:

    You’re not oldfashioned, you’re doing things the right way.

    I agree with Peter, that the industry sort of stopped caring, although I would probably phrase it in another way: we stopped developing for users and started developing for a billion different devices and screen sizes.

    I might be stuck in my ways, but when I meet developers that don’t use PE, I tend to look at them as amateurs, who have no idea what the web is really about.

  15. § Web Axe:

    Great article, thank you. Developers who really know the code and care about users implement websites with progressive enhancement.

  16. § Mike Griffin:

    I never use CDN’s anyway, they can change on you at any time, and you CANNOT use them if you are using HTTPS, to me this is a non issue, any professional site should have all their JavaScript files local. For sites like jsFiddle, sure it’s nice, but on a corporate website, no way, no how …

  17. § Dan:

    I don’t disagree with you inasmuch as I think progressive enhancement is a a good technique, and it would be great if more developers practiced it. That said, I think lamenting that the industry has “stopped caring”, and sort of implying that web development is becoming lower quality, is short sighted.

    First, I agree with the point Chris makes – web pages and web apps are very different beasts. There are some very interesting, highly interactive UIs being made on the web today where experience and content are equal in importance. Needless to say, these UIs tend to rely quite heavily on JavaScript, and potentially only work on modern browsers. I believe that to attempt progressive enhancement in these cases would be misguided. It would be like trying to port a latest-generation console video game to the SNES: pointless, because the whole value of the game is the experience, which would be lost (or impossible) targeting a less powerful system. Yes, it means some users cannot appreciate these apps; and that is OK.

    As you say yourself, progressive enhancement is an old concept. When it was first gaining traction, the bell curve of the technology people were using looked different. To be sure, there are still plenty of long-tail users (as Peter identifies them); but that tail is getting smaller and smaller. Modern browsers are overtaking legacy ones, mobile devices are getting faster, etc. Firefox has even hidden the option to disable JavaScript.

    I’m not saying progressive enhancement is pointless, or that the web wouldn’t be a better place if it were more widely used. But the practical importance of progressive enhancement (of the “don’t require JavaScript” form) is shrinking. It’s the same reason many of us don’t stress out anymore about using under 1MB of memory, or using less than 10MB on the file system, or whatever. PCs increasingly have more RAM and more disk space. Doesn’t mean it wouldn’t be nice to support older systems with more modest specs. But the priority to do so goes down as those systems become rarer and rarer.

    It’s simply a trend that I think we will continue to see; and I don’t think it’s because developers today are worse, or more apathetic. It’s just that the world is changing.

  18. § Daniel Earwicker:

    Sky’s blocking of jQuery is clearly not a “teachable moment” for the cause of progressive enhancement. The idiocy of political-motivated web-filtering is blind to how we developed our sites.

    I’m sure Sky blocks a lot of “naughty” Wikipedia pages that use only semantic HTML+CSS. What moral do you draw from that? “And that kids, is why we use Flash for everything!”

    Not all apps are documents. Not all apps make sense if you try to represent them with the semantic markup elements built into HTML. If your site is document-like, go nuts with the semantic markup – you’d be crazy to use JS to do a job that HTML can already do (better). But otherwise, HTML is really not going to help you. Sometimes what you want is a 2D or 3D graphics space that you can totally control. You need Canvas or WebGL, and you need JavaScript. There is no meaningful sense in which such an app can “degrade gracefully” to HTML, unless that means a DIV containing the words “Enable JavaScript”.

    And even if your app’s features could in theory be developed so they work, to some extent, on a non-JS platform, whether you should practise progressive enhancement is an economic question, a trade-off. Striving for perfect progressive enhancement might result in the 98% of your users with JS enabled getting a worse experience, because you invested your limited dev. time making sure that the the 2% who don’t have JS would see some limited version (that would still be crummy anyway). Of course, those figures aren’t necessarily true in all situations – that’s the point. I would suggest that real professionals know when and how to make such trade-offs, instead of pretending that we can always give the same answer to a question!

    Google’s applications started leading the way in this area nearly a decade ago. Google Maps is a test case. It was obviously possible to develop such a site in pure HTML, with all the logic in the backend, because such sites existed at the time. But all those earlier sites sucked, and could only ever suck, in comparison to Google’s leap forward.

    Google continued to maintain the “no JS” version for a while. But then they went through a period of forgetting to keep it running. A few years ago out of curiosity I tried Google Maps with JS disabled, and it went into an endless redirect loop! I just tried it again now, and it currently redirects to a broken version of their main search page. Clearly a browser without JS is not a serious concern for Google Maps. They can’t even be bothered to maintain a proper error page for that kind of client. They know it’s not worth it. No one really needs them to.

  19. § George Hamilton:

    People saying “who browses without JS enabled these days, anyway!?” fail to realise that NoScript is still a favorite add-on for Firefox. You don’t have to go that far to think of ISPs blocking things, you should think of users not trusting dozens of scripts from 7 different domains an average web site embeds nowadays.

  20. § Dave Chapman:

    To a certain extent I agree with you Daniel about the whole apps versus documents agrument.

    But I don’t think we’re talking about Google Maps or the latest WebGL game here – we’d all agree that it’s impossible to progressively enhance a WebGL game.

    I think we’re talking about apps (websites) like this one that don’t function without javascript.

  21. § Nancy C:

    I’ve asked the very same question in my own workplace. It pisses me off but that’s the company line so I have to run with it.

  22. § Lee Kowalkowski:

    Yeah, I used a bit-wise operator once, couldn’t get that JS file past many standard corporate firewalls. Rightly so; any web page that calculates hash codes client-side is obviously up to no good. HTTP PUT and DELETE are also often blocked by the same firewalls. A product is most likely to fail on the client, and for an unanticipated reason.

    Dependencies aren’t bad, they’re often necessary. The problem is not thinking about (and testing!) the user experience when dependencies fail (and they do).

    It’s no surprise that users that can’t use your product, don’t, but also users that couldn’t use your product in the past, probably wont in the future.

    Replace the word users with customers, if they found a competitor’s product to work, then replace the word customer with ex-customer, and one that gossips about how they couldn’t use your product.

    Any opportunity to tell the user why your product isn’t working (truthfully) might make you look better than if you just leave your user with a broken product (or at least like you know what you’re doing).


Work With Me logo

At we build custom content management systems, ecommerce solutions and develop web apps.

Follow me


  • Web Standards Project
  • Britpack
  • 24 ways

Perch - a really little cms

About Drew McLellan

Photo of Drew McLellan

Drew McLellan (@drewm) has been hacking on the web since around 1996 following an unfortunate incident with a margarine tub. Since then he’s spread himself between both front- and back-end development projects, and now is Director and Senior Web Developer at in Maidenhead, UK (GEO: 51.5217, -0.7177). Prior to this, Drew was a Web Developer for Yahoo!, and before that primarily worked as a technical lead within design and branding agencies for clients such as Nissan, Goodyear Dunlop, Siemens/Bosch, Cadburys, ICI Dulux and Somewhere along the way, Drew managed to get himself embroiled with Dreamweaver and was made an early Macromedia Evangelist for that product. This lead to book deals, public appearances, fame, glory, and his eventual downfall.

Picking himself up again, Drew is now a strong advocate for best practises, and stood as Group Lead for The Web Standards Project 2006-08. He has had articles published by A List Apart, Adobe, and O’Reilly Media’s, mostly due to mistaken identity. Drew is a proponent of the lower-case semantic web, and is currently expending energies in the direction of the microformats movement, with particular interests in making parsers an off-the-shelf commodity and developing simple UI conventions. He writes here at all in the head and, with a little help from his friends, at 24 ways.