All in the <head> – Ponderings and code by Drew McLellan –

Why is Progressive Enhancement so unpopular?

A little earlier today, having read how Sky broadband had blocked the jQuery CDN I tweeted

To which many responded this is why we don’t rely on CDNs and how you can (shock, horror) even host your own JavaScript fallback and how you make a hole at each end of the shell and suck with a straw. In order to clarify the problem, I followed up with

The internet, as a network, is designed to be tolerant of faults. If parts of the network fail, the damage gets routed around and things keep working. HTML is designed to be tolerant of faults. If a document has unrecognised tags, or only partially downloads or is structured weirdly, the browser will do its best to keep displaying as much of that page as it can without throwing errors. CSS is designed to be tolerant of faults. If a selector doesn’t match or a property is not supported, or a value is unrecognised, the browser steps over the damage and keeps going.

JavaScript is brittle and intolerant of faults. If a dependancy is missing, it stops. If it hits unrecognised syntax, it stops. If the code throws an error, in some cases it stops there too. If part of the script is missing, it likely won’t even start. As careful as we are to code defensively within our JavaScript, it counts for nothing if the code doesn’t run.

Does that mean we shouldn’t use JavaScript? Of course not. Scripting in the browser is an important part of the experience of using the web in 2014. It’s my opinion that you shouldn’t depend on JavaScript running for your site to work. Build with HTML, add styling with CSS, add behaviour with JavaScript. If the JavaScript fails, the HTML should still work.

Unpopular

This isn’t a new concept, it’s a very old one. What is new, however, is the backlash against this very simple idea by people who at the same time consider themselves to be professional web developers.

It used to be that progressive enhancement was the accepted ‘best practise’ (ugh) way to do things. If you’re building a site today you’d generally make it responsive. Any new site that isn’t responsive when it could be is considered a bit old-hat and a missed opportunity. So it used to be with progressive enhancement. If you built a site that depended on JavaScript, chances are you were a cowboy and didn’t really know what you were doing – a skilled developer wouldn’t do it that way, because they know JavaScript can break.

Somewhere along the line that all got lost. I’m not sure where – it was still alive and well when jQuery launched with it’s find something, do something approach (that’s progressive enhancement). It was lost by the time AngularJS was ever considered an approach of any merit whatsoever.

When did the industry stop caring about this stuff, and why? We spend hours in test labs working on the best user experience we can deliver, and then don’t care if we deliver nothing. Is it because we half expect what we’re building will never launch anyway, or will be replaced in 6 months?

Perhaps I’m old fashioned and I should stop worrying about this stuff. Is it ok to rely on JavaScript, and to hell if it breaks? Perhaps so.