All in the <head>

– Ponderings & code by Drew McLellan –

– Live from The Internets since 2003 –


Why is Progressive Enhancement so unpopular?

187 days ago

A little earlier today, having read how Sky broadband had blocked the jQuery CDN I tweeted

To which many responded this is why we don’t rely on CDNs and how you can (shock, horror) even host your own JavaScript fallback and how you make a hole at each end of the shell and suck with a straw. In order to clarify the problem, I followed up with

The internet, as a network, is designed to be tolerant of faults. If parts of the network fail, the damage gets routed around and things keep working. HTML is designed to be tolerant of faults. If a document has unrecognised tags, or only partially downloads or is structured weirdly, the browser will do its best to keep displaying as much of that page as it can without throwing errors. CSS is designed to be tolerant of faults. If a selector doesn’t match or a property is not supported, or a value is unrecognised, the browser steps over the damage and keeps going.

JavaScript is brittle and intolerant of faults. If a dependancy is missing, it stops. If it hits unrecognised syntax, it stops. If the code throws an error, in some cases it stops there too. If part of the script is missing, it likely won’t even start. As careful as we are to code defensively within our JavaScript, it counts for nothing if the code doesn’t run.

Does that mean we shouldn’t use JavaScript? Of course not. Scripting in the browser is an important part of the experience of using the web in 2014. It’s my opinion that you shouldn’t depend on JavaScript running for your site to work. Build with HTML, add styling with CSS, add behaviour with JavaScript. If the JavaScript fails, the HTML should still work.


This isn’t a new concept, it’s a very old one. What is new, however, is the backlash against this very simple idea by people who at the same time consider themselves to be professional web developers.

It used to be that progressive enhancement was the accepted ‘best practise’ (ugh) way to do things. If you’re building a site today you’d generally make it responsive. Any new site that isn’t responsive when it could be is considered a bit old-hat and a missed opportunity. So it used to be with progressive enhancement. If you built a site that depended on JavaScript, chances are you were a cowboy and didn’t really know what you were doing – a skilled developer wouldn’t do it that way, because they know JavaScript can break.

Somewhere along the line that all got lost. I’m not sure where – it was still alive and well when jQuery launched with it’s find something, do something approach (that’s progressive enhancement). It was lost by the time AngularJS was ever considered an approach of any merit whatsoever.

When did the industry stop caring about this stuff, and why? We spend hours in test labs working on the best user experience we can deliver, and then don’t care if we deliver nothing. Is it because we half expect what we’re building will never launch anyway, or will be replaced in 6 months?

Perhaps I’m old fashioned and I should stop worrying about this stuff. Is it ok to rely on JavaScript, and to hell if it breaks? Perhaps so.

- Drew McLellan

Rebuilding 24 ways

244 days ago

As those with long memories may recall, I first launched 24 ways in December 2005 as a fairly last-minute idea for sharing a quick tip or idea every day in advent. I emailed some friends to ask for contributions, and was overwhelmed by the response. Instead of the tips I’d had in mind, what I got back was full-blown articles prepared with depth and care.

I designed (and I use the word lightly) the site myself, got it up and running using blog software, and off we went on a twenty-four day roller coaster.

The site was such a success that we repeated the process in 2006. When recruiting authors for our third year in 2007, Tim Van Damme asked me to do something about the terrible design. I pretty much said “well, go on then!” and that year we launched with an all-new look. Tim did an amazing job with a design that was well ahead of its time, both visually and technically. It’s hard to remember now, but the heavy use of RGBA colour meant that the design only worked in a few browsers (notably not IE or Opera) and performance was bad in those that could render it.

But that was very much the point. I think the fact that the design ran for six entire seasons (2007-2012) is testament to how forward-looking it was. It took a couple of years for the browsers to catch up with it.

In 2011, I retrofitted the design with a few media queries to help it respond on modern devices, but by the end of our 2012 season, the design was beginning to show its age. Simple practicalities like not have enough space left for any more archived year tabs, plus a structure designed for discovering three years of articles and not eight meant it was time to think about a redesign.

2013 Redesign

As my early attempts attest, I have very little skill in that area, and so if I wanted a new design I was going to have to find someone much better than I am to work with. So, where does one start in finding a designer?

I’m in the fortunate position of knowing lots of really great web designers – many of whom have been authors for 24 ways over the years. I figured I’d start with my top-choice dream person, and work down the list until I found someone who’d be prepared to do it.

So I started by asking Paul Robert Lloyd, and he said yes.

Knowing that a redesign would take some time and needed to be fit around everyone’s work and life commitments, we started discussing the project early in the year. By June we started to panic that time was shifting on, and now as I write, about an hour before we launch the new site, Paul’s still working away on the finishing touches.

In 2012 I rebuild the site in Perch for the old design, and this month I’ve updated that implementation to add the new features and requirements the design added.

The details of the design itself are probably best left to Paul to discuss (and I hope he does), but for now, I’ll just let you soak in it like I have been doing for the last few weeks.

So here it is, 24 ways 2013.

- Drew McLellan

Ideas of March

505 days ago

In between the first and the second time I re-pledged my commitment to the medium of blogging, I posted just three times. This year, it’s four times, which represents a strong upward trend. Let’s say it represents a strong upward trend.

Last year, I wrote about the permanence of ideas, and the trend towards short-form fire-and-forget tweets serving as the only written expression of important thoughts and ideas. How 140 characters can so vastly over-distill an expression that perhaps all that is left is a bitter syrupy remnant of an otherwise complex and nuanced thought. Worse still, the distillation never occurs, the idea overflows and escapes leaving nothing but a curious smell and a slight unease around naked flames.

This year, my thoughts are turned to something much more fundamental. Chris writes about the shutdown of Google Reader and with it, the importance of not only capturing and expressing your thoughts and ideas, but continuing to own the means by which they are published. Ever since the halcyon days of Web 2.0, we’ve been netting our butterflies and pinning them to someone else’s board. The more time that passes, the more we contribute and the more we become invested in platforms that are becoming less and less relevant to current market conditions and trends.

Will it end well? It will not.

If content is important to you, keep it close. If your content is important to others, keep it close and well backed up. Hope that what you’ve created never has to die. Make sure that if something has to die, it’s you that makes that decision. Own your own data, friends, and keep it safe.

Well, this has been weird.

- Drew McLellan



Work With Me logo

At we build custom content management systems, ecommerce solutions and develop web apps.

Follow me


  • Web Standards Project
  • Britpack
  • 24 ways

Perch - a really little cms

About Drew McLellan

Photo of Drew McLellan

Drew McLellan (@drewm) has been hacking on the web since around 1996 following an unfortunate incident with a margarine tub. Since then he’s spread himself between both front- and back-end development projects, and now is Director and Senior Web Developer at in Maidenhead, UK (GEO: 51.5217, -0.7177). Prior to this, Drew was a Web Developer for Yahoo!, and before that primarily worked as a technical lead within design and branding agencies for clients such as Nissan, Goodyear Dunlop, Siemens/Bosch, Cadburys, ICI Dulux and Somewhere along the way, Drew managed to get himself embroiled with Dreamweaver and was made an early Macromedia Evangelist for that product. This lead to book deals, public appearances, fame, glory, and his eventual downfall.

Picking himself up again, Drew is now a strong advocate for best practises, and stood as Group Lead for The Web Standards Project 2006-08. He has had articles published by A List Apart, Adobe, and O’Reilly Media’s, mostly due to mistaken identity. Drew is a proponent of the lower-case semantic web, and is currently expending energies in the direction of the microformats movement, with particular interests in making parsers an off-the-shelf commodity and developing simple UI conventions. He writes here at all in the head and, with a little help from his friends, at 24 ways.