All in the <head>

– Ponderings & code by Drew McLellan –

– Live from The Internets since 2003 –


Crashing Out

350 days ago

At the start line

Then it all went black. Had you asked me if I’d lost consciousness, I would have assured you I had not, but neither could I account for the time between being on my bike and where I was now; on the ground.

It had started months before. Rachel had had a ballot place for the inaugural 2013 RideLondon 100 mile cycling event, and had deferred due to injury. This had meant she would be riding in the 2014 event instead, and so I also entered the ballot hoping that we could ride it together.

In March the magazine arrived confirming that I’d been successful in securing a ballot place for what was now the 2014 RideLondon-Surrey 100. One hundred miles. I’d not even ridden 50 at that point, and a quick calculation showed that my current training speed was too slow to complete the event within the time limit. If I was going to be able to finish, some serious work was needed.

I dialled in a weekly distance goal on Strava, and set about making sure I kept exceeding it. I pushed my times, and all the cycling along with the running I’d been doing meant that I was dropping weight and picking up speed with it. I entered myself in a series of sportives throughout the summer, gradually building up the distance.

In June I’d lined up a 70 mile sportive in Stratford. There had been severe weather warnings the night before, but the first 20 miles or so were in relatively fine weather. When the storm hit, it hit us fairly hard, and the course began to thin out as those on the 70 and 94 mile routes peeled off on the shorter 47 mile route back to base. I pushed on, learning that when you cycle in the rain your shoes fill with water.

Training went well, and by August I knew I was in shape to complete the 100 miles, and, provided I made good time on the flat sections I should have plenty of time in the bag for the climbs.

The big day

The RideLondon course starts in the Olympic Park, wends its way out through South London and heads towards the Surrey hills. It climbs the short but sharp Leith Hill, then the famous Box Hill before heading back into London to finish in front of Buckingham Palace on The Mall.

I was keeping a keen eye on the weather in the week before the ride. We’d had a glorious summer with very little rain, but it looked like that was now coming to an end. The ride was going to be wet. As the weekend got closer, it was clear that it was going to be more than just wet. I wasn’t fazed when Met Office issued a severe weather warning – I had trained for this! Storms? Childsplay! I AM READY.

The line in startline turned out to be an Americanism. This was a start queue. As we shuffled towards the timing gantry, an announcer informed us that due to the bad weather the two big climbs (and therefore the two dangerous descents) had been cut from the course. I’d trained all year for 100 miles, and I was about to ride 86. I was massively disappointed, but also relieved. The pressure was off. The time limit was now trivial, and I could actually let go and enjoy the massive event through beautiful surroundings on traffic-free roads. RideLondon 86 was going to be fun! Wet, but fun!

With that, we went over the timing mats and we were off. We’d had some light rain while queuing to start, but that had stopped and the ride out of London was relatively dry. The miles passed quickly, and I’d flown through the 10 mile marker before I felt like we’d even got going.

As we hit Richmond Park, everything came to a sudden halt. An accident up ahead and blocked the route, and there was nothing to do but hop off our bikes and slowly queue. You don’t get as cold as you’d think when exercising in bad weather. If you’re working hard and moving well, you stay warm and the weather isn’t that much of a big deal. It’s when you stop that things get dangerous and miserable, as wet clothes conduct heat much faster than dry ones, and you can get cold quickly. That’s what happened as we stood in Richmond Park.

The heavens opened and it rained about as much rain as I’ve ever seen or could even imagine. The storm had blown in from the Caribbean, but hadn’t been so courteous as to bring the heat with it. We got wet to the bone, so much so that the common joke as we stood was that at least we’d hit a point where we could get no wetter.

Eventually the blockage ahead cleared, and we were off again. There were plenty of flooded roads to wade through, but by this point we were all so wet already that they just seemed like fun. Riding through floods! What a jolly good wheeze. The 20, 30 and 40 mile markers went by without incident, and due to the shortened route, the 50s were missed and the markers were soon reading into the 60s. I was wet, but feeling good.

And then it all went wrong

At around 70 miles I noted that I was feeling hungry. That’s not usually a good sign, as if you’re feeling hungry or thirsty it means you’ve not been keeping up with your fuelling. Rather than wait for the next water stop, I pulled off at the side of a quiet stretch of road and ate half a Clif bar. I only had about 16 miles to go, but I didn’t want to arrive at the finish exhausted.

What happened after this point is patchy in my mind. I felt alert and comfortable, and was happy that I was back on track with my food. A short moment later, on a wide, unbusy section of road, I suddenly became aware of another cyclist undertaking on the inside. What was more, we were rapidly getting closer. My front wheel was dangerously close to touching his back wheel side-on. I knew this was bad. I knew that touching wheels like this was always worse for the rider behind. I knew I was the rider behind. I tried to brake and avoid and there was a clattering of spokes and then it all went black.

I was sitting at the edge of the road. My vision was slowly clearing, I felt a bit battered and my knee was sore. A man tried to straighten my legs, and I had to shout at him to stop. My vision was clouding, then clearing. My bike was in the road. A man brought it to the edge. Someone tried to remove my helmet – I didn’t want that and I had to shout again. They wanted me to stand, to move, to do anything other than what I needed to do, which was to sit on the curb and calm myself down. I still had a ride to finish.

A course marshal wanted to call an ambulance. An ambulance! I’d fallen off my bike and scraped my knee, and they wanted to send me to hospital. They were just covering themselves by being over cautious. I said I was fine, and made the concession that they could put a dressing on my knee. That would stop it bleeding while I rode to the finish, I thought. And then I saw my bike and knew I wouldn’t be finishing.

My vision started to cloud again, and I asked to be sat down. They lowered me to the ground and gave me some water. They were calling the ambulance. How old am I? Where are we? It’s on its way. I was supposed to be finishing this in St James’s, Mayfair. Instead I was to end up in St George’s, Tooting, Accident and Emergency.

The aftermath

I had dislocated the acromioclavicular (or AC) joint in my right shoulder, had grazed my left knee down to the kneecap, and was generally suffering from the kind of bruising and grazing that occurs when one leaves one’s bicycle and 28mph and promptly finds the ground. That I’d collided to the left, yet landed on my right shoulder indicates both that at some point in the proceedings I was spectacularly airborne, and that my dismount could use some work.

An AC joint dislocation isn’t the sort of shoulder dislocation where they pop something back in its socket and send you on your way. It’s more the sort where your collarbone used to be attached to your shoulder and now it’s poking out at a strange angle. It’s a common sports injury, and thankfully, recovery tends to be straightforward and doesn’t always require surgery.

I had an X-ray, and got to see what looked like an artist’s impression of me as a skeleton. I’d never been a skeleton before, and now here I was, a bad one with bits in the wrong places.

Kind friends brought me warm, dry clothes, coffee and support. I had been due to drive myself back home to Bristol that evening, so my parents drove in from Exeter to rescue me and my car. If I didn’t know how much of an idiot I was for crashing my bike by then, needing good old Mum and Dad come and save me brought it (and me) home.

The following day, I noticed a very slight tenderness on my left temple, and then on my right. The contact points from my helmet. While appearing to be only superficially scuffed on the outside, the internal structure of my helmet was deeply shattered. It must have taken quite an impact – probably a similar force as was enough to dislocate my shoulder. Yet I hadn’t even noticed. My head was fine. Wear a cycle helmet.

Other than the prognosis for my shoulder, the remaining unknown is the state of my bike. RideLondon have it in their secure storage and are (wonderfully, graciously) shipping it back to me free of charge. That’s going to be some time in the next week. I remember that many of the front spokes were broken, the gear shifters were bent out of place (not usually serious) and that the chain was off. I have no idea as to the state of the frame or fork or mechs or anything else. I guess we’ll find out.

I am an idiot, and I still haven’t ridden 100 miles.

- Drew McLellan

Why is Progressive Enhancement so unpopular?

552 days ago

A little earlier today, having read how Sky broadband had blocked the jQuery CDN I tweeted

To which many responded this is why we don’t rely on CDNs and how you can (shock, horror) even host your own JavaScript fallback and how you make a hole at each end of the shell and suck with a straw. In order to clarify the problem, I followed up with

The internet, as a network, is designed to be tolerant of faults. If parts of the network fail, the damage gets routed around and things keep working. HTML is designed to be tolerant of faults. If a document has unrecognised tags, or only partially downloads or is structured weirdly, the browser will do its best to keep displaying as much of that page as it can without throwing errors. CSS is designed to be tolerant of faults. If a selector doesn’t match or a property is not supported, or a value is unrecognised, the browser steps over the damage and keeps going.

JavaScript is brittle and intolerant of faults. If a dependancy is missing, it stops. If it hits unrecognised syntax, it stops. If the code throws an error, in some cases it stops there too. If part of the script is missing, it likely won’t even start. As careful as we are to code defensively within our JavaScript, it counts for nothing if the code doesn’t run.

Does that mean we shouldn’t use JavaScript? Of course not. Scripting in the browser is an important part of the experience of using the web in 2014. It’s my opinion that you shouldn’t depend on JavaScript running for your site to work. Build with HTML, add styling with CSS, add behaviour with JavaScript. If the JavaScript fails, the HTML should still work.


This isn’t a new concept, it’s a very old one. What is new, however, is the backlash against this very simple idea by people who at the same time consider themselves to be professional web developers.

It used to be that progressive enhancement was the accepted ‘best practise’ (ugh) way to do things. If you’re building a site today you’d generally make it responsive. Any new site that isn’t responsive when it could be is considered a bit old-hat and a missed opportunity. So it used to be with progressive enhancement. If you built a site that depended on JavaScript, chances are you were a cowboy and didn’t really know what you were doing – a skilled developer wouldn’t do it that way, because they know JavaScript can break.

Somewhere along the line that all got lost. I’m not sure where – it was still alive and well when jQuery launched with it’s find something, do something approach (that’s progressive enhancement). It was lost by the time AngularJS was ever considered an approach of any merit whatsoever.

When did the industry stop caring about this stuff, and why? We spend hours in test labs working on the best user experience we can deliver, and then don’t care if we deliver nothing. Is it because we half expect what we’re building will never launch anyway, or will be replaced in 6 months?

Perhaps I’m old fashioned and I should stop worrying about this stuff. Is it ok to rely on JavaScript, and to hell if it breaks? Perhaps so.

- Drew McLellan

Rebuilding 24 ways

609 days ago

As those with long memories may recall, I first launched 24 ways in December 2005 as a fairly last-minute idea for sharing a quick tip or idea every day in advent. I emailed some friends to ask for contributions, and was overwhelmed by the response. Instead of the tips I’d had in mind, what I got back was full-blown articles prepared with depth and care.

I designed (and I use the word lightly) the site myself, got it up and running using blog software, and off we went on a twenty-four day roller coaster.

The site was such a success that we repeated the process in 2006. When recruiting authors for our third year in 2007, Tim Van Damme asked me to do something about the terrible design. I pretty much said “well, go on then!” and that year we launched with an all-new look. Tim did an amazing job with a design that was well ahead of its time, both visually and technically. It’s hard to remember now, but the heavy use of RGBA colour meant that the design only worked in a few browsers (notably not IE or Opera) and performance was bad in those that could render it.

But that was very much the point. I think the fact that the design ran for six entire seasons (2007-2012) is testament to how forward-looking it was. It took a couple of years for the browsers to catch up with it.

In 2011, I retrofitted the design with a few media queries to help it respond on modern devices, but by the end of our 2012 season, the design was beginning to show its age. Simple practicalities like not have enough space left for any more archived year tabs, plus a structure designed for discovering three years of articles and not eight meant it was time to think about a redesign.

2013 Redesign

As my early attempts attest, I have very little skill in that area, and so if I wanted a new design I was going to have to find someone much better than I am to work with. So, where does one start in finding a designer?

I’m in the fortunate position of knowing lots of really great web designers – many of whom have been authors for 24 ways over the years. I figured I’d start with my top-choice dream person, and work down the list until I found someone who’d be prepared to do it.

So I started by asking Paul Robert Lloyd, and he said yes.

Knowing that a redesign would take some time and needed to be fit around everyone’s work and life commitments, we started discussing the project early in the year. By June we started to panic that time was shifting on, and now as I write, about an hour before we launch the new site, Paul’s still working away on the finishing touches.

In 2012 I rebuild the site in Perch for the old design, and this month I’ve updated that implementation to add the new features and requirements the design added.

The details of the design itself are probably best left to Paul to discuss (and I hope he does), but for now, I’ll just let you soak in it like I have been doing for the last few weeks.

So here it is, 24 ways 2013.

- Drew McLellan


Work With Me logo

At we build custom content management systems, ecommerce solutions and develop web apps.

Follow me


  • Web Standards Project
  • Britpack
  • 24 ways

Perch - a really little cms

About Drew McLellan

Photo of Drew McLellan

Drew McLellan (@drewm) has been hacking on the web since around 1996 following an unfortunate incident with a margarine tub. Since then he’s spread himself between both front- and back-end development projects, and now is Director and Senior Web Developer at in Maidenhead, UK (GEO: 51.5217, -0.7177). Prior to this, Drew was a Web Developer for Yahoo!, and before that primarily worked as a technical lead within design and branding agencies for clients such as Nissan, Goodyear Dunlop, Siemens/Bosch, Cadburys, ICI Dulux and Somewhere along the way, Drew managed to get himself embroiled with Dreamweaver and was made an early Macromedia Evangelist for that product. This lead to book deals, public appearances, fame, glory, and his eventual downfall.

Picking himself up again, Drew is now a strong advocate for best practises, and stood as Group Lead for The Web Standards Project 2006-08. He has had articles published by A List Apart, Adobe, and O’Reilly Media’s, mostly due to mistaken identity. Drew is a proponent of the lower-case semantic web, and is currently expending energies in the direction of the microformats movement, with particular interests in making parsers an off-the-shelf commodity and developing simple UI conventions. He writes here at all in the head and, with a little help from his friends, at 24 ways.