At this year’s beyond tellerrand conference in Düsseldorf web developer Jeremy Keith gave a talk on resilience in web applications:
Resilience – Jeremy Keith – btconfDUS 2016 from beyond tellerrand on Vimeo.
The World Wide Web – or the Internet for that matter – since its inception always was designed as a resilient, fault-tolerant medium. This not just applies in a technical sense but in a social or even political way, too. As John Gilmore is famously quoted: “The Net interprets censorship as damage and routes around it.”
The technology stack of the Internet and by extension the web is composed of layers that get more abstract and more application-oriented the further you move up that stack (the following is of course overly simplistic):
- JavaScript
- —
- CSS
- HTML
- TCP/IP
- HTTP
Everything below the — is pretty resilient and for the most part fault-tolerant in that those technologies gloss over things like parse errors or network transport glitches and just display everything they can to the user. HTML and CSS in particular have fallback mechanisms that allow for backward and forward compatibility: A reasonably well-designed website from 2016 should display its essential content in even the oldest web browsers!
Therein lies the rub, though. JavaScript by design isn’t as fault-tolerant and resilient as the other layers in the web’s technology stack and that’s a good thing because you want your software to have predictable behaviour and output. However, you as a designer and / or developer should always attempt to provide plain HTML / CSS fallbacks for your application so if the user can’t or doesn’t use the latest and greatest version of ECMAScript for one reason or another she’ll at least be able to see the basic content of your application or website. That’s why progressive enhancement is such an awesome concept.
Anyway, have a look at the video above as Jeremy explains this in far more detail (and in a quite funny manner …).