Journal tags: rel

34

sparkline

Labels

I love libraries. I think they’re one of humanity’s greatest inventions.

My local library here in Brighton is terrific. It’s well-stocked, it’s got a welcoming atmosphere, and it’s in a great location.

But it has an information architecture problem.

Like most libraries, it’s using the Dewey Decimal system. It’s not a great system, but every classification system is going to have flaws—wherever you draw boundaries, there will be disagreement.

The Dewey Decimal class of 900 is for history and geography. Within that class, those 100 numbers (900 to 999) are further subdivded in groups of 10. For example, everything from 940 to 949 is for the history of Europe.

Dewey Decimal number 941 is for the history of the British Isles. The term “British Isles” is a geographical designation. It’s not a good geographical designation, but technically it’s not a political term. So it’s actually pretty smart to use a geographical rather than a political term for categorisation: geology moves a lot slower than politics.

But the Brighton Library is using the wrong label for their shelves. Everything under 941 is labelled “British History.”

The island of Ireland is part of the British Isles.

The Republic of Ireland is most definitely not part of Britain.

Seeing books about the history of Ireland, including post-colonial history, on a shelf labelled “British History” is …not good. Frankly, it’s offensive.

(I mentioned this situation to an English friend of mine, who said “Well, Ireland was once part of the British Empire”, to which I responded that all the books in the library about India should also be filed under “British History” by that logic.)

Just to be clear, I’m not saying there’s a problem with the library using the Dewey Decimal system. I’m saying they’re technically not using the system. They’ve deviated from the system’s labels by treating “History of the British Isles” and “British History” as synonymous.

I spoke to the library manager. They told me to write an email. I’ve written an email. We’ll see what happens.

You might think I’m being overly pedantic. That’s fair. But the fact this is happening in a library in England adds to the problem. It’s not just technically incorrect, it’s culturally clueless.

Mind you, I have noticed that quite a few English people have a somewhat fuzzy idea about the Republic of Ireland. Like, they understand it’s a different country, but they think it’s a different country in the way that Scotland is a different country, or Wales is a different country. They don’t seem to grasp that Ireland is a different country like France is a different country or Germany is a different country.

It would be charming if not for, y’know, those centuries of subjugation, exploitation, and forced starvation.

British history.

Update: They fixed it!

Indie webbing

The past weekend’s Indie Web Camp Brighton was wonderful! Many thanks to Mark and Paul for all their work putting it together.

There was a great turn-out. It felt like the perfect time for an Indie Web Camp. There’s a real appetite for getting away from ever more extractive silos and staking claim to our own corners of the web. Most of the attendees were at their first ever Indie Web Camp.

Paul asked me to oversee the schedule planning on day one, which I was happy to do. We made sure that first-timers got first dibs on proposing sessions. In the end, every single session was proposed by new attendees.

Day two was all about putting ideas into practice: coding, designing, and writing on our own website. I’m always blown away by how much gets done in just one short day. Best of all is when there’s someone who starts the weekend without their own website but finishes with a live site. That happened again this time.

I spent the second day tinkering with something I started at Indie Web Camp Nuremberg in October. Back then, I got related posts working here on my journal; a list of suggested follow-up posts to read based on the tags of the current post.

I wanted to do the same for my links; show links related to the one I’m currently linking to. It didn’t take too long to get that up and running.

But then I thought about it some more and realised it would be good to also show blog posts related to the link. So I did that. Then I realised it would be really good to show related links under blog posts too.

So now, if everything’s working correctly, then at the end of this post you will not only see related blog posts I’ve previously written, but also links related to the content of this post.

It was a very inspiring weekend. There’s something about being in a room with other people working on their websites that makes me super productive.

While we were hacking away on day two, somebody mentioned that they still find hard to explain the indie web to people.

“It’s having your own website”, I said.

But surely there’s more to it than that, they wondered.

Nope. If someone has their own website, then they’re part of the indie web. It doesn’t matter if that website is made with a complicated home-rolled tech stack or if it’s a Squarespace site.

What you do with your own website is entirely up to you. The technologies are just plumbing wether it’s webmentions, RSS, or anything else. None of it is a requirement. Heck, even HTML is optional. If you want to put plain text files on your website, go for it. It’s your website.

Indie Web Camp Nuremberg

After two days at border:none in Nuremberg, it was time for two days at Indie Web Camp, also in Nuremberg.

I hadn’t been to an Indie Web Camp since before The Situation. It felt very good to be back. I had almost forgotten how inspiring and productive they can be.

This one had a good turnout of around twenty people. We had ourselves an excellent first day of thought-provoking sessions. Then on day two it was time to put some of those ideas into action.

A little trick I like to do on the practical day is to have two tasks to attempt: one of them quite simple, and the other more ambitious. That way, as long as I get the simpler task done, I’ll always have at least something to demo at the end of the day.

This time I attempted three bits of home improvement on my website.

Autolinking Mastodon usernames

The first problem I set myself was ostensibly the simple one. But it involved regular expressions, so then I had two problems.

I wanted to automatically link up Mastodon usernames if I mentioned one in my notes. For example, during border:none I mentioned Brian’s mastodon username in a note: @briansuda@loðfíll.is.

That turned out to be an excellent test case. Those Icelandic characters made sure I wasn’t making unwarranted assumptions about character sets.

Here’s the regular expression I came up with. It’s not foolproof by any means. Basically it looks for @something@something.something.

Good enough. Ship it.

Related posts

My next task was a bit more ambitious. It involved SQL queries, something I’m slightly better at than regular expressions but that’s a very low bar.

I wanted to show related posts when you get to the end of one of my blog posts.

I’ve been tagging all my blog posts for years so that’s the mechanism I used for finding similar posts. There’s probably a clever SQL statement that could do this, but I ended up brute-forcing it a bit.

I don’t feel too bad about the hacky clunky nature of my solution, because I cache blog post pages. That means only the first person to view the blog post (usually me) will suffer any performance impacts from my clunky database queries. After that everything’s available straight from a cached file.

Let’s say you’re reading a blog post of mine that I’ve tagged with ten different keywords. I make a separate SQL query for each keyword to get all the other posts that use that tag. Then it’s a matter of sorting through all the results.

I loop through the results of each tag and apply a score to the tagged post. If the post shares one tag with the post you’re looking at, it has a score of one. If it shares two tags, it has a score of two, and so on.

I decided that for a post to be considered related, it had to share at least three tags. I also decided to limit the list of related posts to a maximum of five.

It worked out pretty well. If you scroll down on my recent post about JavaScript, you’ll see links to related posts about JavaScript. If you read through a post on accessibility testing, you’ll find other posts about accessibility testing. If you make it to the end of this post about Mars colonisation you’ll see links to more posts about exploring our solar system.

Right now I’m just doing this for my blog but I’d like to do it for my links too. A job for a future Indie Web Camp.

Link rot

I was very inspired by Remy’s recent post on how he’s tackling link rot on his site. I wanted to do the same for mine.

On the first day at Indie Web Camp I led a session on link rot to gather ideas and alternative approaches. We had a really good discussion, though it’s always worth bearing in mind that there’ll never be a perfect solution. There’ll always be some false positives and some false negatives.

The other Jeremy at Indie Camp Nuremberg blogged about the session. Sebastian Greger was attending remotely and the session inspired him to spend the second day also tackling linkrot.

In the end I decided to stick with Remy’s two-pronged approach:

  1. a client-side script that—as a progressive enhancement—intercepts outbound links and re-routes them to
  2. a server-side script that redirects to the Internet Archive if the link is broken.

Here’s the JavaScript I wrote for the first part.

It’s very similar to Remy’s but with one little addition. I check to see if the clicked link is inside an h-entry and if it is, I pass on the date from the post’s dt-published value.

Here’s the PHP I wrote for the server-side redirector. The comments tell the story of what the code is doing:

  • Check that the request is coming from my site.
  • There also has to be a URL provided in the query string.
  • Make a very quick curl request to get the response headers from the URL. The time limit is set to 1 second.
  • If there was any error (like a time out), give up and go to the URL.
  • Pick the response headers apart to get the HTTP status code.
  • If the response is OK, go to the URL.
  • If the response is a redirect, go around again but this time use the redirect URL.
  • Construct the archive.org search endpoint.
  • If we have a date, provide it. Otherwise ask for the latest snapshot.
  • Ping that archive.org URL. This time there’s no time limit; this might take a while.
  • If there’s an archived copy, redirect to that.
  • There’s no archived copy. Give up and go the URL anyway.

Not perfect by any means, but it works for the most common cases of link rot.

For the demo at the end of the day I went back into my archive of over 10,000 links and plucked out some old posts, like this one from December 2005. It takes a little while to do the rerouting but eventually you get to see the archived version from the same time period as when I linked to it.

Here’s another link from 2005. Here’s another. Those links are broken now, but with a little patience, you’ll still get to read them on the Internet Archive.

The Internet Archive’s wayback machine really is a gift. I can’t imagine how would it be even remotely possible to try to address link rot on my site without archive.org.

I will continue to donate money to the Internet Archive and I encourage you to do the same.

Relative times

Last week Phil posted a little update about his excellent site, ooh.directory:

If you’re in the habit of visiting the Recently Updated Blogs page, and leaving it open, the times when each blog was updated will now keep up with the relentless passing of time.

Does that make sense? “3 minutes ago” will change to “4 minutes ago” and so on and on and on, until you refresh the page.

I thought that was a nice little addition, and I immediately thought of The Session. There are time elements all over the site with relative times as the text content: 2 minutes ago, 7 hours ago, 1 year ago, and so on. Those strings of text are generated on the server. But I figured it would be nice enhancement to periodically update them in the browser after the page has loaded.

I viewed source to see how Phil was doing it. The code is nice and short, using a library called Day.js with a plug-in for relative time.

“Hang on”, I thought, “isn’t there some web standard for doing this kind of thing?” I had a vague memory of some JavaScript API for formatting dates and times.

Sure enough, we’ve now got the Intl.RelativeTimeFormat object. It’s got browser support across the board.

Here’s the code I wrote.

I’ve got a function that loops through all the time elements with datetime attributes. It compares the current timestamp to that value to get the elapsed time. Then that’s formatted using the format() method and output as innerText.

You need to tell the format() method which units you want to use: seconds, minutes, hours, days, etc. So there’s a little bit of looping to figure out which unit is most appropriate. If the elapsed time is less than a minute, use seconds. If the elapsed time is less than an hour, use minutes. If the elapsed time is less than a day, use hours. You get the idea.

It’s a pity there isn’t some kind of magic unit like “auto” to do this, but it’s not much extra code to figure it out.

Anyway, that function runs periodically using setInterval(). I’ve set it to run every 30 seconds in my gist. On The Session I’ve set it to one minute.

You’ll notice that I’m grabbing all the relevant time elements—using document.querySelector('time[datetime]')—every time the function is run. That may seem inefficient. Couldn’t I just grab them once and then keep them stored as an array? But I want this to work even if the page contents have been updated with Ajax. (Do people even say “Ajax” any more? Get off my lawn, you pesky kids!)

I think I’ve written this code in an abstract way so that you should be able to drop it into any web page. For the calculations to work, you’ll need to either make sure that your datetime attributes are using timezones. Or, if there’s no timezone info, UTC is assumed.

This was a fun little piece of functionality to play around with. Now I know a little more about this Intl.RelativeTimeFormat object. The way I’m using it as a classic example of progressive enhancement. If a browser doesn’t support it, or if my code breaks, it’s no big deal. The funtionality is a little bonus that almost nobody will notice anyway. Just a small delighter …if you’re the kind of person who finds it delightful when relative time strings automatically update.

Sunday

Today was a good day. The weather was beautiful.

Jessica and I did a little bit of work in the garden—nothing too sweaty. Then Jessica cut my hair. It looks good. And it feels good to have my neck freed up.

We went for a Sunday roast at the nearest pub, which does a most excellent carvery. It was tasty and plentiful so after strolling home, I wanted to do nothing more than sit around.

I sat outside in the back garden under the dappled shade offered by the overhanging trees. I had a good book. I had my mandolin to hand. I’d reach for it occassionally to play a tune or two.

Coco the cat—not our cat—sat nearby, stretching her paws out lazily in the warm muggy air.

It was a good day.

In between

I was chatting with my new colleague Alex yesterday about a link she had shared in Slack. It was the Nielsen Norman Group’s annual State of Mobile User Experience report.

There’s nothing too surprising in there, other than the mention of Apple’s app clips and Google’s instant apps.

Remember those?

Me neither.

Perhaps I lead a sheltered existence, but as an iPhone user, I don’t think I’ve come across a single app clip in the wild.

I remember when they were announced. I was quite worried about them.

See, the one thing that the web can (theoretically) offer that native can’t is instant access to a resource. Go to this URL—that’s it. Whereas for a native app, the flow is: go to this app store, find the app, download the app.

(I say that the benefit is theoretical because the website found at the URL should download quickly—the reality is that the bloat of “modern” web development imperils that advantage.)

App clips—and instant apps—looked like a way to route around the convoluted install process of native apps. That’s why I was nervous when they were announced. They sounded like a threat to the web.

In reality, the potential was never fulfilled (if my own experience is anything to go by). I wonder why people didn’t jump on app clips and instant apps?

Perhaps it’s because what they promise isn’t desirable from a business perspective: “here’s a way for users to accomplish their tasks without downloading your app.” Even though app clips can in theory be a stepping stone to installing the full app, from a user’s perspective, their appeal is the exact opposite.

Or maybe they’re just too confusing to understand. I think there’s an another technology that suffers from the same problem: progressive web apps.

Hear me out. Progressive web apps are—if done well—absolutely amazing. You get all of the benefits of native apps in terms of UX—they even work offline!—but you retain the web’s frictionless access model: go to a URL; that’s it.

So what are they? Are they websites? Yes, sorta. Are they apps? Yes, sorta.

That’s confusing, right? I can see how app clips and instant apps sound equally confusing: “you can use them straight away, like going to a web page, but they’re not web pages; they’re little bits of apps.”

I’m mostly glad that app clips never took off. But I’m sad that progressive web apps haven’t taken off more. I suspect that their fates are intertwined. Neither suffer from technical limitations. The problem they both face is inertia:

The technologies are the easy bit. Getting people to re-evaluate their opinions about technologies? That’s the hard part.

True of progressive web apps. Equally true of app clips.

But when I was chatting to Alex, she made me look at app clips in a different way. She described a situation where somebody might need to interact with some kind of NFC beacon from their phone. Web NFC isn’t supported in many browsers yet, so you can’t rely on that. But you don’t want to make people download a native app just to have a quick interaction. In theory, an app clip—or instant app—could do the job.

In that situation, app clips aren’t a danger to the web—they’re polyfills for hardware APIs that the web doesn’t yet support!

I love having my perspective shifted like that.

The specific situations that Alex and I were discussing were in the context of museums. Musuems offer such interesting opportunities for the physical and the digital to intersect.

Remember the pen from Cooper Hewitt? Aaron spoke about it at dConstruct 2014—a terrific presentation that’s well worth revisiting and absorbing.

The other dConstruct talk that’s very relevant to this liminal space between the web and native apps is the 2012 talk from Scott Jenson. I always thought the physical web initiative had a lot of promise, but it may have been ahead of its time.

I loved the thinking behind the physical web beacons. They were deliberately dumb, much like the internet itself. All they did was broadcast a URL. That’s it. All the smarts were to be found at the URL itself. That meant a service could get smarter over time. It’s a lot easier to update a website than swap out a piece of hardware.

But any kind of technology that uses Bluetooth, NFC, or other wireless technology has to get over the discovery problem. They’re invisible technologies, so by default, people don’t know they’re even there. But if you make them too discoverable— intrusively announcing themselves like one of the commercials in Minority Report—then they’re indistinguishable from spam. There’s a sweet spot of discoverability right in the middle that’s hard to get right.

Over the past couple of years—accelerated by the physical distancing necessitated by The Situation—QR codes stepped up to the plate.

They still suffer from some discoverability issues. They’re not human-readable, so you can’t be entirely sure that the URL you’re going to go to isn’t going to be a Rick Astley video. But they are visible, which gives them an advantage over hidden wireless technologies.

They’re cheaper too. Printing a QR code sticker costs less than getting a plastic beacon shipped from China.

QR codes turned out to be just good enough to bridge the gap between the physical and digital for those one-off interactions like dining outdoors during a pandemic:

I can see why they chose the web over a native app. Online ordering is the only way to place your order at this place. Telling people “You have to go to this website” …that seems reasonable. But telling people “You have to download this app” …that’s too much friction.

Ironically, the nail in the coffin for app clips and instant apps might’ve been hammered in by Apple and Google when they built QR-code recognition into their camera software.

Work ethics

If you’re travelling around Ireland, you may come across some odd pieces of 19th century architecture—walls, bridges, buildings and roads that serve no purpose. They date back to The Great Hunger of the 1840s. These “famine follies” were the result of a public works scheme.

The thinking went something like this: people are starving so we should feed them but we can’t just give people food for nothing so let’s make people do pointless work in exchange for feeding them (kind of like an early iteration of proof of work for cryptobollocks on blockchains …except with a blockchain, you don’t even get a wall or a road, just ridiculous amounts of wasted energy).

This kind of thinking seems reprehensible from today’s perspective. But I still see its echo in the work ethic espoused by otherwise smart people.

Here’s the thing: there’s good work and there’s working hard. What matters is doing good work. Often, to do good work you need to work hard. And so people naturally conflate the two, thinking that what matters is working hard. But whether you work hard or not isn’t actually what’s important. What’s important is that you do good work.

If you can do good work without working hard, that’s not a bad thing. In fact, it’s great—you’ve managed to do good work and do it efficiently! But often this very efficiency is treated as laziness.

Sensible managers are rightly appalled by so-called productivity tracking because it measures exactly the wrong thing. Those instruments of workplace surveillance measure inputs, not outputs (and even measuring outputs is misguided when what really matters are outcomes).

They can attempt to measure how hard someone is working, but they don’t even attempt to measure whether someone is producing good work. If anything, they actively discourage good work; there’s plenty of evidence to show that more hours equates to less quality.

I used to think that must be some validity to the belief that hard work has intrinsic value. It was a position that was espoused so often by those around me that it seemed a truism.

But after a few decades of experience, I see no evidence for hard work as an intrinsically valuable activity, much less a useful measurement. If anything, I’ve seen the real harm that can be caused by tying your self-worth to how much you’re working. That way lies burnout.

We no longer make people build famine walls or famine roads. But I wonder how many of us are constructing little monuments in our inboxes and calendars, filling those spaces with work to be done in an attempt to chase the rewards we’ve been told will result from hard graft.

I’d rather spend my time pursuing the opposite: the least work for the most people.

Alternative stylesheets

My website has different themes you can choose from. I don’t just mean a dark mode. These themes all look very different from one another.

I assume that 99.99% of people just see the default theme, but I keep the others around anyway. Offering different themes was originally intended as a way of showcasing the power of CSS, and specifically the separation of concerns between structure and presentation. I started doing this before the CSS Zen Garden was created. Dave really took it to the next level by showing how the same HTML document could be styled in an infinite number of ways.

Each theme has its own stylesheet. I’ve got a very simple little style switcher on every page of my site. Selecting a different theme triggers a page refresh with the new styles applied and sets a cookie to remember your preference.

I also list out the available stylesheets in the head of every page using link elements that have rel values of alternate and stylesheet together. Each link element also has a title attribute with the name of the theme. That’s the standard way to specify alternative stylesheets.

In Firefox you can switch between the specified stylesheets from the View menu by selecting Page Style (notice that there’s also a No style option—very handy for checking your document structure).

Other browsers like Chrome and Safari don’t do anything with the alternative stylesheets. But they don’t ignore them.

Every browser makes a network request for each alternative stylesheet. The request is non-blocking and seems to be low priority, which is good, but I’m somewhat perplexed by the network request being made at all.

I get why Firefox is requesting those stylesheets. It’s similar to requesting a print stylesheet. Even if the network were to drop, you still want those styles available to the user.

But I can’t think of any reason why Chrome or Safari would download the alternative stylesheets.

Negative

I no longer have Covid. I am released from isolation.

Alas, my negative diagnosis came too late for me to make it to UX London. But that’s okay—by the third and final day of the event, everything was running smooth like buttah! Had I shown up, I would’ve just got in the way. The Clearleft crew ran the event like a well-oiled machine.

I am in the coronaclear just in time to go away for a week. My original thinking was this would be my post-UX-London break to rest up for a while, but it turns out I’ve been getting plenty of rest during UX London.

I’m heading to the west coast of Ireland for The Willie Clancy Summer School, a trad music pilgrimage.

Jessica and I last went to Willie Week in 2019. We had a great time and I distinctly remember thinking “I’m definitely coming back next year!”

Well, a global pandemic put paid to that. The event ran online for the past two years. But now that it’s back for real, I wouldn’t miss it for the world.

My mandolin and I are bound for Miltown Malbay!

SafarIE

I was moaning about Safari recently. Specifically I was moaning about the ridiculous way that browser updates are tied to operating system updates.

But I felt bad bashing Safari. It felt like a pile-on. That’s because a lot of people have been venting their frustrations with Safari recently:

I think it’s good that people share their frustrations with browsers openly, although I agree with Baldur Bjarnason that’s good to avoid Kremlinology and the motivational fallacy when blogging about Apple.

It’s also not helpful to make claims like “Safari is the new Internet Explorer!” Unless, that is, you can back up the claim.

On a recent episode of the HTTP 203 podcast, Jake and Surma set out to test the claim that Safari is the new IE. They did it by examining Safari according to a number of different measurements and comparing it to the olden days of Internet Explorer. The result is a really fascinating trip down memory lane along with a very nuanced and even-handed critique of Safari.

And the verdict? Well, you’ll just to have to listen to the podcast episode.

If you’d rather read the transcript, tough luck. That’s a real shame because, like I said, it’s an excellent and measured assessment. I’d love to add it to the links section of my site, but I can’t do that in good conscience while it’s inaccessible to the Deaf community.

When I started the Clearleft podcast, it was a no-brainer to have transcripts of every episode. Not only does it make the content more widely available, but it also makes it easier for people to copy and paste choice quotes.

Still, I get it. A small plucky little operation like Google isn’t going to have the deep pockets of a massive corporation like Clearleft. But if Jake and Surma were to open up a tip jar, I’d throw some money in to get HTTP 203 transcribed (I recommend getting Tina Pham to do it—she’s great!).

I apologise for my note of sarcasm there. But I share because I care. It really is an excellent discussion; one that everyone should be able to access.

Update: the bug with that episode of the HTTP 203 podcast has been fixed. Here’s the transcript! And all future episodes will have transcripts too:

Browsers

I mentioned recently that there might be quite a difference in tone between my links and my journal here on my website:

’Sfunny, when I look back at older journal entries they’re often written out of frustration, usually when something in the dev world is bugging me. But when I look back at all the links I’ve bookmarked the vibe is much more enthusiastic, like I’m excitedly pointing at something and saying “Check this out!” I feel like sentiment analyses of those two sections of my site would yield two different results.

My journal entries have been even more specifically negative of late. I’ve been bitchin’ and moanin’ about web browsers. But at least I’m an equal-opportunities bitcher and moaner.

I wish my journal weren’t so negative, but my mithering behaviour has been been encouraged. On more than one occasion, someone I know at a browser company has taken me aside to let me know that I should blog about any complaints I might have with their browser. It sounds counterintuitive, I know. But these blog posts can give engineers some ammunition to get those issues prioritised and fixed.

So my message to you is this: if there’s something about a web browser that you’re not happy with (or, indeed, if there’s something you’re really happy with), take the time to write it down and publish it.

Publish it on your website. You could post your gripes on Twitter but whinging on Jack’s website is just pissing in the wind. And I suspect you also might put a bit more thought into a blog post on your own site.

I know it’s a cliché to say that browser makers want to hear from developers—and I’m often cynical about it myself—but they really do want to know what we think. Share your thoughts. I’ll probably end up linking to what you write.

Updating Safari

Safari has been subjected to a lot of ire recently. Most of that ire has been aimed at the proposed changes to the navigation bar in Safari on iOS—moving it from a fixed top position to a floaty bottom position right over the content you’re trying to interact with.

Courage.

It remains to be seen whether this change will actually ship. That’s why it’s in beta—to gather all the web’s hot takes first.

But while this very visible change is dominating the discussion, invisible changes can be even more important. Or in the case of Safari, the lack of changes.

Compared to other browsers, Safari lags far behind when it comes to shipping features. I’m not necessarily talking about cutting-edge features either. These are often standards that have been out for years. This creates a gap—albeit an invisible one—between Safari and other browsers.

Jorge Arango has noticed this gap:

I use Safari as my primary browser on all my devices. I like how Safari integrates with the rest of the OS, its speed, and privacy features. But, alas, I increasingly have issues rendering websites and applications on Safari.

That’s the perspective of an end-user. Developers who have to deal with the gap in features are more, um, strident in their opinions. Perry Sun wrote For developers, Apple’s Safari is crap and outdated:

Don’t get me wrong, Safari is very good web browser, delivering fast performance and solid privacy features.

But at the same time, the lack of support for key web technologies and APIs has been both perplexing and annoying at the same time.

Alas, that post also indulges in speculation about Apple’s motives which always feels a bit too much like a conspiracy theory to me. Baldur Bjarnason has more to say on that topic in his post Kremlinology and the motivational fallacy when blogging about Apple. He also points to a good example of critiquing Safari without speculating about motives: Dave’s post One-offs and low-expectations with Safari, which documents all the annoying paper cuts inflicted by Safari’s “quirks.”

Another deep dive that avoids speculating about motives comes from Tim Perry: Safari isn’t protecting the web, it’s killing it. I don’t agree with everything in it. I think that Apple—and Mozilla’s—objections to some device APIs are informed by a real concern about privacy and security. But I agree with his point that it’s not enough to just object; you’ve got to offer an alternative vision too.

That same post has a litany of uncontroversial features that shipped in Safari looong after they shipped in other browsers:

Again: these are not contentious features shipping by only Chrome, they’re features with wide support and no clear objections, but Safari is still not shipping them until years later. They’re also not shiny irrelevant features that “bloat the web” in any sense: each example I’ve included above primarily improving core webpage UX and performance. Safari is slowing that down progress here.

But perhaps most damning of all is how Safari deals with bugs.

A recent release of Safari shipped with a really bad Local Storage bug. The bug was fixed within a day. Yay! But the fix won’t ship until …who knows?

This is because browser updates are tied to operating system updates. Yes, this is just like the 90s when Microsoft claimed that Internet Explorer was intrinsically linked to Windows (a tactic that didn’t work out too well for them in the subsequent court case).

I don’t get it. I’m pretty sure that other Apple products ship updates and fixes independentally of OS releases. I’m sure I’ve received software updates for Keynote, Garage Band, and other pieces of software made by Apple.

And yet, of all the applications that need a speedy update cycle—a user agent for the World Wide Web—Apple’s version is needlessly delayed by the release cycle of the entire operating system.

I don’t want to speculate on why this might be. I don’t know the technical details. But I suspect that the root cause might not be technical in nature. Apple have always tied their browser updates to OS releases. If Google’s cardinal sin is avoiding anything “Not Invented Here”, Apple’s downfall is “We’ve always done it this way.”

Evergreen browsers update in the background, usually at regular intervals. Firefox is an evergreen browser. Chrome is an evergreen browser. Edge is an evergreen browser.

Safari is not an evergreen browser.

That’s frustrating when it comes to new features. It’s unforgivable when it comes to bugs.

At least on Apple’s desktop computers, users have the choice to switch to a different browser. But on Apple’s mobile devices, users have no choice but to use Safari’s rendering engine, bugs and all.

As I wrote when I had to deal with one of Safari’s bugs:

I wish that Apple would allow other rendering engines to be installed on iOS devices. But if that’s a hell-freezing-over prospect, I wish that Safari updates weren’t tied to operating system updates.

Service worker weirdness in Chrome

I think I’ve found some more strange service worker behaviour in Chrome.

It all started when I was checking out the very nice new redesign of WebPageTest. I figured while I was there, I’d run some of my sites through it. I passed in a URL from The Session. When the test finished, I noticed that the “screenshot” tab said that something was being logged to the console. That’s odd! And the file doing the logging was the service worker script.

I fired up Chrome (which isn’t my usual browser), and started navigating around The Session with dev tools open to see what appeared in the console. Sure enough, there was a failed fetch attempt being logged. The only time my service worker script logs anything is in the catch clause of fetching pages from the network. So Chrome was trying to fetch a web page, failing, and logging this error:

The service worker navigation preload request failed with a network error.

But all my pages were loading just fine. So where was the error coming from?

After a lot of spelunking and debugging, I think I’ve figured out what’s happening…

First of all, I’m making use of navigation preloads in my service worker. That’s all fine.

Secondly, the website is a progressive web app. It has a manifest file that specifies some metadata, including start_url. If someone adds the site to their home screen, this is the URL that will open.

Thirdly, Google recently announced that they’re tightening up the criteria for displaying install prompts for progressive web apps. If there’s no network connection, the site still needs to return a 200 OK response: either a cached copy of the URL or a custom offline page.

So here’s what I think is happening. When I navigate to a page on the site in Chrome, the service worker handles the navigation just fine. It also parses the manifest file I’ve linked to and checks to see if that start URL would load if there were no network connection. And that’s when the error gets logged.

I only noticed this behaviour because I had specified a query string on my start URL in the manifest file. Instead of a start_url value of /, I’ve set a start_url value of /?homescreen. And when the error shows up in the console, the URL being fetched is /?homescreen.

Crucially, I’m not seeing a warning in the console saying “Site cannot be installed: Page does not work offline.” So I think this is all fine. If I were actually offline, there would indeed be an error logged to the console and that start_url request would respond with my custom offline page. It’s just a bit confusing that the error is being logged when I’m online.

I thought I’d share this just in case anyone else is logging errors to the console in the catch clause of fetches and is seeing an error even when everything appears to be working fine. I think there’s nothing to worry about.

Update: Jake confirmed my diagnosis and agreed that the error is a bit confusing. The good news is that it’s changing. In Chrome Canary the error message has already been updated to:

DOMException: The service worker navigation preload request failed due to a network error. This may have been an actual network error, or caused by the browser simulating offline to see if the page works offline: see https://w3c.github.io/manifest/#installability-signals

Much better!

The Machines Stop

The Situation feels like it’s changing. It’s not over, not by a long shot. But it feels like it’s entering a different, looser phase.

Throughout the lockdown, there’s been a strange symmetry between the outside world and the inside of our home. As the outside world slowed to a halt, so too did half the machinery in our flat. Our dishwasher broke shortly before the official lockdown began. So did our washing machine.

We had made plans for repairs and replacements, but as events in the world outside escalated, those plans had to be put on hold. Plumbers and engineers weren’t making any house calls, and rightly so.

We even had the gas to our stovetop cut off for a while—you can read Jessica’s account of that whole affair. All the breakdowns just added to the entropic Ballardian mood.

But the gas stovetop was fixed. And so too was the dishwasher, eventually. Just last week, we got our new washing machine installed. Piece by piece, the machinery of our interier world revived in lockstep with the resucitation of the world outside.

As of today, pubs will be open. I won’t be crossing their thresholds just yet. We know so much more about the spread of the virus now, and gatherings of people in indoor spaces are pretty much the worst environments for transmission.

I’m feeling more sanguine about outdoor spaces. Yesterday, Jessica and I went into town for Street Diner. It was the first time since March that we walked in that direction—our other excursions have been in the direction of the countryside.

It was perfectly fine. We wore masks, and while we were certainly in the minority, we were not alone. People were generally behaving responsibly.

Brighton hasn’t done too badly throughout The Situation. But still, like I said, I have no plans to head to the pub on a Saturday night. The British drinking culture is very much concentrated on weekends. Stay in all week and then on the weekend, lassen die Sau raus!, as the Germans would say.

After months of lockdown, reopening pubs on a Saturday seems like a terrible idea. Over in Ireland, pubs have been open since Monday—a sensible day to soft-launch. With plenty of precautions in place, things are going well there.

I’ve been watching The Situation in Ireland throughout. It’s where my mother lives, so I was understandably concerned. But they’ve handled everything really well. It’s not New Zealand, but it’s also not the disaster that is the UK.

It really has been like watching an A/B test run at the country level. Two very similar populations confronted with exactly the same crisis. Ireland took action early, cancelling the St. Patrick’s Day parade(!) while the UK was still merrily letting Cheltenham go ahead. Ireland had clear guidance. The UK had dilly-dallying and waffling. And when the shit really hit the fan, the Irish taoiseach rolled up his sleeves and returned to medical work. Meanwhile the UK had Dominic Cummings making a complete mockery of the sacrifices that everyone was told to endure.

What’s strange is that people here in the UK don’t seem to realise how the rest of the world, especially other European countries, have watched the response here with shock and horror. The narrative here seems to be that we all faced this thing together, and with our collective effort, we averted the worst. But the numbers tell a very different story. Comparing the numbers here with the numbers in Ireland—or pretty much any other country in Europe—is sobering.

So even though the timelines for reopenings here converge with Ireland’s, The Situation is far from over.

Even without any trips to pubs, restaurants, or other indoor spaces, I’m looking forward to making some more excursions into town. Not that it’s been bad staying at home. I’ve really quite enjoyed staying put, playing music, reading books, and watching television.

I was furloughed from work for a while in June. Normally, my work at this time of year would involve plenty of speaking at conferences. Seeing as that wasn’t happening, it made sense to take advantage of the government scheme to go into work hibernation for a bit.

I was worried I might feel at a bit of a loose end, but I actually really enjoyed it. The weather was good so I spent quite a bit of time just sitting in the back garden, reading (I am very, very grateful to have even a small garden). I listened to music. I watched movies. I surfed the web. Yes, properly surfed the web, going from link to link, get lost down rabbit holes. I tell you, this World Wide Web thing is pretty remarkable. Some days I used it to read up on science or philosophy. I spent a week immersed in Napoleonic history. I have no idea how or why. But it was great.

I’m back at work now, and have been for a couple of weeks. But I wouldn’t mind getting furloughed again. It felt kind of like being retired. I’m quite okay with the propsect of retirement now, as long as we have music and sunshine and the World Wide Web.

That’s the future. For now, The Situation continues, albeit in looser form.

I’ve really enjoyed reading other people’s accounts throughout. My RSS reader is getting a good workout. I always look forward to weeknotes from Alice, Nat, and Phil (this piece from Phil has really stuck with me). Jessica has written fifteen installments—and counting—of A Journal of the Plague Week. I know I’m biased, but I think it’s some mighty fine writing. Start here.

41 hours in Galway

It was my birthday recently. I’m a firm believer in the idea that birthday celebrations should last for more than 24 hours. A week is the absolute minimum.

For the day itself, I did indeed indulge in a most luxurious evening out with Jessica at The Little Fish Market in Hove (on the street where we used to live!). The chef, Duncan Ray, is an absolute genius and his love for all things fish-related shines through in his magnificent dishes.

But to keep the celebrations going, we also went on a weekend away to Galway, where I used to live decades ago. It was a quick trip but we packed in a lot. I joked at one point that it felt like one of those travel articles headlined with “36 hours in someplace.” I ran the numbers and it turned out we were in Galway for 41 hours, but I still thought it would be fun to recount events in the imperative style of one of those articles…

A surprisingly sunny day in Galway.

Saturday, February 29th

The 3:30pm train from Dublin will get you into Galway just before 6pm. The train station is right on the doorstep of Loam, the Michelin-starred restaurant where you’ve made your reservation. Enjoy a seven course menu of local and seasonal produce. Despite the quality of the dishes, you may find the overall experience is a little cool, and the service a touch over-rehearsed.

You’ll be released sometime between 8:30pm and 9pm. Stroll through Eyre Square and down Shop Street to the Jury’s Inn, your hotel. It’s nothing luxurious but it’s functional and the location is perfect. It’s close to everything without being in the middle of the noisy weekend action. The only noise you should hear is the rushing of the incredibly fast Corrib river outside your window.

Around 9:30pm, pop ‘round to Dominick Street to enjoy a cocktail in the America Village Apothecary. It’s only open two nights a week, and it’s a showcase of botanicals gathered in Connemara. Have them make you a tasty conconction and then spend time playing guess-that-smell with their specimen jars.

By 10:30pm you should be on your way round the corner to The Crane Bar on Sea Road. Go in the side entrance and head straight upstairs where the music session will be just getting started. Marvel at how chilled out it is for a Saturday night, order a pint, and sit and listen to some lovely jigs’n’reels. Don’t forget to occassionally pester one of the musicians by asking “What was that last tune called? Lovely set!”

Checked in at The Crane Bar. Great tunes! 🎻🎶 — with Jessica

Sunday, March 1st

Skip the hotel breakfast. Instead, get your day started with a coffee from Coffeewerk + Press. Get that coffee to go and walk over to Ard Bia at Nimmos, right at the Spanish Arch. Get there before it opens at 10am. There will already be a line. Once you’re in, order one of the grand brunch options and a nice big pot of tea. The black pudding hash will set you up nicely.

Checked in at Ard Bia at Nimmos. Black pudding hash and a pot of tea — with Jessica

While the weather is far clearer and sunnier than you were expecting, take the opportunity to walk off that hearty brunch with a stroll along the sea front. That’ll blow out the cobwebs.

Galway bay. Galway bay.

When the cold gets too much, head back towards town and duck into Charlie Byrne’s, the independent bookshop. Spend some time in there browsing the shelves and don’t leave without buying something to remember it by.

By 1pm or so, it’s time for some lunch. This is the perfect opportunity to try the sushi at Wa Cafe near the harbour. They have an extensive range of irrestistable nigiri, so just go ahead and get one of everything. The standouts are the local oyster, mackerel, and salmon.

Checked in at wa cafe. Sushi — with Jessica

From there, head to Tigh Cóilí for the 2pm session. Have a Guinness and enjoy the tunes.

Checked in at Tigh Cóilí. Afternoon session — with Jessica

Spend the rest of the afternoon strolling around town. You can walk through the market at St. Nicholas Church, and check out the little Claddagh ring museum at Thomas Dillon’s—the place where you got your wedding rings at the close of the twentieth century.

Return the ring from whence it came!

If you need a pick-me-up, get another coffee from Coffeewerk + Press, but this time grab a spot at the window upstairs so you can watch the world go by outside.

By 6pm, you’ll have a hankering for some more seafood. Head over to Hooked on Henry Street. Order a plate of oysters, and a cup of seafood chowder. If they’ve got ceviche, try that too.

Checked in at Hooked. with Jessica

Walk back along the canal and stop in to The Salt House to sample a flight of beers from Galway Bay Brewery. There’ll probably be some live music.

Checked in at The Salt House. 🍺 — with Jessica

With your appetite suitably whetted, head on over to Cava Bodega for some classic tapas. Be sure to have the scallops with black pudding.

Checked in at Cava Bodega. Scallops with black pudding — with Jessica

The evening session at Tigh Cóilí starts at 8:30pm on a Sunday so you can probably still catch it. You’ll hear some top-class playing from the likes of Mick Conneely and friends.

Checked in at Tigh Cóilí. 🎶🎻 — with Jessica

And when that’s done, there’s still time to catch the session over at The Crane.

Checked in at The Crane Bar. 🎶🎻 — with Jessica

Monday, March 2nd

After a nice lie-in, check out of the hotel and head to McCambridge’s on Shop Street for some breakfast upstairs. A nice bowl of porridge will set you up nicely for the journey back to Dublin.

If you catch the 11am train, you’ll arrive in Dublin by 1:30pm—just enough time to stop off in The Winding Stair for some excellent lunch before heading on to the airport.

Checked in at The Winding Stair. Lunch in Dublin — with Jessica

Getting there

Aer Lingus flies daily from Gatwick to Dublin. Dublin’s Heuston Station has multiple trains per day going to Galway.

Navigation preloads in service workers

There’s a feature in service workers called navigation preloads. It’s relatively recent, so it isn’t supported in every browser, but it’s still well worth using.

Here’s the problem it solves…

If someone makes a return visit to your site, and the service worker you installed on their machine isn’t active yet, the service worker boots up, and then executes its instructions. If those instructions say “fetch the page from the network”, then you’re basically telling the browser to do what it would’ve done anyway if there were no service worker installed. The only difference is that there’s been a slight delay because the service worker had to boot up first.

  1. The service worker activates.
  2. The service worker fetches the file.
  3. The service worker does something with the response.

It’s not a massive performance hit, but it’s still a bit annoying. It would be better if the service worker could boot up and still be requesting the page at the same time, like it would do if no service worker were present. That’s where navigation preloads come in.

  1. The service worker activates while simultaneously requesting the file.
  2. The service worker does something with the response.

Navigation preloads—like the name suggests—are only initiated when someone navigates to a URL on your site, either by following a link, or a bookmark, or by typing a URL directly into a browser. Navigation preloads don’t apply to requests made by a web page for things like images, style sheets, and scripts. By the time a request is made for one of those, the service worker is already up and running.

To enable navigation preloads, call the enable() method on registration.navigationPreload during the activate event in your service worker script. But first do a little feature detection to make sure registration.navigationPreload exists in this browser:

if (registration.navigationPreload) {
  addEventListener('activate', activateEvent => {
    activateEvent.waitUntil(
      registration.navigationPreload.enable()
    );
  });
}

If you’ve already got event listeners on the activate event, that’s absolutely fine: addEventListener isn’t exclusive—you can use it to assign multiple tasks to the same event.

Now you need to make use of navigation preloads when you’re responding to fetch events. So if your strategy is to look in the cache first, there’s probably no point enabling navigation preloads. But if your default strategy is to fetch a page from the network, this will help.

Let’s say your current strategy for handling page requests looks like this:

addEventListener('fetch', fetchEvent => {
  const request = fetchEvent.request;
  if (request.headers.get('Accept').includes('text/html')) {
    fetchEvent.respondWith(
      fetch(request)
      .then( responseFromFetch => {
        // maybe cache this response for later here.
        return responseFromFetch;
      })
      .catch( fetchError => {
        return caches.match(request)
        .then( responseFromCache => {
          return responseFromCache || caches.match('/offline');
        });
      })
    );
  }
});

That’s a fairly standard strategy: try the network first; if that doesn’t work, try the cache; as a last resort, show an offline page.

It’s that first step (“try the network first”) that can benefit from navigation preloads. If a preload request is already in flight, you’ll want to use that instead of firing off a new fetch request. Otherwise you’re making two requests for the same file.

To find out if a preload request is underway, you can check for the existence of the preloadResponse promise, which will be made available as a property of the fetch event you’re handling:

fetchEvent.preloadResponse

If that exists, you’ll want to use it instead of fetch(request).

if (fetchEvent.preloadResponse) {
  // do something with fetchEvent.preloadResponse
} else {
  // do something with fetch(request)
}

You could structure your code like this:

addEventListener('fetch', fetchEvent => {
  const request = fetchEvent.request;
  if (request.headers.get('Accept').includes('text/html')) {
    if (fetchEvent.preloadResponse) {
      fetchEvent.respondWith(
        fetchEvent.preloadResponse
        .then( responseFromPreload => {
          // maybe cache this response for later here.
          return responseFromPreload;
        })
        .catch( preloadError => {
          return caches.match(request)
          .then( responseFromCache => {
            return responseFromCache || caches.match('/offline');
          });
        })
      );
    } else {
      fetchEvent.respondWith(
        fetch(request)
        .then( responseFromFetch => {
          // maybe cache this response for later here.
          return responseFromFetch;
        })
        .catch( fetchError => {
          return caches.match(request)
          .then( responseFromCache => {
            return responseFromCache || caches.match('/offline');
          });
        })
      );
    }
  }
});

But that’s not very DRY. Your logic is identical, regardless of whether the response is coming from fetch(request) or from fetchEvent.preloadResponse. It would be better if you could minimise the amount of duplication.

One way of doing that is to abstract away the promise you’re going to use into a variable. Let’s call it retrieve. If a preload is underway, we’ll assign it to that variable:

let retrieve;
if (fetchEvent.preloadResponse) {
  retrieve = fetchEvent.preloadResponse;
}

If there is no preload happening (or this browser doesn’t support it), assign a regular fetch request to the retrieve variable:

let retrieve;
if (fetchEvent.preloadResponse) {
  retrieve = fetchEvent.preloadResponse;
} else {
  retrieve = fetch(request);
}

If you like, you can squash that into a ternary operator:

const retrieve = fetchEvent.preloadResponse ? fetchEvent.preloadResponse : fetch(request);

Use whichever syntax you find more readable.

Now you can apply the same logic, regardless of whether retrieve is a preload navigation or a fetch request:

addEventListener('fetch', fetchEvent => {
  const request = fetchEvent.request;
  if (request.headers.get('Accept').includes('text/html')) {
    const retrieve = fetchEvent.preloadResponse ? fetchEvent.preloadResponse : fetch(request);
    fetchEvent.respondWith(
      retrieve
      .then( responseFromRetrieve => {
        // maybe cache this response for later here.
       return responseFromRetrieve;
      })
      .catch( fetchError => {
        return caches.match(request)
        .then( responseFromCache => {
          return responseFromCache || caches.match('/offline');
        });
      })
    );
  }
});

I think that’s the least invasive way to update your existing service worker script to take advantage of navigation preloads.

Like I said, preload navigations can give a bit of a performance boost if you’re using a network-first strategy. That’s what I’m doing here on adactio.com and on thesession.org so I’ve updated their service workers to take advantage of navigation preloads. But on Resilient Web Design, which uses a cache-first strategy, there wouldn’t be much point enabling navigation preloads.

Jeff Posnick made this point in his write-up of bringing service workers to Google search:

Adding a service worker to your web app means inserting an additional piece of JavaScript that needs to be loaded and executed before your web app gets responses to its requests. If those responses end up coming from a local cache rather than from the network, then the overhead of running the service worker is usually negligible in comparison to the performance win from going cache-first. But if you know that your service worker always has to consult the network when handling navigation requests, using navigation preload is a crucial performance win.

Oh, and those browsers that don’t yet support navigation preloads? No problem. It’s a progressive enhancement. Everything still works just like it did before. And having a service worker on your site in the first place is itself a progressive enhancement. So enabling navigation preloads is like a progressive enhancement within a progressive enhancement. It’s progressive enhancements all the way down!

By the way, if all of this service worker stuff sounds like gibberish, but you wish you understood it, I think my book, Going Offline, will prove quite valuable.

Trad time

Fifteen years ago, I went to the Willie Clancy Summer School in Miltown Malbay:

I’m back from the west of Ireland. I was sorry to leave. I had a wonderful, music-filled time.

I’m not sure why it took me a decade and a half to go back, but that’s what I did last week. Myself and Jessica once again immersed ourselves in Irish tradtional music. I’ve written up a trip report over on The Session.

On the face of it, fifteen years is a long time. Last time I made the trip to county Clare, I was taking pictures on a point-and-shoot camera. I had a phone with me, but it had a T9 keyboard that I could use for texting and not much else. Also, my hair wasn’t grey.

But in some ways, fifteen years feels like the blink of an eye.

I spent my mornings at the Willie Clancy Summer School immersed in the history of Irish traditional music, with Paddy Glackin as a guide. We were discussing tradition and change in generational timescales. There was plenty of talk about technology, but we were as likely to discuss the influence of the phonograph as the influence of the internet.

Outside of the classes, there was a real feeling of lengthy timescales too. On any given day, I would find myself listening to pre-teen musicians at one point, and septegenarian masters at another.

Now that I’m back in the Clearleft studio, I’m finding it weird to adjust back in to the shorter timescales of working on the web. Progress is measured in weeks and months. Technologies are deemed outdated after just a year or two.

The one bridging point I have between these two worlds is The Session. It’s been going in one form or another for over twenty years. And while it’s very much on and of the web, it also taps into a longer tradition. Over time it has become an enormous repository of tunes, for which I feel a great sense of responsibility …but in a good way. It’s not something I take lightly. It’s also something that gives me great satisfaction, in a way that’s hard to achieve in the rapidly moving world of the web. It’s somewhat comparable to the feelings I have for my own website, where I’ve been writing for eighteen years. But whereas adactio.com is very much focused on me, thesession.org is much more of a community endeavour.

I question sometimes whether The Session is helping or hindering the Irish music tradition. “It all helps”, Paddy Glackin told me. And I have to admit, it was very gratifying to meet other musicians during Willie Clancy week who told me how much the site benefits them.

I think I benefit from The Session more than anyone though. It keeps me grounded. It gives me a perspective that I don’t think I’d otherwise get. And in a time when it feels entirely to right to question whether the internet is even providing a net gain to our world, I take comfort in being part of a project that I think uses the very best attributes of the World Wide Web.

Registering service workers

In chapter two of Going Offline, I talk about registering your service worker wrapped up in some feature detection:

<script>
if (navigator.serviceWorker) {
  navigator.serviceWorker.register('/serviceworker.js');
}
</script>

But I also make reference to a declarative way of doing this that isn’t very widely supported:

<link rel="serviceworker" href="/serviceworker.js">

No need for feature detection there. Thanks to the liberal error-handling model of HTML (and CSS), browsers will just ignore what they don’t understand, which isn’t the case with JavaScript.

Alas, it looks like that nice declarative alternative isn’t going to be making its way into browsers anytime soon. It has been removed from the HTML spec. That’s a shame. I have a preference for declarative solutions where possible—they’re certainly easier to teach. But in this case, the JavaScript alternative isn’t too onerous.

So if you’re reading Going Offline, when you get to the bit about someday using the rel value, you can cast a wistful gaze into the distance, or shed a tiny tear for what might have been …and then put it out of your mind and carry on reading.

Posting to my site

I was idly thinking about the different ways I can post to adactio.com. I decided to count the ways.

Admin interface

This is the classic CMS approach. In my case the CMS is a crufty hand-rolled affair using PHP and MySQL that I wrote years ago. I log in to an admin interface and fill in a form, putting the text of my posts into a textarea. In truth, I usually write in a desktop text editor first, and then paste that into the textarea. That’s what I’m doing now—copying and pasting Markdown from the Typed app.

Directly from my site

If I’m logged in, I get a stripped down posting interface in the notes section of my site.

Notes posting interface

Bookmarklet

This is how I post links. When I’m at a URL I want to bookmark, I hit the “Bookmark it” bookmarklet in my browser’s bookmarks bar. That pops open a version of the admin interface tailored specifically for links. I really, really like bookmarklets. The one big downside is that they don’t work on mobile.

Text message

This is something I knocked together at Indie Web Camp Brighton 2015 using the Twilio API. It’s handy for posting notes if I’m travelling somewhere and data is at a premium. But I don’t use it that often.

Instagram

Thanks to Aaron’s OwnYourGram service—and the fact that my site has a micropub endpoint—I can post images from Instagram to my site. This used to happen instantaneously but Instagram changed their API rules for the worse. Between that and their shitty “algorithmic” timeline, I find myself using the service less and less. At this point I’m only on their for the doggos.

Swarm

Like OwnYourGram, Aaron’s OwnYourSwarm allows me to post check-ins and photos from the Swarm app to my site. Again, micropub makes it all possible.

OwnYourGram and OwnYourSwarm are very similar and could probably be abstracted into a generic service for posting from third-party apps to micropub endpoints. I’d quite like to post my check-ins on Untappd to my site.

Other people’s admin interfaces

Thanks to rel="me" and IndieAuth, I can log into other people’s posting interfaces using my own website as the log-in, and post to my micropub endpoint, like this. Quill is a good example of this. I don’t use it that much, but I really should—the editor interface is quite Medium-like in its design.

Anyway, those are the different ways I can update my website that I can think of right now.

Syndication

In terms of output, I’ve got a few different ways of syndicating what I post here:

Just so you know, if you comment on one of my posts on Facebook, I probably won’t see it. But if you reply to a copy of one of posts on Twitter or Instagram, it will show up over here on adactio.com thanks to the magic of Brid.gy and webmention.

100 words 085

I’m back in Brighton after a thoroughly lovely weekend in Ireland. I must remember to visit Cobh more often in the summertime when there’s quite a lot of fun things to do.

But it’s nice to be back in Brighton too. This is the time of year when a seaside town really comes alive. And this is a particularly good week to be in Brighton—in just a few more days it’ll be time for third and final Responsive Day Out. I know it’s going to be an excellent event, packed with great talks. I’m really looking forward to it.