Did a Chatbot just improve me?

I’ve been looking for a new contract for a little while now, and thought I’d try to improve my chances by responding to some feedback which suggests, based on my CV, it’s not that clear what I can do and why you’d hire me.

I wrote a “little” thing, then asked Bing’s ChatGPT bot to review it. The damned bot culled my chattiness, but certainly made it significantly easier to consume. I mean, you’re probably already getting bored with this human-generated introduction, right?

With that in mind, with no further delay, here is the comparison of an attempt to explain (potentially to recruiters or a huge LinkedIn post or something) what I do, vs the Bingbot summary; me vs chatgpt; human vs robot; meat vs machine:

Continue reading

Burberry X Google Hackathons

While working at Burberry I had a few opportunities to work directly with some of the lovely people at Google, focussing on site speed, resilience, and improving the user experience.

Working with Google representatives, we organised a series of fascinating hackathon events relating to improving perceived performance, and not losing an offline user.

Perception of speed

These sessions started with an introduction for that hackathon’s theme; for example, a “Perceived Performance” session described the 2009 study at an airport in Texas where complaints were abnormally high for a baggage delivery time that was below the average wait time – the cause eventually was discovered to be due to the arrival terminal being only 1 minute from the baggage claim area, so although passengers were fast to disembark the plane and get through the terminal, they were then waiting around for their luggage.

The solution that most improved passenger happiness was to increase the distance between the arrival terminal and baggage reclaim, such that, although the delay between arriving and receiving baggage was largely the same, it was spent actively moving; passengers were not feeling like they were wasting their time passively waiting around.

This was a great example of if you can’t make something actually fast, make it seem fast enough.

Offline (or is it?)

Some other interesting hacks were related to the user being offline; something that is more common that teams might expect. Mobile traffic has soared over the past few years, leaving desktop traffic in its wake.

One huge difference between desktop and mobile is the connectivity: WiFi / ethernet is more reliable than traversing the physical space that makes up a mobile network: jumping between cell towers, moving through built up areas and underground transport dead zones.

The default website reaction to this would be a browser error page; however, given how often a mobile user might see this on your site, it’s certainly worth considering mitigations.

The first hack to mention was one where I suggested a couple of new engineers attend; surprised at the invite, they asked what should they do?!

My suggestion was to implement client side caching via a service worker for any visited product detail page (PDP), and alter the product listing page (PLP) to greyscale any images of products that were not in the cache when the user is offline; this allows the user to still hop between listing page and detail pages for products they’ve already seen – which they can clearly identify – until they’re back online.

(Mock up of how a “greyscale if not cached” product listing page could look)

This suggestion was implemented beautifully, and that team won the multi company hackathon! Go team!


The second mentionable hack, although developed at a much later hackathon, would turn out to be complimentary to the first.

By utilising a service worker and custom javascript (background sync doesn’t work in Safari, which was the biggest mobile market for ecommerce), the user could still add products to their basket while offline, synchronising the basket automatically once connection is restored. This was slick, and using it by default could avoid annoying “wait, did that add to my bag? Should I click again?” situations. (I did a talk about Optimistic UI at some point also, which relates to this)

This could potentially be progressively enhanced further to simplify and use the new Periodic Background Sync API, or the WebWorker Background Sync API for browsers that support them. I’ll leave it as an exercise for the reader! Fancy trying it?

Being able to involve a large subset of the digital department – including all roles, not just engineers – was fascinating. Getting everyone on shared context, and seeing what is actually possible in browsers with a relatively small amount of code and effort, was an eye opener for many.

If you are unsure the value of a hackathon, or not sure how to run one, contact someone who has done it before to help define desired outcomes, set the agenda, and help it go smoothly. The lovely people at Google did this beautifully for us at Burberry, and I’m happy to advise also!

Rent vs Buy vs Build

AKA “Is your Company a Software House or is it a Retailer?”

These days you can easily set up an online shop in a matter of minutes/hours, buying no hardware, setting up no software; just signing up to something like Shopify (or WooCommerce if you’re using WordPress) for a few coins a month and you can sell goods and services online.

You’re spoiled for choice if you have more money to invest, thanks to a proliferation in professional solutions for being an online retailer.

The complexity hidden behind these few setup clicks is quite staggering for those of us who were involved in building e-commerce in the earlyish years.

Back in the early 2000s, there weren’t many options for powering an e-commerce website. Without regular reviews of the ecosystem, some of us might have fallen into the trap of spending a lot of time and effort “writing software” instead of solving business problems or helping meet business goals; the “Software House” vs “Online Retailer” dichotomy.

For example…

Continue reading

Browser Super Powers: getUserMedia

In case you didn’t already believe it, your Web Browser has super powers. No longer is it something to merely display a document marked up with hypertext.

No longer is it limited to the read-only text and images of the olden days (aka the last two decades or so). Oh no. Now that the browser wars have cooled down, and the commons group are collaborating and updating the W3C standards rapidly enough for the eager-beaver browser vendors, we’re seeing new functionality quickly adopted across all major browser.

One of these super powers allows us to access the user’s microphone and camera (with their permission) using a one-liner:

var promise = navigator.mediaDevices.getUserMedia(constraints);

where constraints define the device preferences, such as:

{
  audio: true,
  video: {
    width: { min: 1024, ideal: 1280, max: 1920 },
    height: { min: 776, ideal: 720, max: 1080 },
    facingMode: "user"
  }
}

Here we’re requesting permission to access the device’s microphone and the camera, with a minimum and maximum requirement around the camera resolution, as well as defining a preference for the front-facing camera if available (facingMode).

This is all just plain old javascript too! At the time of writing, you only need to worry about ye olde IE and Opera Mini not supporting it.

Don’t believe me? Go to a website that uses HTTPS, open your browser’s Developer Tools and paste this in to the console:

var constraints = { audio: true, video: true }; 

navigator.mediaDevices.getUserMedia(constraints)
.then(function(mediaStream) {
    var video = document.createElement('video');
    document.getElementsByTagName('body')[0].appendChild(video);
    video = document.querySelector('video');
    video.srcObject = mediaStream;
    video.onloadedmetadata = function(e) {
        video.play();
    };
})
.catch(function(err) { console.log(err.name + ": " + err.message); });

You’ll be prompted for permission to access your devices:
browser requesting permission to access camera

If you allow, then you can scroll to the bottom of the page and see your lovely face appear in a dynamically generated video tag:

dynamically added video element with my pretty face in it
Amazing, right?

Web browsers are getting closer to native apps in what they can achieve, and getUserMedia (aka Stream API) is just one example of this.

How to create an Apache-licenced Private WebPageTest setup, and get the Classic Interface

In my previous articles I took you through the process of setting up your own private WebPageTest, either via the AWS interface, or via Terraform infrastructure as code).

By default, this would create a Private WebPageTest instance that uses the latest code on the release branch of the official WPOFoundation github repo for WebPageTest.

new webpagetest ui

This is great if you like the newer UI (it’s not as up to date as the official WebPageTest.org site, which obviously evolves much faster), but it might not be what you want for a couple of reasons:

  1. You preferred the original, classic, WebPageTest UI, or
  2. You plan to monetise your private WebPageTest setup, which violates the release branch’s LICENSE.md entry about “Noncompete” and “Competition”

Since WebPageTest existed loooong before Catchpoint bought it up, the original version of the code (the fully open source version) still exists, and has no such non-competition concerns. It does have a LICENSE file, but that just lists all the licenses associated with the other libraries WebPageTest uses.

By the way, the same is also true for the WebPageTest agent – master branch & release branch vs apache branch – so bear that in mind if you’re creating a competing product. Presumably this is what Speedcurve do, for example. (Apparently so!)

In this article I’ll show you how to tweak the previous private WebPageTest installation scripts and setup processes to use the apache branch, thus reverting to the “classic” UI, and freeing you up from non-competition concerns. (if you get rich because of this article, please buy me a coffee and hire me, thanks 😁)

Continue reading

Automate Your WebPageTest Private Instance With Terraform: 2021 Edition

This article assumes you have an understanding of what Terraform is, what WebPageTest is and some AWS basics.

all the logos

Have you ever wanted to have your WebPageTest setup managed as infrastructure as code, so you can keep all those carefully tuned changes and custom settings in source control, ready to confidently and repeatedly destroy and rebuild at a whim?

Sure you have.

In this article I’ll show how to script the setup of your new WPT server, installing from a base OS, and configuring customisations – all within Terraform so you can easily rebuild it with a single command.


Continue reading

Google’s Chrome User Experience Data in WebPageTest

This article assumes that you know the basics of AWS, WebPageTest, SSH, and at least one linux text editor.

Chrome User Experience Report logo + WebPageTest Logo + Core Web Vitals' LCP thresholds. Quite a busy hero image, I'll admit.

When talking to people about website performance stats, I’ll usually split it into Real User Metrics (RUM) and Automated (Synthetic/Test Lab):

  • RUM is performance data reported from the website you own, reported into the analytics tool you have integrated.
  • Automated are scripted tests that you run from your own performance testing tool against any website you like.

RUM is great: you get real performance details from real user devices and can investigate the difference in performance for many different options.

For example, iPhone vs Android, Mac vs Windows, Mobile vs Desktop, Chrome vs Firefox, UK vs US, even down to ISP and connection type, in order to see who is getting a good experience and who can be improved.

This data is invaluable in prioritising performance improvements, since you can tie it back to the approximate number of users it will affect, and therefore the impact on your business.

There are loads of vendors who can provide this for you (I’ve used many of them), or you can write your own – if you’re a glutton for punishment (and high AWS bills – ask me how I know…😁)

However, since this is measured on your own website and reported into your own tooling, you can only see such real-world performance detail for your own website; no real-world user experience data from your competitors.

Automated tests are great: you get detailed measurements of any website you can access – your own or competitors, or basically any website – in a repeatable fashion so that you can track changes over time.

You can have as many automated tests as you like, you can test from wherever you’re able to set up a test agent, and with whatever device you’re able to automate or emulate.

However, since these are all automated tests running because you said so, you can’t use them to understand how users are using your site, on which devices, what devices are underperforming others, and from where.

Again, there are a load of vendors who can provide this for you; writing your own is a bit more of a headache though – I wouldn’t recommend it, especially while wpt continues to be free for self hosting.

What if you could get some of the usual key performance metrics you’re used to with RUM, but for sites you don’t own such as your competitors?

In this article I’ll talk about the Google Chrome User Experience dataset and how to use it in your performance test setup to find the intersection of RUM and Automated performance test results, wiring it all up into your WebPageTest setup! Continue reading

WebPageTest Private Instance: 2021 Edition

Catchpoint's WebPageTest

The fantastic WebPageTest, free to use and public, has been available to set up your own private instances for many years; I wrote this up a while back, and scripted a Terraform version to make this as easy and automated as possible.

For AWS it was just a case of creating an EC2 instance (other installation options are available) with a predefined WPT server AMI (amazon machine image), add in a few configuration options and boom – your very own, autoscaling, globally distributed, website performance testing solution! New test agents would spin up automatically in other AWS regions, all based on WebPageTest Agent AMIs.

In 2020 WebPageTest was bought by Catchpoint and we finally saw improvements being made, pull requests being closed, and the WebPageTest UI getting a huge update; things were looking great for WebPageTest enthusiasts! If you havent heard of Catchpoint before, they are a company who are all about global network and web application monitoring, so a good match for WebPageTest.

Unfortunately, however, this resulted in the handy WebPageTest server EC2 AMIs no longer existing. If you want your own private installation you now need to build your own WebPageTest server from a base OS. It can be a bit tricky, though it gives you greater understanding of how it works under the hood, so hopefully you’ll feel more confident extending your installation in future.

In this article, I’ll show you how to create a WebPageTest private instance on AWS from scratch (no AMI), create your own private WebPageTest agents using the latest and greatest version of WebPageTest, and wire it all up.

Continue reading

Creating a 4G router using a Raspberry Pi and a mobile phone

A few days ago these workmen were using cutting machinery dangerously close to my broadband cables:

Shortly after this picture was taken – bang! No internet! They cut the cables while doing their work!

Two adults working from home on back to back video calls, one high-schooler also on video classes, and one primary-schooler with streaming classes – all suddenly disconnected from the world!

That afternoon we huddled around the kitchen table, mobile phones on with hotspots enabled to get through the rest of the day – but this wouldn’t work for regular use.

The broadband company said it wouldn’t be fixed for weeks due to how badly everything was damaged; the pavement would have to come up! I had to think of a pragmatic solution.

In this article I’ll go through the steps I took to completely swap my home broadband for a Raspberry Pi and a spare mobile phone – and show the results!

Continue reading