Author: Kev Quirk

Hi, I'm Kev and I'm a cyber security professional from England. I use this site to share my thoughts and ideas from time to time.

Which Is The Best WordPress Caching Plugin?

I’ve talked about how I optimise this site before, but I wanted to do some digging into which is the best WordPress caching plugin. I’ve tested some of the most popular caching plugins available, and decided to write this post with the results.

Testing WordPress caching plugins

Before I get into the results, let’s talk about how I actually went about testing each plugin, so you have an idea of what I tested and how.

My friend and Fosstodon colleague, Matt Cooper, kindly offered to give me a test WordPress site on his shared hosting account with Hostinger. From there I created a 1,200 word post using Lorem Ipsum generated text. I also added a 140KB image as the featured image.

I kept all of the WordPress settings default, except for the permalink structure, which I set to post-name. All plugins were removed and I used the default Twenty Twenty WordPress theme.

Which WordPress caching plugins were tested?

There are a lot of caching plugins available for WordPress, but for this test I went with the 4 most popular plugins I could find, which are:

  1. LiteSpeed Cache
  2. W3 Total Cache
  3. WP Fastest Cache
  4. WP Rocket (premium plugin)

I installed each of these plugins in turn, then configured them as best I could. At first I was going to figure out a baseline and configure them all as closely as each other, but I later decided that this was unfair.

We’re looking for the best WordPress caching plugin here, so I wanted to test all the bell and whistles each plugin offers so that I could get a true representation of what they can do to a site’s performance.

How was the testing carried out?

Once I had established which WordPress caching plugins I was going to test, I headed over to GTMetrix to carry out the tests. The server that Matt’s shared hosting account is on is located on the east coast of the US. The closest GTMetrix server is located in Dallas, TX. So I went with that. I also selected Firefox as the test browser.

GTMEtrix test results

I tested the same page without WordPress caching so I could get a baseline, then with each caching plugin listed above.

Tests were carried out 6 times for each caching plugin. The first test was ignored, the rationale being that it allowed GTMetrix to generate the cache for the subsequent 5 tests that I ran, which I then took an average of. This, I think, gave a good idea of how the WordPress caching plugins were improving performance.

The results

GTMetrix gives a tonne of data for users to sift though, but I focussed on 3 metrics for my testing:

  1. Load time in seconds
  2. Total page size
  3. Number of requests
AVG Load TimePage SizeRequests
No caching2.26194 KB9
LiteSpeed Cache1.36185 KB7
W3 Total Cache1.36195 KB7
WP Fastest Cache1.26195 KB8
WP Rocket1.46185 KB6

It was great to see that all the caching plugins improved load times. The quickest of which, WP Fastest Cache, knocked a full second off.

I did think there would be more of a difference between the different caching plugins though. I was also really surprised to see that WP Rocket, a premium plugin that costs $45/year, was the slowest.

Having said that, we’re only talking about 2 tenths of a second between the fastest and slowest caching plugins. And when navigating around the test site, I didn’t really notice a difference in load times when using any WordPress caching.

Real world testing

The test site I used was great, and gave me good data, but I wanted to see what these plugins could do with a site that has a lot more to their posts. While 1,200 words and a single image is indicative of many blog posts, I think a post with roughly the same amount of words, multiple images and comments would be a better test.

So I decided to carry out the same tests on one of my posts. My most recent post, Synology vs Nextcloud, seemed a good candidate.

It’s 1,400 words, 3 images and 66 comments (each with an avatar). This means that the WordPress caching has to do a lot more work because it’s loading things like analytics, an advert and commenter avatars. It’s also doing some IndieWeb goodness too.

Results round 2

So I went ahead and carried out exactly the same tests on my Synology vs Nextcloud post. Here’s the results for that post:

AVG Load TimePage SizeRequests
No caching1.982.32 MB62
LiteSpeed Cache1.902.31 MB50
W3 Total Cache1.64704 KB22
WP Fastest Cache1.642.31 MB53
WP Rocket1.38961 KB20

I found these results to be far more interesting. First of all, LiteSpeed Cache and WP Fastest Cache didn’t really manage to reduce the amount of requests, or the page size by much.

I think this is because these plugins didn’t appear to have an lazy loading option for images (not that I could find anyway). So the entire post, all the images, and all the avatars were loaded right away.

Since W3 Total Cache and WP Rocket both support lazy loading, the page size and request numbers were significantly reduced.

WP Rocket managed to reduce the load time most significantly, but even that was only 60ms quicker than with no caching at all. Again, I was expecting the load times to be improved more than this.

Having said that, the server this site runs on is a VPS that’s under resourced, so even without caching, WordPress will still loads fairly quickly. If you want more info on my hosting setup, here’s a post about what it costs me to run this blog, which explains it all.

This time around there was a noticeable difference, I felt, with W3 Total Cache, WP Fastest Cache and WP Rocket. The LiteSpeed Cache and no caching did feel noticeable slower than the rest.

What I have learned

I think the most important lesson I’ve learned from this testing is that caching is not a magic pill that fixes all performance issues. I already have a good quality server, and I optimise this site in a number of ways, not just caching.

So I suppose in hindsight the results aren’t that surprising on my own environment. But I still would have expected more performance to be squeezed out of the shared environment.

I currently pay for WP Rocket, as it’s widely regarded to be the best out there. Although it did get the best speed results of the WordPress caching plugins that I tested on my site, in the real world it didn’t feel any quicker than WP Fastest Cache or W3 Total Cache. Plus, it performed the worst on the shared environment.

My WP Rocket license is due for renewal in April 2021, which is a 3 site license costing $99/year. My initial reaction was to not renew this license and instead move to W3 Total Cache, who also have a premium offering. However, that’s $99/year for 1 site. So I might as well stay put with WP Rocket, especially since it was the fastest in the tests for my site.

What can you take from this post?

Do your own testing! It’s clear from the testing that I’ve done, that the results can vary greatly depending on the environment your site is hosted on.

WP Rocket offers a free trial, so you can still give it a go, but make sure you test other WordPress caching plugins too, as you may be able to gain more performance with one of the free offerings. Not to mention a few dollars too!


I’m really happy with the setup I have, but that’s a result of lots of testing over the years. Some things have worked, some haven’t. I’m glad that my current choice, WP Rocket, ended up being the fastest in my environment, but the free alternatives came very close and were quicker on the shared hosting.

I’m going to be sticking with WP Rocket when my license is up for renewal in April.

Do you use any of the WordPress caching plugins listed in this post, or this there a hidden gem on the WordPress Plugin library that I have missed? If so, why not tell me about it in the comments below.

Quick Update

Hey folks. You may have noticed that I’ve been a little light on content over the last couple of weeks. That’s because I’ve been extremely busy with some personal stuff that I’ve been dealing with (all positive, don’t worry).

If you’re subscribed to my newsletter, you will know all the details around this personal stuff already.

Anyway, I just wanted to let you fine people know that I haven’t gone anywhere. I’m still writing content and I will be publishing more content just as soon as I get time.

Talk very soon…hopefully!

Synology vs Nextcloud – Which Is Better For A Home Server?

Before I get hung out to dry by the Internet, I’d like to point out that this comparison of Synology vs Nextcloud is my opinion only. It is based solely on my needs and my experiences with both products.

TL;DR – I think Synology is a far superior product.

My experience with Synology & Nextcloud

I wanted to preface this post by mentioning the experience I have with the two products – I’m currently running a Synology device as my home server and previously ran a different Synology device for ~5 years.

That first Synology ended up dying and I decided to replace it with a home built server running Nextcloud. I ran that for around a year and a half, before going back to Synology.

So when talking about Synology vs Nextcloud, I would say I have a fair amount of experience with both.

Right, enough waffle; let’s crack on with the post…

My home server needs

I suppose a good place to start would be to talk about my needs from a home server. To be honest, they’re pretty basic as server hosting goes.

There are only three things I need from my home server, which are:

  1. File syncing & storage
  2. Media streaming & storage
  3. Backups

That’s pretty much it. I do use my server for other things, but these 3 services are the crucial ones. Everything else is just me mucking around.

File syncing & storage

This is the bread and butter of both the Synology and Nextcloud. And to be honest, both do this very well. On Nextcloud, I had very few duplicate files and syncing was relatively quick.

However, when it comes to the mobile experience, the Nextcloud files app is pretty poor. There are options to backup photos automatically as they are taken, but I never managed to get this to work right. Instead, they would just queue up in the app and wouldn’t actually upload to the server until I opened the app on my device.

Not very automatic, is it?

I use the Synology Drive application to sync files across all my devices – Windows, Linux and iOS. Everything works great. Photos are synced automatically using the Moments app, and unlike Nextcloud, it all works flawlessly.

Synology drive client

So the desktop experience for Synology vs Nextcloud is pretty much on par. But the mobile experience has been far superior for me on Synology.

Media streaming & storage

When my wife and I get some downtime from the kids, we like to watch TV. We have Netflix and Amazon Prime, but we also like to stream stuff from our home server.

I wasn’t able to find such a service within Nextcloud, so I opted to install a Plex server instead. Plex is a great service, but it was another piece of software for me to maintain on my server.

When I think cloud, I usually think multimedia too. So the fact that Nextcloud has no way of managing a video library is a shame, and a big gap I think, as many people use their home server/NAS for streaming.

Synology has the Video Station app for streaming videos using my web browser. There are also accompanying apps for my Apple devices, and my Roku streaming boxes. So I can access my video library from pretty much anywhere – big win!

Synology Video Station

I could also install Plex on my Synology if I wanted, but I decided to just use their native Synology app as it works really well and saves me from using 3rd party applications from outside of the Synology ecosystem.


Your file syncing application of choice is not your backup. Backups should always follow the 3-2-1 backup rule.

At least 3 copies of your data in 2 different locations, 1 of which needs to be off-site.

The 3-2-1 backup rule

Again, Nextcloud falls flat on its face here and has absolutely nothing for backing up either locally or off-site. Another big miss. So when running Nextcloud, I had to install yet another application – Duplicati.

Like Plex, Duplicati is an excellent application that works really well. But at this point, I not only had Nextcloud to maintain, but also the OS, Plex and Duplicati. This became a much bigger beast than it needed to be.

Synology on the other hand has a native application – Hyper Backup. I use this to backup all of my important data locally, then I use another native Synology app, Cloud Sync, for my off-site backups to Backblaze B2.

If you want to know more, this post talks about my Synology off-site backups in more detail.

Synology Hyper Backup


At this point Nextcloud had failed in everything I needed of it outside of basic file syncing. If that’s all you need, then Nextcloud is a lot cheaper to setup in terms of money in the bank. But in terms of maintenance and your time, it’s still a lot more expensive than Synology.

You see, even if you decide you only need to run Nextcloud for syncing files and that’s all your server will do, you will still have the operating system to maintain.

If, like me, you decide to run other applications outside of what Nextcloud offers, then you will need to include the maintenance of those apps too. And, let’s not forget here, all it takes is for one of those updates to go wrong and your entire server could be toast (thank goodness for containers!)

I think it’s important to say here, that if you opt for the Nextcloud snap package, maintenance is much reduced as snaps update automatically. This is what I was doing on my Nextcloud server, but I got sick of waiting for the snap package to actually be upgrade by the Nextcloud team.

In my experience, the snap package was very much a second class citizen and it was way behind the other packages. I hope this has improved since I stopped using it.

With Synology, the OS and applications are all managed via the same web interface. So whether you need to install updates for your applications, or the OS, everything is managed from the same place. It’s literally a single button click. Plus, In all the years of using Synology, I’ve never known an update to cause an issue with the system.

Unfortunately the same can’t be said for my Nextcloud updates – in the time I was using it, I had my system bork twice due to dodgy updates or dependency issues.

Other issues with Nextcloud

My experience with many of the Nextcloud “apps” is that many of them are shit to be frank.

Nextcloud Talk is a hot mess of slowness. Nextcloud Mail has a tonne of random errors when connecting to a mail server and sending mail. Not to mention it has a horrible user experience.

Nexcloud Contacts and Calendars regularly failed to sync via DAV. Nextcloud News had some weird errors that couldn’t be dismissed and only worked with an official mobile app. Not very open. 🙁

I could go on, but the general consensus with my experience of many of the Nextcloud apps, was that they had been very poorly implemented.

I personally felt that this was indicative of the Nextcloud team trying to run before they could walk – they’re just trying to do too much, too soon.

Conclusion of Synology vs Nextcloud

So for me, when it comes to Synology vs Nextcloud there is no comparison – Synology wins all day long. Nextcloud is a good tool and has a bright future ahead of it (I hope).

You never know, I may even go back to it one day. But for the time being, since I need to do other stuff outside of basic file syncing, it’s the Synology.

I also like the fact that Synology have tonnes of other apps that all work really well, so I can play around with the server. For example, if I want to add a really good mail server to my Synology, it’s just a few clicks away.

Need to take notes? Synology has an app for that. Stream audio? Yep, that’s there. Torrent downloading? Check! DNS server, proxy server, Mattermost type chat application, VPN server, mail server, calendar, contacts even an office suite! It’s all there.

Synology devices are not cheap, but I’m a firm believer in you get what you pay for and in this case I think that’s really evident.

Remember, this is just my opinion. If Nextcloud has proven to be a better solution for your needs, please feel free to tell me about it in the comments below.

CodeFund Adverts & This Site

You may have noticed that over the last month, there has been an advert at the top of this site. They looked something like this, depending on whether you loaded the dark version of this site or not:

Screenshot of CodeFund advert on this site.

The advert is from CodeFund, an ethical advertising agency. Their ads have no tracking, no cookies and no nonsense.


I started using CodeFund at the beginning of June following a recommendation on Fosstodon. I wanted to see if I could use it as a way of getting a bonus for some of the many hours I spend on this blog.

Things were pretty successful. I hadn’t had any complaints from visitors, and I was able to make around $80.

CodeFund earnings

While this won’t allow me to retire any time soon, I was planning on using the funds to save up for a new desktop computer. CodeFund announced today that they are having to close their doors unfortunately.

The time has come for us to shut down CodeFund. As many others, we were unable to survive the economic downturn.


This is a real kick in nuts for me – I had finally found an ethical company that provided useful ads to my visitors, without tracking them or sacrificing their privacy.

While I’m not desperate to monetise this blog as I have a full time job; it would be nice if I could get a little bonus back for the time I put in.

The powers that be clearly had other ideas and stepped in to stop that from happening. Oh well, maybe other ethical monetisation options will appear in the future. Until then, I’ll keep chugging along and churning out content, like I always do. 🙂

If you have any recommendations of ways to ethically monetise this blog, please get in touch, or leave a comment below.

Email Is Not Broken

I’ve been reading a lot of hyperbole lately around how broken email is. Sure, email has problems, but is it actually broken? I don’t think so.

A lot of this hyperbole appears to have come about following the release of Basecamp’s new email service, Hey. I’ve signed up for a Hey trial, and although it seems like a good service, I don’t think it fixes any of the problems with email.

What is email?

A good place to start a discussion about something as polarising as email, is to articulate what email actually is. That way, you guys will hopefully understand where I am coming from right from the start.

To me, email is a way of receiving simple communications that have a short time to live.

That’s all email is to me. They’re mostly unimportant messages that I receive, deal with, and move on. I imagine that’s what they are to many other people too, especially when you consider how many people have hundreds of unread items in their inbox. How important can the vast majority of email actually be if there’s so much unread mail floating around?

To be clear, I’m not one of those people that has lots of unread mail. I’m a zero inbox kind of guy personally. But I couldn’t count the amount of times I’ve seen someone’s unlocked phone and noticed a mail icon sporting a red blob with an inordinately large number displayed in it.

When I migrated my wife and I from Gmail to Zoho, the amount of unread mail she had in her mailbox was ridiculous. I thought she didn’t reply to my emails just because she didn’t like me, turns out she just doesn’t check her email! 🙂

The problems with email

Now we have established what I believe email is, let’s look at some of the problems with email. To me, the main problems are threefold:

  1. Spam
  2. Privacy
  3. Workflow management


Spam is by far the biggest problem to plague email. No spam filter is perfect and it takes work to keep on top of it.

Many email providers allow you to manage your spam on the fly. You can mark emails that slip through the net as spam, so the next time they will be caught. And conversely, you can mark false positives as safe.

Let’s say that I received a spam email and it hits my inbox. I won’t delete it, I’ll actually mark it as spam so that my spam filter learns what spam is. If I were to delete it, my spam filter would be none the wiser and I would be perpetuating the problem.

Conversely, if a legitimate email gets incorrectly marked as spam I won’t just move that mail into my inbox. I will mark the sender as safe, then move the email. Again, it’s giving my spam filter the opportunity to learn.

I’ve been using Zoho for a few years now and by doing this, the spam filtering is excellent and I receive very few spam mails to my inbox.

Hey’s spam solution

If you receive an email to your Imbox (seriously? Imbox…what a ridiculous name) that has come from an address Hey has never seen before, it forces the user to screen it first.

Hey email screening

I think this is a good solution, in that it forces people to vet any potential spam as it comes in. But although this specific workflow isn’t baked into any other email provider that I know of, the ability to vet and manage spam is the norm these days.

Email isn’t broken in this case – peoples’ inability to manage their incoming mail is.


Apparently there are 1.5 billion people using Gmail globally. I’ve made efforts to significantly reduce my Google usage, but a lot of people are happy with Gmail and that’s fine.

What isn’t fine is the complete lack of privacy that Gmail affords its users. Apparently Google will no longer read your email to personalise adverts. I don’t believe that for a second, but even if they do, there’s still adverts in Gmail and they wouldn’t provide a service to billions of people for free if they weren’t making a profit from it.

When the product is free, you’re the product.

Email is not private, so I don’t treat it as such. If I have something that I want/need to email that is private, I will either encrypt the email, or send an encrypted attachment. My email provider, Zoho, has a very open privacy policy but I still wouldn’t use their service to send private data.

It’s not just Gmail that’s the problem here – most free email providers have privacy issues.

Hey’s privacy solution

When it comes to privacy, Hey has a great policy, saying the following on their manifesto:

There are lots of “free” email services out there, but free email costs you one of most valuable things you have – your privacy and your personal information. We’re not interested in your personal data. It’s always yours, never ours. We simply charge a flat, all‑inclusive $99/year fee for HEY. That makes our business work without having to sell your data, advertise to you, or otherwise engage in unscrupulous marketing tactics.

The Hey manifesto

I love this. As a Hey user you’re not being advertised to, or tracked and your data isn’t being harvested. Awesome. However, that’s no different than an untold number of other paid for email services like Fastmail,, Tutanota, Proton and Zoho.

So although Hey’s approach to their users’ privacy is great to see, it isn’t anything innovative and it isn’t fixing any problems. The only true way that I see for people to fix the privacy problems with email, is for people to stop using these free services and pay for a privacy respecting one.

Most privacy respecting email services cost less than the price of a cup of coffee per month. I think this is a relatively small cost for a service that many people use every day.

Workflow management

You have an email address and over time hundreds of people and companies get that address. This means that tonnes of mail is just arbitrarily dumped in your inbox. Which in turned leads to that red blob on your phone’s home screen that displays a very large number.

No one has time to deal with that crap, am I right?

Establishing an email workflow is extremely important. Every email provider I can think of has some sort of filtering system that allows you to filter emails into certain folders automatically.

For example, if you buy a lot of stuff from Amazon or eBay, you could create rule that automatically puts shopping and delivery receipts into a Shopping folder and marks them as read.

You don’t need to deal with them at all then. They’re dealt with automatically and you know where they are if you need them. Same with newsletters – the ones you want to keep, file them away. The ones you don’t, unsubscribe from the newsletter. If there is no unsubscribe link, make a rule that automatically deletes those newsletters.

By working through your incoming mail and filtering out the noise, you’re left with a much smaller collection of mail that you need to actually deal with.

Hey’s email workflow

This is where I think Hey really lets itself down. The traditional setup of a folder tree down the right-hand side of the screen, and the ability to easily flip between them is logical to me.

However, the workflow on Hey is completely broken in my opinion. There are three main parts to Hey’s interface:

  • The Imbox (still a stupid name)
  • The Feed
  • The Paper Trail

The Imbox is exactly the same as your Inbox, just with a silly name. The Feed is where newsletters etc. are suppose to be delivered to and The Paper Trail is for things like receipts.

This all sounds good, but there’s no single-click way of getting to and from those interfaces. The UI for each is also slightly different, which is jarring.

There is no sent items folder in Hey (not that I could see at least). Everything just goes into an ever scrolling feed of mail below your Imbox, called Previously Seen.

Managing email in Hey

If I have a newsletter I’m saving for later, along with an email I need to deal with, I have to flip between multiple interfaces within Hey. Whereas both would be in my Inbox in a traditional mailbox.

If I want to move something between the three Hey feeds, there’s no way to drag and drop. Instead I have to go into the email, click More, click Move then finally select the right feed.

Hey more dialogue
Hey move dialogue

That’s 4 clicks compared to a single click (or drag & drop) in a traditional mailbox.

I should note here that there are keyboard shortcuts throughout Hey. I really like that they list the shortcuts by the menu items too. However, the vast majority of people prefer using a mouse.

Power users tend to prefer keyboard shortcuts in my experience (myself included). But if Hey are trying to “fix email” the interface needs to be efficient for everyone to navigate.

Email is NOT broken

All three of the problems with email that I have talked about in this post boil down to the user and their choices, rather than email as a service.

Hey is an interesting take on email and it may be the next big thing for email. But I personally feel that it’s a lot of hype, purely because it’s a new shiny thing for techies to play with.

If I were to give my wife a Hey mailbox, she would get very lost, very quick. It’s an interesting concept, but I can’t help but think that Hey are trying to fix a problem that doesn’t exist.

Email is far from perfect, but it’s well established and mature. With a little bit of work your inbox can be a highly moderated list of only items you need to deal with.

My inbox (almost empty)


I’d like to end this post by saying that this is just my opinion. Some people may not want to invest the time to manage their incoming mail like I do.

That’s absolutely fine and in such cases services like Hey may work better for you. But just because you’re not prepared to put the work in, doesn’t mean that email is broken.

Email is far from perfect, but I don’t think it’s broken. What do you think?

How To Use A TP-Link Router With Sky Fibre Optic

How to use a TP-Link router with sky fibre was originally written on 07th June 2017, but has been updated on 24th June 2020.

The Sky OEM router is fine for the vast majority of cases, but if you want more functionality, better signal strength, or advanced features like VPN support or parental controls, you’re going to want to use a TP-Link router with Sky fibre.

In this article I’ll be showing you how you can replace the OEM Sky router with a much better, TP-Link device.

Why Change?

The OEM Sky router is fine for most uses, but some people want more functionality than what “normal” routers can offer. Or, maybe your router is simply swamped in a sea of wireless networks from your neighbours, and you want a better signal strength.

Whichever it is, an after-market router is generally a much better alternative to the OEM routers that are made as cheaply as possible so they can be given to thousands of people for free.

After having a lot of issues with wireless myself, I decided to replace my OEM Sky router with a TP-Link AC1200 (costing approximately £45 from Amazon) for my Sky fibre connection. I’m so glad I did!

June 2020 update: I’ve since upgraded my Wi-Fi network with a TP-Link Deco M5 mesh Wi-Fi system which has vastly improved my wireless network.

Setting It Up

Setting up the TP-Link router with Sky fibre is extremely simple. Whilst it’s not quite plug and play, the process is very simple:

  1. Plug in your TP-Link to a power supply and connect the grey DSL cable to the corresponding port on the back of your router.
  2. Connect the other end of your DSL cable to the micro-filter than comes with the router and connect it to your phone socket. If you already have a micro-filter, replace it with the new one.
  3. Connect the network cable to port 1 on the router, then connect the other end to your laptop.
  4. Open a browser window and navigate to
  5. The TP-Link will ask you to set an admin password – make sure you use something secure, as Password123 ain’t gonna cut it!
  6. Once in, the TP-Link setup wizard will start:
    • Input your location and time zone. Click Next.
    • Select Sky(MER)_VDSL from the ISP list. Make sure it’s this one, as this is Sky fibre. The other Sky option in the list is for Sky Broadband and will not work for fibre connections.
    • In the username field, enter abcdefgh@skydsl
    • In the password field, enter 1234567890abcdef
    • Click Next, then set up your wireless network how you see fit.
    • The TP-Link will then test the Internet connection and you should see a success message. If you do not, wait for the DSL light to stop flashing and try again – it should work just fine.
  7. That’s it! You’re now connected to Sky fibre via your new TP-Link router.
Tp-Link Admin UI


Now you have your new router connected, you can start to have a look around the admin interface and change the settings as needed. Here are some of the changes I made:

  • Disabled WPS – it’s insecure and easily hacked, so turn it off.
  • Added my NAT rules so that traffic will route to my server.
  • Turned on and configured the guest network, so guests don’t have access to my server.
  • Changed the IP Subnet and DHCP pool. This was only so I didn’t have to re-configure all my existing devices that have static addresses.


Overall, I’m very happy with the TP-Link VR200. The connection has been rock solid and it has served me well for over 3 years now.

Using a TP-Link router with Sky fibre has many advantages and it’s an extremely well priced router compared to the functionality it offers. The only downside is that it is a lot bigger than the OEM Sky router. But I can live with that for the additional functionality it offers.

I have had feedback from hundreds of people that this process works, but if you’re struggling I would suggest the Sky fibre forums. But if you’re really stuck, feel free to leave a comment below, or contact me.

How To Create An IndieWeb Profile

I’ve written about the IndieWeb in the past, but it can be little complicated and confusing to get started. In this post I’m going to take you through creating an IndieWeb profile.

What is the IndieWeb?

The IndieWeb is a way of connecting your personal website with lot of other peoples’ sites from around the world. So if you’re on the IndieWeb and I link to your blog in one of my posts, you get a notification. These are called Webmentions and you can see the Webmentions for this post in the comments section below.

Think of it as an inter-linked commenting system that traverses the entire Internet. Websites aren’t physically connected, but they can communicate with one another. You can learn more on the IndieWeb site.

What is an IndieWeb profile?

An IndieWeb profile, or h-card as they’re officially known, is a snippet of code that tells other websites connected to the IndieWeb a little bit about you and your site.

I like to think of it as my business card for the IndieWeb.

Why do I need an IndieWeb profile?

Well, like any inter-connected social system, a profile help people recognise you within the network. A profile is also useful for discovery purposes on the IndieWeb.

You can create a h-card in a number of ways, but in this post I will show you how I have created my h-card and what it all means.

Example IndieWeb h-card

Let’s take a look at my IndieWeb profile first, so you can see what they look like and what we need to configure.

Kev's IndieWeb Profile

As you can see, my IndieWeb profile contains a fair amount of information. But there’s a lot more you can add if you wish. This link lists all of the h-card identifiers that are available.

Enough of this preamble, let’s get started and actually make the thing, shall we?

The Basics

There are a number of ways you can create a h-card. Some people like to markup their about page, others like to add the identifiers to their posts and pages. Personally, I opted to create a simple block of hidden code on my homepage that handles the whole thing.

I think this is the easiest way of doing it, as it then acts as a single profile within your website’s code that is easy to update.

So with that we will start by creating a new HTML section what will house our h-card profile:

<section style="display: none;" class="h-card">


So display: none; tells your browser to hide everything within this section when the page is loaded. This ensures your visitors will not be able to see it, but other sites on the IndieWeb will traverse this code and find your profile. We’re also giving the section a class of h-card, which tells the IndieWeb that this is your h-card profile.

About me

Now we have the basic section setup and we have hidden it with some inline CSS, let’s add some basic information to the profile. My name and a short bio seems like a place to start:

<section style="display: none;" class="h-card">

<!-- About me -->
<span class="p-name">Kev Quirk</span>
<span class="p-note">I'm a cyber security professional and privacy advocate from North West England. My interest include drawing, fishkeeping, motorbikes & open source software.</span>


By using p-name and p-note as the class for the two lines of code, we’re telling the IndieWeb what our name is and a little bit about ourselves.

Profile picture

No profile is complete without a profile picture. For this we simply add an img tag and set its class to u-photo:

<section style="display: none;" class="h-card">

<!-- About me -->
<span class="p-name">Kev Quirk</span>
<span class="p-note">I'm a cyber security professional and privacy advocate from North West England. My interest include drawing, fish keeping, motorbikes & open source software.</span>

<!-- Profile picture -->
<img class="u-photo" src=""/>



Adding your location is totally optional, but I decided to add it as a lot of people assume I’m American. I’m not sure why, they just do, so I thought by adding my rough location this would help.

Privacy note: If you’re going to do this, make sure the location you specify is very vague. I’d recommend Town/City at most.

<section style="display: none;" class="h-card">

<!-- About me -->
<span class="p-name">Kev Quirk</span>
<span class="p-note">I'm a cyber security professional and privacy advocate from North West England. My interest include drawing, fish keeping, motorbikes & open source software.</span>

<!-- Profile picture -->
<img class="u-photo" src=""/>

<!-- My location -->
<span class="p-locality">North West England</span>


Social Links

The next step is to add some links. These are really important as they show the IndieWeb what your other online identities are. This is a great way of validating your various online accounts so people know they’re legitimate.

<section style="display: none;" class="h-card">

<!-- About me -->
<span class="p-name">Kev Quirk</span>
<span class="p-note">I'm a cyber security professional and privacy advocate from North West England. My interest include drawing, fish keeping, motorbikes & open source software.</span>

<!-- Profile picture -->
<img class="u-photo" src=""/>

<!-- My location -->
<span class="p-locality">North West England</span>

<!-- Links -->
<a class="u-url u-uid" href=""></a>
<a class="u-email" rel="me" href=""></a>
<a class="u-url" rel="me" href=""></a>
<a class="u-url" rel="me" href=""></a>
<a class="u-url" rel="me" href=""></a>


The first link has two classes, u-url and u-uid. The u-url class is a generic identifier that simply says that this URL is owned by me. So this could be a social profile, or a link to your homepage.

u-uid is a little different. This is your universally unique identifier, so it’s the daddy of all your links – it’s your main home on the IndieWeb. A link to your homepage should always include both the u-url and u-uid classes.

We then have u-email which is pretty self-explanatory – it’s your email address. I personally use the same email address as the one listed on my contact page for this.

Finally we have a few links to my social profiles that only contain the u-url identifier.


Adding categories to your IndieWeb profile shows other people on the IndieWeb the kind of things you’re interested in and write about on your blog.

<section style="display: none;" class="h-card">

<!-- About me -->
<span class="p-name">Kev Quirk</span>
<span class="p-note">I'm a cyber security professional and privacy advocate from North West England. My interest include drawing, fish keeping, motorbikes & open source software.</span>

<!-- Profile picture -->
<img class="u-photo" src=""/>

<!-- My location -->
<span class="p-locality">North West England</span>

<!-- Links -->
<a class="u-url u-uid" href=""></a>
<a class="u-email" rel="me" href=""></a>
<a class="u-url" rel="me" href=""></a>
<a class="u-url" rel="me" href=""></a>
<a class="u-url" rel="me" href=""></a>

<!-- Categories -->
<span class="p-category">Blogging</span>
<span class="p-category">Fish keeping</span>
<span class="p-category">InfoSec</span>
<span class="p-category">Motorbikes</span>
<span class="p-category">Open Source Software</span>
<span class="p-category">Privacy</span>
<span class="p-category">Web Design</span>


Adding more items

That’s pretty much it for my h-card, but I mentioned earlier in this post that you can add other items to your profile if you would like. A full list of all supported identifiers can be found here.

For example, let’s say I want to add my title, which is Mr. The additional code to be added to my h-card would look like this:

<!-- My title -->
<span class="p-honorific-prefix">Mr</span>

If you want to add others, just follow this same process, referencing the specific identifier for the field you wish to add.

Note: Wherever I’ve added <!-- XXXX --> throughout my h-card is optional. These are just HTML comments that make reading the code easier for me.

Adding it to your website

All you need to do now is copy and paste the complete h-card somewhere inside your <body> tags on your website’s homepage. Personally, I would recommend putting your h-card right at the bottom, just above your closing </body> tag. This way it won’t interfere with anything else on your page.

Once you have added your h-card to your site’s code, save it and uploaded it to your web server and you should have a working IndieWeb profile. If you want to test your profile to see if it’s working, you can use this page.

Do you have a different way of managing your IndieWeb h-card? If so, why not tell me how you have done it in the comments below.

How To Backup Nextcloud

“How To Backup Nextcloud” was originally written on 03 July 2019, but has been updated on 19 June 2020.

I recently wrote a guide on how to setup your own Nextcloud server; it’s a great way of ensuring your personal data is kept private. However, it’s also important to backup Nextcloud too.

Isn’t Nextcloud My Backup?

No it isn’t. Nextcloud is not a backup solution, it’s a way of syncing your data, but it’s not a backup. Think about it, if you delete a file from computer A, that deletion will immediately be synced everywhere via Nextcloud. There are protections in place, such as the trash bin and version control, but Nextcloud is not a backup solution.

Since building my own server I have come up with a pretty decent way of backing up my data that follows the 3-2-1 principle of backing data up.

At least 3 copies of your data, on 2 different storage media, 1 of which needs to be off-site.

— The 3-2-1 backup rule


In order to effectively backup Nextcloud, there are a few pieces of hardware and software involved. There is an initial cost to the hardware, but it isn’t significant.

To backup Nextcloud you will need:

  1. An Ubuntu based server running the Nextcloud Snap
  2. A USB hard drive that is at least double the size of the data you’re backing up (I’d recommend getting the biggest you can afford)
  3. Duplicati backup software installed on your Nextcloud server
  4. A Backblaze B2 account
  5. Around 30-60 minutes to set it all up

At this point I will assume that you have connected and mounted your USB hard drive to the server. If you haven’t done that yet, take a look at my guide on how to mount a partition in Ubuntu.

Note: this process is designed around the Nextcloud Snap installation, not the manual installation.


Following this post, you will be able to do the following:

  1. Automatically backup your entire Nextcloud instance (including your database) every day
  2. Create a log file so you can see if the backup worked
  3. Sync the backup to B2 cloud storage (it will be encrypted before transmission)
  4. Delete old backups so your hard drive doesn’t fill up
  5. Receive email alerts once the backup completes

User Setup

I would reccomend using a dedicated user for backing up. This will allow us to keep the backup routine separate from the normal user account you use, making the setup more secure.

In this guide, I will be using ncbackup as the user account. You can use whatever username you feel is appropriate. Let’s start by creating the user and the directories we will need to store our backups.

# Create new user
sudo adduser ncbackup

# Switch to new user account
su - ncbackup

# Make directories for Backups
mkdir Backups
mkdir Backups/Logs

# Logout to switch back to normal user

Now we have the directories setup, let’s create the script that will run our backups. In this example, I’m using nano, but feel free to use any text editor you like. To learn more about nano, click here.

nano /usr/sbin/

We’re using the usr/sbin directory because it is used for system-wide binaries that require elevated privileges. You can store your script wherever you like, but usr/sbin is good practice.

Backup Nextcloud

Populate the file with the following, ensuring you change the username and path to whatever the appropriate values are for your setup.

# Output to a logfile
exec &> /home/ncbackup/Backups/Logs/"$(date '+%Y-%m-%d').txt"
echo "Starting Nextcloud export..."

# Run a Nextcloud backup
echo "Export complete"
echo "Compressing backup..."

# Compress backed up folder
tar -zcf /home/ncbackup/Backups/"$(date '+%Y-%m-%d').tar.gz" /var/snap/nextcloud/common/backups/* 
echo "Nextcloud backup successfully compressed to /home/ncbackup/Backups"

# Remove uncompressed backup data
rm -rf /var/snap/nextcloud/common/backups/*
echo "Removing backups older than 14 days..."

# Remove backups and logs older than 14 days
find /home/ncbackup/Backups -mtime +14 -type f -delete
find /home/ncbackup/Backups/Logs -mtime +14 -type f -delete
echo "Complete"

echo "Nextcloud backup completed successfully."

Now we need to make our backup script executable:

sudo chmod +x /usr/sbin/

A lot of the commands in our script will require sudo access, but we don’t want to give full sudo access to our ncbackup user, as it doesn’t need elevated rights globally. However, we do want to be able to run the backup script with sudo rights, and we want to do it without requiring a password.

To accomplish this, we need to use visudo. We can configure visudo to allow the ncbackup user to run the backup script as sudo, without a password. Crucially, the ncbackup user will not be able to run anything else as sudo.

# Allow ncbackup to run script as sudo ncbackup ALL=(ALL) NOPASSWD: /usr/sbin/
# Open visudo
sudo visudo

# Allow ncbackup to run script as sudo
ncbackup ALL=(ALL) NOPASSWD: /usr/sbin/

Enabling sudo access for the backup script introduces another potential security risk. The ncbackup user can run the backup script as sudo without a password. So a threat actor could potentially edit the script and run any command as sudo without a password.

Bad times.

However, we saved the script in /usr/sbin, which means the ncbackup user will not be able to edit the script. By doing so, we have prevented the system from becoming insecure.

As an extra layer of security, we will stop the ncbackup user from being able to login to the server at all:

sudo usermod -s /sbin/nologin ncbackup

If at a later date you need to be able to login using the ncbackup user, you can revert change this by running the following command:

sudo usermod -s /bin/bash ncbackup

Schedule Backups

Now have the backup script setup, we need to schedule the backup to run automatically; for this, we will use Cron.

Run the following command to enter the Cron settings for the ncbackup user:

sudo crontab -u ncbackup -e

Once you’re in crontab, you need to add the following lines to the bottom of the file:

# Nextcloud backup cron (runs as 2am daily)
0 2 * * * sudo /usr/sbin/

The settings above will run the backup script at 02:00am every day. You can change this to whatever value you like, but I would recommend running the backup every day.

The first value represents minutes, then hours, then days etc. So if you wanted to run the backup at 03:30am, your Crontab entry would look something like this:

# Nextcloud backup cron (runs as 03:30am daily)
30 3 * * * sudo /usr/sbin/

Now Wait…

That’s most of the setup complete at this point. The next thing to do is to wait 24 hours for your backup to complete automatically (or you could run the script manually yourself).

Once the script has run, you should see a tar.gz file within your backup folder with a name that corresponds to the date the backup ran:

kev@server:~$ ls /home/ncbackup/Backups/
2020-06-10.tar.gz  Logs

Within the Logs folder, you should also see a <date>.txt file that corresponds to the backup. You can open this to see how your backup went:

kev@server:~$ cat /home/ncbackup/Backups/Logs/2020-06-10.txt 
Starting Nextcloud export...
WARNING: This functionality is still experimental and under
development, use at your own risk. Note that the CLI interface is unstable, so beware if using from within scripts.
Enabling maintenance mode...
Exporting apps...
              0 100%    0.00kB/s    0:00:00 (xfr#0, to-chk=0/1)
Exporting database...
Exporting config...
Exporting data...
         15.90M 100%  109.87MB/s    0:00:00 (xfr#105, to-chk=0/139) 
Successfully exported /var/snap/nextcloud/common/backups/20190703-130201
Disabling maintenance mode...
Export complete
Compressing backup...
tar: Removing leading `/' from member names
Nextcloud backup successfully compressed to /home/ncbackup/Backups
Removing backups older than 14 days...
find: ‘./home/ncbackup/Backups/’: No such file or directory
Nextcloud backup completed successfully.

With the echo statements we put in the script, you can see at what point in the backup things failed, if they do in fact fail.

Note: there are masses of improvements that can be added to this script, but this satisfies my needs. If you do add improvements, please let me know and I’ll post an update.

Setup Duplicati

You now have a single layer of backups for Nextcloud. However, if you want to abide by the 3-2-1 rule of backups (which I highly recommend), then we now need to use Duplicati to add additional layers to our backup routine.

To install Duplicati, go to this link and right click ‘copy link location‘ on the Ubuntu DEB. Then amend the commands below as appropriate.

# Download Duplicati DEB

# Install Duplicati
sudo dpkg -i duplicati_[version].deb

# If you get a dependency error, run the following
sudo apt --fix-broken install

We now need to enable the Systemd service for Duplicati so it runs automatically on boot:

# Enable Duplicati service
sudo systemctl enable duplicati

# Start the Duplicati service
sudo systemctl start duplicati

By default the Duplicati service will only listen on localhost, so if you try to access the IP of the server from another device, you won’t get the Duplcati webGUI.

To fix this, edit the DAEMON_OPTS option within the Duplicati config to the following:

# Open Duplicati config
sudo nano /etc/default/duplicati

# Additional options that are passed to the Daemon.

Restart Duplicati so the config changes take affect:

sudo systemctl restart duplicati

You should now be able to access the Duplicati web interface by going to http://server-ip:8200. You will be asked to set a password for Duplicati when you first login, make sure this is a strong one!

Security Note: My server is hosted at home, and I don’t expose port 8200 to the internet. If your server is not at home, then I would strongly suggest you configure something like IP Tables, or Digital Ocean firewall, to restrict access to port 8200.

Configure Duplicati Backups

Now its time to configure our backups in Duplicati. We will configure 2 backup routines – 1 to USB and another to Backblaze B2 for off-site.

Let’s do the USB backup first. Within the Duplicati webGUI, click on the Add Backup button to the left of the screen.

This is a very straightforward process where you choose the destination (our USB drive), the source (the output from our backup script) and the schedule.

Duplicati USB Backup

When creating your backup routines in Duplicati, always ensure you encrypt your backups and use a strong passphrase.

Also, always make sure your Duplicati backups run at different times to your other backups. Personally, I go for the following setup:

  • 02:00 – Local Nextcloud backup script runs via Cron
  • 03:00 – Duplicati backs up to USB
  • 04:00 – Duplicati backs up to Backblaze B2

I always leave the Backblaze backup to run last, as it then has up to 22 hrs to complete the upload before the next backup starts, so they shouldn’t interfere with one another.

Off-Site Backups

When it comes to configuring your Backblaze backups, change the destination from Local to B2 Cloud Storage. You will need your B2 bucket information and application keys from to complete the config.

Once you have entered your Backblaze Bucket information, click Test Connection to make sure Duplicati can write to your B2 bucket correctly.

Important note: You will need to add payment information to your Backblaze account before backing up, otherwise your backups will fail.

To give you an idea of what Backblaze costs, I’m currently backing up around 150GB of data to my Buckets, and I’m charged less than $1/month.

Personally, I only keep 7 days of backups on BackBlaze, as I only have it for disaster recovery, where all my local backups have failed. I don’t need data retention in the cloud, that’s what my USB drive is for.

Duplicati Email Notifications

You can configure email notifications for Duplicati backups, this way you will always know if your backups are working.

To do this, head into the Duplicati WebGUI and click on the Settings option to the left of screen, scroll all the way down to the bottom where it says Default options. Click the option that says Edit as text, the paste the following into the field:

# Change as needed
--send-mail-subject=Duplicati %PARSEDRESULT%, %OPERATIONNAME% report for %backup-name%
--send-mail-from=Backup Mailer &lt;>

I personally use Amazon SES for this, but you should be able to use any SMTP server.

That’s It!

You’re done. That’s it. Finito. You now know how to backup Nextcloud in such a way that it abides by the cardinal 3-2-1 backup rule, and it lets you know when your backups have run.


I can’t stress this enough. Once your backups have been running for a few days, make sure you run a test restore (not on your live system) to make sure you can get your data back. After all, there’s no point in having backups if you can’t restore from them!

To restore the backups you have made of Nextcloud into a vanilla Nextcloud snap installation, you need to decompress your backup to /var/snap/nextcloud/common then use the nextcloud.import command to restore it:

# Decompress your backup
tar -xvzf /path/to/nexcloud/backup.tar.gz -C /var/snap/nextcloud/common

# Restore your Nextcloud backup
sudo nextcloud.import /var/snap/nextcloud/common/backup-to-restore

Yes, restoring your Nextcloud snap from backup really is that simple!


This is by no means the perfect way to backup Nextcloud, but it does work and it has worked for me for quite some time now. You may have a different/better way of backing up, if you do, please leave comment below, or get in touch with me.

Finally, I’d like to thank my friend Thomas from work, who helped improve my script a little and gave me a couple of ideas to improve to the security.

Thanks, Tom. 🙂

Finishing My Website Redesign

So I recently wrote about how I decided I was going to redesign this website to give it a fresh new look. Well, I’m happy to say that I’ve now finished the “redesign.”

What do you think?

Pretty much the same, right? Well, I started designing a whole new site with a custom theme that I built from the ground up using the Divi framework.

It looked similar to what I have now, but it was a lot heavier and had a tonne of functionality that I didn’t really need. So after spending a couple of weeks building it, and even giving my newsletter subscribers a sneak preview, I decided to ditch it.

Instead I went with updates to my old theme. So I’ve done a few tweaks to improve things here and there, but nothing major. Here’s what I’ve done:

  • Complete redesign of the commenting system, which have now been re-enabled across the site.
  • Added more splashes of blue to buttons, links etc.
  • Improved typography everywhere.
  • Reduced the content width to 640px.
  • Improved the notes page.
  • Removed posts categorised as notes from the homepage feed.
  • Numerous other miscellaneous tweaks.

I’ve now started maintaining a Github repository for this theme, so if you want to fork it and use it yourself, be my guest. As with everything on this site, it has an open license.

My Theme On Github

I’m really happy with the decision I made to keep this theme. I’ve worked so hard on it and I know the code intricately, so it’s easy for me to fix issues or make tweaks when I need to.

For the time being I’m going to stop pissing around with my theme and concentrate on actually writing content, I think.

Why Does Logitech Hate Left Handed People?

I’ve recently been looking at switching my traditional mouse for a trackball mouse. I asked for recommendations on Fosstodon and the overwhelming recommendation was the Logitech M570. Unfortunately, Logitech don’t make a left handed version, and after a bit of research it seems that the issue goes much further than just this device.

Now, many standard mice can be used with either hand as they’re symmetrical. But if you want to use an ergonomic mouse, be it trackball or traditional, these are specific to one hand or another.

The problem with Logitech is that they don’t make any left handed mice, in any of their ranges. So although my initial search was for a trackball, this covers all mice they produce. For example, this one.

Why Logitech hates left handed people

If you take a look around the Logitech forums, you will see 279 pages of search results from people complaining about the lack of left handed support from Logitech.

There are posts going back years; some have responses from Logitech team members, others have just been ignored. The problem is, every post I’ve seen has had the same regurgitation of a cookie cutter response.

Posts like this one from 9 months ago where the poster is requesting a left handed mouse, gets the following response from Logitech:

The left handed version the MX Master 3 is not yet available and we do not have any information on when a left handed version would be available.

I’ll have this post forwarded to the proper team here on my end…

Logitech support

In another post, also from 9 months ago, Logitech reply 4 months later with the following:

Thank you for reaching Logitech! We deeply apologize for not providing a prompt response.

As with your inquiry about the MX Master 3, the manufactured device at the moment are intended for people who use their mouse using their right hand.[…] We will forward this post to our team for consideration.

Logitech support

There’s also this post from 5 months ago, this post from 2 years ago and this post from 3 years ago. And that’s just in the first 3 of those 279 pages of results!

The forums are littered with posts from fellow lefties pleading with Logitech to create a left handed mouse. But many years later, we’re still left out in the cold.


According to Wikipedia, around 10% of the world’s population are left handed. Ten percent may not sound like a lot, but that’s the equivalent population of the USA, Japan, Brazil & Germany combined. That’s a lot of people!

On their website, Logitech say the following about their MX Ergo trackball mouse:

Logitech’s most advanced trackball for trackball enthusiasts and consumers searching for alternatives to mice and touchpads. Delivers 20% less muscular strain compared to a regular mouse.


So does the muscular strain of left handed people not matter to you, Logitech? I suppose those 780 million left handed people worldwide don’t really matter, hey?

I’d like to finally add that the title of this post is facetious. I know that Logitech don’t hate left handed people. It’s ok Logitech, I still really like your hardware, but it would be wonderful if you offered some ergonomic mice for left handed people. 🙂