Kev Quirk|

Menu

Author: Kev Quirk

Hi, I'm Kev and I'm a cyber security professional from England. I use this site to share my thoughts and ideas from time to time.

De-Googling My Life – 2 Years On

I first started De-Googling my life back in September 2017. It’s now been nearly 2.5 years since I completed that process, so I thought it was time for an update.

I recently received an email from a reader asking me if I was intending to do an update on how de-Googling my life was going. I was intending to do an update, but I hadn’t realised it was well over 2 years since I went through the process.

It’s about time I gave you guys another update on how things are going, and what’s changed since the last update. I’ll go through each of the changes I made during my de-Google and I’ll give you guys an update on each one.

01 – Browser & Search

In the first post in the series I talked about replacing my browser and search. I replaced Chrome with Firefox and Google search with DuckDuckGo.

That’s still the case to this day. I’m still using both services and I’m happy to report that my experience has been great with each of them. Since the post was published, I’ve also started using Brave as a secondary browser. However, I prefer to use Firefox, as it’s one of the few browsers out there today that doesn’t use Google’s rendering engine, Blink.

In the browser space, diversity is key – I really don’t want Google to have the monopoly here, but unfortunately, MS Edge, Chrome, Chromium, Brave, Vivaldi and Opera all use the Blink rendering engine.

As I understand it, it’s only Firefox and Safari that don’t use it, which use Quantum and Webkit respectively. If you’re thinking about changing browsers, please consider Firefox so that Google don’t get even more of a monopoly in this space.

With regards to DuckDuckGo, there’s not much to say here. Their results are accurate, and their search is private, so I’m happy.

02 – Analytics

In the second post I explained how I was using Google Analytics for monitoring my web traffic. GA is all kinds of bad because of their online tracking capabilities, so I flipped to Piwik. Since then, I decided I wanted to reduce the amount of JavaScript my site runs where possible, so analytics was an obvious saving.

Ultimately I decided to go with Awstats, which provides anonymous analytics based off my web server logs. You can read more about this on my privacy page.

Awstats IP Addresses

I’m currently testing Koko Analytics too, it’s also privacy respecting and doesn’t need JavaScript. I’m not sure it adds more value than Awstats yet though, so I may not keep it. I do how like it’s embedded in my WordPress dashboard though.

03 – Google+

Next stage of de-Googling my life was replacing Google+. At the time, G+ was still alive and kicking, but it has since been killed off by Google. Like so many other services of theirs.

Anyway, I decided to replace G+ with Mastodon. I co-founded a FOSS-centric instance called Fosstodon, which has grown from strength to strength. We now have a really strong community of over 8,500 members.

About Fosstodon

I’m very happy with Mastodon – the community is large enough to be interesting and always has something going on, yet small enough to feel close-knit.

As far as I’m aware, Fosstodon is actually the largest FOSS/open source Mastodon instance on the Fediverse. If you you want to know more about Mastodon and how it works, this post may help.

04 – Google Drive

Step 4 was all about getting rid of Google Drive. I was using it to sync all my files and photos between my devices. I originally went with Nextcloud for file syncing, but after having a few issues with their service, I recently flipped back to a Synology.

I’m now using my Synology for file syncing, note taking, media streaming and backing up all my important data. Synology don’t make cheap devices, but they are brilliant. If you have the funds available, I’d highly recommend them. If not, there’s always Nextcloud.

05 – Gmail

Ok, the one that people really care about – replacing Gmail. To be fair, Gmail is a great service, but it’s a privacy black hole, so it had to go.

I originally went with a self-hosted solution on my cPanel server. This worked really well and I had no problems with it, but I ended up closing that server down and migrating all my web hosting to a different provider.

On the new provider, I have limited storage capacity, so I decided to host my email elsewhere. Originally I went with Fastmail, who offer an extremely good service, but they’re not cheap at $5/month per mailbox. With 6 mailboxes to buy, for both myself and family members, $30/month is more than I really wanted to spend on email, calendar and contact syncing.

I finally switched to Zoho Mail. They’re far cheaper at around $1/month per mailbox, they also provide a great service, and they respect your privacy too.

Zoho has never sold your information to someone else for advertising, or made money by showing you other people’s ads, and we never will. This has been our approach for almost 20 years, and we remain committed to it.

Zoho privacy policy

I’m extremely happy with the service Zoho provides. I’d highly recommend them if you’re looking for a Gmail replacement.

06 – What I Couldn’t Replace

This was the final post about de-Googling my life. Here I talked about what I couldn’t replace, which included Android, YouTube, Google Maps, my Chromebook, Android Pay and Google+.

This post is already long enough, so I’ll summarise each in the list below:

  • Android – I ended up ditching Android in favour of an iPhone. I now have an iPhone 8 and am still very happy with that decision.
  • YouTube – The fact of the matter is, there is no replacement for YouTube. Unfortunately I’m still a regular YouTube user.
  • Google Maps – I use Apple Maps for navigation, but if I need to lookup something on a map, I still default to Google. Mainly because Street View is so damned awesome.
  • My Chromebook – I’ve replaced this with a Lenovo X1 Carbon running Linux and I’m very happy with it.
  • Android Pay – I now use Apple Pay on my iPhone.
  • Google+ – This is dead now, as explained earlier. I now use Mastodon exclusively.

Conclusion

Two years on and I’m pretty much Google free at this point. There are some products that I just can’t replace, but you never know, in the future there may be products released that can rival these services.

De-Googling my life was pretty difficult at first, but now I have my new workflows established, I really don’t miss their services.

And as a final note, I’d like to thank Brandon for dropping me an email and reminding that I really needed to get this update done. Thanks, Brandon!

How To Cycle An Aquarium

If you’re new to fishkeeping, you need to cycle your aquarium before stocking your tank with live animals. This post will show you how I personally do this.

Before adding livestock, it’s very important to cycle an aquarium first. Cycling is the process of establishing bacteria colonies so the nitrogen cycle takes place. It’s really simple to do, but hugely important.

If you don’t cycle an aquarium before adding fish, you could be poisoning them to the point where they die!

There are a number of ways to do a cycle, this is just the way I personally do it and it’s always worked well for me. You’re going to need a few things before you get started:

  • Something that can be broken down to produce waste in lieu of fish. I tend to use supermarket bought prawns.
  • A water testing kit. I recommend the API test kit.
  • A booster solution to add bacteria to your tank. I recommend API Quick Start.
  • Patients – this process takes weeks.

Starting the Cycle

First thing you need to do is fill your tank with water and setup your filtration. Once that’s done, plop a few prawns into your tank and leave them there. Also start dosing your tank with your boosting solution, per the recommended guidelines on the packaging.

Note: API Quick Start says that you can “immediately add fish” on the bottle. YOU CANNOT! This is a sales ploy – complete your cycle first.

After a few days, the prawns will start to look disgusting and your water will likely have a milky hue to it. That’s a good thing – this is a biological bloom and means your bacteria culture is starting to form.

full nitrogen cycle

If you look at the image above, you don’t have fish in your tank producing waste, so you’re simulating that with the prawns. These are being broken down into ammonia (step 1) and your bacteria are starting to colonise your tank/filtration to break down the ammonia into nitrites (step 2).

Water Testing

At this point you need to be doing regular water tests. I would recommend doing this daily. However, if you want to make your test kit last longer, every 2 days is ok.

The API kit tests for pH, ammonia, nitrite and nitrate. We’re not concerned about the water pH for cycling, but once your cycle is complete, you will need to make sure that the fish you intend to stock your tank with are compatible with the pH of your water.

From your water tests, you will start to see a spike in ammonia. You should be aiming for the ammonia hit 1.0 ppm. If you have too much ammonia, remove some prawns, if you don’t have enough, add some.

API test results

After a few more days of having ammonia in the system, you should start seeing the nitrites increase. Eventually, you should see your ammonia disappear completely and be replaced by nitrites. Again, this is a good thing as it means you’re moving onto step 3 of the nitrogen cycle.

At this point, another bacteria will start to grow in your aquarium. This bacteria will convert your nitrites into nitrates. Keep testing your water every day or so. After 2-4 weeks, you should see that your aquarium consistently has zero ammonia and nitrites, and the nitrates are climbing. Congratulations, your initial cycle is complete.

Adding Fish

Once you’re at the point where your test results are zero for ammonia and nitrite, and your nitrates are rising, you can start adding fish. Remember to remove all the prawns at this point!

I would add fish in a staged approach though, depending on what you’re going to have. If you only intend to have a small school of fish, then just add them, but if you’re intending to build a large community tank, add them in stages.

Although you do have a cycled aquarium, you’re only cycled for a bio-load of 1.0 ppm of ammonia. So if you all of a sudden add 100 fish that produce 3.0 ppm of ammonia, you may crash your system and cause another cycle to start.

This is because your bacteria can’t cope with the amount of waste being produced, and it will take time for more to grow. By adding your fish progressively, you build up your bacteria colonies in stages, thus ensuring the stability of your aquarium parameters.

While adding fish in stages, keep adding your booster solution to help the bacteria colonies grow. You don’t need to do this, but it will help your bacteria colonies to grow more quickly.

Watch Those Nitrates

Once you start adding fish, remember those nitrates are creeping up all the time, so remember to do weekly water changes of 30-50% to keep on top of them.

If your nitrates are hitting more than 40 ppm a week, then I would suggest doing water changes more often, or upping the amount of water you’re changing to 60-75%. Although big water changes can stress some fish due to the large fluctuations in water parameters. I’ve always found that smaller water changes more often is better.

High nitrates may also be an indication of inadequate filtration, or over stocking. If you’re unsure, seek advice. I would strongly recommend r/aquariums on Reddit as a really good source of quality advice.

If you have the right media and the appropriate filtration, you may eventually create a colony of anaerobic bacteria that will feed on nitrates, thus reducing them, but that can take up to 6 months to accomplish.

Conclusion

You now should have a good idea of how to cycle an aquarium. Like most things in this hobby, there are multiple ways to do this; this is just the way I personally do it.

Other people prefer to add ammonia instead of prawns, that’s fine too. Whatever works for you. Just make sure you’re doing regular water tests and monitoring your parameters.

Finally, just for good measure, here is a picture of my favourite tank. It’s a 350 litre (90 us gal) tank containing my two Oscar cichlids, Betty & Dice. If you look closely, you can also see my Bristlenose Pleco hiding under the rock.

Oscar Tank

How I Optimise My Website Performance

WordPress is not slow. This website uses WordPress and is pretty darn quick, I think you will agree? In this post I want to talk a little bit about how I’ve optimised the performance of this website so that it loads in less than 2 seconds.

Website performance isn’t cheap, so this post contains affiliate links. If you want to know more, please refer to my affiliate link policy. 🙂

I recently wrote about why I think WordPress isn’t slow, and that people should probably give it some slack. But I wanted to go into some detail in this post about what optimisations I’ve made in order to achieve good website performance.

Testing Speed

According to MachMetrics, the average load time for a website is 4.7 seconds on desktop and 11.4 seconds on mobile. Also, the average size of a website is 1.9MB.

Below are some stats from GTMetrix, where I have tested the speed of one of my heavier pages that contains a number of images (how online tracking works):

London test results
London test results
Dallas test results
Dallas test results
Sydney test results
Sydney test results

My server is located in London, so that’s why the London test is really quick at less than half a second. But how did I manage to get speeds way below the average for tests in America and Australia, especially on a web page that contains 4 large images?

Optimisations

In order to have consistent, high quality website performance, you really need to be working at it from the beginning. There are a number of steps I take to maintain good performance:

  1. Use a good quality host
  2. Have a lightweight WordPress theme
  3. Optimise all images
  4. Utilise caching where possible
  5. Use a Content Delivery Network (CDN)

Good Quality Hosting

If you head over to GoDaddy, or Bluehost, or Hostgator, or any one of the many other shared hosting providers, you will be able to get a hosting package very cheap. In some cases, just a couple pounds/dollars a month.

While that sounds great on the surface, with hosting you really do get what you pay for. Shared hosting means that you share server resources with an untold number of other customers. So if any one of them has a poorly optimised website, is consuming a large number of resources on your shared server, or if the host over-subscribes their servers, then you’re going to end up with a really slow site.

To host this site, I use a VPS from Ionos. It has a dual core CPU, 120GB of SSD storage and 4GB RAM. This provides plenty of capacity for this site, and all the other sites I own. Even if I have a spike in traffic, like when I’ve had posts hit the front page of HackerNews, my website was still fast.

Lightweight Theme

When I moved from Grav back to WordPress, I decided from the start that I wanted to use a lightweight theme. I don’t need tracking pixels, or sharing options, or a sidebar, or even a commenting system. So I decided to take the incredible Susty theme (Github) and make it my own. The result is the site you see now.

Because this site has very little going on within its pages (outside of the content), there are fewer items to load and fewer parts to stitch together. Therefore, the site loads much quicker, even without optimisations.

Optimise Images

The max width for the content section of this website is 1000px. So if I ever add an image to a post, I always make sure it is no wider than 1000px. Anything bigger is just wasted pixels and wasted space.

Content Max-Width

I also try to ensure that every one of my images is less than 100KB in size. I usually do this editing and optimisation in GIMP before uploading. If the width of your content area is 700px, don’t upload images any wider than that. It’s just a waste.

Enable Caching

I can’t stress how important this is. Caching is a minefield and can be pretty complicated to get right. But if you configure content caching, CSS/JS minification and Gzip compression, you should notice some significant performance boosts.

There are a lot of caching plugins out there for WordPress. I personally use WP Rocket, which costs $49/year, but it’s so worth it! If you don’t want to pay out for WP Rocket, W3 Total Cache is a free alternative that is also very good.

Use A CDN

A Content Delivery Network (CDN) is a way of distributing your website geographically, so that it’s always served from a server near your visitors. Basically it serves cached copies of your website from all over the world. So if someone visits your site from the other side of the world, the website is served from the CDN’s closest server to your visitor, not your main server.

Shorter geographical distances means quick response times, which in turns means quicker load times for your website.

For this site, I use BunnyCDN. They’re very cheap and their service is great. Plus, combining BunnyCDN with WP Rocket is very simple to do. It takes literally 2 minutes.

Putting It All Together

Let’s do a real world test of what all of this work actually does. Below is a GTMetrix test of the same page I tested earlier. The only difference is that on one of the tests I have appended /?nocache, which tells my server not to load any of the optimisations for that session.

As you can see, the differences between the results are significant:

Website test from Dallas
With optimisations enabled
website-test-dallas-nocache
Without optimisations enabled

My website loaded nearly a full second quicker with the optimisations enabled. That’s and improvement of around 30% and remember, we still have the image optimisations and lightweight theme in place.

The number of requests drops from 29 to 12, and the physical size of the website is nearly half, at 355KB instead of 656KB.

What Does It Cost?

Ok, so let’s talk money – a good quality host, premium caching plugins and a CDN all cost money – but they won’t break the bank. Here’s a breakdown of what this website costs me to run:

ItemCost
Ionos VPS Hosting£12.00/month
WP Rocket Plugin£4/month
BunnyCDN£1.50/month*
Total:£17.50/month

* The cost of BunnyCDN differs from month to month depending on the amount on bandwidth I’ve used. This is an average. Pricing details here.

Conclusion

Yes, WordPress can be slow, but it doesn’t have to be. Through optimisation, I’ve managed to create a website that performs really well, even when under significant load.

If you have any questions about website performance, or any of the optimisations I’ve listed in this post, or if you think I could be doing more to optimise my site, please let me know.

How The Aquarium Nitrogen Cycle Works

The nitrogen cycle is one of the first things most fishkeepers learn, but many don’t achieve (or don’t know about) the FULL nitrogen cycle. This post may help you have better water quality, so you have to do fewer water changes. But most importantly, your fish will be happier and healthier.

So we all know the nitrogen cycle, right? Fish poo > ammonia > nitrite > nitrate. If you don’t know what the nitrogen cycle is, I’ll start by explaining that first.

The Traditional Nitrogen Cycle

So if you’re not aware, the traditional nitrogen cycles involves 4 main stages:

  1. Your fish produce waste which is broken down into ammonia.
    • Ammonia is extremely toxic to fish, so any ammonia in your aquarium is really bad.
  2. Once ammonia is present in your aquarium, a culture of bacteria develops that feeds on the ammonia and turns it into nitrites.
    • Nitrites are also toxic to your fish. So again, any signs of nitrites in your water is really bad.
  3. Now you have all of your ammonia turned into nitrites, another bacteria starts to form which turns nitrites into nitrates.
    • While not as bad for your fish as ammonia or nitrites, nitrates are still bad for your fish in high quantities.
  4. Nitrates are removed from the water column by doing water changes.

Here’s the whole thing in a nice, easy to digest diagram:

Traditional nitrogen cycle

Different people have a different gague as to what the limit for nitrates should be. Personally, I start to get concerned if my nitrates exceed 40ppm. I’d be really concerned if they hit 80ppm. Always aim for the lowest possible number of nitrates.

All good in the hood, right? You do your regular water tests and you have no ammonia or nitrite, and a steady supply of nitrate building up. Done. Finitio. Let’s all go home…right?

Not quite.

That is not the complete nitrogen cycle. What if I told you that you could add another step so that your nitrates also reduce as part of the cycle? Well, they can folks!

Anaerobic Bacteria

This is where anaerobic bacteria comes into the mix. They will feed on your nitrates, reducing their numbers to much safer levels. This means better water parameters, fewer water changes and, most importantly, happier fish.

Again, for your viewing pleasure, here’s the FULL cycle in an easy to consume diagram:

full nitrogen cycle

So Kev, why isn’t everyone doing this?

That’s a great question, fair reader. You see, a lot of people find it difficult to create a colony of anaerobic bacteria, because they require very specific conditions in which to grow. Whereas the bacteria in stages 2 and 3 will grow pretty much anywhere there is water and waste.

Anaerobic organisms do not require oxygen to survive. In fact, the presence of oxygen can kill them. So in order to create a colony of anaerobic bacteria, you need a place that is full of water, nitrate and void of oxygen.

That’s pretty difficult considering your fish and stage 2 & 3 bacteria need oxygen to survive. See why this is hard to achieve now? 🙂

It’s all about the media

Getting a full cycle is all about getting the right biological media in your filtration system. I’m not going to go into specifics in this post – I’ll save that for another day. But you need a media that not only has a large surface area for the aerobic bacteria to thrive, but also lots of nooks and crannies for the anaerobic colony to take hold too.

Aerobic bateria only takes a couple of weeks to establish a colony, whereas anaerobic bateria can take months to establish. So don’t expect this to occur overnight. But, given the right conditions, it will happen.

Once you have this, you will notice that your nitrates start to significantly reduce. Like I said, I will do a separate post on what media to use, and how to setup filtration correctly.

In the meantime, you can sleep well tonight knowing that you now know a little more about the nitrogen cycle, and there is a way to reduce those pesky nitrates without doing water changes every week.

If you want to see my other fishkeeping posts via RSS, or any of my other posts for that matter, you can subscribe here.

How to backup a Synology to Backblaze B2

I recently wrote a post about moving from Nextcloud to Synology, in that post I mention how I’m backing up to Backblaze B2. This post explains how I configured that.

So you want to backup a Synology to Backblaze B2? Well, so does my friend and Fosstodon co-founder, Mike Stone who asked for more detail around how I backup to Backblaze B2 storage. Fosstodon has a limit of 500 characters and that wasn’t going to cut it, so I decided to write a post instead.

Synology Cloud Sync

So to backup to B2, I’m using the Synology Cloud Sync application. This is easily installed with a single click via the Synology Package Manager.

Personally I backup all user home folders, as well as our Family Share folder to B2. I could backup all the media on my Synology, but that would be very expensive. I tend to make sure that my crucial data is still available, even if I lose all my local data. Multimedia files aren’t crucial and can be replaced, so I’m comfortable with just having local backups for that.

Synology Cloud Sync

Adding a new backup

I’m not going to take you through the process step by step, as it’s pretty straightforward. What I will do though, is show you how I’ve configured my backups and what those settings mean.

Here is a screenshot of the configuration for one of my B2 backup routines:

Cloud Sync Config

Encryption

When setting up the B2 backup routine, it’s very important to ensure encryption is enabled. This means that all of your data is encrypted at rest, so nobody except you can access it.

You will be asked to configure and encryption passphrase. Once configured, Synology will download certificates so you can decrypt your data later.

Keep both the certificates and passphrase safe. If you lose either of these, you will not be able to retrieve your data!

Sync Direction Settings

Set this option to “Upload Local Changes Only” and check the box that says “Don’t remove files in the destination folder…” Using both of these settings means that files will only sync one way – up to Backblaze. And if you delete a file, that deletion will not be synced.

Duplicates

Let’s say you have a spreadsheet you use to manage your finances. You add your monthly finances for January, then again in February. Cloud Sync will not overwrite your finances file with the new revisions.

Instead, Backblaze treats duplicated files as a new version. Now, this may result in you using more storage within your B2 bucket, but the cost probably won’t be significant. Plus it will allow you to roll back if you need to.

I personally set the Lifecycle Settings within my buckets to 14 days. This means that B2 will keep 14 days worth of versions for any file.

Backblaze Bucket Settings

Cost

Backblaze B2 is really cheap. For example, I have 2 buckets – one for home folders and the other for our family share. Over the 2 buckets, I have approximately 150GB of data stored in B2.

My invoice for January 2020 was just $0.39! Backblaze have an online calculator that should give you an idea of what backing up to their service will cost you.

Conclusion

This post should give you a good idea of how to backup a Synology to Backblaze B2. To be honest, it’s easy to configure and once it’s done, you can pretty much set it and forget it. Just make sure you do a test restore!

If you have any questions about this process, please feel free to get in touch with me.

Moving from Nextcloud to Synology

Following a hardware failure in Feb 2019, I moved away from my old Synology to a home built server running Nextcloud. I’ve now moved back to Synology. This post explains why.

So why move from Nextcloud to Synology? I think Nextcloud has the potential to be a great piece of software, but my personal opinion is that they’re trying to force too many ‘features’ in prematurely. This, for me at least, has resulted in an experience that leaves a lot to be desired. I’ve touched on this before in my post, Nextcloud talk is crap.

But it wasn’t just Nextcloud that was the problem – having to be a SysAdmin on the home server was starting to get old. I used the server for 3 main things:

  1. Nextcloud (file sync)
  2. Plex (media streaming)
  3. Duplicati (backups)

I was finding that I had to regularly fix one or more of those components when something inevitably went wrong. That’s when I decided to make a change.

From Nextcloud to Synology

I was lucky enough to get my end of year bonus from work, so I decided to invest in a new Synology. Previously I have a 4 disk solution with 4x1TB disks in a RAID5. This time, I decided to go for the DS218+, which has 2 drive bays. So I went with 2x4TB drives in a RAID1. I also upgraded the system RAM from 2GB to 6GB.

The data migration was so simple to do. It was simply a case of dumping all my data from the old server to a USB hard drive, then dumping it all onto the Synology.

Finally I installed the Synology Drive client onto all of our machines and copied the local data from the Nextcloud folder to the Synology Drive folder. Once the Synology had checked all the data matched up, we were good to go. The data migration was done in a morning.

Setting Up Plex

All of my media files were already dumped onto my USB drive. So all I had to do was setup a new share in the Synology GUI, dump my media into it, then install the Plex app in the Synology Package Centre.

Plex app on Synology

The final step was to setup the libraries and associate the Synology with my Plex account and I was good to go. This whole process took less than an hour to do.

Backups

The final part was to setup backups, after all, if you have data that isn’t backed up, you might as well not have the data at all. I wanted to follow the 3-2-1 backup rule, just like I did on Nextcloud.

Synology has backup apps that can be easily installed and configured, so this was also trivial. Along with the Synology, I bought a new 8TB USB hard drive for backing up locally. I installed the Hyper Backup app and configured it to backup all my important data to USB every night.

For off-site backups, I installed the Synology Cloud Sync app, which syncs my data in real-time to the cloud; for me, this is a Backblaze B2 bucket. If an item is deleted locally, that change is not synced. Cloud Sync also stores a number of file versions for any files that have changed.

All of my backups are encrypted at rest, and the Cloud Sync data is encrypted before being transferred to Backblaze, so I know my data is secure. Configuring the backups also took around an hour. All I had to do was wait 24 hours for my backups to complete and I was good to go.

The only thing I need to do now is carry out a test restore of some data from all of my backups, then I know I’m good. I’ll probably do that this weekend.

Conclusion

I’m really happy to be back on Synology because everything just works. Updates are all handled through the GUI, and cam be done automatically, so there is very little admin overhead.

The Synology platform also offers a wider range of apps, all of which work extremely well, unlike many of Nextcloud’s apps. I’m now exploring what else I can start self-hosting on my Synology, as it’s usually as simple as installing an app and clicking a few buttons to configure it.

Overall, I think Nextcloud has a really long way to go before it’s as good as services like Synology. Hopefully one day they will get there, but for now, I’m gonna stick with my Synology.

How To Create A Simple Install Script In Ubuntu

Some people like to upgrade their installation when a new version of Ubuntu is released, personally I like to nuke and pave a new installation so I’m starting fresh. This post will show you how I created a simple install script to configure a new installation quickly.

Note: this is just how I do it. I’m sure there are many ways to improve this script, but it works for me. If you have suggestions for improvements, please get in touch.

I actually have the process split up into two separate scripts. The first does a system update, sets up my repositories and installs my applications. I then sync all of my data over from my NAS.

Once the sync is complete, I run the second install script. This configures my VPN client, sets up my terminal aliases and configures my symlinks.

Once these two script have run, I have pretty much everything configured on my new OS. This means I can go from fresh install to a fully configured OS in under an hour.

Install Script #1

The first script has to be run as sudo. This is because it needs elevated privileges to install software and add repositories. My script installs packages from a number of sources, including the Ubuntu repositories, DEB files, Snap packages and additional repositories that I add.

The script below isn’t my exact script, but it shows how I install from the various different sources that I need:

#!/bin/sh
 
# Add additional repositories
apt-add-repository ppa:tista/adapta # Adapta theme repo
add-apt-repository ppa:papirus/papirus # Papirus icon theme repo
add-apt-repository ppa:agornostal/ulauncher # Ulauncher repo
add-apt-repository ppa:wereturtle/ppa # Ghostwriter repo
 
# Get the latest package lists
apt-get update
 
# Get DEB files
wget https://prerelease.keybase.io/keybase_amd64.deb
wget https://atom.io/download/deb/atom-amd64.deb
wget https://launcher.mojang.com/download/Minecraft.deb
 
# Install from Repo
apt-get install adapta-gtk-theme -y
apt-get install papirus-icon-theme -y
apt install gnome-tweak-tool -y
apt-get install ulauncher -y
apt-get install filezilla -y
apt-get install inkscape -y
apt-get install calibre -y
apt-get install torbrowser-launcher -y
apt-get install ghostwriter -y
apt-get install hunspell-en-gb -y # Adds spellcheck to Ghostwriter
apt-get install gimp -y
apt-get install plank -y

# Install snap packages
snap install spotify
snap install gitkraken
 
# Install DEB files
dpkg -i keybase_amd64.deb
dpkg -i atom-amd64.deb
dpkg -i Minecraft.deb
apt --fix-broken install -y # Fix Minecraft dependency issue.
 
# Clean up DEB files
rm -f keybase_amd64.deb
rm -f Minecraft.deb
rm -f atom-amd64.deb
 
# Install requirements for Ulauncher PW generator
apt install python3-pip -y
pip3 install pwgen
 
# Final message
echo All application have been installed, the script will now quit.
 
# Exit the script
exit 0

To actually use the script, paste the code above into a text editor, edit it as needed and save it as something like install.sh. You then need to right click on the file, go to properties, then the permissions tab and check the box to allow execution.

Alternatively, you can add execute permissions from the terminal with the following command:

sudo chmod +x install.sh

To execute the install script, run the following command:

sudo ./install.sh

Install Script #2

This second script is designed to setup our user config. Because we’re not installing any applications and these commands affect our user, we do not use the sudo command to execute the script.

As before, paste the following code into a text editor, save the file as something like install2.sh, then give it execute permissions.

Once you’re ready to execute the script, run the same command as before, but without sudo. So:

./install2.sh
#!/bin/sh

# Symlink for config files
mv ~/.config ~/.configOLD
ln -s ~/Nextcloud/Config/ ~/.config

# Symlink .minecraft folder so previous save works.
ln -s ~/Nextcloud/Minecraft ~/.minecraft

# Setup other Symlinks
rm -rf ~/Documents
rm -rf ~/Pictures
rm -rf ~/Public
rm -rf ~/Templates
rm -rf ~/Videos
ln -s ~/Nextcloud/Documents ~/Documents
ln -s ~/Nextcloud/Photos ~/Pictures

# Setup terminal alias
alias update='sudo apt update'
alias upgrade='sudo apt upgrade -y'

# Final message
echo User folders have been configured, the script will now quit.
 
# Exit the script
exit 0

I personally sync my .config folder from my home directory to Nextcloud. This allows me to have all of my applications configured in the exact same way across all of my machines.

This script creates a bunch of symlinks from my Nextcloud folder. Rather than remove the .config folder, it simply renames it. Once you have confirmed everything is working as it should, it’s fine to delete the old config folder.

Conclusion

While not perfect, these 2 install scripts should significantly speed up the time it takes to go from vanilla Ubuntu build, to a configured system. For me, that’s less than an hour.

Removing Comments – 3 Months On

Three months ago I decided I was removing comments from this website. Did I make the right decision, or will I be bringing them back?

If you can’t be bothered to read my previous post, the TL;DR is that dealing with spam and useless comments was taking too much of my time.

Was it the right move?

In short, yes, I think it was. Since removing comments I’ve been able to free more time up for writing posts, and I haven’t had to deal with comments that add nothing to the post itself.

The addition of the contact me link in place of the comments form seems to have worked really well. There has been a marked increase in people emailing me with questions.

This is great, because taking the time to actually email me means that they have something they really want to say, which means the engagements have been much more positive and interesting.

Will I bring them back?

No. Well, not in their traditional form at least. If I ever do decide to bring comments back, it will be in a similar way to how Gilles Chehade does it on his blog.

Basically, any post he creates has a corresponding Github issue. This issue is linked at the bottom of each post, and if people want to comment, they can do so by adding a comment to the issue.

I think this is a great way of providing a mechanism for people to comment easily, but it has enough of a barrier to entry to stop the spammers and useless comments.

Conclusion

Overall, removing comments from this site has been successful, I feel. My interactions with readers have improved, both via email and via my social accounts. And I haven’t had to manage troves of spam comments.

If you’re on the fence about removing comments from your site, I’d say do it. It has really worked well for me so far.

The Case For WordPress

Apparently, WordPress is slow, insecure and hard to maintain. I disagree and wanted to take a minute to explain why I think none of this is actually true.

This morning, I saw a post from one of the Fosstodon team, Gina. She was asking about creating a blog and wondered if Ghost would be a good recommendation:

Obviously Gina ended up getting a tonne of different recommendations, which included WordPress, Hugo, Jekyll, Grav and a few others I’d never heard of.

My personal recommendation was WordPress over Ghost. Mainly because I know PHP better than Node.js, and for me, WordPress just works. However, that recommendation was met with a few of the common misconceptions around WordPress.

WordPress is slow

In and of itself, WordPress is NOT slow. WordPress can be slow, but it isn’t inherently slow. Look at this website for example, it’s running WordPress, but it loads very quickly and scores 96% for mobile & 100% for desktop on Google PageSpeed Insights.

There’s no magic going on here. I use a standard VPS hosted with Ionos for my web server, and I run the W3 Total Cache plugin. Apart from that, I just follow good practices where I can. I don’t even use a CDN.

I don’t have a tonne of plugins installed, I use a well coded theme that doesn’t have a load of features I don’t need, and I make sure there aren’t huge images littered throughout my posts.

Marko Saric has a great post that goes into a lot of detail on how to optimise WordPress – it’s really not that hard.

So yes, WordPress can be slow and heavy, but can’t every site? The point is, it doesn’t have to be slow and heavy, and it’s easy to make it light and fast.

WordPress is hard to maintain

Honestly, I was surprised this came up. I didn’t realise people thought that WordPress was hard to maintain. Most web hosts have a one-click installer for WordPress and the core application updates itself automatically.

If you decide you want to disable auto updates, updating WP core is literally a single click. So I’m really not sure where the idea that WordPress is hard to maintain comes from.

When I was running Grav and Ghost, I found these much more difficult to maintain than WordPress. Once everything is setup, WP lets me focus on writing content, not messing around with maintenance.

WordPress is insecure

This is similar to whole speed thing. Yes, WordPress can be insecure, but that’s true of any piece of software if you don’t maintain it.

WordPress gets a bad rep for security because of a number of security issues that have been disclosed in the past. This is because WP runs on something like 30% of websites on the entire web.

It’s popular, so it’s a big target. But that doesn’t make it inherently insecure. Administrators having poor security hygiene is what makes those sites insecure.

Not updating WP core and plugins, using really old and unsupported versions of PHP, using a tonne of plugins instead of coding features into the site’s theme and using poor passwords all reduce security.

  1. Disable pingbacks to prevent DDoS attacks.
  2. Use strong passwords and multi-factor authentication.
  3. Update your shit (including PHP).
  4. Reduce the number of plugins you’re using where possible.

Follow these 4 simple rules and your WordPress site will be significantly more secure.

WordPress needs MySQL & PHP

Yes, that’s right. WordPress requires a database and PHP. So? Why is that a problem? As long as you’re using a modern, supported version of PHP, then you’re good.

The vast majority of web hosts allow for using both PHP and creating MySQL databases – this really isn’t anything special. Lots of other web applications require MySQL and PHP too.

Ghost requires Node.js, which still has problems being used on a lot of shared hosting environments. Want to run a Ghost blog? You’re going to need a VPS for that.

Conclusion

WordPress, Ghost, Hugo, Jekyll, Grav, or coding a blog yourself. Who really cares as long as it works for you. For me, WordPress is the best option.

Yes, there are many insecure and slow examples of WordPress out there, but that’s the fault of site admins, not WordPress.

How To Add CSS Dark Mode To A Website

A lot of people like to have the option of having a dark mode for a website. Maybe they prefer the look, or maybe they want to save their eyes from strain. This post will show you how to implement an automatic CSS dark mode that changes depending on your visitor’s theme.

CSS Dark Mode

On this site, I define variables to set the colours of my theme. I’d suggest you do the same, as it will make this process a lot easier. My standard variables are as follows:

:root {
  --accent: #226997;
  --main: #333;
  --light: #666;
  --lighter: #f3f3f3;
  --border: #e6e6e6;
  --bg: #ffffff;
}

If you want to use these variables throughout your stylesheet, you do so like this:

p {
  color: var(--main);
}

This way, if you ever want to change the colours of your theme, all you need to do is amend the variable you defined and everything using that variable will be updated.

Now we need to define a new set of variables that will be used when CSS dark mode is invoked. For me, the additional variables look like this:

/* Define colours for dark mode */
:root {
  --accent: #3493d1;
  --main: #f3f3f3;
  --light: #ececec;
  --lighter: #666;
  --border: #e6e6e6;
  --bg: #333333;
}

Adding Dark Mode Support

We now have our two sets of variables defined. The only thing left to do is add the prefers-color-scheme media query to our dark variables.

Take your dark colour variables and add the @media query below:

/* Define colours for dark mode */
@media (prefers-color-scheme: dark) {
  :root {
    --accent: #3493d1;
    --main: #f3f3f3;
    --light: #ececec;
    --lighter: #666;
    --border: #e6e6e6;
    --bg: #333333;
  }
}

That’s literally it! Your site will now automatically switch to dark mode if someone is using a dark operating system theme and visits your site.

My light theme
My dark theme

Testing It Works

I’m sure you will want to test this change works. To do so, you can simply enable a dark theme on your operating system, such as the iOS dark theme.

Alternatively, if you don’t want to mess around with your OS themes, you can force this test in Firefox. Here’s how:

  1. Open Firefox and type about:config in the address bar and hit enter.
  2. You will be asked to accept the risk. Accept it.
  3. In the search bar, search for ui.systemUsesDarkTheme.
  4. Change the checkbox to number and click on the + symbol.
  5. Change the value to 1 and click on the tick button.
  6. The page should now turn dark.
  7. Head back to your website and the theme should have automatically updated to dark mode.
  8. If you want to test it switches back, change the value to 0.
  9. Once you have finished testing, click the trash can to delete the option.

Conclusion

You should now have a website that is not only responsive in terms of mobile interface, but also by theme too. I’m sure your late night visitors, or those who just prefer a dark themed site, will thank you.