Category Archives: Linux

Diagnosing faulty memory in Linux…

For the past year I’ve had very occasional chrome crashes (segfaults in rendering process) and an occasional bit of btrfs corruption. As it was always easily repairable with btrfs check --repair I never thought much about it, although I suspected it may be an issue with the memory. I ran memtest86 overnight one time but it didn’t show up any issues. There were never any read or SMART issues logged on the disk either, and it happened to another disk within the machine as well.

Recently though I was seeing btrfs corruption on a weekly basis, especially after upgrading to ubuntu 18.04 (from ubuntu 16.04). I thought it may be a kernel issue so I got one of the latest kernels. It seemed to happen especially when I was doing something quite file-system intense, for example browsing some cache-heavy pages while running a vm with a long build process going on.

Then, earlier in the week the hard drive got corrupted again, much more seriously and after spending some time fixing, running btrfs check --repair a few times it suddenly started deleting a load of inodes. Force rebooting the machine I discovered that the disk was un-mountable, although later I was able to recover quite a lot of key data from btrfs restore as documented in this post.

memtest86 was still not showing any issues, and so my first thought was that assuming the hard disk was not at fault it may be something to do only when the memory had a lot of contention (memtest86 was only able to run on a single core on my box). I booted a minimal version of linux and ran a multi-process test over a large amount (not all) of the memory:

where 8 is the number of processor/threads and 1400 is the amount of free memory on the system divided by that number (in my case I was testing 16gb of memory). 10 is the number of runs. It took about 45 min to run once over the 16gb, or about 25 min to run over 8gb (each of the individual sodimms in my laptop).

Within about 10 minutes it started showing issues on one of the chips. I’ve done a bit of research since this and seen that if a memory chip is going to fail then it would usually do it within the first 6 months of being used. However this is a kingston chip that has been in my laptop since I bought it 2 or 3 years back. I added another 8gb samsung chip a year ago and it seemed to be after that that the issues started, however that chip works out as fine. Perhaps adding another chip in broke something, or perhaps it just wore out or overheated somehow…

Whatsapp upgraded, crashes on start

Somehow today my wifes’ phone had managed to upgrade to a new version of WhatsApp. When she opened it it just said that the applicaiton had crashed. This also started happening recently with ‘Google Play Services’ and some other apps on her phone.

(As an aside, this is why I turn off auto-update where at all possible because you never know when something will break)

However after much research and debugging I learnt that the problem is not so much with WhatsApp itself as with the Cyanogenmod (custom ROM) that we use on our phones and will happen increasingly. Fortunately there is a relatively easy way to fix this – skip to the bottom of this article if you want to just fix the issue.

The technical root cause is documented on the google issue tracker and is caused by a change in the way apps are being built when they are upgraded to using the gradle 3 build-chain. It seems to be fixed in the latest versions of google build-tools so hopefully in the next 6 months this problem will go away but for the moment it will only increase as teams upgrade their android build chains. Basically in my quick scanning of the bug ticket the problem is that the implementation of some low-level part of reading an apk package on cyanogenmod and many other derived custom ROMs is slightly faulty. That code-path is not normally used but the new appt2 build-process creates some outputs that trigger the condition in libandroidfw which then cause the apps to not load.

This means that we just need to patch the library and it fixes the problem:

Download fix for cyanogenmod 12.1.

Download fix for cyanogenmod 13 (untested)

To install this fix you can put it onto your SD card and install via TWRP or whichever bootloader you use. Alternatively you can do it by hand if you have rooted your phone by connecting to your phone’s shell with adb shell and setting up the following:,

Then run the following from your computer to update (after having extracted the zip file):

Then reboot your phone and it should all work again.

Transparently serving WebP images from Apache

I’ve recently been working on a website where we are creating a tool to customize a product. We have various renders from the designers with lots of transparency and then combine these together on the frontend to produce the customized render. As a result of needing transparency we can’t use the jpeg format so we need to use PNG format, however as this is lossless it means the image sizes tend to be very big. Fortunately the WebP format can compress transparent images including the transparency layer (but this is not set by default). Running the WebP converter with light compression over our PNG assets for this projects produced a set of WebP’s which were in total only 25% of the size of the PNG assets and still a high quality. This means much faster loading for the site, especially when displaying multiple renders of the customized product and its 5-10 layers per render.

However, WebP support is only available in about 70% of the browsers today. Rather than trying to test for it on the client side, it would be great to just keep the browser-side code the same but serve different assets depending on whether the browser supports it or not.

I found a good start for apache support for transparent loading of WebPs on github, however there were a few bugs in the script. Here is the final version that I used – you need to put it under a <VirtualHost> section.

And here is a script to convert all png, jpg or gif files under your image directories to WebP format in such a way that they will be automatically served by the code above.

Note the -nt comparison that only updates files if the source has changed. You could add this script to git post-checkout and post-merge hooks to automatically keep your WebP assets in sync with the images in the code (and add a .gitignore entry for *.webp – no need to keep 2 copies of each resource in the repository).

Important note: If you’re using an older version of imagemagick such as on Ubuntu 14.04 (imagemagick 6.7.7), it doesn’t pass the alpha compression arguments through correctly so if you have a lot of transparency you won’t see much in the way of compression happening. Switch the convert line to be something like the below, however you need to remove the gif support as that requires using the gif2webp command to convert:

Also note that this causes some issues when you have for example a jpg and png of the same base name whose contents are different (I found a few in the old code I inherited). You can find the base name of any of these clashes clashes using the following command:

Using wildcards in ssh configuration to create per-client setups

In my role as a linux consultant, I tend to work with a number of different companies. Obviously they all use ssh for remote access, and many require going through a gateway/bastion server first in order to access the rest of the network. I want to treat these clients as separate and secure as possible so I’ll always create a new SSH key for each client. Most clients would have large numbers of machines on their network and rather than having to cut and paste a lot of different configurations together you can use wildcards in your ~/.ssh/config file.

However this is not amazingly easy – as SSH configuration requires the most general settings to be at the bottom of the file. So here’s a typical setup I might use for an imaginary client called abc:

Using Letsencrypt with Wowza Media Server

As part of a work project, I needed to set up Wowza Media Server to do video streaming. As the webapp (which I wrote using the excellent ionic 3 framework) is running under https, it won’t accept video traffic coming from non-encrypted sources. Wowza has some pricey solutions for automatically installing SSL certificates for you, you can also purchase ones however these days I don’t see why everyone doesn’t just use the free and easily automated letsencrypt system. Unfortunately however, letsencrypt doesn’t let you run servers on different ports particularly easily, although it does have some hooks to stop/start services that may already be listening on port 443 (ssl). I happen to be using a redhat/centos distro, although I’m pretty sure the exact same instructions will work on ubuntu and other distros.

Firstly, you need to download the wowza-letsencrypt-converter java program which will convert letsencrypt certificates to the Java format that Wowza can use. Install that prebuild jar under /usr/bin.

Now, create a directory under the Wowza conf directory called ssl and create a file called jksmap.txt (so for example full path is /usr/local/WowzaStreamingEngine/conf/ssl/jksmap.txt) which lists all the domains the Wowza server will be listening on like:

‘secret’ is not actually a placeholder; it’s the password that the wowza-letsencrypt-converter program sets up automatically so keep it as it is.

Configure SSL on the Wowza server by editing the VHost.xml configuration file (find out more about this process in the wowza documentation). Find the 443/SSL section which is commented out by default and change the following sections:

Note the <KeyStorePath>foo</KeyStorePath> line – the value foo is ignored when using jksmap.txt, however if this is empty the server refuses to start or crashes.

Next, install letsencrypt using the instructions on the certbot website.

Once you’ve done all this, run the following command to temporarily stop the server, fetch the certificate, convert it and start the server again:

Then, in order to ensure that the certificate continues to be valid you need to set up a cron entry to run this command daily which will automatically renew the cert when it gets close to its default 3 month expiry time. Simply create /etc/cron.d/wowza-cert-renewal with the following content: