Category Archives: Linux

Whatsapp upgraded, crashes on start

Somehow today my wifes’ phone had managed to upgrade to a new version of WhatsApp. When she opened it it just said that the applicaiton had crashed. This also started happening recently with ‘Google Play Services’ and some other apps on her phone.

(As an aside, this is why I turn off auto-update where at all possible because you never know when something will break)

However after much research and debugging I learnt that the problem is not so much with WhatsApp itself as with the Cyanogenmod (custom ROM) that we use on our phones and will happen increasingly. Fortunately there is a relatively easy way to fix this – skip to the bottom of this article if you want to just fix the issue.

The technical root cause is documented on the google issue tracker and is caused by a change in the way apps are being built when they are upgraded to using the gradle 3 build-chain. It seems to be fixed in the latest versions of google build-tools so hopefully in the next 6 months this problem will go away but for the moment it will only increase as teams upgrade their android build chains. Basically in my quick scanning of the bug ticket the problem is that the implementation of some low-level part of reading an apk package on cyanogenmod and many other derived custom ROMs is slightly faulty. That code-path is not normally used but the new appt2 build-process creates some outputs that trigger the condition in libandroidfw which then cause the apps to not load.

This means that we just need to patch the library and it fixes the problem:

Download fix for cyanogenmod 12.1.

Download fix for cyanogenmod 13 (untested)

To install this fix you can put it onto your SD card and install via TWRP or whichever bootloader you use. Alternatively you can do it by hand if you have rooted your phone by connecting to your phone’s shell with adb shell and setting up the following:,

Then run the following from your computer to update (after having extracted the zip file):

Then reboot your phone and it should all work again.

Transparently serving WebP images from Apache

I’ve recently been working on a website where we are creating a tool to customize a product. We have various renders from the designers with lots of transparency and then combine these together on the frontend to produce the customized render. As a result of needing transparency we can’t use the jpeg format so we need to use PNG format, however as this is lossless it means the image sizes tend to be very big. Fortunately the WebP format can compress transparent images including the transparency layer (but this is not set by default). Running the WebP converter with light compression over our PNG assets for this projects produced a set of WebP’s which were in total only 25% of the size of the PNG assets and still a high quality. This means much faster loading for the site, especially when displaying multiple renders of the customized product and its 5-10 layers per render.

However, WebP support is only available in about 70% of the browsers today. Rather than trying to test for it on the client side, it would be great to just keep the browser-side code the same but serve different assets depending on whether the browser supports it or not.

I found a good start for apache support for transparent loading of WebPs on github, however there were a few bugs in the script. Here is the final version that I used – you need to put it under a <VirtualHost> section.

And here is a script to convert all png, jpg or gif files under your image directories to WebP format in such a way that they will be automatically served by the code above.

Note the -nt comparison that only updates files if the source has changed. You could add this script to git post-checkout and post-merge hooks to automatically keep your WebP assets in sync with the images in the code (and add a .gitignore entry for *.webp – no need to keep 2 copies of each resource in the repository).

Important note: If you’re using an older version of imagemagick such as on Ubuntu 14.04 (imagemagick 6.7.7), it doesn’t pass the alpha compression arguments through correctly so if you have a lot of transparency you won’t see much in the way of compression happening. Switch the convert line to be something like the below, however you need to remove the gif support as that requires using the gif2webp command to convert:

Also note that this causes some issues when you have for example a jpg and png of the same base name whose contents are different (I found a few in the old code I inherited). You can find the base name of any of these clashes clashes using the following command:

Using wildcards in ssh configuration to create per-client setups

In my role as a linux consultant, I tend to work with a number of different companies. Obviously they all use ssh for remote access, and many require going through a gateway/bastion server first in order to access the rest of the network. I want to treat these clients as separate and secure as possible so I’ll always create a new SSH key for each client. Most clients would have large numbers of machines on their network and rather than having to cut and paste a lot of different configurations together you can use wildcards in your ~/.ssh/config file.

However this is not amazingly easy – as SSH configuration requires the most general settings to be at the bottom of the file. So here’s a typical setup I might use for an imaginary client called abc:

Using Letsencrypt with Wowza Media Server

As part of a work project, I needed to set up Wowza Media Server to do video streaming. As the webapp (which I wrote using the excellent ionic 3 framework) is running under https, it won’t accept video traffic coming from non-encrypted sources. Wowza has some pricey solutions for automatically installing SSL certificates for you, you can also purchase ones however these days I don’t see why everyone doesn’t just use the free and easily automated letsencrypt system. Unfortunately however, letsencrypt doesn’t let you run servers on different ports particularly easily, although it does have some hooks to stop/start services that may already be listening on port 443 (ssl). I happen to be using a redhat/centos distro, although I’m pretty sure the exact same instructions will work on ubuntu and other distros.

Firstly, you need to download the wowza-letsencrypt-converter java program which will convert letsencrypt certificates to the Java format that Wowza can use. Install that prebuild jar under /usr/bin.

Now, create a directory under the Wowza conf directory called ssl and create a file called jksmap.txt (so for example full path is /usr/local/WowzaStreamingEngine/conf/ssl/jksmap.txt) which lists all the domains the Wowza server will be listening on like:

‘secret’ is not actually a placeholder; it’s the password that the wowza-letsencrypt-converter program sets up automatically so keep it as it is.

Configure SSL on the Wowza server by editing the VHost.xml configuration file (find out more about this process in the wowza documentation). Find the 443/SSL section which is commented out by default and change the following sections:

Note the <KeyStorePath>foo</KeyStorePath> line – the value foo is ignored when using jksmap.txt, however if this is empty the server refuses to start or crashes.

Next, install letsencrypt using the instructions on the certbot website.

Once you’ve done all this, run the following command to temporarily stop the server, fetch the certificate, convert it and start the server again:

Then, in order to ensure that the certificate continues to be valid you need to set up a cron entry to run this command daily which will automatically renew the cert when it gets close to its default 3 month expiry time. Simply create /etc/cron.d/wowza-cert-renewal with the following content:

Easily setup a secure FTP server with vsftpd and letsencrypt

I recently had to set up a FTP server for some designers to upload their work (unfortunately they couldn’t use SFTP otherwise it would have been much simpler!). I’ve not had to set up vsftpd for a while, and when I last did it I didn’t much worry about needing to use encryption. So here are some notes on how to set up vsftpd with letsencrypt on ubuntu 14.04 / 16.04 so that only a specific user or two are permitted access.

First, install vsftpd:

Next, you need to make sure you have installed letsencrypt. If not, you can do so using the instructions here – fortunately letsencrypt installation has got a lot easier since my last blog post about letsencrypt almost 2 years ago.

I’m assuming you are running this on the same server as the website, and you’re wanting to set it up as ftp on the same domain or similar subdomain as the website (eg ftp access direct to example.org, or via something like ftp.example.org). If not, you can do a manual install of the certificate but then you will need to redo this every 3 months.

Assuming you’re running the site on apache get the certificate like:

You should now have the necessary certificates in the /etc/letsencrypt/live/example.org/ folder, and your site should be accessible nicely via https.

Now, create a user for FTP using the useradd command. If you want to just create a user that only has access to the server via FTP but not a regular account you can modify the PAM configuration file /etc/pam.d/vsftpd and comment out the following line:

This lets you keep nologin as the shell so the user cannot login normally but can log in via vsftpd’s PAM layer.

Now open up

Because we’re running behind a firewall we want to specify which port range to open up for the connections (as well as port 21 for FTP of course):

If you want to make it even more secure by only allowing users listed in /etc/vsftpd.userlist to be able to log in, add some usernames in that file and then add the following to the /etc/vsftpd.conf configuration file:

You can test using the excellent lftp command:

If the cert is giving errors or is self-signed, you can do the following to connect ignoring them:

Fixing Ubuntu 16.04 massive internal microphone distortion

A while ago I upgrade from Ubuntu 14.10 to 16.04. Afterwards, my laptop’s internal microphone started to become massively distorted to the point that people on the other end of skype or hangouts calls couldn’t understand me at all.

Looking in the ALSA settings I noticed that the “Internal Mic Boost” was constantly being set to 100% and when I dropped this down to 0% everything went well. It seems on my laptop at least to be coupled with the “Mic Boost” which boosts both but without quite so much distortion, ie the “Internal Mic Boost” is a boost on top of the “Mic Boost” which is obviously a problem.

I couldn’t find much detail about how to configure this properly, so after some hacking around I was able to come up with the following solution. Go through every file in /usr/share/pulseaudio/alsa-mixer/paths, look for the section “[Element Internal Mic Boost]” if it is there. You should see a setting under that section like “volume = merge“. Turn that into “volume = off“. To prevent it being changed later when ALSA is updated, you can run:

I’d love to hear if there is a simpler way to work around this issue, but it works for me at least!

Successfully downloading big files from Dropbox via Linux command-line

Recently, someone was trying to send me a 20Gb virtual machine image over dropbox. I tried a couple of times to download using chrome, however it got to 6-8Gb and then came up with a connection error. Clicking on the resume button failed and then removed the file (!). Very strange as I didn’t have any connection issues, but perhaps a route changed somewhere. I saw a number of dropbox users complaining about this on the internet. Obviously there are other approaches such as adding to your own dropbox account and using their local program to do the sync, however because I’m just on a standard free account I couldn’t add in such a large file.

Because I was using btrfs and snapper I still had a version of the half-completed download around, and so I tried seeing if standard linux tools would be able to continue the download where it left off. It turns out that simply using wget -c enables you to resume the download (it dropped a couple of times during the download but just restarting it with the same command let the whole file download just fine. So, to download a large dropbox file even if your internet connection is a bit flakey, simply go to the dropbox download link and then paste it into the terminal (may require the ?dl=1 parameter after it) like: