Recovering from strange mysql crash

So, yesterday my server started emailing some cron errors over. One particular script that runs every hour was throwing an error about 10 seconds in to running that appeared to be the server hanging up the query. I thought I may have set some timeout too low such that the server wouldn’t allow queries longer than 10 seconds (as this is a web server then nothing should take that long apart from a few analytics scripts that run overnight). Running the query by hand showed the same problem, so I started doing some analysis of the component parts to see which was taking so long. Then, I looked at the timeouts set and noticed:

mysql> show status like "%time%";
| Variable_name              | Value |
| Uptime                     | 105   |
| Uptime_since_flush_status  | 105   |

Hmmm that looks bad. Looking in dmesg confirms that mysql has not been hanging up – it’s actually been crashing!

[504510.549172] init: mysql main process (15317) terminated with status 1
[504510.549186] init: mysql main process ended, respawning

I ran mysqlcheck on the database in question and the server crashed again, even though I was able to query the table fine and even added an index before realizing that there was some issue with it. So, rather than restore from a backup I thought I’ll just clone the table and replace the existing one with it:

mysql> create table t like client_songs;
mysql> insert t select * from client_songs;
mysql> check table t;

New table looks fine, lets do a final update (as it’s continually being inserted into)

mysql> insert t select * from client_songs cs where client_song_ts >= ( select max(client_song_ts) from t ) on duplicate key update select_count= cs.select_count, download_count = cs.download_count, rating = cs.rating, client_song_ts = cs.client_song_ts, print_count = cs.print_count;

Then put it live:

mysql> drop table client_songs;
mysql> rename table t to client_songs;
mysql> check table client_songs;

Everything working again. I wish mysql (5.5.37 from ubuntu 14.04 LTS) was more reliable that’s why I tend to use postgres for new projects these days. It’s really strange that the table could be read fine but one particular query caused it to crash – probably a case of the particular index that was being for the query being corrupted but not the row-data.

Getting HP p1005 and associated LaserJet printers working with Raspberry Pi

The standard hplip doesn’t seem to work very well on the Raspberry Pi with printers like the HP LaserJet 1000/1005/1018/1020/P1005/P1006/P1007/P1008/P1505 as they need a custom firmware downloading to them before being able to print anything. This seems like an issue with the firmware download or page formatting tool which just make the printer give a click sound and do no more. This is annoying because I was wanting to use my Raspberry Pi as a file and print server – my all-in-one HP DeskJet F2420 set up really easily but I was struggling for a long time to make the LaserJet work correctly.

The basic way to make this work is to download and compile the foo2zjs open source printer driver (as the apt sources don’t seem to be working properly at least on my slightly older version of Raspian). After having downloaded and compiled you install using the following commands:

rm /etc/udev/rules.d/*hpmud* # remove any existing hotplug from HP
./getweb P1005 # change depending on your printer
make install
make install-hotplug
make cups

For some reason the install script didn’t seem to correctly install the firmware on my system so I had to work around it:

mkdir -p /usr/share/foo2xqx/firmware/
mv sihp* /usr/share/foo2xqx/firmware/

If you want to test without having to use cups, try the following (after turning your printer off and on again):

foo2xqx-wrapper > testpage.xq # convert to wire-format
cat /usr/share/foo2xqx/firmware/sihpP1005.dl > /dev/usb/lp0 # bang firmware over to printer (wait 5 sec for it to install)
cp testpage.xq /dev/usb/lp0 # print the page (hopefully)

Depending on printer you may need to use foo2zjs-wrapper instead of foo2xqx-wrapper.

Then you have proved it works so you can set it up in cups nice and easily and use it as a remote printer!

A Facebook Share component for AngularJS

There are several facebook share components available for AngularJS however I was needing something that could:

  • Show a count of the number of shares
  • Work within the translation infrastructure of my app (ie a custom template)
  • Handle different URL mappings for pages – whilst a typical angular url might be… facebook can’t scrape this unless you use the #! method or somesuch. Because of this typically on the server we map to different links such as…

The below code snippets work together to do all of this in a few lines:

angularApp.directive("ngFbShare", function($http, $rootScope) {
    return {
        restrict: 'A',
        templateUrl: 'views/components/fbshare.html',
        scope: {
            ngFbShare: "@",
        link: function(scope, elem, attrs) {
            var link; // always the latest link relative to the base of the server. Use this for click tracking etc
            var update_share_link = function() {
                link = '/landing/' + scope.ngFbShare;
                scope.share_link = MY_FULL_ADDRESS + link;
            scope.$watch('ngFbShare', update_share_link);

            scope.fb_total_count = 0;
            $http.get('' + scope.share_link)
                .then(function(res) {
                    scope.fb_total_count =[0].total_count;

<a href="{{ share_link }}" target="fb_share_window" class="btn btn-primary btn-fb">
    <i class="fa fa-fw fa-facebook"></i> Share
    <span class="count">{{ fb_total_count }}</span>

Convert emf files to png/jpg format on Linux

For a project recently I was sent some excel files with images embedded in them. Not fun. Then I discovered that these were in some random windows format of emf or wmf (depending on whether I exported as .xlsx or .ods from libreoffice) which I think was just wrapping a jpg/png file into some vector/clipart format. Fortunately there’s a great script called unoconv that uses bindings into libreoffice/openoffice to render pretty much anything, however it doesnt seem possible to change page size/resolution. If you use the PDF output though you can get the image simply embedded in the PDF, then use the pdfimages command to extract the original images out of there. Finally some of these had different white borders so I cropped these and converted to png. Full commands below:

rm -fr out; mkdir out;
for i in xl/media/image*.emf; do
  unoconv -f pdf -o t.pdf "$i";
  pdfimages t.pdf out;
  convert out-000.ppm -trim out/$(basename "$i").png;

Drupal: Importing commerce products with Feeds Import 3

My CMS of choice for non-bespoke projects is Drupal even though it’s written in PHP it seems a lot more secure, stable and extensible than most CMS’ out there. Recently I’ve been working on an ecommerce site using Drupal Commerce which is a bit tricky to learn but very flexible and well integrated with Drupal. Today I needed to import a product list into the new system from an existing platform. Fortunately with Drupal’s Feeds Import module this is pretty straight forward (after reading the documentation about how to process multiple taxonomies etc). However it seems like it recently had an upgrade and version 3 is incompatible with version 2 (there’s a Commerce adaptor for v2).

I couldn’t find any code about how to integrate this latest version of Feeds Import with Drupal Commerce to import the prices of the products (which are linked to standard nodes using a Product Reference field). So, I created an input filter of my own to do this, see the code below. Note the custom cover_type field which and also setting the extended data attribute of the pricing detail.

class CommerceImportFilter {
    public static function add_product( $field ) {
        $cp = commerce_product_new('product');
        $cp->title = 'softcover';
        $cp->field_cover_type = array(LANGUAGE_NONE => array( 0 => array(
            'value' => 'soft'
        $cp->commerce_price = array(LANGUAGE_NONE => array( 0 => array(
          'amount' => $field * 100,
          'currency_code' => 'TRY',
          'data' => array( 'include_tax' => 'kitap_kdv' ),
        return $cp->product_id;

Solved: Problems with connecting ath9k to 802.11n network

So, I was at a friends house and tried to connect my Qualcomm Atheros AR9285 Wireless Network Adapter (ath9k driver on linux) to their wireless network (D-link DIR-615). It was connecting and then 10 seconds later disconnecting without ever properly establishing a connection. Output as below:

[ 6350.957601] wlan0: authenticate with XXX
[ 6350.971542] wlan0: send auth to XXX (try 1/3)
[ 6350.973230] wlan0: authenticated
[ 6350.976927] wlan0: associate with XXX (try 1/3)
[ 6350.980936] wlan0: RX AssocResp from XXX (capab=0xc31 status=0 aid=3)
[ 6350.981006] wlan0: associated
[ 6350.981376] cfg80211: Calling CRDA for country: GB
[ 6350.984168] ath: EEPROM regdomain: 0x833a
[ 6350.984172] ath: EEPROM indicates we should expect a country code
[ 6350.984174] ath: doing EEPROM country->regdmn map search
[ 6350.984175] ath: country maps to regdmn code: 0x37
[ 6350.984177] ath: Country alpha2 being used: GB
[ 6350.984178] ath: Regpair used: 0x37
[ 6350.984179] ath: regdomain 0x833a dynamically updated by country IE
[ 6350.984207] cfg80211: Regulatory domain changed to country: GB
[ 6350.984209] cfg80211:  DFS Master region: unset
[ 6350.984210] cfg80211:   (start_freq - end_freq @ bandwidth), (max_antenna_gain, max_eirp), (dfs_cac_time)
[ 6350.984213] cfg80211:   (2402000 KHz - 2482000 KHz @ 40000 KHz), (N/A, 2000 mBm), (N/A)
[ 6350.984215] cfg80211:   (5170000 KHz - 5250000 KHz @ 40000 KHz), (N/A, 2000 mBm), (N/A)
[ 6350.984217] cfg80211:   (5250000 KHz - 5330000 KHz @ 40000 KHz), (N/A, 2000 mBm), (0 s)
[ 6350.984218] cfg80211:   (5490000 KHz - 5710000 KHz @ 40000 KHz), (N/A, 2700 mBm), (0 s)
[ 6350.984220] cfg80211:   (57240000 KHz - 65880000 KHz @ 2160000 KHz), (N/A, 4000 mBm), (N/A)
[ 6360.987225] wlan0: deauthenticating from XXX by local choice (Reason: 3=DEAUTH_LEAVING)

Not very nice. I browsed around on the internet but couldn’t find anything obvious, eventually by looking at the different options the ath9k kernel driver accepts I found the ath9k_hw_btcoex_disable option which seems to do the trick.

echo options ath9k nohwcrypt=1 ath9k_hw_btcoex_disable > /etc/modprobe.d/ath9k.conf
sudo rmmod ath9k ath9k_hw ath9k_common
sudo modprobe -v ath9k ath9k_hw ath9k_common

and it all works again.