Background Slideshow with AngularJS and Bootstrap

As part of a project we wanted to have the front page with a nice rotating background for the jumbotron. There are a number of carousel components and scripts that can be easily found online but mostly they use the img tag and/or require a root absolute div which means it won’t automatically resize to the jumbotron content. I wanted a jumbotron that would resize to the content and also provide a nice seamless transition for the images. So, I sat down and rolled my own.

Firstly you need to set up a jumbotron component:

.jumbotron-slideshow {
    position: relative;
    background-color: transparent;  // replace the standard bootstrap background color

    .slideshow {
        background-size: cover;
        background-repeat: no-repeat;
        background-position: 50% 50%;
        position: absolute;
        top: 0;
        bottom: 0;
        left: 0;
        right: 0;
        /* Layer the images so that the visible one is below all the others,
         * but the previously active one fades out to reveal the visible one
         * below */
        transition: opacity 1s;
        opacity: 0;
        &.visible {
            transition: none;
            opacity: 1;
            z-index: -1;

And then the HTML:

<div class="jumbotron jumbotron-slideshow">
    <div ng-bg-slideshow="[ 'images/bg1.jpg', 'images/bg2.jpg', ... ]" interval=5000></div>

    ... content that you want ...

Create the angular template to generate the image divs:

<div ng-repeat="img in images"
        class="slideshow" ng-class="{ visible: active_image == $index }" ng-style="{ 'background-image': 'url(' + img + ')' }">

And finally the Angular component:

app.directive("ngBgSlideshow", function($interval) {
    return {
        restrict: 'A',
        scope: {
            ngBgSlideshow: '&',
            interval: '=',
        templateUrl: 'views/components/slideshow.html',
        link: function( scope, elem, attrs ) {
            scope.$watch( 'ngBgSlideshow', function(val) {
                scope.images = val();
                scope.active_image = 0;

            var change = $interval(function() {
                if( scope.active_image >= scope.images.length )
                    scope.active_image = 0;
            }, scope.interval || 1000 );
            scope.$on('$destroy', function() {
                $interval.cancel( change );

Note: If you want to be able to programatically change the interval you’ll need to add a watch that recreates the interval when the interval attribute changes.

Multi-line commands with comments in bash

As part of the last post I initially used a bash script to generate the commands to output the individual videos. As per usual, when I finally got fed up of the limitations and syntax issues in bash I switched to a proper programming language, perl. However this time I learnt a neat trick to doing multi-line commands in bash with comments embedded using the array feature of bash. A multi-line command typically looks like:

        melt \
            color:black \
                out=$audiolen \

However what if you want to add comments into the command? You can’t.

To solve this create an array:

        # Take black background track for same number of seconds as the MP3, then add 10 seconds of another image

and then use the following magic to execute it:


Using this you can also conditionally add in extra statements if you’re using a pipeline-type program such as imagemagick (convert) or melt:

        # Output to the file
        -consumer avformat
            f=mpeg acodec=mp2 ab=96k vcodec=mpeg2video vb=1000k

Automatically creating videos from pictures, music and subtitles

So for one of my projects we have a number of albums and individual songs which we want to upload to youtube as many people use this to listen to music these days. We also want to create a separate collection of videos that have the song words (Think hard-burning subtitles into a video). Obviously you can do this in video editing software but it would be nice to be able to tweak all the videos afterwards without having to do much work.

Initially I tried using avconv/mencoder to generate videos based on the pictures using the following code – generate the picture/music as a video, apply subtitles and then finally apply the audio again but without reencoding it.

    avconv -loop 1 -y \
            -i bgimg.jpg \
            -i "$mp3" \
            -shortest \
            -c:v libx264 -tune stillimage -pix_fmt yuv420p \
            -c:a mp3 \

    # Apply subtitles
    mencoder -utf8 -ovc lavc -oac copy -o "$out" "$t" -sub "$sub"

    # Add in end track and overlay with mp3
    mencoder -audiofile "$mp3" -idx -ovc lavc -oac copy -o "final.avi" "$out" "$append"

Whilst this kind of works it’s got a number of downsides the big ones being 1) it isn’t flexible to eg add another picture/slide at the end, and 2) it reencodes the video/audio a number of times.

Then I remembered that the great kdenlive video editing software is actually just a frontend to the brilliant mlt framework. This is basically a library plus commandline programs to do all sorts of video mixing with live or rendered output.

Using the melt commandline program you can test and generate tracks without having to worry about the XML format that it typically uses for the more advanced options. The final commands:

melt color:black out=5614 \
  t.jpg out=250 \
  -track \
    cdimage.jpg out=5614 \
  -transition composite geometry=0,0:100%x70% halign=1 \
  -consumer xml:basic.mlt

melt basic.mlt
  -filter watermark:subtitles.mpl \
    composite.valign=b composite.halign=c producer.align=centre \
  -audio-track audio.mp3

If you want to do the video output you can add the following onto the last command:

-consumer avformat \
  target=out.mpg \
  mlt_profile=hdv_720_25p f=mpeg acodec=mp2 ab=96k vcodec=mpeg2video vb=1000k

Lets go through this a line at a time:

melt color:black out=5614

Generate black background for 5614 frames

  t.jpg out=250

Followed by t.jpg for 250 frames

    cdimage.jpg out=5614

Generate a new track which is the cd image for the same length as the black track

  -transition composite geometry=0,0:100%x70% halign=1

Mix the two tracks so that the second one (ie the cd image) is 70% of the screen height and centered horizontally to the top.

  -consumer xml:basic.mlt

Output to an xml file (in order to apply subtitles to the whole thing we need to do this intermediary stage)

melt basic.mlt

Start with the mixed video sequence defined in the xml file (which is just instructions, not a staged render)

  -filter watermark:subtitles.mpl
    composite.valign=b composite.halign=c producer.align=centre

Apply the watermark filter with a subtitle mpl file, align to the bottom centered (it will auto scale extra wide lines to be the width of the video). A MPL file looks like this:


Where the first bit is the frame and the second bit is any text to be displayed. New lines demarcated with a tilde (~) character. Here is a simple perl script to convert a srt format subtitle file into this mpl format:

use strict;
use warnings;
use Path::Tiny 'path';

my ($fps, $in) = @ARGV or die;
$in = (path $in)->slurp;
$in =~ s/\r//g;
my @parts = split /\n\n/, $in;
for my $part (@parts) {
    #print "$part\n\n";
    $part =~ s/^ \D* \d+ \n
        ([\d:,]+) \s --> \s ([\d:,]+) \n
    my ($start, $end) = ($1, $2);
    for( $start, $end ) {
        my ($h,$m,$s,$part_s) = split /[:.,]/;
        $_ = int( ( ( $h * 60 + $m ) * 60 + $s + $part_s / 1000 ) * $fps );
    $part =~ s/\n/~/g;
    print "$start=$part\n",


Back to the melt commandline:

  -audio-track audio.mp3

Overlay the audio track

For the non-test output commandline parts:

-consumer avformat target=out.mpg

Output using libav

  mlt_profile=hdv_720_25p f=mpeg acodec=mp2 ab=96k vcodec=mpeg2video vb=1000k

Set the profile to be 25fps 720p hd video using mpeg, set audio rate 96kbps and video rate 1000kbps