Test driven development is nothing new, but recently I wanted to try out a stricter version of it in which the granularity is much finer — it became essential that I ran my unit tests for a file every time I hit <save>.

My current dev env is vim + bash + standard perl tests. For each area I’m working on, I define a brief script to declare the files of interest. In the below example I’m working on the Authen.pm plugin module and want the tests in cari_fn_authen.t to run each time the source is saved.

src/test/bin/run_authen:

#!/bin/bash
declare -x   TESTS="src/test/Cari/Fn/Authen/authen_unit.t"
declare -x   SOURCES="lib/Cari/Fn/Authen.pm"
${0/\/run*/\/run_tests.sh}

src/test/bin/run_tests.sh:

#!/bin/bash
declare -r   PROGNAME=${0##*/}
declare -x   PATH=/bin:/usr/bin
declare -i   DEBUG=0
declare      TESTS=${TESTS:?}
declare      SOURCES=${SOURCES:?}
declare      PERL="perl -Imod -Ilib -Isrc/test"

cache=$(mktemp -t ${PROGNAME}.XXXXXX)
function clean_up { rm -f ${cache:-/fictitious_path}; }
trap clean_up SIGHUP SIGINT SIGTERM

declare -i first_time=1
while [ $first_time -gt 0 ] \
        || inotifywait -qqe MOVE_SELF $TESTS $SOURCES; do
    echo -e "\n\n\n"$(date '+%F %R') $TESTS
    for t in "$TESTS"; do
        $PERL -T $t \
            | grep -vF ' skip ' \
            >$cache 2>&1
        result=$(tail -n 1 $cache)
        if [[ $result == 1..+([0-9]) ]]; then
            echo $result
        else
            cat $cache
        fi
    done

    echo -ne \\'033[0;43m'
    $PERL -wc $SOURCES 2>&1 \
        | grep -vF ' syntax OK'
    echo -ne \\'033[0m'
    first_time=0
done

At the beginning the script checks I’ve remembered to declare TESTS and SOURCES. Then it sets up a temporary file for cacheing test output. Once it’s running, the only way to stop it is to interrupt it, so clean_up is required when interrupted.

Since we’re monitoring vim saves, the loop guard has inotifywait look for the MOVE_SELF signal. Upon each iteration, it starts by spitting out a timestamp and the tests being invoked. (That’s just to give me confidence the monitoring is active.) Then it runs each test, putting its output into the cache, checking the result (last line), and then spitting out the full output only if there was a problem.

The next part just checks the syntax of the source files. If a problem is detected is spits out the errors and warnings with a yucky muddy yellow background. [See here for other colours.]

The end result is I can do quite major refactoring and rewriting at a quicker pace, knowing that any problems I create will be spat onto the neighbouring screen.

Advertisements

There are many situations in which you would like something to run each time you save a file (or a process modifies a file). Since linux kernel 2.6 this has been possible via the inotify mechanism. All you need is a client package for whichever language (bash, perl, ruby, python, …) you want to use.
For perl we have liblinux-inotify2-perl and for bash we have inotify-tools which I’ll illustrate here.

In this example, each time one of the specified files is saved (in vim), the script of unit tests will be invoked. This particular line uses the fact that vim implements a ‘save’ by way of ‘move, move, delete’, so we detect the first ‘move’…

tests=test/cari_plugin_defaults.pl
sources="lib/Cari/Plugin/Defaults.pm lib/Cari/Plugin/Stash.pm"
while inotifywait -qqe MOVE_SELF $tests $sources; do ./$tests; done

This is useful in lots of examples where you want to quickly see the results of editing source file(s), for example editing LaTeX files.

(Update: My latest vim issues ‘MODIFY’ instead of ‘MOVE_SELF’, so you’ll want to test yourself using inotifywait filename.)

Often tsv (tab-separated values) is the safest format for manipulating data, at least when convenience is a priority. Bash can work with these with ease, simply by modifying the IFS variable.
To process a two-column tsv:

IFS_OLD=$IFS
IFS='{tab}'
while read first second; do
  echo first=$first, second=$second
done <two_col.tsv
IFS=$IFS_OLD

where {tab} is entered by <ctrl>+<v>,<tab>