December 6, 2013

Aged to Perfection

Software development is a constant battle between short-term and long-term goals. We want to write new software quickly to solve technical problems, show to customers and validate ideas. A lot of these prototypes are throw aways, code to be discarded. Long term we want high quality, robust software, fully tested, well documented, etc. It has to work with low maintenance costs for long time in order to recoup the initial investments.

This fight is especially relevant to software with long build times and difficult manual testing procedures. If each commit has to conform to high standards (long term goal), just to be thrown away after hours or days, what would it do to the developer's morale? At the same time, if we allow low quality code to be committed, we are sure to degrade the system overall.

I like measuring code complexity and other static source properties before each commit. If the limit is exceeded, the commit is rejected.

I do not know what to set the limit to.

I would like to set it as high as possible for long term gains. I also want it low enough to allow committing code during explorations, prototyping, etc.

Right now I set the limit high, but use two strategies to get around it

  • disable limit using command line switch to let in the code I plan to refactor later.
    • downside: I forget to refactor, or have to disable the checks for most commits.
    • downside: the command line switch is blunt. It disables ALL checks, even when majority are passing and useful.
  • work on separate branches without limits, or with much lower ones.
    • downside: maintaining separate limits introduces merging or profile nightmare.

Short-term and long-term are measurements of time. After noticing this I came up with a solution:

use lower limits for code changed recently, and much higher limits for stable code.

Practice

I use gruntjs build tool for most of my projects, it is a flexible tool with huge number of 3rd party plugins. Most of the plugins operate on files, and the grunt task API allows flexible file filtering.

I created a project aged with single exported function. It can be plugged directly into any grunt plugin limiting files by last modified date. For example, the next script runs complexity plugin with two sets of settings:

  • fresh options will be applied to ALL files.
  • aged options will be applied only to stable files, that were last modified more than 3 days ago.

Gruntfile.js

var aged = require('aged');
var files = ['Gruntfile.js', 'src/index.js'];
grunt.initConfig({
    complexity: {
        fresh: {
            src: files,
            options: {
                cyclomatic: 5,
                halstead: 10,
                maintainability: 100
            }
        },
        aged: {
            src: files,
            filter: aged(3, 'days'),
            options: {
                errorsOnly: false,
                cyclomatic: 2,
                halstead: 10,
                maintainability: 100
            }
        }
    }
});

sample output

You can configure multiple cutoffs, with different options, you can even configure tools to run on files younger than certain time by defining your own filter like this

function fresher(n, units) {
    var isAged = aged(n, units);
    return function (filename) {
        return !isAged(filename);
    };
}

I found two profiles controlled by 3 day limit to be enough. This is long enough to allow implementing a feature, testing and refactoring it before strict limits kick in. A longer inactivity period would probably make it harder to get back into the feature - the memory is no longer fresh.