Crazy Chess: acceptance tests

First a little side note for anyone that’s been waiting for updates: I am on sabbatical but taking little side jobs once in a while. This will be going on, off and on, for a while. Sometimes I’ll be too tired to blog.

I almost forgot an important part of professional development because it’s very often done by a different team: that of acceptance or functional testing. This level of testing actually runs the product and tests it from the user’s perspective to ensure that it meets their needs, which of course includes actually functioning. It is very nice to have it automated so that as a developer I can just run the tests before I even propose my changes for review. Any time it’s difficult for a developer to run a well designed and specified set of acceptance/regression tests the turnaround time for any change either increases or it becomes a more uncertain process–meaning more mistakes make it into the trunk.

Although it’s not really my direct area I do have some experience with helping test teams come up with tests (theoretically it’s more a job for the customer and PM to help testers decide what to test but theory and practice don’t always coincide). I looked into some of the automated acceptance testing frameworks I’ve heard of, such as FitNesse, but none seem to really fit into the TravisCI setup. It seemed easier just to leverage Boost.Test to drive a custom class I made to spin up the command-line program and talk to it.

The first stage in development involves writing a program that just “draws” a chess position on the console. So the first set of acceptance tests just checks that the startup position creates the right board view, one variation that is not startup, and an invalid position. Certainly far from exhaustive but this seems sufficient to me, especially as the classes that implement this behavior will be unit tested. That and we haven’t really introduced a lot of complexity yet; at this point it’s more important to set off in a good direction.

So in the test directory I moved unit test stuff into a ‘unit’ subdirectory, and created an ‘acceptance’ subdirectory. Both create boost tests with slightly different targets. I made a basic `process` class that uses posix functionality to fork out and exec a command line program–it then uses boost’s IOStreams library to communicate with the program through stdin and stdout. Normally you’d think about how you might test this class but:

  1. It would be really tough to test something like this.
  2. It really does very little–after abstracting a testable interface there’d not be much to test.

So this class isn’t going to be tested in an automated way. It was created and tested a couple times to make sure it worked. Beyond that not much is needed.

Using this class, the three tests were created. We just send the fen to the program via `argv[1]` and expect to read a series of lines that represent the board. There’s some funkiness here but there’s a couple practical reasons to put up with it, the most important is that this whole set of tests will be relpaced rather quickly and the program created will go away also–both to be replaced by more complex behaving programs.

The test code is here.

The simple, failing program is here.

The important thing to note here is that we have our three tests before the code that will pass them is made. This tests the whole of the board issue. Now that this is done we’ll write unit tests to exercize classes and functions that will build up a program that passes these acceptance tests. This process is known as “Test Driven Development” and the single deciding aspect that distinguishes this process is that tests are written before the code that passes them.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s