Friday, June 29, 2007 - 09:12

Practical Testdriven Development

We never learned much about testing our code when I was at university - the tutors told us about trying out boundary cases and that sort of thing, and for some tutorials we maybe had to draw up a simple test plan or describe how we tested the program; and the testing phase was mentioned in the Software Lifecycle module, but that was about it. No mention of automated testing, unit testing, etc. Partly that was just the way things were done - this was just before Java was starting to be widely accepted, and we were coding in c/c++.

So I always used to test my code manually, trying to run through everything I could think of. Then a couple of months into my 2nd job we decided to start doing unit testing with nUnit, and I hated it. Partly because I'd already written a bunch of code and now had to go back and write unit tests for it, which is always boring. Partly because a lot of the stuff I was doing wasn't really well suited for automated unit testing, being GUI stuff and stuff that was dependent on initial settings on a piece of hardware which I didn't have control over. And partly because because I hadn't designed my code in a way that allowed for easy unit testing. Now, though, I use it as an integral part of my dev process. It's still not quite testdriven, since the idea there is that you design and write your tests, with expected inputs and outputs, before writing your code; I find that difficult to do, because I think best about what I want the code to do while I'm writing, not sitting with a pencil and a piece of paper.

But what I do is write a chunk of code, then go write unit tests. I find that that's when I start thinking about things like, what if this is null? Or, what should actually happen in this case? I use the unit tests not just for automated regression testing, but also an initial way to exercise my code - it's actually than creating a temporary test gui to run each of your methods, which is what I mostly used to do. And then you have the advantage of being able to easily rerun your tests to make sure new code or changes hasn't broken anything.

Another benefit is that your unit tests can serve as an example of what what the code is supposed to do, what kinf of input it's intended to take, and the kind of values it returns. Sure, this should be described in your xml method comments, but sometimes an example just explains it so much more clearly. And if someone needs to know what your code will do in some odd situation, they don't have to wade through the source or hope that your documentation covers that case, they can just look at the unit tests (and hope that they cover that case).

In theory, unit testing is whitebox testing - you should exercise every line, every branch of your code. In practice, I try to do as much as I can but don't worry too much about about it all, since that can be really tricky, and can take more time than it's really worth.

I often don't write any asserts until after I've run the tests, which isn't quite the way it should work, I know. But it's easier to run the test, see what output you get, figure out if it's what you want (and if not, what you do want), and then write the asserts to match. Not exactly the testdriven philosophy, but I find it works well for me.

One of my initial arguments with unit testing was that you shouldn't write unit tests for your own code - if you think of writing a test for something, you've most likely thought of that issue while coding and have dealt with it. It's the things that you haven't thought of that you need to test, and someone else is more likely to see those things than you are. I still think this is true to an extent, but I tend to go into 'test writing mode' where I do think about things that wouldn't have occurred to me while coding - somehow I seem to switch mindsets.

I mentioned above that originally my code wasn't well designed for unit testing, and this is a point that I often find difficult. I don't believe that you should write and design your code with the ease of unit testing in mind; but generally, following good design principles does lead to easily tested code (short, simple methods;
loosely coupled classes, etc). The biggest issue, for me, is private methods. I refuse to make private methods public just so that I can unit test them; and unfortunately you don't have friends in c#. In a way it doesn't matter, since all your private methods must be used some public method at some point, otherwise it needn't exist, so by testing all your public methods you will test your private methods as well. But it makes your unit bigger, and makes it more difficult to test every branch - specifically where you want to vary the inputs to the enclosed private method, in ways that wouldn't come up in normal execution. And while you can use reflection to call your private methods, in practice it's just too clumsy and annoying to set up to do it foe every single method. I don't have a solution for this one yet.

Update: There's a good discussion about testing private methods here, although there is a shorter way to use reflection to invoke private methods, described here - basically, get the type of the class you want to test, call GetMethod() on the type to get the relevant MethodInfo, then call Invoke on the MethodInfo. Simple.

Update (29/06/2007): I came across a very good discussion on 'designing for tests' vs 'testing what's designed', and whether you should make methods public just so that they can be tested.

I've also started working with RhinoMocks, a mocking framework. This helps to keep your tests modular, and helps with some of the issues I've mentioned above. If you're testing methods in Class A, and they call something in Class B, you just mock out Class B telling it what you expect in return when you pass in a specific value. Sometimes this seems like you end up duplicating tests: you could just write a test that calls Class A's method, which calls Class B, and check you get the right value, and avoid writing tests on both Class A and Class B. But that's merging units, and you should test the two issues separately - testing Class A's reaction to what Class B returns, and testing what Class B generates based on the input from Class A, are two different things.

Of course, you're actually testing Class A's reaction to what you think Class B will return - if Class B was written by someone else, it may not return values you would expect (you may think it returns an empty string rather than null, for example). And that's where using Class B instead of a mock of Class B would reveal issues. On the other hand, is that what a unit test should be testing? Or does it rather belong in a higher level integration test?

RhinoMocks also lets you perform interaction based testing, as opposed to just state based testing. In other words, instead of checking what's returned, you can also check that certain methods, in various mocked classes, were or weren't called during the test. This is obviously pretty powerful, but can also get really complex. Especially when you're coming from a state based testing mindset.

I haven't quite got the hang of mocking frameworks yet - they seem like a really cool idea, but in practice, for non-trivial examples, I find it's often quite tricky. Sometimes it seems easier to just set up a higher level system test using something like Fitnesse. And sometimes that's okay, but sometimes you really need the unit level tests as well.

Labels:

0 Comments:

Post a Comment

<< Home