[<<] Industrie Toulouse

There's a pretty good article at ONLamp.com about More Test-Driven Development in Python. It's basic unit test work, showcasing writing tests first and code second. I'm never consistent when it comes to unit testing. On some projects, I use them. On others, I don't. Some projects use code that's been around for years and has no tests. Sometimes, it's just really hard to unit test code meant for Zope 2. And when I say that, I mean underlying Python Product based code, not scripts and forms and other UI related elements. My Zope 2 projects have less and less code managed in the ZODB and more from underlying Python objects.

Zope 3, on the other hand, advocates testing heavily. In the developer documentation for writing a new content object, writing unit tests is step #4, just after preparation, initial design, and writing interfaces. Zope 3 also uses Python's doctest system for writing tests inside of prose. The tests are example code that can be executed in the Python interpreter. Functional Tests are also covered in Zope 3, which are tests that can test a whole application and its UI, whereas unittests and doctests are typically used to cover small components and code outside of their environment. The Zope X3 3.0.0 release itself contains nearly 4000 tests in unit or doctest form. It's great to see testing promoted so heavily in its documentation.

On a recent project, I was reminded (again) of how nice it is to have unit tests. I was running behind schedule and had a few simple tests written to cover some early parts of the system, and I was fast approaching the heart of the application - a complex workflow interaction. I knew that testing it by hand would be menial and terrible, because it would require so much repeat typing and multiple web browsers to be open. But since I was behind schedule and had no UI to show progress with, I kept putting off writing tests. The big turnoff was building the test harness - getting a lot of what would have been repeated manual entry into some setup code. I think that unlike most unit tests, testing workflow or any other state-sensitive system can be a bit more difficult because so much work has to be done to get the system to a particular state where you can test a transition or value. It can be daunting, and daunting items can be easy to put off until another day.

But when I got to the point where I started really having to implement this workflow system, I knew without a doubt that I would be sunk if I didn't have tests to automate the process. So I hunkered down and wrote a Python module in my tests directory called support, and filled it with classes, functions, and dummy objects that could be used in unit tests to build up the system. As I tested each piece of workflow, I could add the successful steps of those tests into the support objects so that other tests wouldn't have to duplicate the work. Getting the test harness and initial tests in place took nearly a full day of development time, but it was worth it. I didn't have anything visual to show anybody yet, but it made everything so much more dependable. And I ended up making my basic deadline anyways.

The moral of the story, which is one I have to be reminded of constantly, is that it's worth it to spend the extra time to make a supporting test harness, even if it appears to be a difficult task. Too often I fall victim to the thoughts of "I just need to get this done, I don't have time to start writing tests" and too often I find myself becoming the victim of "man, I wish I had tests for this!" later down the road when it feels like it's too late.