Friday, August 29, 2008

Testing code or testing requirements?

I have been in several projects where multiple attempts to classify tests into different categories has been attempted. Unit-tests, Web-tests, Behaviour-tests, Fitnesse-tests, Requirement-tests, Integration-tests, all of them written in Java in some ways or another. As you will see, the typical categories refers to different properties of a test. In some cases related to which layers of the code it touches. A typical unit test typically tests the contract of a given class, but will often test intentionally or unintentionally also test adjacent classes unless the rule is strictly enforced. A typical Web test could possibly be Selenium test that verifies various steps of a wizard with or without the back-end systems or databases to drive the data in place.
Some definitions of tests refers to the layer of code it refers to and how many interconnected classes they may include such as a unit-test or an integration test. Others may be more technology specific like Fitnesse-tests or Seleniums test that refers to the technology being used when implementing the test.

I believe that the core property of any test should be to capture and verify the expectation you have related to how the class, module, layer or application should work. That is in most cases why you actually write a test to start with. This is a very code-centric endpoint of the spectrum of continuously transforming user expectations and needs into a running system that provides value. Actually verifying the user-requirements and expectations is in the other end of this spectrum. There are other (possibly subordinate) properties of the process of writing tests also. If you use the same language for both tests and running code or supportive libraries, you will also verify if a class, interface or module is easy to use, and as part of the process you may actually improve it while writing the test. That makes that process of writing the test itself into something that validates more subtle and less quantifiable properties of the code. If doing TDD, writing a test is a starting point for writing code.

There are some bits and pieces missing in this picture. Very often I read about or hear about developers doing various kinds of testing in their project while applying some kind of label to what they are doing. The label may vary but terms like Requirement driven tests are becoming more and more widespread. I believe that this is good, because verification of requirements is probably the most difficult and important thing we are dealing with as system developers. But I have also seen this urge to classify everything as requirement test lead us in the wrong direction. Requirements means different things depending on who you are talking to.

I will try to address that using a brief description taken from a User Story document in one of my previous engagements:

"It should be possible to search for customers by first-name, last-name and birth-date. If multiple customers are returned a list of customers with the current address of residence should be presented along with the result"

We can discuss if this is a good definition of a requirement or not. This is not the point here. The point is, it captures a fragment of a user-expectation, a requirement. We may think of multiple tests in many of the categories above. But when does a written test really capture a requirement? I will say that depends, because requirement means different things depending on who you talk to. For me, as a coder, currently developing a jsp-based solution, one requirement applicable to the actual code I am writing may be that the web-controller returns the customer from a search as an array-type rather than a Collection based type (even though we have generics) simply because there is no way to refer to such a type using the useBean directive in JSP 2.0. This will therefore be a non-functional requirement in my code. It does not mean that I will be writing a test for that requirement in particular, but the unit test of my controller will most definitely be aware of this property.
The part of the code I am writing may also need a utility that merges two arrays into one. The SDK does not provide a utility that does that for us, but if a create one myself, i will most certainly update ArrayUtilsTest to provide a test for the new merge method in ArayUtills. It this a test of a requirement? most definitely, if verifies my expectation. Is it a requirement of the application itself that the system provides such a function? not necessarily, the primary property of the system is to provide customer search by the means necessary as defined by lead-developers and programmers on the project.

The primary conclusion put forward is that tests addresses the expectation of a whole spectrum of what we refer to as requirements. Quite a few of these requirements manifested as code in tests may not be easily translated into functional requirements owned and understood by the customers. Trying to be dogmatic about what kind of tests we write and to say that every line of code should be directly translated into a customer requirements may turn out to be a side-track. Very often when writing tests, the limitations of the test-framework and intrinsic properties of the interfaces or libraries you are using will force you to simplify and limit the number of assertions you apply to you test harness. It is important to understand that the requirement coverage provided by such tools will never be perfect. Strict tests will almost always end up being to brittle.

Getting your automated builds running the right set of tests is difficult. In my experience it ends up with a mix of unit tests, integration tests, web-tests and some flavor of requirement tests like fitnesse. These tests will test customer requirements and quite a bit of other properties. These tests tends to be something owned and maintained by developers more than by domain experts and customers, those that actually understand the problem domain.

What are your experiences?

2 comments:

Johannes Brodwall said...

Good to read your thoughts again, Bjørn.

I think your observation about all tests being in some way related to a requirement is extremely important. When people start discussing the subtle divisions between unit tests and integration tests (for example), this more important discussion is often forgotten. Besides: I write my tests so that they can be run both as unit and integration tests without code change!

My experience is that even with FitNesse tests that are owned by the domain experts in place, a lot of requirements are best expressed in Java-based tests. There are many details to the requirements that I don't think the domain experts want to be bothered about, and that can be more succinctly expressed in the same language as the system.

So I don't feel bad about the test of many requirement details being owned by the developers.

Second, testing technical details like persistence might be far removed from the external requirements. I only add these tests if they make me more quickly able to diagnose problems.

But even these detailed tests are expressed in terms of "what should the system do."

Unknown said...

Enter Key Office Setup, after purchasing MS Office from visit www.office.com/setup, sign in to your Microsoft account then enter product key for office.com/setup