Book Review : Integrated Test Design and Automation

cover
At Amazon.co.uk
At Amazon.com

Integrated Test Design and Automation

  • Author: Dennis Janssen
    Hans Buwalda
    Iris Pinkster
  • Publisher:
  • Published: 2002
  • Edition: 1
  • Pages:
  • Target Audience :Beginners
  • Contents:

    Introduction
    Preparation
    Analysis
    Navigation
    Execution
    Test Management
    Appendix

Review Date:  24/01/2002 15:44:17
 
Short Summary:
A high level overview of a methodology to support Action Word testing
 
Rating:
3
 
Short Description:
Slightly Undercooked 
Review:

There is a thought provoking little article that starts on page 446 of Software Test Automation. In its 18 pages it clearly outlines a method of using spreadsheets, and a little automation magic, to construct high level test scripts. These Test Scripts can easily be munged into the automated test tool's language, and because they abstract the underlying automation source code from the reader, support review by people unfamiliar with the test tool's script language.

It is, in effect, the use of a macro language for doing test automation but is explained in a way that opens macro test scripting to a wider audience. The reader can then make a stab at automating this process in their favourite spreadsheet's macro language, rather than having to learn Perl or pouring through the Dragon book (.com) .

Hans Buwalda calls these macros "Action Words" and the methodology which supports it he calls "TestFrame".

The other new item of terminology introduced by this method is that of a "Test Cluster". A test cluster being a collection of tests with similar scope and detail which is documented in a spreadsheet.

There are two obvious difficulties associated with this approach: the identification of Test Clusters and the identification of Action Words. I am assuming here that the munging of the action word spreadsheets into an automated test script is within the capabilities of a competent test automator.

Test Clusters are analogous to test scenarios or test scripts so most testers will have their own strategies for developing and documenting these. The construction of action words is a harder task to undertake or teach. Anyone who has done any programming will find the construction of action words analogous to the construction of program functions or subroutines.

I mention all this because when I picked up Integrated Test Design and Automation, I was wondering how they were going to approach the teaching of action word identification and looking forward to an expansion of the technical solutions that were used to munge the data.

It was this preconception that caused me some pause as that is not the prime focus of this book. This is a methodology book and as a consequence examines the project lifecycle and the methodology from cradle to grave. Test Managers, Project Managers, Developers, Test Analysts and Test Automators are all invited to attend.

And so with an effort of will, I shelved my preconception, dived into the text, and stumbled head over heels over twisted sentences and paragraphs. I don't like to mention the use of English, particularly when the authors' first languages quite obviously aren't English, because it always draws attention to my own misuses of the language, but the author's previous works have all been well written. I caution you to be prepared to read carefully.

The introduction sets the scene, introduces us to testing in general and then to the TestFrame model in particular. This is a fairly non-prescriptive methodology in that the book rightly tells us to think about what our goals and risks are and customise the testing to provide a solid structure to best support those goals. Automation is stressed from the outset, and provided we structure our tests and automation correctly then we are assured that we can expect incremental payback very quickly with minimal impact to our tests between releases.

Obviously in order to achieve this we have to prepare well. Therefore we conduct a preliminary study where we ask fundamental questions about the system, the organisation, the development methods, and the desired aims of test process.

A Risk analysis follows using a weighting process (risk factor * risk weight). There is a fairly long example of the risk weighting process and when you come to tailor this method to your own site, you are going to have to create your own risks and your own risk factors, there is little guidance provided on quite how the risk factors are identified.

One of the main techniques in the TestFrame method for preparing a test strategy is the cluster matrix and cluster cards. These are presented as a structured way of presenting the unstructured information gathered during test analysis. Quite how well structured this will be depends entirely upon your analysis skills and not the technique of cluster cards. Presumably these clusters can be documented hierarchically or in a graph representation to aid organisation and are not only presented as a sequential block of clusters. No doubt you will find the best way to do this in your organisation as you tailor the TestFrame method from experience.

The cluster card description does provide a good set of 'things' to think about when targeting areas of the system for test. And I have no doubt that you will be able supplement this with attributes from other test strategy construction methods with which you are familiar.

Preparation continues to cover the when of testing, the standards to be used, the issue recording process and environmental concerns. The Test Environment is a concern in any test planning process and is particularly important in an automated process. The coverage of environment planning in TestFrame has many pointers to test environment issues that might catch you out if you have not encountered them before.

Having prepared well, we are ready to analyse our source documents to produce tests. The focus of the analysis phase here is on tests which can be re-used. There is very little throw away with this process, we are building for the future.

This is one of the few books that I've read which tries to define what a "Test Condition" is, sadly it doesn't try quite hard enough and points the reader to a paper written in STQE by Hans Buwalda and a co-author Maartje Kasdorp. Quite why this 5 page paper hasn't been put in as an appendix in this book is unclear. I would certainly have preferred to see the relevant information from this paper in this book rather than having to source it on sticky minds myself. Particularly as I could not see that the paper in question actually has an in depth discussion of the construction of test conditions with "the appropriate rigor".

The analysis section of the book is probably the weakest section of the book. Developing good action words and deciding what parameters to give them has to be one of the most important parts of ensuring that this method does require minimal change between releases. The coverage of action work design is disappointing. You would be well advised to spend some time supplementing your reading of this book with a some good programming and design books, possibly even reading some language design or compiler books as this is essentially what you are doing.

Examined test design techniques are decision tables, entity lifecycles (or more specifically CRUD matrices), equivalence partitioning, syntax testing and joint testware development. This section acts as an introduction to these techniques and you will have to search elsewhere for more in depth coverage. Minimisation of data dependency between tests is obviously very important when you want to ensure maximum independence between tests to ensure that one test failure does not bring down your entire test run, there is a small section on this.

A high level overview of an Action word translation engine is provided in Chapter 4. This also deals with mechanisms for implementing action words. The advantage of using a spreadsheet is that the action word lines have, in effect, already been pre-tokenised and a complex parser does not need to be written. I.e. the action words themselves will always be in column A, and data in the other columns are the parameters. This makes the spreadsheet macro language an ideal platform for writing your first interpreter.

Overview coverage of GUI vs. Text systems, synchronisation and testing your macros is provided. There is also discussion of variable naming standards, code commenting. The coverage is fairly superficial and anyone who did not know how to implement a test engine of this sort is unlikely to be able to implement one from this book alone.

Towards the end of the book, the chapters grow shorter and shorter. The execution chapter seems to be aimed at the manager rather than the test automator and the automator would be well advised to read Software Test Automation to discover more pitfalls and strategies relating to automated test execution. The manager will no doubt appreciate the succinctness of this chapter.

The final chapter on test management is probably the best chapter, it is certainly the most well written, is clear and to the point. But has been thrown in almost as an after thought to round off the methodological nature of the book. It serves as a useful reminder to the test manager of the issues of resistance, commitment, clarity, conflict, motivation and dependency.

And so we come to the end and I find myself in two minds about the book. It isn't as good as I had hoped it would be, it certainly isn't as good as it could have been.

It could have been far more practical replete with a CD Rom and examples of the automation approach in general. It could have been filled with anecdotes on how to design the perfect action word, lessons to avoid, howlers that that an inexperienced test automator might construct (functions with 30 parameters for example).

But it isn't. It hints at the processes involved with automating action word translation and leaves it as an exercise to the reader to write their own action word macro processor.

It is an exploration of a particular test methodology but again lacks realistic examples. It might have been useful to have seen the method demonstrated by exploring the testing of an action word engine itself.

The authors obviously know what they are talking about, they simply don't give enough to the reader to allow them to do it themselves. This is not a book that will divulge its secrets quickly or thoroughly. Be prepared to burn the midnight oil, read between the lines, surf the web and learn both programming and design if you want to do Action Words.

More information is available at www.testframe.com