Tuesday, March 27, 2007

Agile Quality: Control vs. Assurance vs. Analysis

Questions about how testing fits onto agile development practices are usually answered into two very unhelpful ways for QA professionals:

1. Agile development eliminates the need for QA, developers test it all themselves.
2. QA has to work harder to keep up with development while maintaining their traditional methodologies and test approaches.

There is truth and falsehood in the statements. When I am asked how to "fit QA in" I like to frame my answer by defining three types of testing, Quality Control, Quality Assurance, and Quality Analysis, then go on to describe how I think each fits into most agile processes.

Quality Control

What a lot of people think of as testing is what I call Quality Control. Think of the guy sitting in the beer plant (or girl if you are a fan of Laverne and Shirley) watching the bottles go by making sure that there is nothing wrong with them before they get capped, this is quality control. In other words, you are inspecting the final product to ensure it meets the criteria for an acceptable product. Within any software project, unit testing, peer review, and regression testing are all forms of quality control. Inside of an agile project, these tasks need to be performed on a continuous basis. Unit tests need to be automated and made a part of a continuous integration strategy and peer reviews can be literally continuous, in the case of pair programming, or mandated on a regular basis in the form of diff reviews before check-ins and code reviews as a part of doneness criteria. There is also no controversy in stating that regression testing needs to be automated and should be run as often as possible.

Ideally, regression tests should be written in such a way as to be maintained along with the code. Using FIT (Framework for Integration Testing) is one good way to keep the tests in sync with the code. If the suite of FIT regression tests are run with every build, those tests that were not refactored along with the rest of the changed code that now fail need to be investigated to see if either the test was missed in refactoring or an actual bug was introduced. Though the cost of maintenance is not zero, there is a lower cost of maintenance and next to no chance that the automated tests will be abandoned.

As you can see, the ownership of quality control within the software product moves more onto the shoulders of the developers. This is as it should be in an agile project where the developer has the responsibility of meeting the customers' requirements, which usually implicitly include no regressions.

Quality Assurance

"Quality Assurance is a part and consistent pair of quality management proving fact-based external confidence to customers and other stakeholders that a product meets needs, expectations, and other requirements. QA assures the existence and effectiveness of procedures that attempt to make sure - in advance - that the expected levels of quality will be reached" Wikipedia

Within an agile project, the customer is constantly involved and informed. As such, there is no real need for "fact-based external confidence" building. Another best practice in agile development it to ensure that the acceptance criteria for all requirements are documented and well understood during the requirements gathering and iteration planning stages. Ideally, the validation that development of the requirements meets the acceptance criteria is also automated (again, FIT is a great tool for this automated validation).

So, again, it is the responsibility of the product owner in creating requirements and the customer working with the developers to assure "expected levels of quality" are reached.

I know I have a number of nervous testers and QA people at this point in reading, but you had to know that the main point was coming last.

Quality Analysis

So far I have mentioned these remarkably well written requirements and acceptance criteria in such a way that some may believe that they magically appear. Well, they do not and they are much too critical to the success of an agile project to neglect them. Here is where an experienced tester can contribute greatly to an agile team. A product owner or customer will provide vision in the form of high-level requirements and basic acceptance criteria. An experience tester can look at these criteria and with an understanding of the system that exists and/or the technologies involved expand and elaborate on these criteria. An especially experienced tester will also be able to suggest missing requirements and non-functional requirements that the customer/product owner has not had the time or experience to consider. A good example of how the acceptance criteria could be augmented is to add the boundary conditions to the acceptance (i.e. added a check for maximum field length sizes to FIT tables)

The tester having been freed from a lot of manual, tedious control and assurance testing can then provide value in performing exploratory testing. Using a tester's natural ability to ferret out instabilities in the system and looking at the system from a high level and turning things on their side as only a try tester/user can do.

Conclusion

In conclusion, what I feel is the role of a tester or QA person in agile projects is more of an Analyst role. Call it what you will, quality analyst, requirements analyst, system analyst, etc... An experienced tester can fill in those technical requirements that are missed by the customer with their high-level perspective but also missed by the developer with their focused perspective. The blend of technical skills, customer perspective, and user experience make the experienced tester/QA person ideal for requirements expansion and elaboration and provides a good career path into product ownership/management.


No comments: