Tuesday, March 27, 2007

Agile Quality: Control vs. Assurance vs. Analysis

Questions about how testing fits onto agile development practices are usually answered into two very unhelpful ways for QA professionals:

1. Agile development eliminates the need for QA, developers test it all themselves.
2. QA has to work harder to keep up with development while maintaining their traditional methodologies and test approaches.

There is truth and falsehood in the statements. When I am asked how to "fit QA in" I like to frame my answer by defining three types of testing, Quality Control, Quality Assurance, and Quality Analysis, then go on to describe how I think each fits into most agile processes.

Quality Control

What a lot of people think of as testing is what I call Quality Control. Think of the guy sitting in the beer plant (or girl if you are a fan of Laverne and Shirley) watching the bottles go by making sure that there is nothing wrong with them before they get capped, this is quality control. In other words, you are inspecting the final product to ensure it meets the criteria for an acceptable product. Within any software project, unit testing, peer review, and regression testing are all forms of quality control. Inside of an agile project, these tasks need to be performed on a continuous basis. Unit tests need to be automated and made a part of a continuous integration strategy and peer reviews can be literally continuous, in the case of pair programming, or mandated on a regular basis in the form of diff reviews before check-ins and code reviews as a part of doneness criteria. There is also no controversy in stating that regression testing needs to be automated and should be run as often as possible.

Ideally, regression tests should be written in such a way as to be maintained along with the code. Using FIT (Framework for Integration Testing) is one good way to keep the tests in sync with the code. If the suite of FIT regression tests are run with every build, those tests that were not refactored along with the rest of the changed code that now fail need to be investigated to see if either the test was missed in refactoring or an actual bug was introduced. Though the cost of maintenance is not zero, there is a lower cost of maintenance and next to no chance that the automated tests will be abandoned.

As you can see, the ownership of quality control within the software product moves more onto the shoulders of the developers. This is as it should be in an agile project where the developer has the responsibility of meeting the customers' requirements, which usually implicitly include no regressions.

Quality Assurance

"Quality Assurance is a part and consistent pair of quality management proving fact-based external confidence to customers and other stakeholders that a product meets needs, expectations, and other requirements. QA assures the existence and effectiveness of procedures that attempt to make sure - in advance - that the expected levels of quality will be reached" Wikipedia

Within an agile project, the customer is constantly involved and informed. As such, there is no real need for "fact-based external confidence" building. Another best practice in agile development it to ensure that the acceptance criteria for all requirements are documented and well understood during the requirements gathering and iteration planning stages. Ideally, the validation that development of the requirements meets the acceptance criteria is also automated (again, FIT is a great tool for this automated validation).

So, again, it is the responsibility of the product owner in creating requirements and the customer working with the developers to assure "expected levels of quality" are reached.

I know I have a number of nervous testers and QA people at this point in reading, but you had to know that the main point was coming last.

Quality Analysis

So far I have mentioned these remarkably well written requirements and acceptance criteria in such a way that some may believe that they magically appear. Well, they do not and they are much too critical to the success of an agile project to neglect them. Here is where an experienced tester can contribute greatly to an agile team. A product owner or customer will provide vision in the form of high-level requirements and basic acceptance criteria. An experience tester can look at these criteria and with an understanding of the system that exists and/or the technologies involved expand and elaborate on these criteria. An especially experienced tester will also be able to suggest missing requirements and non-functional requirements that the customer/product owner has not had the time or experience to consider. A good example of how the acceptance criteria could be augmented is to add the boundary conditions to the acceptance (i.e. added a check for maximum field length sizes to FIT tables)

The tester having been freed from a lot of manual, tedious control and assurance testing can then provide value in performing exploratory testing. Using a tester's natural ability to ferret out instabilities in the system and looking at the system from a high level and turning things on their side as only a try tester/user can do.

Conclusion

In conclusion, what I feel is the role of a tester or QA person in agile projects is more of an Analyst role. Call it what you will, quality analyst, requirements analyst, system analyst, etc... An experienced tester can fill in those technical requirements that are missed by the customer with their high-level perspective but also missed by the developer with their focused perspective. The blend of technical skills, customer perspective, and user experience make the experienced tester/QA person ideal for requirements expansion and elaboration and provides a good career path into product ownership/management.


Tuesday, March 20, 2007

Types of customers

From time to time, I come across this complaint: developers tell me sad stories about some customer managers that "just don't get it". They do not understand Agile principles and the process they impose is somewhat sick. Are they morons?

No, they are not. First of all, it is counterproductive to consider them as inadequate people. There's always a thing that we just don't understand about them.

The most typical thing that happens is that customer managers do not try to help the team to adopt some good Agile engineering practices. Almost always the customer loves all the things about Team and Customer collaboration and Agile project management, but it looks like there's something magically annoying about unit testing, test driven development, refactoring, code reviews and pairing.

After all, we're all reasonable people and we all have the same goal, aren't we? So there must be some consensus about the way we do things. Or we have some deep misunderstanding of our goals.

So why do we need Agile engineering practices? Well, it allows us to shorten the test cycle, which is important for frequent delivery. It raises the quality of the code and the system becomes easier to maintain. As far as money, engineering practices just make the system cheaper to develop.

But note that it will make the development cheaper in the future. At the beginning of the project, it’s just a pure investment. There's no need to spend hours on automating testing if it takes a few minutes to do manual regression testing for the whole system.

So it looks like there are 3 types of customer managers:

  1. Product-Driven managers: People that value their product. They know that their welfare depends upon how successful their product is going to be. Typically they are the owners of the company. Their goals are long-term one: several years or more.
  2. Project-Driven managers: People that value their project. They will be rewarded if the project will be successful. They are mostly hired managers from bureaucratic organizations. Their goals are based on their reward system and are mid-term one.
  3. Demo-driven managers: There are some managers (thanks god I've seen only one) that value next demonstration to their stakeholders.

Obviously, product-driven managers invest enough efforts on technical excellence. Otherwise they going to fail in a year or so or at least might loose some money on trying to develop the system that is not as flexible as it supposed to be. This is the most comfortable customers for the team that value Agile principles.

Project-driven managers are the most typical ones. They always have the battle in their head. The angel tells them how important is to maintain high-quality in the system and how XP engineering practices can help in doing it. And the devil just make them realize that their bonuses depends how great the system's going to look like the next few months and doesn't depend upon how easy it would be to maintain the system in several years - it's going to be another project with some other manager.

Demo-driven managers don't have a struggle in their head. They just don't want to spend a minute on ensuring quality.

So if you have product-driven managers, you are lucky. Your development is a kind that prevents the problems rather than struggling with them.

If you have project-driven manager, just help the angel win ;-). Otherwise, you will spend most of your time heroically fighting with problems that could be easily avoided.

If you have demo-driven managers, God help you.

Tuesday, March 6, 2007

When is Scrum not Scrum?

Via Jason Yip, an interesting summary of potential process flaws of Scrum:

When is Scrum not scrum?




Tool Usage on Scrum Teams

Another great article from Michael Vizdos.
http://www.implementingscrum.com/cartoons/cartoons_files/implementingscrum-20070305.html
Can't agree more.