Chapter 1 What Is Agile Testing, Anyway?



Like a lot of terminology, “agile development” and “agile testing” mean different things to different people. In this chapter, we explain our view of agile, which reflects the Agile Manifesto and general principles and values shared by different agile methods. We want to share a common language with you, the reader, so we’ll go over some of our vocabulary. We compare and contrast agile development and testing with the more traditional phased approach. The “whole team” approach promoted by agile development is central to our attitude toward quality and testing, so we also talk about that here.


Agile Values

“Agile” is a buzzword that will probably fall out of use someday and make this book seem obsolete. It’s loaded with different meanings that apply in different circumstances. One way to define “agile development” is to look at the Agile Manifesto (see Figure 1-1).

Figure 1-1 Agile Manifesto

Using the values from the Manifesto to guide us, we strive to deliver small chunks of business value in extremely short release cycles.

We use the word “agile” in this book in a broad sense. Whether your team is practicing a particular agile method, such as Scrum, XP, Crystal, DSDM, or FDD, to name a few, or just adopting whatever principles and practices make sense for your situation, you should be able to apply the ideas in this book. If you’re delivering value to the business in a timely manner with high-quality software, and your team continually strives to improve, you’ll find useful information here. At the same time, there are particular agile practices we feel are crucial to any team’s success. We’ll talk about these throughout the book.

Chapter 21, “Key Success Factors,” lists key success factors for agile testing.


What Do We Mean by “Agile Testing”?

You might have noticed that we use the term “tester” to describe a person whose main activities revolve around testing and quality assurance. You’ll also see that we often use the word “programmer” to describe a person whose main activities revolve around writing production code. We don’t intend that these terms sound narrow or insignificant. Programmers do more than turn a specification into a program. We don’t call them “developers,” because everyone involved in delivering software is a developer. Testers do more than perform “testing tasks.” Each agile team member is focused on delivering a high-quality product that provides business value. Agile testers work to ensure that their team delivers the quality their customers need. We use the terms “programmer” and “tester” for convenience.

Several core practices used by agile teams relate to testing. Agile programmers use test-driven development (TDD), also called test-driven design, to write quality production code. With TDD, the programmer writes a test for a tiny bit of functionality, sees it fail, writes the code that makes it pass, and then moves on to the next tiny bit of functionality. Programmers also write code integration tests to make sure the small units of code work together as intended. This essential practice has been adopted by many teams, even those that don’t call themselves “agile,” because it’s just a smart way to think through your software design and prevent defects. Figure 1-2 shows a sample unit test result that a programmer might see.

Figure 1-2 Sample unit test output

This book isn’t about unit-level or component-level testing, but these types of tests are critical to a successful project. Brian Marick [2003] describes these types of tests as “supporting the team,” helping the programmers know what code to write next. Brian also coined the term “technology-facing tests,” tests that fall into the programmer’s domain and are described using programmer terms and jargon. In Part II, we introduce the Agile Testing Quadrants and examine the different categories of agile testing. If you want to learn more about writing unit and component tests, and TDD, the bibliography will steer you to some good resources.

If you want to know how agile values, principles, and practices applied to testing can help you, as a tester, do your best work, and help your team deliver more business value, please keep reading. If you’ve bothered to pick up this book, you’re probably the kind of professional who continually strives to grow and learn. You’re likely to have the mind-set that a good agile team needs to succeed. This book will show you ways to improve your organization’s product, provide the most value possible to your team, and enjoy your job.

Lisa’s Story

During a break from working on this chapter, I talked to a friend who works in quality assurance for a large company. It was a busy time of year, and management expected everyone to work extra hours. He said, “If I thought working 100 extra hours would solve our problems, I’d work ‘til 7 every night until that was done. But the truth was, it might take 4,000 extra hours to solve our problems, so working extra feels pointless.” Does this sound familiar?

—Lisa

If you’ve worked in the software industry long, you’ve probably had the opportunity to feel like Lisa’s friend. Working harder and longer doesn’t help when your task is impossible to achieve. Agile development acknowledges the reality that we only have so many good productive hours in a day or week, and that we can’t plan away the inevitability of change.

Agile development encourages us to solve our problems as a team. Business people, programmers, testers, analysts—everyone involved in software development—decides together how best to improve their product. Best of all, as testers, we’re working together with a team of people who all feel responsible for delivering the best possible quality, and who are all focused on testing. We love doing this work, and you will too.

When we say “agile testing” in this book, we’re usually talking about business-facing tests, tests that define the business experts’ desired features and functionality. We consider “customer-facing” a synonym for “business-facing.” “Testing” in this book also includes tests that critique the product and focus on discovering what might be lacking in the finished product so that we can improve it. It includes just about everything beyond unit and component level testing: functional, system, load, performance, security, stress, usability, exploratory, end-to-end, and user acceptance. All these types of tests might be appropriate to any given project, whether it’s an agile project or one using more traditional methodologies.

Agile testing doesn’t just mean testing on an agile project. Some testing approaches, such as exploratory testing, are inherently agile, whether it’s done an agile project or not. Testing an application with a plan to learn about it as you go, and letting that information guide your testing, is in line with valuing working software and responding to change. Later chapters discuss agile forms of testing as well as “agile testing” practices.


A Little Context for Roles and Activities on an Agile Team

We’ll talk a lot in this book about the “customer team” and the “developer team.” The difference between them is the skills they bring to delivering a product.


Customer Team

The customer team includes business experts, product owners, domain experts, product managers, business analysts, subject matter experts—everyone on the “business” side of a project. The customer team writes the stories or feature sets that the developer team delivers. They provide the examples that will drive coding in the form of business-facing tests. They communicate and collaborate with the developer team throughout each iteration, answering questions, drawing examples on the whiteboard, and reviewing finished stories or parts of stories.

Testers are integral members of the customer team, helping elicit requirements and examples and helping the customers express their requirements as tests.


Developer Team

Everyone involved with delivering code is a developer, and is part of the developer team. Agile principles encourage team members to take on multiple activities; any team member can take on any type of task. Many agile practitioners discourage specialized roles on teams and encourage all team members to transfer their skills to others as much as possible. Nevertheless, each team needs to decide what expertise their projects require. Programmers, system administrators, architects, database administrators, technical writers, security specialists, and people who wear more than one of these hats might be part of the team, physically or virtually.

Testers are also on the developer team, because testing is a central component of agile software development. Testers advocate for quality on behalf of the customer and assist the development team in delivering the maximum business value.


Interaction between Customer and Developer Teams

The customer and developer teams work closely together at all times. Ideally, they’re just one team with a common goal. That goal is to deliver value to the organization. Agile projects progress in iterations, which are small development cycles that typically last from one to four weeks. The customer team, with input from the developers, will prioritize stories to be developed, and the developer team will determine how much work they can take on. They’ll work together to define requirements with tests and examples, and write the code that makes the tests pass. Testers have a foot in each world, understanding the customer viewpoint as well as the complexities of the technical implementation (see Figure 1-3).

Figure 1-3 Interaction of roles

Some agile teams don’t have any members who define themselves as “testers.” However, they all need someone to help the customer team write business-facing tests for the iteration’s stories, make sure the tests pass, and make sure that adequate regression tests are automated. Even if a team does have testers, the entire agile team is responsible for these testing tasks. Our experience with agile teams has shown that testing skills and experience are vital to project success and that testers do add value to agile teams.


How Is Agile Testing Different?

We both started working on agile teams at the turn of the millennium. Like a lot of testers who are new to agile, we didn’t know what to expect at first. Together with our respective agile teams, we’ve worked on we’ve learned a lot about testing on agile projects. We’ve also implemented ideas and practices suggested by other agile testers and teams. Over the years, we’ve shared our experiences with other agile testers as well. We’ve facilitated workshops and led tutorials at agile and testing conferences, talked with local user groups, and joined countless discussions on agile testing mailing lists. Through these experiences, we’ve identified differences between testing on agile teams and testing on traditional waterfall development projects. Agile development has transformed the testing profession in many ways.


Working on Traditional Teams

Neither working closely with programmers nor getting involved with a project from the earliest phases was new to us. However, we were used to strictly enforced gated phases of a narrowly defined software development life cycle, starting with release planning and requirements definition and usually ending with a rushed testing phase and a delayed release. In fact, we often were thrust into a gatekeeper role, telling business managers, “Sorry, the requirements are frozen; we can add that feature in the next release.”

As leaders of quality assurance teams, we were also often expected to act as gatekeepers of quality. We couldn’t control how the code was written, or even if any programmers tested their code, other than by our personal efforts at collaboration. Our post-development testing phases were expected to boost quality after code was complete. We had the illusion of control. We usually had the keys to production, and sometimes we had the power to postpone releases or stop them from going forward. Lisa even had the title of “Quality Boss,” when in fact she was merely the manager of the QA team.

Our development cycles were generally long. Projects at a company that produced database software might last for a year. The six-month release cycles Lisa experienced at an Internet start-up seemed short at the time, although it was still a long time to have frozen requirements. In spite of much process and discipline, diligently completing one phase before moving on to the next, it was plenty of time for the competition to come out ahead, and the applications were not always what the customers expected.

Traditional teams are focused on making sure all the specified requirements are delivered in the final product. If everything isn’t ready by the original target release date, the release is usually postponed. The development teams don’t usually have input about what features are in the release, or how they should work. Individual programmers tend to specialize in a particular area of the code. Testers study the requirements documents to write their test plans, and then they wait for work to be delivered to them for testing.


Working on Agile Teams

Transitioning to the short iterations of an agile project might produce initial shock and awe. How can we possibly define requirements and then test and deliver production-ready code in one, two, three, or four weeks? This is particularly tough for larger organizations with separate teams for different functions and even harder for teams that are geographically dispersed. Where do all these various programmers, testers, analysts, project managers, and countless specialties fit in a new agile project? How can we possibly code and test so quickly? Where would we find time for difficult efforts such as automating tests? What control do we have over bad code getting delivered to production?

We’ll share our stories from our first agile experiences to show you that everyone has to start somewhere.

Lisa’s Story

My first agile team embraced Extreme Programming (XP), not without some “learning experiences.” Serving as the only professional tester on a team of eight programmers who hadn’t learned how to automate unit tests was disheartening. The first two-week iteration felt like jumping off a cliff.

Fortunately, we had a good coach, excellent training, a supportive community of agile practitioners with ideas to share, and time to learn. Together we figured out some ins and outs of how to integrate testing into an agile project—indeed, how to drive the project with tests. I learned how I could use my testing skills and experience to add real value to an agile team.

The toughest thing for me (the former Quality Boss) to learn was that the customers, not I, decided on quality criteria for the product. I was horrified after the first iteration to find that the code crashed easily when two users logged in concurrently. My coach patiently explained, over my strident objections, that our customer, a start-up company, wanted to be able to show features to potential customers. Reliability and robustness were not yet the issue.

I learned that my job was to help the customers tell us what was valuable to them during each iteration, and to write tests to ensure that’s what they got.

—Lisa


Janet’s Story

My first foray into the agile world was also an Extreme Programming (XP) engagement. I had just come from an organization that practiced waterfall with some extremely bad practices, including giving the test team a day or so to test six months of code. In my next job as QA manager, the development manager and I were both learning what XP really meant. We successfully created a team that worked well together and managed to automate most of the tests for the functionality. When the organization downsized during the dot-com bust, I found myself in a new position at another organization as the lone tester with about ten developers on an XP project.

On my first day of the project, Jonathan Rasmusson, one of the developers, came up to me and asked me why I was there. The team was practicing XP, and the programmers were practicing test-first and automating all their own tests. Participating in that was a challenge I couldn’t resist. The team didn’t know what value I could add, but I knew I had unique abilities that could help the team. That experience changed my life forever, because I gained an understanding of the nuances of an agile project and determined then that my life’s work was to make the tester role a more fulfilling one.

—Janet

Read Jonathan’s Story

Jonathan Rasmusson, now an Agile Coach at Rasmusson Software Consulting, but Janet’s coworker on her second agile team, explains how he learned how agile testers add value.

So there I was, a young hotshot J2EE developer excited and pumped to be developing software the way it should be developed—using XP. Until one day, in walks a new team member—a tester. It seems management thought it would be good to have a QA resource on the team.

That’s fine. Then it occurred to me that this poor tester would have nothing to do. I mean, as a developer on an XP project, I was writing the tests. There was no role for QA here as far as I could see.

So of course I went up and introduced myself and asked quite pointedly what she was going to do on the project, because the developers were writing all the tests. While I can’t remember exactly how Janet responded, the next six months made it very clear what testers can do on agile projects.

With the automation of the tedious, low-level boundary condition test cases, Janet as a tester was now free to focus on much greater value-add areas like exploratory testing, usability, and testing the app in ways developers hadn’t originally anticipated. She worked with the customer to help write test cases that defined success for upcoming stories. She paired with developers looking for gaps in tests.

But perhaps most importantly, she helped reinforce an ethos of quality and culture, dispensing happy-face stickers to those developers who had done an exceptional job (these became much sought-after badges of honor displayed prominently on laptops).

Working with Janet taught me a great deal about the role testers play on agile projects, and their importance to the team.


Agile teams work closely with the business and have a detailed understanding of the requirements. They’re focused on the value they can deliver, and they might have a great deal of input into prioritizing features. Testers don’t sit and wait for work; they get up and look for ways to contribute throughout the development cycle and beyond.

If testing on an agile project felt just like testing on a traditional project, we wouldn’t feel the need to write a book. Let’s compare and contrast these testing methods.


Traditional vs. Agile Testing

It helps to start by looking at similarities between agile testing and testing in traditional software development. Consider Figure 1-4.

Figure 1-4 Traditional testing vs. agile testing

In the phased approach diagram, it is clear that testing happens at the end, right before release. The diagram is idealistic, because it gives the impression there is as much time for testing as there is for coding. In many projects, this is not the case. The testing gets “squished” because coding takes longer than expected, and because teams get into a code-and-fix cycle at the end.

Agile is iterative and incremental. This means that the testers test each increment of coding as soon as it is finished. An iteration might be as short as one week, or as long as a month. The team builds and tests a little bit of code, making sure it works correctly, and then moves on to next piece that needs to be built. Programmers never get ahead of the testers, because a story is not “done” until it has been tested. We’ll talk much more about this throughout the book.

There’s tremendous variety in the approaches to projects that agile teams take. One team might be dedicated to a single project or might be part of another bigger project. No matter how big your project is, you still have to start somewhere. Your team might take on an epic or feature, a set of related stories at an estimating meeting, or you might meet to plan the release. Regardless of how a project or subset of a project gets started, you’ll need to get a high-level understanding of it. You might come up with a plan or strategy for testing as you prepare for a release, but it will probably look quite different from any test plan you’ve done before.

Every project, every team, and sometimes every iteration is different. How your team solves problems should depend on the problem, the people, and the tools you have available. As an agile team member, you will need to be adaptive to the team’s needs.

Rather than creating tests from a requirements document that was created by business analysts before anyone ever thought of writing a line of code, someone will need to write tests that illustrate the requirements for each story days or hours before coding begins. This is often a collaborative effort between a business or domain expert and a tester, analyst, or some other development team member. Detailed functional test cases, ideally based on examples provided by business experts, flesh out the requirements. Testers will conduct manual exploratory testing to find important bugs that defined test cases might miss. Testers might pair with other developers to automate and execute test cases as coding on each story proceeds. Automated functional tests are added to the regression test suite. When tests demonstrating minimum functionality are complete, the team can consider the story finished.

If you attended agile conferences and seminars in the early part of this decade, you heard a lot about TDD and acceptance testing but not so much about other critical types of testing, such as load, performance, security, usability, and other “ility” testing. As testers, we thought that was a little weird, because all these types of testing are just as vital on agile projects as they are on projects using any other development methodology. The real difference is that we like to do these tests as early in the development process as we can so that they can also drive design and coding.

If the team actually releases each iteration, as Lisa’s team does, the last day or two of each iteration is the “end game,” the time when user acceptance testing, training, bug fixing, and deployments to staging environments can occur. Other teams, such as Janet’s, release every few iterations, and might even have an entire iteration’s worth of “end game” activities to verify release readiness. The difference here is that all the testing is not left until the end.

As a tester on an agile team, you’re a key player in releasing code to production, just as you might have been in a more traditional environment. You might run scripts or do manual testing to verify all elements of a release, such as database update scripts, are in place. All team members participate in retrospectives or other process improvement activities that might occur for every iteration or every release. The whole team brainstorms ways to solve problems and improve processes and practices.

Agile projects have a variety of flavors. Is your team starting with a clean slate, in a greenfield (new) development project? If so, you might have fewer challenges than a team faced with rewriting or building on a legacy system that has no automated regression suite. Working with a third party brings additional testing challenges to any team.

Whatever flavor of development you’re using, pretty much the same elements of a software development life cycle need to happen. The difference with agile is that time frames are greatly shortened, and activities happen concurrently. Participants, tests, and tools need to be adaptive.

The most critical difference for testers in an agile project is the quick feedback from testing. It drives the project forward, and there are no gatekeepers ready to block project progress if certain milestones aren’t met.

We’ve encountered testers who resist the transition to agile development, fearing that “agile development” equates with chaos, lack of discipline, lack of documentation, and an environment that is hostile to testers. While some teams do seem to use the “agile” buzzword to justify simply doing whatever they want, true agile teams are all about repeatable quality as well as efficiency. In our experience, an agile team is a wonderful place to be a tester.


Whole-Team Approach

One of the biggest differences in agile development versus traditional development is the agile “whole-team” approach. With agile, it’s not only the testers or a quality assurance team who feel responsible for quality. We don’t think of “departments,” we just think of the skills and resources we need to deliver the best possible product. The focus of agile development is producing high-quality software in a time frame that maximizes its value to the business. This is the job of the whole team, not just testers or designated quality assurance professionals. Everyone on an agile team gets “test-infected.” Tests, from the unit level on up, drive the coding, help the team learn how the application should work, and let us know when we’re “done” with a task or story.

An agile team must possess all the skills needed to produce quality code that delivers the features required by the organization. While this might mean including specialists on the team, such as expert testers, it doesn’t limit particular tasks to particular team members. Any task might be completed by any team member, or a pair of team members. This means that the team takes responsibility for all kinds of testing tasks, such as automating tests and manual exploratory testing. It also means that the whole team thinks constantly about designing code for testability.

The whole-team approach involves constant collaboration. Testers collaborate with programmers, the customer team, and other team specialists—and not just for testing tasks, but other tasks related to testing, such as building infrastructure and designing for testability. Figure 1-5 shows a developer reviewing reports with two customers and a tester (not pictured).


Figure 1-5 A developer discusses an issue with customers

The whole-team approach means everyone takes responsibility for testing tasks. It means team members have a range of skill sets and experience to employ in attacking challenges such as designing for testability by turning examples into tests and into code to make those tests pass. These diverse viewpoints can only mean better tests and test coverage.

Most importantly, on an agile team, anyone can ask for and receive help. The team commits to providing the highest possible business value as a team, and the team does whatever is needed to deliver it. Some folks who are new to agile perceive it as all about speed. The fact is, it’s all about quality—and if it’s not, we question whether it’s really an “agile” team.

Your situation is unique. That’s why you need to be aware of the potential testing obstacles your team might face and how you can apply agile values and principles to overcome them.


Summary

Understanding the activities that testers perform on agile teams helps you show your own team the value that testers can add. Learning the core practices of agile testing will help your team deliver software that delights your customers.

In this chapter, we’ve explained what we mean when we use the term “agile testing.

We showed how the Agile Manifesto relates to testing, with its emphasis on individuals and interactions, working software, customer collaboration, and responding to change.

We provided some context for this book, including some other terms we use such as “tester,” “programmer,” “customer,” and related terms so that we can speak a common language.

We explained how agile testing, with its focus on business value and delivering the quality customers require, is different from traditional testing, which focuses on conformance to requirements.

We introduced the “whole-team” approach to agile testing, which means that everyone involved with delivering software is responsible for delivering high-quality software.

We advised taking a practical approach by applying agile values and principles to overcome agile testing obstacles that arise in your unique situation.


Загрузка...