Index


A

Abbot GUI test tool, 127

Acceptance tests. See also Business-facing tests

definition,

501

Remote Data Monitoring system example,

245

UAT (user acceptance testing) compared with,

130

Ad hoc testing, 198

Adaptability, skills and, 39–40

ADEPT (AS400 Displays for External Prototyping and Testing), 117–118

Advance clarity

customers speaking with one voice,

373–374

determining story size,

375–376

gathering all viewpoints regarding requirements,

374–375

overview of,

140–142

,

373

Advance preparation

downside of,

373

how much needed,

372–373

Agile development

Agile manifesto and,

3–4

barriers to.

See

Barriers to adopting agile development

team orientation of,

6

Agile Estimating and Planning (Cohn), 331, 332

Agile manifesto

people focus,

30

statement of,

4

value statements in,

21

Agile principles. See Principles, for agile testers

Agile testers. See also Testers

agile testing mind-set,

482–483

definition,

4

giving all team members equal weight,

31

hiring,

67–69

what they are,

19–20

Agile testing

definition,

6

as mind-set,

20–21

what we mean,

4–7

Agile values, 3–4

Alcea’s FIT IssueTrack, 84

Alpha tests, 466–467

ant, 284

as build tool,

126

continual builds and,

175

,

291

AnthillPro, 126

ANTS Profiler Pro, 234

Apache JMeter. See JMeter

API-layer functional test tools, 168–170

Fit and FitNesse,

168–170

overview of,

168

testing web Services,

170

API testing

automating,

282

overview of,

205–206

APIs (application programming interfaces), 501

Appleton, Brad, 124

Application under test (AUT), 246

Applications

integration testing with external applications,

459

Remote Data Monitoring system example,

242–243

Architecture

incremental approach to testing,

114

layered,

116

Quadrant 1 tests and,

99

scalability and,

104

,

221

testable,

30

,

115

,

182

,

184

,

267

AS400 Displays for External Prototyping and Testing (ADEPT), 117–118

Assumptions, hidden

agile testers response to,

25

failure to detect,

32

questions that uncover,

136

worst-case scenarios and,

334

Attitude

agile testing mind-set,

482–483

barriers to adopting agile development,

48

vs. skills,

20

Audits, compliance with audit requirements, 89–90

AUT (application under test), 143, 225, 246, 317

Authorization, security testing and, 224

Automated regression testing

key success factors,

484

release candidates and,

458

as a safety net,

261–262

Automated test lists, test plan alternatives, 353–354

Automation

code flux and,

269

of deployment,

232

driving development with,

262–263

of exploratory testing,

201

fear of,

269–270

feedback from,

262

freeing people for other work,

259–261

of functional test structure,

245–247

home-brewed test,

175

investment required,

267–268

learning curve,

266–267

legacy code and,

269

maintainability and,

227–228

manual testing vs.,

258–259

obstacles to,

264–265

old habits and,

270

overview of,

255

programmers’ attitude regarding,

265–266

reasons for,

257–258

responding to change and,

29

ROI and,

264

task cards and,

394–395

testability and,

149–150

tests as documentation,

263–264

Automation strategy

agile coding practices and,

303–304

applying one tool at a time,

312–313

data generation tools,

304–305

database access and,

306–310

design and maintenance and,

292–294

developing,

288–289

identifying tool requirements,

311–312

implementing,

316–319

iterative approach,

299–300

keep it simple,

298–299

learning by doing,

303

managing automated tests,

319

multi-layered approach to,

290–292

organizing test results,

322–324

organizing tests,

319–322

overview of,

273

principles,

298

record/playback tools and,

294

,

296–297

starting with area of greatest pain,

289–290

taking time to do it right,

301–303

test automation pyramid,

276–279

test categories,

274–276

tool selection,

294–298

,

313–316

understanding purpose of tests and,

310–311

what can be automated,

279–285

what might be difficult to automate,

287–288

what should not be automated,

285–287

whole team approach,

300–301

Automation tools, 164–177

API-layer functional test tools,

168–170

builds and,

126

GUI test tools,

170–176

overview of,

164–165

unit-level test tools,

165–168

web services test tool,

170

B

Bach, James, 195, 200, 212

Bach, Jonathan, 201

Back-end testing

behind the GUI,

282

non-UI testing,

204–205

Bamboo, 126

Barriers to adopting agile development, 44–49

conflicting or multiple roles,

45

cultural differences among roles,

48–49

lack of training,

45

lack of understanding of agile concepts,

45–48

loss of identity,

44–45

overview of,

44

past experience and attitudes,

48

Baselines

break-test baseline technique,

363

performance,

235–237

Batch

files,

251

processing,

345

scheduling process,

182

BDD (Behavior-driven development)

easyb tool,

166–168

tools for Quadrant 1 tests,

127

Beck, Kent, 26, 99

Benander, Mark, 51

Benchmarking, 237

Berczuk, Stephen, 124

Beta testing, 466–467

Big picture

agile testers focus on,

23

high-level tests and examples,

397–402

key success factors,

490–491

peril of forgetting,

148

regression tests and,

434

Bolton, Michael, 195

Bos, Erik, 114

Boundary conditions

API testing and,

205

automation and,

11

data generation tools and,

304

identifying test variations,

410

writing test cases for,

137

Boyer, Erika, 140, 163, 372, 432

Brainstorming

automation giving testers better work,

260

prior to iteration,

370

,

381

quadrants as framework for,

253

taking time for,

301

testers,

121

Break-test baseline technique, 363

Browsers, compatibility testing and, 230

Budget limits, 55

Bug tracking. See Defect tracking

Bugs. See Defects

Build

automating,

280–282

challenging release candidate builds,

473

definition,

501

incremental,

178–179

speeding up,

118–119

Build automation tools, 126, 282

Build/Operate/Check pattern, 180

Build tools, 126

BuildBeat, 126

Business analysts, 374

Business expert role

agreement regarding requirements,

428

,

430

common language and,

134

,

291

,

414

on customer team,

6–7

iteration demo and,

443

language of,

291

Power of Three and,

482

tools geared to,

134

Business-facing tests

agile testing as,

6

Quadrants 2 & 3,

97–98

technology-facing tests compared with,

120

Business-facing tests, critiquing the product (Quadrant 3), 189–215

acceptance tests,

245

API testing,

205–206

demonstrations,

191–192

emulator tools,

213–214

end-to-end tests,

249–250

exploratory testing,

195–202

,

248–249

generating test data,

212

GUI testing,

204

monitoring tools,

212–213

overview of,

189–191

reports,

208–210

scenario testing,

192–195

session-based testing,

200–201

setting up tests,

211–212

simulator tools,

213

tools for exploratory testing,

210–211

usability testing,

202–204

user acceptance testing,

250

user documentation,

207–208

web services testing,

207

Business-facing tests, supporting team (Quadrant 2), 129–151

advance clarity,

140–142

automating functional tests,

245–247

common language and,

134–135

conditions of satisfaction and,

142–143

doneness,

146–147

driving development with,

129–132

eliciting requirements,

135–140

embedded testing,

248

incremental approach,

144–146

requirements quandary and,

132–134

ripple effects,

143–144

risk mitigation and,

147–149

testability and automation,

149–150

toolkit for.

See

Toolkit

(Quadrant 2)

web services testing,

247–248

Business impact, 475–476

Business value

adding value,

31–33

as goal of agile development,

5–8

,

69

,

454

metrics and,

75

release cycles and,

3

role, function, business value pattern,

155

team approach and,

16

Busse, Mike, 106, 235, 284, 313

Buwalda, Hans, 193

C

Canonical data, automating databases and, 308–309

Canoo WebTest

automating GUI tests,

184

,

186

GUI regression test suite,

291

GUI smoke tests,

300

GUI test tools,

174–175

organizing tests and,

320

scripts and,

320

XML Editor for,

125

Capability Maturity Model Integration (CMMI), 90–91

Capture-playback tool, 267

Celebrating successes

change implementation and,

50–52

iteration wrap up and,

449–451

Chandra, Apurva, 377

Chang, Tae, 53–54

Change

celebrating successes,

50–52

giving team ownership,

50

introducing,

49

not coming easy,

56–57

responsiveness to,

28–29

talking about fears,

49–50

Checklists

release readiness,

474

tools for eliciting examples and requirements,

156

CI. See Continuous integration (CI)

CI Factory, 126

CMMI (Capability Maturity Model Integration), 90–91

Co-location, team logistics and, 65–66

Coaches

adjusting to agile culture and,

40

learning curve and,

266

providing encouragement,

69

skill development and,

122

training and,

45–46

Cockburn, Alistair, 115

Code

automation and code flux,

269

automation and legacy code,

269

automation strategy and,

303–304

documentation of,

251

standards,

227

writing testable,

115

Code coverage, release metrics, 360–364

Coding and testing, 405–441

adding complexity,

407

alternatives for dealing with bugs,

424–428

choosing when to fix bugs,

421–423

collaborating with programmers,

413–414

dealing with bugs,

416–419

deciding which bugs to log,

420–421

driving development and,

406

facilitating communication,

429–432

focusing on one story,

411–412

identifying variations,

410

iteration metrics,

435–440

media for logging bugs,

423–424

overview of,

405

Power of Three for resolving differences in viewpoint,

411

regression testing and,

432–434

resources,

434–435

risk assessment,

407–409

as simultaneous process,

409–410

,

488–489

starting simple,

406

,

428–429

talking to customers,

414–415

tests that critique the product,

412–413

Cohn, Mike, 50, 155, 276, 296, 331, 332

Collaboration

with customers,

396–397

key success factors,

489–490

with programmers,

413–414

whole team approach,

15–16

Collino, Alessandro, 103, 363

Communication

common language and,

134–135

with customer,

140

,

396–397

DTS (Defect Tracking System) and,

83

facilitating,

23–25

,

429–432

product delivery and,

462–463

size as challenge to,

42–43

between teams,

69–70

test results,

357–358

Comparisons, automating, 283

Compatibility testing, 229–230

Component tests

automating,

282

definition,

501

supporting function of,

5

Conditions of satisfaction

business-facing tests and,

142–143

definition,

501–502

Context-driven testing

definition,

502

quadrants and,

106–107

Continuous build process

failure notification and,

112

feedback and,

119

FitNesse tests and,

357

implementing,

114

integrating tools with,

175

,

311

source code control and,

124

what testers can do,

121

Continuous feedback principle, 22

Continuous improvement principle, 27–28

Continuous integration (CI)

automating,

280–282

as core practice,

486–487

installability and,

231–232

Remote Data Monitoring system example,

244

running tests and,

111–112

Conversion, data migration and, 460–461

Core practices

coding and testing as one process,

488–489

continuous integration,

486–487

incremental approach,

488

overview of,

486

synergy between practices,

489

technical debt management,

487–488

test environments,

487

Courage, principles, 25–26, 71

Credibility, building, 57

Critiquing the product

business facing tests.

See

Business-facing tests

, critiquing the product (Quadrant 3)

technology-facing tests.

See

Technology-facing tests

, critiquing the product (Quadrant 4)

CrossCheck, testing Web Services, 170

CruiseControl, 126, 244, 291

Cultural change, 37. See also Organizations

Cunningham, Ward, 106, 168, 506

Customer expectations

business impact and,

475–476

production support,

475

Customer-facing test. See Business-facing tests

Customer support, DTS (Defect Tracking System) and, 82

Customer team

definition,

502

interaction between customer and developer teams,

8

overview of,

7

Customer testing

Alpha/Beta testing,

466–467

definition,

502

overview of,

464

UAT (user acceptance testing),

464–466

Customers

collaborating with,

396–397

,

489–490

considering all viewpoints during iteration planning,

388–389

delivering value to,

22–23

importance of communicating with,

140

,

414–415

,

444

iteration demo,

191–192

,

443–444

participation in iteration planning,384–385

relationship with,

41–42

reviewing high-level tests with,

400

speaking with one voice,

373–374

CVS, source code control and, 124

D

Data

automating creation or setup,

284–285

cleanup,

461

conversion,

459–461

release planning and,

348

writing task cards and,

392

Data-driven tests, 182–183

Data feeds, testing, 249

Data generation tools, 304–305

Data migration, automating, 310, 460

Databases

avoiding access when running tests,

306–310

canonical data and automation,

308–309

maintainability and,

228

product delivery and updates,

459–461

production-like data and automation,

309–310

setting up/tearing down data for each automated test,

307–308

testing data migration,

310

De Souza, Ken, 223

Deadlines, scope and, 340–341

Defect metrics

overview of,

437–440

release metrics,

364–366

Defect tracking, 79–86

DTS (Defect Tracking System),

79–83

keeping focus and,

85–86

overview of,

79

reasons for,

79

tools for,

83–85

Defect Tracking System. See DTS (Defect Tracking System)

Defects

alternatives for dealing with bugs,424–428

choosing when to fix bugs,

421–423

dealing with bugs,

416–419

deciding which bugs to log,

420–421

media for logging bugs,

423–424

metrics and,

79

TDD (test-driven development) and,

490

writing task cards and,

391–392

zero bug tolerance,

79

,

418–419

Deliverables

“fit and finish” deliverables,

454

nonsoftware,

470

overview of,

468–470

Delivering product

Alpha/Beta testing,

466–467

business impact and,

475–476

communication and,

462–463

customer expectations,

475

customer testing,

464

data conversion and database updates,

459–461

deliverables,

468–470

end game,

456–457

installation testing,

461–462

integration with external applications,

459

nonfunctional testing and,

458–459

overview of,

453

packaging,

474–475

planning time for testing,

455–456

post-development testing cycles,

467–468

production support,

475

release acceptance criteria,

470–473

release management,

470

,

474

releasing product,

470

staging environment and,

458

testing release candidates,

458

UAT (user acceptance testing),

464–466

what if it is not ready,

463–464

what makes a product,

453–455

Demos/demonstrations

of an iteration,

443–444

value to customers,

191–192

Deployment, automating, 280–282

Design

automation strategy and,

292–294

designing with testing in mind,

115–118

Detailed test cases

art and science of writing,

178

big picture approach and,

148–149

designing with,

401

Developer team

interaction between customer and developer teams,

8

overview of,

7–8

Development

agile development,

3–4

,

6

automated tests driving,

262–263

business-facing tests driving,

129–132

coding driving,

406

post-development testing cycles,

467–468

Development spikes, 381

Development team, 502

diff tool, 283

Distributed teams, 431–432

defect tracing systems, and,

82

physical logistics,

66

online high level tests for,

399

online story board for,

357

responding to change,

29

software-based tools to elicit examples and requirements, and,

163–164

Documentation

automated tests as source of,

263–264

problems and fixes,

417

reports,

208–210

of test code,

251

tests as,

402

user documentation,

207–208

Doneness

knowing when a story is done,

104–105

multitiered,

471–472

Driving development with tests. See TDD (test-driven development)

DTS (Defect Tracking System), 80–83

benefits of,

80–82

choosing media for logging bugs,

424

documenting problems and fixes,

417

logging bugs and,

420

reason for not using,

82–83

Dymond, Robin, xxx

Dynamic analysis, security testing tools, 225

E

easyb behavior-driven development tool, 165–168

EasyMock, 127

Eclipse, 125, 316

Edge cases

identifying variations,

410

not having time for,

112

starting simple and then adding complexity,

406–407

test cases for,

137

Embedded system, Remote Data Monitoring example, 248

Empowerment, of teams, 44

Emulator tools, 213–214

End game

Agile testing,

91

iteration,

14

product delivery and,

456–457

release and,

327

End-to-end tests, 249–250

Enjoyment, principle of, 31

Environment, test environment, 347–348

Epic. See also Themes

definition,

502

features becoming,

502

iterations in,

76

,

329

planning,

252

ePlan Services, Inc., xli, 267

Errors, manual testing and, 259

Estimating story size, 332–338

eValid, 234

Event-based patterns, test design patterns, 181

Everyday Scripting with Ruby for Teams, Testers, and You (Marick), 297, 303

Example-driven development, 378–380

Examples

for eliciting requirements,

136–137

tools for eliciting examples and requirements,

155–156

Executable tests, 406

Exploratory testing (ET)

activities, characteristics, and skills (Hagar),

198–200

attributes of exploratory tester,

201–202

automation of,

201

definition,

502–503

end game and,

457

explained (Bolton),

195–198

manual testing and,

280

monitoring tools,

212

overview of,

26

,

195

Remote Data Monitoring system example,

248–249

session-based testing and,

200–201

setup,

211–212

simulators and emulators,

212–213

tests that critique the product,

412–413

tools for,

210–212

tools for generating test data,

212

what should not be automated,

286

External quality, business facing tests defining, 99, 131

External teams, 43, 457

Extreme Programming. See XP (Extreme Programming)

Extreme Programming Explained (Beck), 26

F

Face-to-face communication, 23–25

Failover tests, 232

Failure, courage to learn from, 25

Fake objects, 115, 118, 306, 502–503

Fault tolerance, product delivery and, 459

Fear

barriers to automation,

269–270

change and,

49–50

Fearless Change (Manns and Rising), 121

Feathers, Michael, 117, 288

Features

defects vs.,

417–418

definition,

502–503

focusing on value,

341

Feedback

automated tests providing,

262

continuous feedback principle,

22

iterative approach and,

299–300

key success factors,

484–486

managing tests for,

323–324

Quadrant 1 tests and,

118–119

“Fit and finish” deliverables, 454

Fit (Framework for Integrated Test), 134–135

API-layer functional test tools,

168–169

automation test pyramid and,

278

FIT IssueTrack, Alcea, 83–84

FitNesse

advantages of,

163

API-layer functional test tools,

169–170

automating functional tests with,

30

,

145

business-facing tests with,

154

,

178

collaboration and,

164

continual builds and,

119

,

357

data verification with,

287

doneness and,

472

encouraging use of,

122

examples and,

136

,

169

feedback and,

323–324

file parsing rules illustrated with,

205

functional testing behind the GUI,

291

,

300

home-grown scripts and,

305

JUnit compared with,

299

keywords or actions words for automating tests,

182–183

manual vs. automated testing,

210

memory demands of,

306

organizing tests and,

319–320

overview of,

168–170

remote testing and,

432

“start, stop, continue” list,

446

support for source code control tools,

320

test automation pyramid and,

278

test cards and,

389–390

test cases as documentation,

402

test design and maintenance,

292

testing database layer with,

284

testing stories,

395

traceability requirements and,

88

user acceptance testing,

295

wikis and,

186

Fleisch, Patrick, 377, 440

Flow diagrams

scenario testing and,

194–195

tools for eliciting examples and requirements,

160–163

Fowler, Martin, 117

Framework for Integrated Test. See Fit (Framework for Integrated Test)

Frameworks, 90–93

ftptt, 234

Functional analysts, 386

Functional testing

compatibility issues and,

230

definition,

502–503

end-to-end tests,

249–250

layers,

246

nonfunctional tests compared with,

225

Remote Data Monitoring system example,

245–247

G

Galen, Bob, 455–456, 471

Gärtner, Markus, 395, 476

Geographically dispersed teams

coping with,

376–378

facilitating communication and,

431–432

Gheorghiu, Grig, 225–226, 234

Glover, Andrew, 166

Greenfield projects

code testing and,

116

definition,

502–503

GUI (graphical user interface)

automation strategy and,

293

code flux and,

269

standards,

227

GUI smoke tests

Canoo WebTest and,

300

continual builds and,

119

defect metrics,

437

GUI test tools, 170–176

Canoo Web Test,

174–175

“home-brewed” test automation tools,

175

open source test tools,

172

overview of,

170–171

record/playback tools,

171–172

Ruby with Watir,

172–174

Selenium,

174

GUI testing

API testing,

205–206

automating,

282–283

,

295–296

automation test pyramid and,

278

GUI smoke tests,

119

,

300

,

437

overview of,

204

Web service testing,

207

H

Hagar, Jon, 198

Hardware

compatibility and,

229

cost of test environments,

487

functional testing and,

230

investing in automation and,

267

production environment and,

310

scalability and,

233

test infrastructure,

319

testing product installation,

462

Hendrickson, Elisabeth, 203, 315–316

High-level test cases, 397–402

mockups,

398–399

overview of,

397–398

reviewing with customers,

400

reviewing with programmers,

400–401

test cases as documentation,

402

Hiring a tester, 67–69

Holzer, Jason, 220, 448

Home-grown test tool

automation tools,

314

GUI test tools,

175

test results,

323

httperf, 234

Hudson, 126

I

IBM Rational ClearCase, 124

IDEs (Integrated Development Environments)

definition,

502–503

log analysis tools,

212

tools for Quadrant 1 tests,

124–126

“ility” testing

compatibility testing,

229–230

installability testing,

231–232

interoperability testing,

228–229

maintainability testing,

227–228

reliability testing,

230–231

,

250–251

security testing,

223–227

Impact, system-wide, 342

Implementing Lean Software Development: From Concept to Cash (Poppendieck), 74, 416

Improvement

approach to process improvement,

448–449

continuous improvement principle,

27–28

ideas for improvement from retrospectives,

447–449

Incremental development

building tests incrementally,

178–179

as core practice,

488

“ilities” tests and,

232

thin slices, small chunks,

144–146

traditional vs. agile testing,

12–13

Index cards, logging bugs on, 423

Infrastructure

Quadrant 1 tests,

111–112

test infrastructure,

319

test plans and,

346–347

Installability testing, 231–232

Installation testing, 461–462

Integrated Development Environments. See IDEs (Integrated Development Environments)

Integration testing

interoperability and,

229

product and external applications,

459

IntelliJ IDEA, 125

Internal quality

measuring internal quality of code,

99

meeting team standards,

366

Quadrant 1 tests and,

111

speed and,

112

Interoperability testing, 228–229

Investment, automation requiring, 267–268

Iteration

automation strategy and,

299–300

definition,

502–503

demo,

443–444

life of a tester and,

327

pre-iteration activities.

See

Pre-iteration activities

prioritizing stories and,

338

review,

415

,

435–437

traditional vs. agile testing,

12–13

Iteration kickoff, 383–403

collaboration with customers,

396–397

considering all viewpoints,

385–389

controlling workload,

393

high-level tests and examples,

397–402

iteration planning,

383–384

learning project details,

384–385

overview of,

383

testable stories,

393–396

writing task cards,

389–392

Iteration metrics, 435–440

defect metrics,

437–440

measuring progress with,

435–437

overview of,

435

usefulness of,

439–440

Iteration planning

considering all viewpoints,

385–389

controlling workload,

393

learning project details,

384–385

overview of,

383–384

writing task cards,

389–392

Iteration review meeting, 415

Iteration wrap up, 443–451

celebrating successes,

449–451

demo of iteration,

443–444

ideas for improvement,

447–449

retrospectives,

444–445

“start, stop, continue” exercise for retrospectives,

445–447

ITIL (Information Technology Infrastructure Library), 90–91

J

JBehave, 165

JConsole, 234

JMeter

performance baseline tests,

235

performance testing,

223

,

234

,

313

JMS (Java Messaging Service)

definition,

502–503

integration with external applications and,

243

testing data feeds and,

249

JProfiler, 234

JUnit

FitNesse as alternative for TDD,

299

functional testing,

176

load testing tools,

234–235

unit test tools,

126

,

165

,

291

JUnitPerf, 234

Just in time development, 369. See also Pre-iteration activities

K

Key success factors

agile testing mind-set,

482–483

automating regression testing,

484

big picture approach,

490–491

coding and testing as one process,

488–489

collaboration with customers,

489–490

continuous integration (CI),

486–487

feedback,

484–486

foundation of core practices,

486

incremental approach (thin slices, small chunks),

488

overview of,

481

synergy between practices,

489

technical debt management,

487–488

test environments,

487

whole team approach,

482

Keyword-driven tests, 182–183

King, Joseph, 176

Knowledge base, DTS, 80–81

Kohl, Jonathan, 201, 204, 211

König, Dierk, 320

L

Language, need for common, 134–135

Layered architecture, 116

Lean measurements, metrics, 74–75

Learning

automation strategy and,

303

continuous improvement principle,

27

Learning curve, automation and, 266–267, 303

Legacy code, 269

Legacy code rescue (Feathers), 117

Legacy systems

ccde,

269

definition,

502–503

logging bugs and,

421

testing,

117

Lessons Learned in Software Testing (Pettichord), 485

Lessons learned sessions, 383. See also Retrospectives

Lightweight processes, 73–74

Lightweight test plans, 350

Load testing. See Performance and load testing

LoadRunner, 234

LoadTest, 234

Logistics, physical, 65–66

LogWatch tool, 212

Loss of identity, QA teams fearing, 44–45

Louvion, Christophe, 63

M

Maintainability testing, 227–228

Management, 52–55

advance clarity and,

373–374

cultural change and,

52–54

overview of,

52

providing metrics to,

440

Managers

cultural changes for,

52–54

how to influence testing,

122–123

speaking managerís language,

55

Manns, Mary Lynn, 121–122

Manual testing

automation vs.,

258–259

peril of,

289

Marcano, Antony, 83, 426

Marick, Brian, 5, 24, 97, 134, 170, 203, 303

Martin, Micah, 169

Martin, Robert C., 169

Matrices

high-level tests and,

398–399

text matrices,

350–353

Maven, 126

McMahon, Chris, 260

Mean time between failure, reliability testing, 230

Mean time to failure, reliability testing, 230

Media, for logging bugs, 423–424

Meetings

demonstrations,

71

,

192

geographically dispersed,

376

iteration kickoff,

372

iteration planning,

23–24

,

244

,

331

,

384

,

389

iteration review,

71

,

415

pre-planning,

370–372

release planning,

338

,

345

retrospective,

447

scheduling,

70

sizing process and,

336–337

standup,

177

,

429

,

462

team participation and,

32

test planning,

263

Memory leaks, 237–238

Memory management testing, 237–238

Meszaros, Gerald, 99, 111, 113, 138, 146, 182, 204, 291, 296, 430

Metrics, 74–79

code coverage,

360–364

communication of,

77–78

defect metrics,

364–366

,

437–440

iteration metrics,

435–440

justifying investment in automation,

268

lean measurements,

74–75

overview of,

74

passing tests,

358–360

reasons for tracking defects,

52

,

75–77

,

82

release metrics,

358

ROI and,

78–79

what not to do with,

77

XP radar charts,

47–48

Milestones, celebrating successes, 449–450

MIME (Multipurpose Internet Mail Extensions)

definition,

504

testing data feeds and,

249

Mind maps, 156–158

Mind-set

agile testing as,

20–21

key success factors,

482–483

pro-active,

369–370

“Mini-waterfall” phenomenon, 46–47

Mock objects

definition,

504

risk alleviation and,

459

tools for implementing,

127

unit tests and,

114

Mock-ups

facilitating communication and,

430

high-level tests and,

398–399

stories and,

380

tools for eliciting examples and requirements,

160

Model-driven development, 398

Models

quality models,

90–93

UI modeling example,

399

Monitoring tools, 212–213, 235

Multi-layered approach, automation strategy, 290–292

Multipurpose Internet Mail Extensions (MIME)

definition,

504

testing data feeds and,

249

N

Naming conventions, 227

Nant, 126

Navigation, usability testing and, 204

NBehave, 165

NeoLoad, 234

Nessus, vulnerability scanner, 226

.NET Memory Profiler, 234

NetBeans, 125

NetScout, 235

Non-functional testing. See also Technology-facing tests, critiquing the product (Quadrant 4)

delivering product and,

458–459

functional testing compared with,

225

requirements,

218–219

when to perform,

222

North, Dan, 165

NSpec, 165

NUnit, 126, 165

O

Oleszkiewicz, Jakub, 418

One-off tests, 286–287

Open source tools

agile open source test tools,

172–175

automation and,

314–315

GUI test tools,

172

IDEs,

124–125

OpenWebLoad, 234

Operating systems (OSs), compatibility testing and, 230

Organizations, 37–44

challenges of agile development,

35

conflicting cultures,

43

customer relationships and,

41–42

overview of,

37–38

quality philosophy,

38–40

size and,

42–43

sustainable pace of testing and,

40–41

team empowerment,

44

OSs (operating systems), compatibility testing and, 230

Ownership, giving team ownership, 50

P

Packaging, product delivery and, 474–475

Pair programming

code review and,

227

developers trained in,

61

IDEs and,

125

team approach and,

244

Pair testing, 413

Passing tests, release metrics, 358–360

PerfMon, 235

Perforce, 124

Performance and load testing

automating,

283

baselines,

235–237

memory management testing,

237–238

overview of,

234

product delivery and,

458

scalability testing,

233–234

test environment,

237

tools for,

234–235

when to perform,

223

who performs the test,

220–221

Performance, rewards and, 70–71

Perils

forgetting the big picture,

148

quality police mentality,

39

the testing crunch,

416

waiting for Tuesdayís build,

280

youíre not really part of the team,

32

Perkins, Steve, 156, 159, 373

PerlClip

data generation tools,

305

tools for generating test data,

212

Persona testing, 202–204

Pettichord, Bret, 175, 264, 485

Phased and gated development, 73–74, 129

Physical logistics, 65–66

Planning

advance,

43

iteration.

See

Iteration planning

release/theme planning.

See

Release planning

testing.

See

Test planning

PMO (Project Management Office), 440

Pols, Andy, 134

Ports and Adapters pattern (Cockburn), 115

Post-development testing, 467–468

Post-iteration bugs, 421

Pounder, 234

Power of Three

business expert and,

482

finding a common language,

430

good communication and,

33

,

490

problem solving and,

24

resolving differences in viewpoint,

401

,

411

whole team approach and,

482

Pragmatic Project Automation, 260

Pre-iteration activities, 369–382

advance clarity,

373

benefits of working on stories in advance,

370–372

customers speaking with one voice,

373–374

determining story size,

375–376

evaluating amount of advance preparation needed,

372–373

examples,

378–380

gathering all viewpoints regarding requirements,

374–375

geographically dispersed team and,

376–378

overview of,

369

prioritizing defects,

381

pro-active mindset,

369–370

resources,

381

test strategies and,

380–381

Pre-planning meeting, 370–372

Principles, automation

agile coding practices,

303–304

iterative approach,

299–300

keep it simple,

298–299

learning by doing,

303

overview of,

298

taking time to do it right,

301–303

whole team approach,

300–301

Principles, for agile testers

continuous feedback,

22

continuous improvement,

27–28

courage,

25–26

delivering value to customer,

22–23

enjoyment,

31

face-to-face communication,

23–25

keeping it simple,

26–27

overview of,

21–22

people focus,

30

responsive to change,

28–29

self-organizing,

29–30

Prioritizing defects, 381

Prioritizing stories, 338–340

Pro-active mindset, 369–370

Product

business value,

31–33

delivery.

See

Delivering product

tests that critique (Q3 & Q4),

101–104

what makes a product,

453–455

Product owner

considering all viewpoints during iteration planning,

386–389

definition,

504

iteration planning and,

384

Scrum roles,

141

,

373

tools geared to,

134

Production

logging bugs and,

421

support,

475

Production code

automation test pyramid and,

277–278

definition,

504

delivering value to,

70

programmers writing,

48

source code control and,

434

synchronization with testing,

322

test-first development and,

113

tests supporting,

303–304

Production-like data, automating databases and, 309–310

Professional development, 57

Profiling tools, 234

Programmers

attitude regarding automation,

265–266

big picture tests,

397

collaboration with,

413–414

considering all viewpoints during iteration planning,

387–389

facilitating communication and,

429–430

reviewing high-level tests with,

400–401

tester-developer ratio,

66–67

testers compared with,

4

,

5

training,

61

writing task cards and,

391

Project Management Office (PMO), 440

Projects, PAS example, 176–177

Prototypes

accessible as common language,

134

mock-ups and,

160

paper,

22

,

138–139

,

380

,

400

,

414

paper vs. Wizard of Oz type,

275

UI (user interface),

107

Pulse, 126

PyUnit unit test tool for Python, 126

Q

QA (quality assurance)

definition,

504

in job titles,

31

independent QA team,

60

interchangeable with “test,”

59

whole team approach,

39

working on traditional teams,

9

Quadrant 1. See Technology-facing tests, supporting team (Quadrant 1)

Quadrant 2. See Business-facing tests, supporting team (Quadrant 2)

Quadrant 3. See Business-facing tests, critiquing the product (Quadrant 3)

Quadrant 4. See Technology-facing tests, critiquing the product (Quadrant 4)

Quadrants

automation test categories,

274–276

business facing (Q2 & Q3),

97–98

context-driven testing and,

106–108

critiquing the product (Q3 & Q4),

104

managing technical debt,

106

overview of,

97–98

as planning guide,

490

purpose of testing and,

97

Quadrant 1 summary,

99

Quadrant 2 summary,

99–100

Quadrant 3 summary,

101–102

Quadrant 4 summary,

102–104

shared responsibility and,

105–106

story completion and,

104–105

supporting the team (Q1 & Q2),

100–101

technology facing (Q1 & Q4),

97–98

Quality

customer role in setting quality standards,

26

models,

90–93

organizational philosophy regarding,

38–40

Quality assurance. See QA (quality assurance)

Quality police mentality, 57

Questions, for eliciting requirements, 135–136

R

Radar charts, XP, 47–48

Rasmusson, Jonathan, 11

Record/playback tools

automation strategy and,

294

,

296–297

GUI test tools,

171–172

Recovery testing, 459

Redundancy tests, 232

Reed, David, 171, 377

Refactoring

definition,

504

IDEs supporting,

124–126

Regression suite, 434

Regression tests, 432–434

automated regression tests as a safety net,

261–262

automating as success factor,

484

checking big picture,

434

definition,

504

exploratory testing and,

212

keeping the build “green,”

433

keeping the build quick,

433–434

logging bugs and,

420

regression suite and,

434

release candidates and,

458

Release

acceptance criteria,

470–473

end game,

327

,

456–457

management,

474

product delivery,

470

what if it is not ready,

463–464

Release candidates

challenging release candidate builds,

473

definition,

505

testing,

458

Release metrics

code coverage,

360–364

defect metrics,

364–366

overview of,

358

passing tests,

358–360

Release notes, 474

Release planning, 329–367

overview of,

329

prioritizing and,

338–340

purpose of,

330–331

scope,

340–344

sizing and,

332–337

test plan alternatives,

350–354

test planning,

345–350

visibility and,

354–366

Reliability testing

overview of,

230–231

Remote Data Monitoring system example,

250–251

Remote Data Monitoring system example

acceptance tests,

245

application,

242–243

applying test quadrants,

252–253

automated functional test structure,

245–247

documenting test code,

251

embedded testing,

248

end-to-end tests,

249–250

exploratory testing,

248–249

overview of,

242

reliability testing,

250–251

reporting test results,

251

team and process,

243–244

testing data feeds,

249

unit tests,

244–245

user acceptance testing,

250

web services,

247–248

Remote team member. See Geographically dispersed teams

Repetitive tasks, automating, 284

Reports

documentation and,

208–210

Remote Data Monitoring system example,

251

Repository, 124

Requirements

business-facing tests addressing,

130

documentation of,

402

gathering all viewpoints regarding requirements,

374–375

how to elicit,

135–140

nonfunctional,

218–219

quandary,

132–134

tools for eliciting examples and requirements,

155–156

Resources

completing stories and,

381

hiring agile tester,

67–69

overview of,

66

tester-developer ratio,

66–67

testing and,

434–435

Response time

API,

411

load testing and,

234–235

measurable goals and,

76

web services and,

207

Retrospectives

continuous improvement and,

28

ideas for improvement,

447–449

iteration planning and,

383

overview of,

444–445

process improvement and,

90

“start, stop, and continue” exercise,

445–447

Return on investment. See ROI (return on investment)

Rewards, performance and, 70–71

Rich-client unit testing tools, 127

Rising, Linda, 121–122

Risk

risk analysis,

198

,

286

,

290

,

345–346

risk assessment,

407–409

test mitigating,

147–149

Rogers, Paul, 242, 310, 388, 398

ROI (return on investment)

automation and,

264

definition,

505

lean measurement and,

75

metrics and,

78–79

speaking managerís language,

55

Role, function, business value pattern, 155

Roles

conflicting or multiple roles,

45

cultural differences among,

48–49

customer team,

7

developer team,

7–8

interaction of,

8

RPGUnit, 118

RSpec, 165, 318

Ruby Test::Unit, 170

Ruby with Watir

functional testing,

247

GUI testing,

285

identifying defects with,

212

keywords or actions words for automating tests,

182

overview of,

172–174

test automation with,

186

RubyMock, 127

Rules, managing bugs and, 425

S

Safety tests, 232

Santos, Rafael, 448

Satisfaction conditions. See Conditions of satisfaction

Scalability testing, 233–234

Scenario testing, 192–193

flow diagrams and,

194–195

overview of,

192–195

soap opera tests,

193

Scope, 340–344

business-facing tests defining,

134

deadlines and timelines and,

340–341

focusing on value,

341–342

overview of,

340

system-wide impact,

342

test plans and,

345

third-party involvement and,

342–344

Scope creep, 385, 412

Scripts

automating comparisons,

283

as automation tools,

297

conversion scripts,

461

data generation tools,

305

exploratory testing and,

211–212

Scrum

product owner role,

141

,

373

Remote Data Monitoring system example,

244

sprint reviews,

444

ScrumMaster

approach to process improvement,

448–449

sizing stories and,

336–337

writing task cards and,

391

SDD (story test-driven development)

identifying variations,

410

overview of,

262–263

test-first development and,

263

testing web services and,

170

Security testing

outside-in approach of attackers,

225

overview of,

223–227

specialized knowledge required for,

220

Selenium

GUI test tools,

174–175

implementing automation,

316–318

open source tools,

163

test automation with,

186

,

316

Self-organization

principles,

29–30

self-organizing teams,

69

Session-based testing, 200–201

Setup

automating,

284–285

exploratory testing,

211–212

Shared resources

access to,

43

specialists as,

301

writing tasks and,

390

Shared responsibility, 105–106

Shout-Out Shoebox, 450

“Show me,” collaboration with programmers, 413–414

Simplicity

automation and,

298–299

coding,

406

logging bugs and,

428–429

principle of “keeping it simple,”

26–27

Simulator tools

embedded testing and,

248

overview of,

213

Size, organizational, 42–43

Sizing stories, 332–337

example of,

334–337

how to,

332–333

overview of,

332

tester’s role in,

333–334

Skills

adaptability and,

39–40

vs. attitude,

20

continuous improvement principle,

27

who performs tests and,

220–221

Small chunks, incremental development, 144–146

SOAP

definition,

505

performance tests and,

223

,

234

Soap opera tests, 193

soapUI

definition,

505

performance tests and,

223

,

234

testing Web Services,

170–171

SOATest, 234

Software-based tools, 163

Software Configuration Management Patterns: Effective Teamwork, Practical Integrations (Berczuk and Appleton), 124

Software Endgames (Galen), 471

Source code control

benefits of,

255

overview of,

123–124

tools for,

124

,

320

SOX compliance, 469

Speak with one voice, customers, 373–374

Specialization, 220–221

Speed as a goal, 112

Spikes, development and test, 381

Spreadsheets

test spreadsheets,

353

tools for eliciting examples and requirements,

159

Sprint reviews, 444. See also Demos/demonstrations

SQL*Loader, 460

Stability testing, 28

Staging environment, 458

Stand-up meetings, 177, 429, 462

Standards

maintainability and,

227

quality models and,

90–93

“Start, stop, continue” exercise, retrospectives, 445–447

Static analysis, security testing tools, 225

Steel thread, incremental development, 144, 338, 345

Stories. See also Business-facing tests

benefits of working on in advance of iterations,

370–372

briefness of,

129–130

business-facing tests as,

130

determining story size,

375–376

focusing on one story when coding,

411–412

identifying variations,

410

knowing when a story is done,

104–105

logging bugs and,

420–421

mock-ups and,

380

prioritizing,

338–340

resources and,

381

scope and,

340

sizing.

See

Sizing stories

starting simple,

133

,

406

story tests defined,

505

system-wide impact of,

342

test plans and,

345

test strategies and,

380–381

testable,

393–396

treating bugs as,

425

Story boards

burndown charts,

429

definition,

505–506

examples,

356–357

online,

357

,

384

physical,

356

stickers and,

355

tasks,

222

,

355

,

436

virtual,

357

,

384

,

393

work in progress,

390

Story cards

audits and,

89

dealing with bugs and,

424–425

iteration planning and,

244

story narrative on,

409

Story test-driven development. See SDD (story test-driven development)

Strangler application (Fowler), 116–117

Strategy

automation.

See

Automation strategy

test planning vs. test strategy,

86–87

test strategies,

380–381

Strategy, for writing tests

building tests incrementally,

178–179

iteration planning and,

372

keep the tests passing,

179

overview of,

177–178

test design patterns,

179–183

testability and,

183–185

Stress testing. See Load testing

Subversion (SVN), 124, 320

Success factors. See Key success factors

Successes, celebrating

change implementation and,

50–52

iteration wrap up and,

449–451

Sumrell, Megan, 365, 450

Sustainable pace, of testing, 40–41, 303

SVN (Subversion), 124, 320

SWTBot GUI test tool, 127

Synergy, between practices, 489

System, system-wide impact of story, 342


T

tail-f, 212

Tartaglia, Coni, 439, 454, 470, 473

Task boards. See Story boards

Task cards

automating testing and,

394–395

iteration planning and,

389–392

product delivery and,

462–463

Tasks

completing testing tasks,

415–416

definition,

505–506

TDD (test-driven development)

automated tests driving,

262–263

defects and,

490

definition,

506

overview of,

5

Test-First Development compared with,

113–114

unit tests and,

111

,

244–245

Team City, 126

Team structure, 59–65

agile project teams,

64–65

independent QA team,

60

integration of testers into agile project,

61–63

overview of,

59

traditional functional structure vs agile structure,

64

Teams

automation as team effort,

484

building,

69–71

celebrating success,

50–52

co-located,

65–66

controlling workload and,

393

customer,

7

developer,

7–8

empowerment of,

44

facilitating communication and,

429–432

geographically dispersed,

376–378

,

431–432

giving all team members equal weight,

31

giving ownership to,

50

hiring agile tester for,

67–69

interaction between customer and developer teams,

8

iteration planning and,

384–385

logistics,

59

problem solving and,

123

Remote Data Monitoring system example,

243–244

shared responsibility and,

105–106

traditional,

9–10

using tests to support Quadrants

1

and

2

,

100–101

whole team approach.

See

Whole team approach

working on agile teams,

10–12

Teardown, for tests, 307–308

Technical debt

defects as,

418

definition,

506

managing,

106

,

487–488

Technology-facing tests

overview of,

5

Quadrants 1 & 4,

97–98

Technology-facing tests, critiquing the product (Quadrant 4), 217–239

baselines,

235–237

coding and testing and,

412–413

compatibility testing,

229–230

installability testing,

231–232

interoperability testing,

228–229

maintainability testing,

227–228

memory management testing,

237–238

overview of,

217–219

performance and load testing,

234

performance and load testing tools,

234–235

reliability testing,

230–231

,

250–251

scalability testing,

233–234

security testing,

223–227

test environment and,

237

when to use,

222–223

who performs the test,

220–222

Technology-facing tests, supporting team (Quadrant 1)

build tools,

126

designing with testing in mind,

115–118

ease of accomplishing tasks,

114–115

IDEs for,

124–126

infrastructure supporting,

111–112

overview of,

109–110

purpose of,

110–111

source code control,

123–124

speed as benefit of,

112–114

timely feedback,

118–119

toolkit for,

123

unit test tools,

126–127

unit tests,

244–245

what to do if team doesn’t perform these tests,

121–123

where/when to stop,

119–121

Test automation pyramid

multi-layered approach to automation and,

290–291

overview of,

276–279

three little pigs metaphor,

278

Test behind UI, 282

Test cases

adding complexity,

407

as documentation,

402

example-driven development,

379

identifying variations,

410

starting simple,

406

Test coverage (and/or code coverage), 360–364

Test design patterns, 179–183

Build/Operate/Check pattern,

180

data-driven and keyword-driven tests,

182–183

overview of,

179

test genesis patterns (Veragen),

179

time-based, activity, and event patterns,

181

Test doubles

definition,

506

layered architectures and,

116

Test-driven development. See TDD (test-driven development)

Test environments, 237, 487

Test-First Development

definition,

506

TDD (test-driven development) compared with,

113–114

Test management, 186

Test management toolkit (Quadrant 2), 186

Test plan alternatives, 350-354

Test planning, 345–350

automated test lists, test plan alternatives,

353–354

infrastructure and,

346–347

overview of,

86

,

345

reasons for writing,

345–346

test environment and,

347–348

test plan alternatives,

350–354

test plans, lightweight

350

test plan sample,

351

test strategy vs.,

86–88

traceability and,

88

types of tests and,

346

where to start,

345

Test results

communicating,

357–358

organizing,

322–324

release planning and,

349–350

Test skills. See Skills

Test spikes, 381

Test spreadsheets, 353

Test strategy

iterations, pre-iteration activities and,

380–381

test plan vs.,

86–88

Test stubs

definition,

506

integration with external applications and,

459

unit tests and,

127

Test teams, 506–507. See also Teams

Test tools. See also Toolkits

API-layer functional,

168–170

exploratory testing,

210–211

generating test data with,

212

GUI tests,

170–176

home-brewed,

175

home-grown,

314

IDEs,

124–126

performance testing,

234–235

security testing,

225

unit-level tests,

126–127

,

165–168

web service tests,

170

Test types

alpha/beta,

466–467

exploratory.

See

Exploratory testing (ET)

functional.

See

Functional testing

GUI.

See

GUI testing

integration,

229

,

459

load.

See

Load testing

performance.

See

Performance and load testing

reliability,

230–231

,

250–251

security,

220

,

223–227

stress.

See

Load testing

unit.

See

Unit testing

usability.

See

Usability testing

user acceptance testing.

See

UAT (user acceptance testing)

Test writing strategy. See Strategy, for writing tests

Testability, 183–185

automated vs. manual Quadrant 2 tests,

185

automation and,

149–150

code design and test design and,

184–185

overview of,

183

of stories,

393–396

Testers

adding value,

12

agile testers,

4

,

19–20

agile testing mindset,

20–21

automation allowing focus on more important work,

260

collaboration with customers,

396–397

considering all viewpoints during iteration planning,

386–389

controlling workload and,

393

definition,

507

facilitating communication,

429–430

feedback and,

486

hiring agile tester,

67–69

how to influence testing,

121–122

integration of testers into agile project,

61–63

iterations and,

327

making job easier,

114–115

sizing stories,

333–334

tester-developer ratio,

66–67

writing task cards and,

391

Tester's bill of rights, 49–50

Testing

coding and testing simultaneously,

409–410

completing testing tasks,

415–416

identifying variations,

410

managing,

320–322

organizing test results,

322–324

organizing tests,

319–322

planning time for,

455–456

post-development cycles,

467–468

quadrants.

See

Quadrants

release candidates,

458

risk assessment and,

407–409

sustainable pace of,

40–41

traditional vs. agile,

12–15

transparency of tests,

321–322

Testing in context

context-driven testing and,

106–108

definition,

502

TestNG GUI test tool, 127

Tests that never fail, 286

Text matrices, 350–353

The Grinder, 234

Themes. See also Release planning

definition,

507

prioritizing stories and,

339

writing task cards and,

392

Thin slices, incremental development and, 338

Third parties

compatibility testing and,

230

release planning and,

342–344

software,

163

Tholfsen, Mike, 203

Thomas, Mike, 116, 194

Three little pigs metaphor, 278

Timelines, scope and, 340–341

Toolkit (Quadrant 1)

build tools,

126

IDEs,

124–126

overview of,

123

source code control,

123–124

unit test tools,

126–127

Toolkit (Quadrant 2)

API-layer functional test tools,

168–170

automation tools,

164–165

building tests incrementally,

178–179

checklists,

156

flow diagrams,

160–163

GUI test tools,

170–176

keep the tests passing,

179

mind maps,

156–158

mock-ups,

160

software-based tools,

163

spreadsheets,

159

strategies for writing tests,

177–178

test design patterns,

179–183

test management,

186

testability and,

183–185

tool strategy,

153–155

tools for eliciting examples and requirements,

155–156

unit-level test tools,

165–168

Web service test tool,

170

Toolkit (Quadrant 3)

emulator tools,

213–214

monitoring tools,

212–213

simulator tools,

213

user acceptance testing,

250

Toolkit (Quadrant 4)

baselines,

235–237

performance and load testing tools,

234–235

Tools

API-layer functional test tools,

168–170

automation,

164–165

data generation,

304–305

defect tracking,

83–85

eliciting examples and requirements,

155–156

,

159–163

emulator tools,

213–214

exploratory testing,

210–211

generating test data,

212

GUI test tools,

170–176

home-brewed,

175

home-grown,

314

IDEs,

124–126

load testing,

234–235

monitoring,

212–213

open source,

172

,

314–315

performance testing,

234–235

for product owners and business experts,

134

security testing,

225

simulators,

213

software-based,

163

unit-level tests,

126–127

,

165–168

vendor/commercial,

315–316

web service test tool,

170

Tools, automation

agile-friendly,

316

applying one tool at a time,

312–313

home-brewed,

175

home-grown,

314

identifying tool requirements,

311–312

open source,

314–315

selecting,

294–298

vendors,

315–316

Traceability

DTS and,

82

matrices,

86

test planning and,

88

Tracking, test tasks and status, 354–357

Traditional processes, transitioning. See Transitioning traditional processes to agile

Traditional teams, 9–10

Traditional vs. agile testing, 12–15

Training

as deliverable,

469

lack of,

45

Transitioning traditional processes to agile, 73–93

defect tracking.

See

Defect tracking

existing process and,

88–92

lean measurements,

74–75

lightweight processes and,

73–74

metrics and,

74–79

overview of,

73

test planning.

See

Test planning

U

UAT (user acceptance testing)

post-development testing cycles,

467–468

product delivery and,

464–466

in Quadrant 3,

102

release planning for,

331

,

346

Remote Data Monitoring system example,

250

in test plan,

351

tryng out new features and,

102

writing at iteration kickoff meeting,

372

UI (user interface). See also GUI (graphical user interface)

automation strategy and,

293

modeling and,

399

Unit test tools, 165–168. See also by individual unit tools

behavior-driven development tools,

166–168

list of,

126–127

overview of,

165

Unit testing

automating,

282

BDD (Behavior-driven development),

165–168

definition,

507

metrics and,

76

supporting function of,

5

TDD (test-driven development) and,

111

technology-facing tests,

120

tools for Quadrant 1 tests,

126–127

Usability testing, 202–204

checking out applications of competitors,

204

navigation and,

204

overview of,

202

users needs and persona testing,

202–204

what should not be automated,

285–286

Use cases, 398

User acceptance testing. See UAT (user acceptance testing)

User documentation, 207–208

User interface (UI). See also GUI (graphical user interface)

automation strategy and,

293

modeling and,

399

User story.

See

Story

User story card.

See

Story card

User Stories Applied for Agile Software Development (Cohn), 155

V

Vaage, Carol, 330

Value

adding,

31–33

delivering to customer,

22–23

focusing on,

341–342

testers adding,

12

Values, agile, 3–4. See also Principles, for agile testers

Variations, coding and testing and, 410

Velocity

automation and,

255

,

484

burnout rate and,

79

database impact on,

228

defects and,

487

definition,

507

maximizing,

370

sustainable pace of testing and,

41

taking time to do it right,

301

technical debt and,

106

,

313

,

418

,

506

Vendors

automation tools,

315–316

capture-playback tool,

267

IDEs,

125

planning and,

342–344

source code control tools,

124

working with,

142

,

349

Veragen, Pierre, 76, 163, 179, 295, 363, 372, 444

Version control, 123–124, 186. See also Source Code Control

Viewpoints. See also Big picture

considering all viewpoints during iteration planning,

385–389

gathering all viewpoints regarding requirements,

374–375

Power of Three and,

411

using multiple viewpoints in eliciting requirement,

137–138

Visibility, 354–366

code coverage,

360–364

communicating test results,

357–358

defect metrics,

364–366

number of passing tests,

358–360

overview of,

354

release metrics,

358

tracking test tasks and status,

354–357

Visual Studio, 125

Voris, John, 117

W

Waterfall approach, to development

agile development compared with,

12–13

ìmini-waterfallî phenomenon,

46–47

successes of,

112

test plans and,

346

Watir (Web Application Testing in Ruby), 163, 172–174, 320. See also Ruby with Watir

Web Services Description Language (WSDL), 507

Web service testing

automating,

282

overview of,

207

Remote Data Monitoring system example,

247–248

tools for,

170–171

WebLoad, 234

Whelan, Declan, 321

Whiteboards

example-driven development,

379

facilitating communication,

430

modeling,

399

planning diagram,

371

reviewing high-level tests with programmers,

400–401

test plan alternatives,

353–354

Whole team approach, 325

advantages of,

26

agile vs. traditional development,

15–16

automation strategy and,

300–301

budget limits and,

55

finding enjoyment in work and,

31

key success factors,

482

,

491

pairing testers with programmers,

279

shared responsibility and,

105–106

team building and,

69

team structure and,

59–62

to test automation,

270

test management and,

322

traditional cross-functional team compared with,

64

value of team members and,

70

Wiki

as communication tool,

164

graphical documentation of examples,

398–399

mockups,

160

,

380

requirements,

402

story checklists and,

156

test cases,

372

traceability and,

88

Wilson-Welsh, Patrick, 278

Wizard of Oz Testing, 138–139

Workflow diagrams, 398

Working Effectively With Legacy Code (Feathers), 117, 288

Workload, 393

Worst-case scenarios, 136, 334

Writing tests, strategy for. See Strategy, for writing tests

WSDL (Web Services Description Language), 507

X

XP (Extreme Programming)

agile team embracing,

10–11

courage as core value in,

25

xUnit, 126–127

Y

Yakich, Joe, 316

Z

Zero bug tolerance, 79, 418–419


Загрузка...