Design Driven Testing: Test Smarter, Not Harder

Book description

The groundbreaking book Design Driven Testing brings sanity back to the software development process by flipping around the concept of Test Driven Development (TDD)—restoring the concept of using testing to verify a design instead of pretending that unit tests are a replacement for design. Anyone who feels that TDD is "Too Damn Difficult" will appreciate this book.

Design Driven Testing shows that, by combining a forward-thinking development process with cutting-edge automation, testing can be a finely targeted, business-driven, rewarding effort. In other words, you'll learn how to test smarter, not harder.

  • Applies a feedback-driven approach to each stage of the project lifecycle.

  • Illustrates a lightweight and effective approach using a core subset of UML.

  • Follows a real-life example project using Java and Flex/ActionScript.

  • Presents bonus chapters for advanced DDTers covering unit-test antipatterns (and their opposite, "test-conscious" design patterns), and showing how to create your own test transformation templates in Enterprise Architect.

Table of contents

  1. Copyright
  2. Foreword
  3. About the Author
  4. About the Technical Reviewers
  5. Acknowledgments
  6. Prologue
  7. 1. DDT vs. TDD
    1. 1. Somebody Has It Backwards
      1. 1.1. Problems DDT Sets Out to Solve
        1. 1.1.1. Knowing When You're Done Is Hard
        2. 1.1.2. Leaving Testing Until Later Costs More
        3. 1.1.3. Testing Badly Designed Code Is Hard
        4. 1.1.4. It's Easy to Forget Customer-Level Tests
        5. 1.1.5. Developers Become Complacent
        6. 1.1.6. Tests Sometimes Lack Purpose
      2. 1.2. A Quick, Tools-Agnostic Overview of DDT
        1. 1.2.1. Structure of DDT
        2. 1.2.2. DDT in Action
      3. 1.3. How TDD and DDT Differ
      4. 1.4. Example Project: Introducing the Mapplet 2.0
      5. 1.5. Summary
    2. 2. TDD Using Hello World
      1. 2.1. Top Ten Characteristics of TDD
        1. 2.1.1. 10. Tests drive the design.
        2. 2.1.2. 9. There is a Total Dearth of Documentation.
        3. 2.1.3. 8. Everything is a unit test.
        4. 2.1.4. 7. TDD tests are not quite unit tests (or are they?).
        5. 2.1.5. 6. Acceptance tests provide feedback against the requirements.
        6. 2.1.6. 5. TDD lends confidence to make changes.
        7. 2.1.7. 4. Design emerges incrementally.
        8. 2.1.8. 3. Some up-front design is OK.
        9. 2.1.9. 2. TDD produces a lot of tests.
        10. 2.1.10. 1. TDD is Too Damn Difficult.
      2. 2.2. Login Implemented Using TDD
        1. 2.2.1. Understand the Requirement
        2. 2.2.2. Think About the Design
        3. 2.2.3. Write the First Test-First Test First
        4. 2.2.4. Write the Login Check Code to Make the Test Pass
        5. 2.2.5. Create a Mock Object
        6. 2.2.6. Refactor the Code to See the Design Emerge
      3. 2.3. Acceptance Testing with TDD
      4. 2.4. Conclusion: TDD = Too Damn Difficult
      5. 2.5. Summary
    3. 3. "Hello World!" Using DDT
      1. 3.1. Top Ten Features of ICONIX/DDT
        1. 3.1.1. 10. DDT Includes Business Requirement Tests
        2. 3.1.2. 9. DDT Includes Scenario Tests
        3. 3.1.3. 8. Tests Are Driven from Design
        4. 3.1.4. 7. DDT Includes Controller Tests
        5. 3.1.5. 6. DDT Tests Smarter, Not Harder
        6. 3.1.6. 5. DDT Unit Tests Are "Classical" Unit Tests
        7. 3.1.7. 4. DDT Test Cases Can Be Transformed into Test Code
        8. 3.1.8. 3. DDT Test Cases Lead to Test Plans
        9. 3.1.9. 2. DDT Tests Are Useful to Developers and QA Teams
        10. 3.1.10. 1. DDT Can Eliminate Redundant Effort
      2. 3.2. Login Implemented Using DDT
        1. 3.2.1. Step 1: Create a Robustness Diagram
        2. 3.2.2. Step 2: Create Controller Test Cases
        3. 3.2.3. Step 3: Add Scenarios
        4. 3.2.4. Step 4: Transform Controller Test Cases into Classes
        5. 3.2.5. Step 5: Generate Controller Test Code
        6. 3.2.6. Step 6: Draw a Sequence Diagram
        7. 3.2.7. Step 7: Create Unit Test Cases
        8. 3.2.8. Step 8: Fill in the Test Code
      3. 3.3. Summary
  8. 2. DDT in the Real World: Mapplet 2.0 Travel Web Site
    1. 4. Introducing the Mapplet Project
      1. 4.1. Top Ten ICONIX Process/DDT Best Practices
      2. 4.2. 10. Create an Architecture
      3. 4.3. 9. Agree on Requirements, and Test Against Them
      4. 4.4. 8. Drive Your Design from the Problem Domain
      5. 4.5. 7. Write Use Cases Against UI Storyboards
      6. 4.6. 6. Write Scenario Tests to Verify That the Use Cases Work
      7. 4.7. 5. Test Against Conceptual and Detailed Designs
      8. 4.8. 4. Update the Model Regularly
      9. 4.9. 3. Keep Test Scripts In-Sync with Requirements
      10. 4.10. 2. Keep Automated Tests Up to Date
      11. 4.11. 1. Compare the Release Candidate with Original Use Cases
      12. 4.12. Summary
    2. 5. Detailed Design and Unit Testing
      1. 5.1. Top Ten Unit Testing "To Do"s
        1. 5.1.1. 10. Start with a Sequence Diagram
        2. 5.1.2. 9. Identify Test Cases from Your Design
        3. 5.1.3. 8. Write Scenarios for Each Test Case
        4. 5.1.4. 7. Test Smarter: Avoid Overlapping Tests
        5. 5.1.5. 6. Transform Your Test Cases into UML Classes
        6. 5.1.6. 5. Write Unit Tests and Accompanying Code
          1. 5.1.6.1. Writing the "No Hotels" Test
          2. 5.1.6.2. Implementing SearchHotelService
        7. 5.1.7. 4. Write White Box Unit Tests
          1. 5.1.7.1. Implement a Stunt Service
          2. 5.1.7.2. Update the Test Code to Use the Stunt Service
        8. 5.1.8. 3. Use a Mock Object Framework
          1. 5.1.8.1. The Stunt Service Approach
          2. 5.1.8.2. The Mock Object Framework Approach
        9. 5.1.9. 2. Test Algorithmic Logic with Unit Tests
        10. 5.1.10. 1. Write a Separate Suite of Integration Tests
      2. 5.2. Summary
    3. 6. Conceptual Design and Controller Testing
      1. 6.1. Top Ten Controller Testing "To-Do" List
        1. 6.1.1. 10. Start with a Robustness Diagram
          1. 6.1.1.1. The Use Case
          2. 6.1.1.2. Conceptual Design from Which to Drive Controller Tests
        2. 6.1.2. 9. Identify Test Cases from Your Controllers
        3. 6.1.3. 8. Define One or More Scenarios per Test Case
          1. 6.1.3.1. Understanding Test Scenarios
          2. 6.1.3.2. Identifying the Input Values for a Test Scenario
          3. 6.1.3.3. Using EA to Create Test Scenarios
        4. 6.1.4. 7. Fill in Description, Input, and Acceptance Criteria
        5. 6.1.5. 6. Generate Test Classes
          1. 6.1.5.1. Before Generating Your Tests
          2. 6.1.5.2. Generating the Tests
        6. 6.1.6. 5. Implement the Tests
        7. 6.1.7. 4. Write Code That's Easy to Test
        8. 6.1.8. 3. Write "Gray Box" Controller Tests
        9. 6.1.9. 2. String Controller Tests Together
        10. 6.1.10. 1. Write a Separate Suite of Integration Tests
      2. 6.2. Summary
    4. 7. Acceptance Testing: Expanding Use Case Scenarios
      1. 7.1. Top Ten Scenario Testing "To-Do" List
      2. 7.2. Mapplet Use Cases
        1. 7.2.1. 10. Start with a Narrative Use Case
        2. 7.2.2. 9. Transform to a Structured Scenario
        3. 7.2.3. 8. Make Sure All Paths Have Steps
        4. 7.2.4. 7. Add Pre-conditions and Post-conditions
        5. 7.2.5. 6. Generate an Activity Diagram
        6. 7.2.6. 5. Expand "Threads" Using "Create External Tests"
        7. 7.2.7. 4. Put the Test Case on a Test Case Diagram
        8. 7.2.8. 3. Drill into the EA Testing View
        9. 7.2.9. 2. Add Detail to the Test Scenarios
        10. 7.2.10. 1. Generate a Test Plan Document
      3. 7.3. And the Moral of the Story Is . . .
      4. 7.4. Summary
    5. 8. Acceptance Testing: Business Requirements
      1. 8.1. Top Ten Requirements Testing "To-Do" List
        1. 8.1.1. 10. Start with a Domain Model
        2. 8.1.2. 9. Write Business Requirement Tests
        3. 8.1.3. 8. Model and Organize Requirements
        4. 8.1.4. 7. Create Test Cases from Requirements
        5. 8.1.5. 6. Review Your Plan with the Customer
        6. 8.1.6. 5. Write Manual Test Scripts
        7. 8.1.7. 4. Write Automated Requirements Tests
        8. 8.1.8. 3. Export the Test Cases
        9. 8.1.9. 2. Make the Test Cases Visible
        10. 8.1.10. 1. Involve Your Team!
      2. 8.2. Summary
  9. 3. Advanced DDT
    1. 9. Unit Testing Antipatterns (The "Don'ts")
      1. 9.1. The Temple of Doom (aka The Code)
        1. 9.1.1. The Big Picture
        2. 9.1.2. The HotelPriceCalculator Class
        3. 9.1.3. Supporting Classes
        4. 9.1.4. Service Classes
      2. 9.2. The Antipatterns
        1. 9.2.1. 10. The Complex Constructor
        2. 9.2.2. 9. The Stratospheric Class Hierarchy
        3. 9.2.3. 8. The Static Hair-Trigger
        4. 9.2.4. 7. Static Methods and Variables
        5. 9.2.5. 6. The Singleton Design Pattern
        6. 9.2.6. 5. The Tightly Bound Dependency
        7. 9.2.7. 4. Business Logic in the UI Code
        8. 9.2.8. 3. Privates on Parade
        9. 9.2.9. 2. Service Objects That Are Declared Final
        10. 9.2.10. 1. Half-Baked Features from the Good Deed Coder
      3. 9.3. Summary
    2. 10. Design for Easier Testing
      1. 10.1. Top Ten "Design for Testing" To-Do List
      2. 10.2. The Temple of Doom—Thoroughly Expurgated
        1. 10.2.1. The Use Case—Figuring Out What We Want to Do
        2. 10.2.2. Identify the Controller Tests
        3. 10.2.3. Calculate Overall Price Test
        4. 10.2.4. Retrieve Latest Price Test
      3. 10.3. Design for Easier Testing
        1. 10.3.1. 10. Keep Initialization Code Out of the Constructor
        2. 10.3.2. 9. Use Inheritance Sparingly
        3. 10.3.3. 8. Avoid Using Static Initializer Blocks
        4. 10.3.4. 7. Use Object-Level Methods and Variables
        5. 10.3.5. 6. Avoid the Singleton Design Pattern
        6. 10.3.6. 5. Keep Your Classes Decoupled
        7. 10.3.7. 4. Keep Business Logic Out of the UI Code
        8. 10.3.8. 3. Use Black Box and Gray Box Testing
        9. 10.3.9. 2. Reserve the "Final" Modifier for Constants—Generally Avoid Marking Complex Types Such as Service Objects as Final
        10. 10.3.10. 1. Stick to the Use Cases and the Design
      4. 10.4. Detailed Design for the Quote Hotel Price Use Case
        1. 10.4.1. Controller Test: Calculate Overall Price
        2. 10.4.2. Controller Test: Retrieve Latest Price Test
        3. 10.4.3. The Rebooted Design and Code
      5. 10.5. Summary
    3. 11. Automated Integration Testing
      1. 11.1. Top-Ten Integration Testing "To-Do" List
      2. 11.2. 10. Look for Test Patterns in Your Conceptual Design
      3. 11.3. 9. Don't Forget Security Tests
        1. 11.3.1. Security Testing: SQL Injection Attacks
        2. 11.3.2. Security Testing: Set Up Secure Sessions
      4. 11.4. 8. Decide the "Level" of Integration Test to Write
        1. 11.4.1. How the Three Levels Differ
        2. 11.4.2. Knowing Which Level of Integration Test to Write
      5. 11.5. 7. Drive Unit/Controller-Level Tests from Conceptual Design
      6. 11.6. 6. Drive Scenario Tests from Use Case Scenarios
      7. 11.7. 5. Write End-to-End Scenario Tests
        1. 11.7.1. Emulating the Steps in a Scenario
        2. 11.7.2. Sharing a Test Database
        3. 11.7.3. Mapplet Example: The "Advanced Search" Use Case
        4. 11.7.4. A Vanilla xUnit Scenario Test
      8. 11.8. 4. Use a "Business-Friendly" Testing Framework
      9. 11.9. 3. Test GUI Code as Part of Your Scenario Tests
      10. 11.10. 2. Don't Underestimate the Difficulty of Integration Testing
        1. 11.10.1. Network Latency
        2. 11.10.2. Database Metadata Changes
        3. 11.10.3. Randomly Mutating (aka "Agile") Interfaces
        4. 11.10.4. Bugs in the Remote System
        5. 11.10.5. Cloudy Days
      11. 11.11. 1. Don't Underestimate the Value of Integration Tests
      12. 11.12. Key Points When Writing Integration Tests
      13. 11.13. Summary
    4. 12. Unit Testing Algorithms
      1. 12.1. Top Ten Algorithm Testing "To-Do"s
        1. 12.1.1. 10. Start with a Controller from the Conceptual Design
        2. 12.1.2. 9. Expand the Controllers into an Algorithm Design
        3. 12.1.3. 8. Tie the Diagram Loosely to Your Domain Model
        4. 12.1.4. 7. Split Up Decision Nodes Involving More Than One Check
        5. 12.1.5. 6. Create a Test Case for Each Node
        6. 12.1.6. 5. Define Test Scenarios for Each Test Case
        7. 12.1.7. 4. Create Input Data from a Variety of Sources
        8. 12.1.8. 3. Assign the Logic Flow to Individual Methods and Classes
        9. 12.1.9. 2. Write "White Box" Unit Tests
          1. 12.1.9.1. Testing the "At least one candidate returned" Decision Node
          2. 12.1.9.2. Testing the "Exactly one candidate or one is a 100% match" Decision Node
          3. 12.1.9.3. Send in the Spy Object
            1. 12.1.9.3.1. Spy Object 001: SpyList
            2. 12.1.9.3.2. Spy Object 002: SpyAddressCandidate
          4. 12.1.9.4. Break the Code into Smaller Methods
        10. 12.1.10. 1. Apply DDT to Other Design Diagrams
      2. 12.2. Summary
    5. A. Alice in Use-Case Land
      1. A.1.
        1. A.1.1.
          1. A.1.1.1. It's Not as Surreal as You Might Think . . .
      2. A.2. Introduction
      3. A.3. Part 1
        1. A.3.1. Alice Falls Asleep While Reading
        2. A.3.2. The Promise of Use Case Driven Development
        3. A.3.3. An Analysis Model Links Use-Case Text with Objects
        4. A.3.4. Simple and Straightforward
        5. A.3.5. <<includes>> or <<extends>>
        6. A.3.6. We're Late! We Have to Start Coding!
        7. A.3.7. Alice Wonders How to Get from Use Cases to Code
        8. A.3.8. Abstract... Essential
        9. A.3.9. A Little Too Abstract?
        10. A.3.10. Teleocentricity...
        11. A.3.11. Are We Really Supposed to Specify All This for Every Use Case?
      4. A.4. Part 2
        1. A.4.1. Alice Gets Thirsty
        2. A.4.2. Alice Feels Faint
        3. A.4.3. Imagine... (with Apologies to John Lennon)
        4. A.4.4. Pair Programming Means Never Writing Down Requirements
        5. A.4.5. There's No Time to Write Down Requirements
        6. A.4.6. You Might As Well Say, "The Code Is the Design"
        7. A.4.7. Who Cares for Use Cases?
        8. A.4.8. C3 Project Terminated
        9. A.4.9. OnceAndOnlyOnce?
        10. A.4.10. Alice Refuses to Start Coding Without Written Requirements
        11. A.4.11. You Are Guilty of BDUF...
        12. A.4.12. CMM's Dead! Off with Her Head!
        13. A.4.13. Some Serious Refactoring of the Design
      5. A.5. Part 3
        1. A.5.1. Alice Wakes Up
        2. A.5.2. Closing the Gap Between "What" and "How"
        3. A.5.3. Static and Dynamic Models Are Linked Together
        4. A.5.4. Behavior Allocation Happens on Sequence Diagrams
        5. A.5.5. And the Moral of That Is...
    6. 'Twas Brillig and the Slithy Tests. . .

Product information

  • Title: Design Driven Testing: Test Smarter, Not Harder
  • Author(s):
  • Release date: September 2010
  • Publisher(s): Apress
  • ISBN: 9781430229438