Software Testing and Continuous Quality Improvement, 3rd Edition

Book description

This third edition of a bestseller provides a comprehensive look at software testing as part of the project management process, emphasizing testing and quality goals early on in development. Building on the success of previous editions, the text explains how compliance testing helps an IT organization meet Sarbanes-Oxley and Basel II requirements. The sections on test effort estimation are fully updated, providing greater emphasis on testing metrics. New chapters address process, application, and organizational metrics. The book also examines all aspects of functional testing and looks at the relation between changing business strategies and changes to applications in development.

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright Page
  5. Table of Contents
  6. Acknowledgments
  7. Introduction
  8. About the Author
  9. SECTION 1 SOFTWARE QUALITY IN PERSPECTIVE
    1. 1 A Brief History of Software Testing
      1. Historical Software Testing and Development Parallels
      2. Extreme Programming
      3. Evolution of Automated Testing Tools
        1. Static Capture/Replay Tools (without Scripting Language)
        2. Static Capture/Replay Tools (with Scripting Language)
        3. Variable Capture/Replay Tools
    2. 2 Quality Assurance Framework
      1. What Is Quality?
      2. Prevention versus Detection
      3. Verification versus Validation
      4. Software Quality Assurance
      5. Components of Quality Assurance
        1. Software Testing
      6. Quality Control
        1. Software Configuration Management
          1. Elements of Software Configuration Management
      7. Software Quality Assurance Plan
        1. Steps to Develop and Implement a Software Quality Assurance Plan
          1. Step 1: Document the Plan
          2. Step 2: Obtain Management Acceptance
          3. Step 3: Obtain Development Acceptance
          4. Step 4: Plan for Implementation of the SQA Plan
          5. Step 5: Execute the SQA Plan
      8. Quality Standards
        1. Sarbanes–Oxley
        2. ISO9000
        3. Capability Maturity Model (CMM)
          1. Level 1: Initial
          2. Level 2: Repeatable
          3. Level 3: Defined
          4. Level 4: Managed
          5. Level 5: Optimized
        4. People CMM
        5. CMMI
        6. Malcolm Baldrige National Quality Award
      9. Notes
    3. 3 Overview of Testing Techniques
      1. Black-Box Testing (Functional)
      2. White-Box Testing (Structural)
      3. Gray-Box Testing (Functional and Structural)
      4. Manual versus Automated Testing
      5. Static versus Dynamic Testing
      6. Taxonomy of Software Testing Techniques
    4. 4 Transforming Requirements to Testable Test Cases
      1. Introduction
      2. Software Requirements as the Basis of Testing
      3. Requirement Quality Factors
        1. Understandable
        2. Necessary
        3. Modifiable
        4. Nonredundant
        5. Terse
        6. Testable
        7. Traceable
        8. Within Scope
      4. Numerical Method for Evaluating Requirement Quality
      5. Process for Creating Test Cases from Good Requirements
        1. Step 1: Review the Requirements
        2. Step 2: Write a Test Plan
        3. Step 3: Identify the Test Suite
        4. Step 4: Name the Test Cases
        5. Step 5: Write Test Case Descriptions and Objectives
        6. Step 6: Create the Test Cases
        7. Step 7: Review the Test Cases
      6. Transforming Use Cases to Test Cases
        1. Step 1: Draw a Use Case Diagram
        2. Step 2: Write the Detailed Use Case Text
        3. Step 3: Identify Use Case Scenarios
        4. Step 4: Generating the Test Cases
        5. Step 5: Generating Test Data
        6. Summary
      7. What to Do When Requirements Are Nonexistent or Poor?
        1. Ad Hoc Testing
          1. The Art of Ad Hoc Testing
        2. Advantages and Disadvantages of Ad Hoc Testing
        3. Exploratory Testing
          1. The Art of Exploratory Testing
          2. Exploratory Testing Process
          3. Advantages and Disadvantages of Exploratory Testing
    5. 5 Quality through Continuous Improvement Process
      1. Contribution of Edward Deming
      2. Role of Statistical Methods
        1. Cause-and-Effect Diagram
        2. Flowchart
        3. Pareto Chart
        4. Run Chart
        5. Histogram
        6. Scatter Diagram
        7. Control Chart
      3. Deming’s 14 Quality Principles
        1. Point 1: Create Constancy of Purpose
        2. Point 2: Adopt the New Philosophy
        3. Point 3: Cease Dependence on Mass Inspection
        4. Point 4: End the Practice of Awarding Business on Price Tag Alone
        5. Point 5: Improve Constantly and Ceaselessly the System of Production and Service
        6. Point 6: Institute Training and Retraining
        7. Point 7: Institute Leadership
        8. Point 8: Drive Out Fear
        9. Point 9: Break Down Barriers between Staff Areas
        10. Point 10: Eliminate Slogans, Exhortations, and Targets for the Workforce
        11. Point 11: Eliminate Numerical Goals
        12. Point 12: Remove Barriers to Pride of Workmanship
        13. Point 13: Institute a Vigorous Program of Education and Retraining
        14. Point 14: Take Action to Accomplish the Transformation
      4. Continuous Improvement through the Plan, Do, Check, Act Process
      5. Going around the PDCA Circle
  10. SECTION 2 WATERFALL TESTING REVIEW
    1. 6 Overview
      1. Waterfall Development Methodology
      2. Continuous Improvement “Phased” Approach
      3. Psychology of Life-Cycle Testing
      4. Software Testing as a Continuous Improvement Process
      5. The Testing Bible: Software Test Plan
      6. Major Steps in Developing a Test Plan
        1. Step 1: Define the Test Objectives
        2. Step 2: Develop the Test Approach
        3. Step 3: Define the Test Environment
        4. Step 4: Develop the Test Specifications
        5. Step 5: Schedule the Test
        6. Step 6: Review and Approve the Test Plan
      7. Components of a Test Plan
      8. Technical Reviews as a Continuous Improvement Process
      9. Motivation for Technical Reviews
      10. Types of Reviews
        1. Structured Walkthroughs
        2. Inspections
      11. Participant Roles
      12. Steps for an Effective Review
        1. Step 1: Plan for the Review Process
        2. Step 2: Schedule the Review
        3. Step 3: Develop the Review Agenda
        4. Step 4: Create a Review Report
    2. 7 Static Testing the Requirements
      1. Testing the Requirements with Ambiguity Reviews
      2. Testing the Requirements with Technical Reviews
      3. Inspections and Walkthroughs
      4. Checklists
        1. Methodology Checklist
      5. Requirements Traceability Matrix
      6. Building the System/Acceptance Test Plan
    3. 8 Static Testing the Logical Design
      1. Data Model, Process Model, and the Linkage
      2. Testing the Logical Design with Technical Reviews
      3. Refining the System/Acceptance Test Plan
    4. 9 Static Testing the Physical Design
      1. Testing the Physical Design with Technical Reviews
      2. Creating Integration Test Cases
      3. Methodology for Integration Testing
        1. Step 1: Identify Unit Interfaces
        2. Step 2: Reconcile Interfaces for Completeness
        3. Step 3: Create Integration Test Conditions
        4. Step 4: Evaluate the Completeness of Integration Test Conditions
    5. 10 Static Testing the Program Unit Design
      1. Testing the Program Unit Design with Technical Reviews
        1. Sequence
        2. Selection
        3. Iteration
      2. Creating Unit Test Cases
    6. 11 Static Testing and Dynamic Testing the Code
      1. Testing Coding with Technical Reviews
      2. Executing the Test Plan
      3. Unit Testing
      4. Integration Testing
      5. System Testing
      6. Acceptance Testing
      7. Defect Recording
  11. SECTION 3 SPIRAL (AGILE) SOFTWARE TESTING METHODOLOGY: PLAN, DO, CHECK, ACT
    1. 12 Development Methodology Overview
      1. Limitations of Life-Cycle Development
      2. The Client/Server Challenge
      3. Psychology of Client/Server Spiral Testing
        1. The New School of Thought
        2. Tester/Developer Perceptions
        3. Project Goal: Integrate QA and Development
        4. Iterative/Spiral Development Methodology
      4. Role of JADs
      5. Role of Prototyping
      6. Methodology for Developing Prototypes
        1. Step 1: Develop the Prototype
        2. Step 2: Demonstrate Prototypes to Management
        3. Step 3: Demonstrate Prototype to Users
        4. Step 4: Revise and Finalize Specifications
        5. Step 5: Develop the Production System
      7. Continuous Improvement “Spiral” Testing Approach
    2. 13 Information Gathering (Plan)
      1. Step 1: Prepare for the Interview
        1. Task 1: Identify the Participants
        2. Task 2: Define the Agenda
      2. Step 2: Conduct the Interview
        1. Task 1: Understand the Project
        2. Task 2: Understand the Project Objectives
        3. Task 3: Understand the Project Status
        4. Task 4: Understand the Project Plans
        5. Task 5: Understand the Project Development Methodology
        6. Task 6: Identify the High-Level Business Requirements
        7. Task 7: Perform Risk Analysis
          1. Computer Risk Analysis
          2. Method 1: Judgment and Instinct
          3. Method 2: Dollar Estimation
          4. Method 3: Identifying and Weighting Risk Attributes
      3. Step 3: Summarize the Findings
        1. Task 1: Summarize the Interview
        2. Task 2: Confirm the Interview Findings
    3. 14 Test Planning (Plan)
      1. Step 1: Build a Test Plan
        1. Task 1: Prepare an Introduction
        2. Task 2: Define the High-Level Functional Requirements (in Scope)
        3. Task 3: Identify Manual/Automated Test Types
        4. Task 4: Identify the Test Exit Criteria
        5. Task 5: Establish Regression Test Strategy
        6. Task 6: Define the Test Deliverables
        7. Task 7: Organize the Test Team
        8. Task 8: Establish a Test Environment
        9. Task 9: Define the Dependencies
        10. Task 10: Create a Test Schedule
        11. Task 11: Select the Test Tools
        12. Task 12: Establish Defect Recording/Tracking Procedures
        13. Task 13: Establish Change Request Procedures
        14. Task 14: Establish Version Control Procedures
        15. Task 15: Define Configuration Build Procedures
        16. Task 16: Define Project Issue Resolution Procedures
        17. Task 17: Establish Reporting Procedures
        18. Task 18: Define Approval Procedures
      2. Step 2: Define the Metric Objectives
        1. Task 1: Define the Metrics
        2. Task 2: Define the Metric Points
      3. Step 3: Review/Approve the Plan
        1. Task 1: Schedule/Conduct the Review
        2. Task 2: Obtain Approvals
    4. 15 Test Case Design (Do)
      1. Step 1: Design Function Tests
        1. Task 1: Refine the Functional Test Requirements
        2. Task 2: Build a Function/Test Matrix
      2. Step 2: Design GUI Tests
        1. Ten Guidelines for Good GUI Design
        2. Task 1: Identify the Application GUI Components
        3. Task 2: Define the GUI Tests
      3. Step 3: Define the System/Acceptance Tests
        1. Task 1: Identify Potential System Tests
        2. Task 2: Design System Fragment Tests
        3. Task 3: Identify Potential Acceptance Tests
      4. Step 4: Review/Approve Design
        1. Task 1: Schedule/Prepare for Review
        2. Task 2: Obtain Approvals
    5. 16 Test Development (Do)
      1. Step 1: Develop Test Scripts
        1. Task 1: Script the Manual/Automated GUI/Function Tests
        2. Task 2: Script the Manual/Automated System Fragment Tests
      2. Step 2: Review/Approve Test Development
        1. Task 1: Schedule/Prepare for Review
        2. Task 2: Obtain Approvals
    6. 17 Test Coverage through Traceability
      1. Use Cases and Traceability
      2. Summary
    7. 18 Test Execution/Evaluation (Do/Check)
      1. Step 1: Setup and Testing
        1. Task 1: Regression Test the Manual/Automated Spiral Fixes
        2. Task 2: Execute the Manual/Automated New Spiral Tests
        3. Task 3: Document the Spiral Test Defects
      2. Step 2: Evaluation
        1. Task 1: Analyze the Metrics
      3. Step 3: Publish Interim Report
        1. Task 1: Refine the Test Schedule
        2. Task 2: Identify Requirement Changes
    8. 19 Prepare for the Next Spiral (Act)
      1. Step 1: Refine the Tests
        1. Task 1: Update the Function/GUI Tests
        2. Task 2: Update the System Fragment Tests
        3. Task 3: Update the Acceptance Tests
      2. Step 2: Reassess the Team, Procedures, and Test Environment
        1. Task 1: Evaluate the Test Team
        2. Task 2: Review the Test Control Procedures
        3. Task 3: Update the Test Environment
      3. Step 3: Publish Interim Test Report
        1. Task 1: Publish the Metric Graphics
          1. Test Case Execution Status
          2. Defect Gap Analysis
          3. Defect Severity Status
          4. Test Burnout Tracking
    9. 20 Conduct the System Test (Act)
      1. Step 1: Complete System Test Plan
        1. Task 1: Finalize the System Test Types
        2. Task 2: Finalize System Test Schedule
        3. Task 3: Organize the System Test Team
        4. Task 4: Establish the System Test Environment
        5. Task 5: Install the System Test Tools
      2. Step 2: Complete System Test Cases
        1. Task 1: Design/Script the Performance Tests
        2. Monitoring Approach
        3. Probe Approach
        4. Test Drivers
        5. Task 2: Design/Script the Security Tests
          1. A Security Design Strategy
        6. Task 3: Design/Script the Volume Tests
        7. Task 4: Design/Script the Stress Tests
        8. Task 5: Design/Script the Compatibility Tests
        9. Task 6: Design/Script the Conversion Tests
        10. Task 7: Design/Script the Usability Tests
        11. Task 8: Design/Script the Documentation Tests
        12. Task 9: Design/Script the Backup Tests
        13. Task 10: Design/Script the Recovery Tests
        14. Task 11: Design/Script the Installation Tests
        15. Task 12: Design/Script Other System Test Types
      3. Step 3: Review/Approve System Tests
        1. Task 1: Schedule/Conduct the Review
        2. Task 2: Obtain Approvals
      4. Step 4: Execute the System Tests
        1. Task 1: Regression Test the System Fixes
        2. Task 2: Execute the New System Tests
        3. Task 3: Document the System Defects
    10. 21 Conduct Acceptance Testing
      1. Step 1: Complete Acceptance Test Planning
        1. Task 1: Finalize the Acceptance Test Types
        2. Task 2: Finalize the Acceptance Test Schedule
        3. Task 3: Organize the Acceptance Test Team
        4. Task 4: Establish the Acceptance Test Environment
        5. Task 5: Install Acceptance Test Tools
      2. Step 2: Complete Acceptance Test Cases
        1. Task 1: Identify the System-Level Test Cases
        2. Task 2: Design/Script Additional Acceptance Tests
      3. Step 3: Review/Approve Acceptance Test Plan
        1. Task 1: Schedule/Conduct the Review
        2. Task 2: Obtain Approvals
      4. Step 4: Execute the Acceptance Tests
        1. Task 1: Regression Test the Acceptance Fixes
        2. Task 2: Execute the New Acceptance Tests
        3. Task 3: Document the Acceptance Defects
    11. 22 Summarize/Report Test Results
      1. Step 1: Perform Data Reduction
        1. Task 1: Ensure All Tests Were Executed/Resolved
        2. Task 2: Consolidate Test Defects by Test Number
        3. Task 3: Post Remaining Defects to a Matrix
      2. Step 2: Prepare Final Test Report
        1. Task 1: Prepare the Project Overview
        2. Task 2: Summarize the Test Activities
        3. Task 3: Analyze/Create Metric Graphics
          1. Defects by Function
          2. Defects by Tester
          3. Defect Gap Analysis
          4. Defect Severity Status
          5. Test Burnout Tracking
          6. Root Cause Analysis
          7. Defects by How Found
          8. Defects by Who Found
          9. Functions Tested and Not Tested
          10. System Testing Defect Types
          11. Acceptance Testing Defect Types
        4. Task 4: Develop Findings/Recommendations
      3. Step 3: Review/Approve the Final Test Report
        1. Task 1: Schedule/Conduct the Review
        2. Task 2: Obtain Approvals
        3. Task 3: Publish the Final Test Report
  12. SECTION 4 PROJECT MANAGEMENT METHODOLOGY
    1. 23 The Project Management Framework
      1. The Project Framework
      2. Product Quality and Project Quality
      3. Components of the Project Framework
      4. The Project Framework and Continuous Quality Improvement
      5. The Project Framework Phases
        1. Initiation Phase
        2. Planning Phase
        3. Executing, Monitoring, and Controlling Phases
        4. Implement Phase
      6. Scoping the Project to Ensure Product Quality
      7. Product Scope and Project Scope
      8. The Project Charter
      9. The Scope Statement
      10. The Role of the Project Manager in Quality Management
      11. The Role of the Test Manager in Quality Management
        1. Analyze the Requirements
        2. Perform a Gap Analysis
        3. Avoid Duplication and Repetition
        4. Define the Test Data
        5. Validate the Test Environment
        6. Analyze the Test Results
        7. Deliver the Quality
      12. Advice for the Test Manager
        1. Request Help from Others
        2. Communicate Issues as They Arise
        3. Always Update Your Business Knowledge
        4. Learn the New Testing Technologies and Tools
        5. Improve the Process
        6. Create a Knowledge Base
      13. The Benefits of the Quality Project Management and the Project Framework
    2. 24 Project Quality Management
      1. Project Quality Management Processes
      2. Quality Planning
      3. Identifying the High-Level Project Activities
      4. Estimating the Test Work Effort
      5. Test Planning
      6. Effort Estimation: Model Project
      7. Quality Standards
    3. 25 The Defect Management Process
      1. Quality Control and Defect Management
      2. Defect Discovery and Classification
      3. Defect Priority
      4. Defect Category
      5. Defect Tracking
        1. Defect Reporting
      6. Defect Summary
      7. Defect Meetings
      8. Defect Metrics
      9. Quality Standards
    4. 26 Integrated Testing and Development
      1. Quality Control and Integrated Testing
      2. Integrated Testing
      3. Step 1: Organize the Test Team
      4. Step 2: Identify the Tasks to Integrate
      5. Step 3: Customize Test Steps and Tasks
      6. Step 4: Select Integration Points
      7. Step 5: Modify the Development Methodology
      8. Step 6: Test Methodology Training
      9. Step 7: Incorporate Defect Recording
      10. The Integrated Team
    5. 27 Test Management Constraints
      1. Organizational Architecture
      2. Traits of a Well-Established Quality Organization
      3. Division of Responsibilities
      4. Organizational Relationships
      5. Using the Project Framework Where No Quality Infrastructure Exists
      6. Ad Hoc Testing and the Project Framework
      7. Using a Traceability/Validation Matrix
      8. Reporting the Progress
  13. SECTION 5 EMERGING SPECIALIZED AREAS IN TESTING
    1. 28 Test Process and Automation Assessment
      1. Test Process Assessment
      2. Process Evaluation Methodology
        1. Step 1: Identify the Key Elements
        2. Step 2: Gather and Analyze the Information
        3. Step 3: Analyze Test Maturity
          1. The Requirements Definition Maturity
          2. Test Strategy Maturity
          3. Test Effort Estimation Maturity
          4. Test Design and Execution Maturity
          5. Regression Testing Maturity
          6. Test Automation Maturity
        4. Step 4: Document and Present Findings
      3. Test Automation Assessment
        1. Identify the Applications to Automate
        2. Identify the Best Test Automation Tool
        3. Test Scripting Approach
        4. Test Execution Approach
        5. Test Script Maintenance
      4. Test Automation Framework
        1. Basic Features of an Automation Framework
          1. Define the Folder Structure
          2. Modularize Scripts/Test Data to Increase Robustness
          3. Reuse Generic Functions and Application-Specific Function Libraries
          4. Develop Scripting Guidelines and Review Checklists
          5. Define Error Handling and Recovery Functions
          6. Define the Maintenance Process
        2. Standard Automation Frameworks
        3. Data-Driven Framework
        4. Modular Framework
      5. Keyword-Driven Framework
      6. Hybrid Framework
    2. 29 Nonfunctional Testing
      1. Performance Testing
      2. Load Testing
      3. Stress Testing
      4. Volume Testing
      5. Performance Monitoring
      6. Performance Testing Approach
      7. Knowledge Acquisition Process
      8. Test Development
      9. Performance Deliverables
      10. Security Testing
        1. Step 1: Identifying the Scope of Security Testing
        2. Step 2: Test Case Generation and Execution
      11. Types of Security Testing
        1. Network Scanning
          1. Purpose
          2. Tools
          3. Approach
        2. Vulnerability Scanning
          1. Purpose
          2. Tools
          3. Approach
        3. Password Cracking
          1. Tools
        4. Log Reviews
          1. Approach
        5. File Integrity Checkers
          1. Purpose
          2. Tools
        6. Virus Detectors
          1. Tools
          2. Approach
        7. Penetration Testing
          1. Purpose
          2. Approach
      12. Usability Testing
      13. Goals of Usability Testing
        1. Approach and Execution
        2. Guidelines for Usability Testing
        3. Accessibility Testing and Section 508
      14. Compliance Testing
    3. 30 SOA Testing
      1. Key Steps of SOA Testing
    4. 31 Agile Testing
      1. Agile User Stories Contrasted to Formal Requirements
      2. What Is a User Story?
      3. Agile Planning
      4. Types of Agile Testing
      5. Compliance Testing
    5. 32 Testing Center of Excellence
      1. Industry Best Processes
      2. Testing Metrics
      3. Operating Model
      4. Test Automation Framework
      5. Continuous Competency Development
    6. 33 On-Site/Offshore Model
      1. Step 1: Analysis
      2. Step 2: Determine the Economic Trade-Offs
      3. Step 3: Determine the Selection Criteria
      4. Project Management and Monitoring
      5. Outsourcing Methodology
        1. On-Site Activities
        2. Offshore Activities
      6. Implementing the On-Site/Offshore Model
        1. Knowledge Transfer
        2. Detailed Design
        3. Milestone-Based Transfer
        4. Steady State
        5. Application Management
      7. Prerequisites
        1. Relationship Model
        2. Standards
      8. Benefits of On-Site/Offshore Methodology
        1. On-Site/Offshore Model Challenges
          1. Out of Sight
          2. Establish Transparency
          3. Security Considerations
          4. Project Monitoring
          5. Management Overhead
          6. Cultural Differences
          7. Software Licensing
      9. Future of the Onshore/Offshore Model
  14. SECTION 6 MODERN SOFTWARE TESTING TOOLS
    1. 34 Software Testing Trends
      1. Automated Capture/Replay Testing Tools
      2. Test Case Builder Tools
      3. Necessary and Sufficient Conditions
      4. Test Data Generation Strategies
        1. Sampling from Production
        2. Starting from Scratch
        3. Seeding the Data
        4. Generating Data Based on the Database
        5. A Cutting-Edge Test Case Generator Based on Requirements
    2. 35 Taxonomy of Software Testing Tools
      1. Testing Tool Selection Checklist
      2. Commercial Vendor Tool Descriptions
      3. Open-Source Freeware Vendor Tools
      4. When You Should Consider Test Automation
      5. When You Should NOT Consider Test Automation
    3. 36 Methodology to Evaluate Automated Testing Tools
      1. Step 1: Define Your Test Requirements
      2. Step 2: Set Tool Objectives
      3. Step 3a: Conduct Selection Activities for Informal Procurement
        1. Task 1: Develop the Acquisition Plan
        2. Task 2: Define Selection Criteria
        3. Task 3: Identify Candidate Tools
        4. Task 4: Conduct the Candidate Review
        5. Task 5: Score the Candidates
        6. Task 6: Select the Tool
      4. Step 3b: Conduct Selection Activities for Formal Procurement
        1. Task 1: Develop the Acquisition Plan
        2. Task 2: Create the Technical Requirements Document
        3. Task 3: Review Requirements
        4. Task 4: Generate the Request for Proposal
        5. Task 5: Solicit Proposals
        6. Task 6: Perform the Technical Evaluation
        7. Task 7: Select a Tool Source
      5. Step 4: Procure the Testing Tool
      6. Step 5: Create the Evaluation Plan
      7. Step 6: Create the Tool Manager’s Plan
      8. Step 7: Create the Training Plan
      9. Step 8: Receive the Tool
      10. Step 9: Perform the Acceptance Test
      11. Step 10: Conduct Orientation
      12. Step 11: Implement Modifications
      13. Step 12: Train Tool Users
      14. Step 13: Use the Tool in the Operating Environment
      15. Step 14: Write the Evaluation Report
      16. Step 15: Determine Whether Goals Have Been Met
  15. SECTION 7 APPENDICES
    1. Appendix A: Spiral (Agile) Testing Methodology
    2. Appendix B: Software Quality Assurance Plan
    3. Appendix C: Requirements Specification
    4. Appendix D: Change Request Form
    5. Appendix E: Test Templates
    6. Appendix F: Checklists
    7. Appendix G: Software Testing Techniques
  16. Bibliography
  17. Glossary
  18. Index

Product information

  • Title: Software Testing and Continuous Quality Improvement, 3rd Edition
  • Author(s): William E. Lewis
  • Release date: June 2017
  • Publisher(s): Auerbach Publications
  • ISBN: 9781351722209