Effective Methods for Software Testing, Third Edition

Book description

  • Written by the founder and executive director of the Quality Assurance Institute, which sponsors the most widely accepted certification program for software testing

  • Software testing is a weak spot for most developers, and many have no system in place to find and correct defects quickly and efficiently

  • This comprehensive resource provides step-by-step guidelines, checklists, and templates for each testing activity, as well as a self-assessment that helps readers identify the sections of the book that respond to their individual needs

  • Covers the latest regulatory developments affecting software testing, including Sarbanes-Oxley Section 404, and provides guidelines for agile testing and testing for security, internal controls, and data warehouses

  • CD-ROM with all checklists and templates saves testers countless hours of developing their own test documentation

Note: CD-ROM/DVD and other supplementary materials are not included as part of eBook file.

Table of contents

  1. Copyright
    1. Dedication
  2. About the Author
  3. Credits
  4. Introduction
    1. Getting the Most Out of This Book
    2. What’s New in the Third Edition
    3. What’s on the CD
  5. One. Assessing Testing Capabilities and Competencies
    1. 1. Assessing Capabilities, Staff Competency, and User Satisfaction
      1. The Three-Step Process to Becoming a World-Class Testing Organization
      2. Step 1: Define a World-Class Software Testing Model
        1. Customizing the World-Class Model for Your Organization
      3. Step 2: Develop Baselines for Your Organization
        1. Assessment 1: Assessing the Test Environment
          1. Implementation Procedures
            1. Building the Assessment Team
            2. Completing the Assessment Questionnaire
            3. Building the Footprint Chart
            4. Assessing the Results
          2. Verifying the Assessment
        2. Assessment 2: Assessing the Capabilities of Your Existing Test Processes
        3. Assessment 3: Assessing the Competency of Your Testers
          1. Implementation Procedures
            1. Understanding the CSTE CBOK
            2. Completing the Assessment Questionnaires
            3. Building the Footprint Chart
            4. Assessing the Results
          2. Verifying the Assessment
      4. Step 3: Develop an Improvement Plan
      5. Summary
  6. Two. Building a Software Testing Environment
    1. 2. Creating an Environment Supportive of Software Testing
      1. Minimizing Risks
        1. Risk Appetite for Software Quality
        2. Risks Associated with Implementing Specifications
          1. Faulty Software Design
          2. Data Problems
        3. Risks Associated with Not Meeting Customer Needs
        4. Developing a Role for Software Testers
      2. Writing a Policy for Software Testing
        1. Criteria for a Testing Policy
        2. Methods for Establishing a Testing Policy
      3. Economics of Testing
      4. Testing—An Organizational Issue
      5. Management Support for Software Testing
      6. Building a Structured Approach to Software Testing
        1. Requirements
        2. Design
        3. Program
        4. Test
        5. Installation
        6. Maintenance
      7. Developing a Test Strategy
        1. Use Work Paper 2-1
        2. Use Work Paper 2-2
      8. Summary
    2. 3. Building the Software Testing Process
      1. Software Testing Guidelines
        1. Guideline #1: Testing Should Reduce Software Development Risk
        2. Guideline #2: Testing Should Be Performed Effectively
        3. Guideline #3: Testing Should Uncover Defects
          1. Defects Versus Failures
          2. Why Are Defects Hard to Find?
        4. Guideline #4: Testing Should Be Performed Using Business Logic
        5. Guideline #5: Testing Should Occur Throughout the Development Life Cycle
        6. Guideline #6: Testing Should Test Both Function and Structure
          1. Why Use Both Testing Methods?
          2. Structural and Functional Tests Using Verification and Validation Techniques
      2. Workbench Concept
        1. Testing That Parallels the Software Development Process
      3. Customizing the Software-Testing Process
        1. Determining the Test Strategy Objectives
        2. Determining the Type of Development Project
        3. Determining the Type of Software System
        4. Determining the Project Scope
        5. Identifying the Software Risks
        6. Determining When Testing Should Occur
        7. Defining the System Test Plan Standard
        8. Defining the Unit Test Plan Standard
        9. Converting Testing Strategy to Testing Tactics
      4. Process Preparation Checklist
      5. Summary
    3. 4. Selecting and Installing Software Testing Tools
      1. Integrating Tools into the Tester’s Work Processes
      2. Tools Available for Testing Software
      3. Selecting and Using Test Tools
        1. Matching the Tool to Its Use
        2. Selecting a Tool Appropriate to Its Life Cycle Phase
        3. Matching the Tool to the Tester’s Skill Level
        4. Selecting an Affordable Tool
      4. Training Testers in Tool Usage
      5. Appointing Tool Managers
        1. Prerequisites to Creating a Tool Manager Position
        2. Selecting a Tool Manager
        3. Assigning the Tool Manager Duties
        4. Limiting the Tool Manager’s Tenure
      6. Summary
    4. 5. Building Software Tester Competency
      1. What Is a Common Body of Knowledge?
      2. Who Is Responsible for the Software Tester’s Competency?
      3. How Is Personal Competency Used in Job Performance?
        1. Using the 2006 CSTE CBOK
      4. Developing a Training Curriculum
        1. Using the CBOK to Build an Effective Testing Team
      5. Summary
  7. Three. The Seven-Step Testing Process
    1. 6. Overview of the Software Testing Process
      1. Advantages of Following a Process
      2. The Cost of Computer Testing
        1. Quantifying the Cost of Removing Defects
        2. Reducing the Cost of Testing
      3. The Seven-Step Software Testing Process
        1. Objectives of the Seven-Step Process
        2. Customizing the Seven-Step Process
        3. Managing the Seven-Step Process
        4. Using the Tester’s Workbench with the Seven-Step Process
      4. Workbench Skills
      5. Summary
    2. 7. Step 1: Organizing for Testing
      1. Objective
      2. Workbench
      3. Input
      4. Do Procedures
        1. Task 1: Appoint the Test Manager
        2. Task 2: Define the Scope of Testing
        3. Task 3: Appoint the Test Team
          1. Internal Team Approach
          2. External Team Approach
          3. Non-IT Team Approach
          4. Combination Team Approach
        4. Task 4: Verify the Development Documentation
          1. Development Phases
          2. Measuring Project Documentation Needs
          3. Determining What Documents Must Be Produced
          4. Determining the Completeness of Individual Documents
          5. Determining Documentation Timeliness
        5. Task 5: Validate the Test Estimate and Project Status Reporting Process
          1. Validating the Test Estimate
            1. Strategies for Software Cost Estimating
            2. Parametric Models
          2. Testing the Validity of the Software Cost Estimate
            1. Validate the Reasonableness of the Estimating Model
            2. Validate That the Model Includes All the Needed Factors
            3. Verify the Correctness of the Cost-Estimating Model Estimate
          3. Calculating the Project Status Using a Point System
            1. Overview of the Point Accumulation Tracking System
            2. Typical Methods of Measuring Performance
            3. Using the Point System
            4. Extensions
            5. Rolling Baseline
            6. Reports
      5. Check Procedures
      6. Output
      7. Summary
    3. 8. Step 2: Developing the Test Plan
      1. Overview
      2. Objective
      3. Concerns
      4. Workbench
      5. Input
      6. Do Procedures
        1. Task 1: Profile the Software Project
          1. Conducting a Walkthrough of the Customer/User Area
          2. Developing a Profile of the Software Project
        2. Task 2: Understand the Project Risks
        3. Task 3: Select a Testing Technique
          1. Structural System Testing Techniques
            1. Stress Testing
              1. Objectives
              2. How to Use Stress Testing
              3. When to Use Stress Testing
            2. Execution Testing
              1. Objectives
              2. How to Use Execution Testing
              3. When to Use Execution Testing
            3. Recovery Testing
              1. Objectives
              2. How to Use Recovery Testing
              3. When to Use Recovery Testing
            4. Operations Testing
              1. Objectives
              2. How to Use Operations Testing
              3. When to Use Operations Testing
            5. Compliance Testing
              1. Objectives
              2. How to Use Compliance Testing
              3. When to Use Compliance Testing
            6. Security Testing
              1. Objectives
              2. How to Use Security Testing
              3. When to Use Security Testing
          2. Functional System Testing Techniques
            1. Requirements Testing
              1. Objectives
              2. How to Use Requirements Testing
              3. When to Use Requirements Testing
            2. Regression Testing
              1. Objectives
              2. How to Use Regression Testing
              3. When to Use Regression Testing
            3. Error-Handling Testing
              1. Objectives
              2. How to Use Error-Handling Testing
              3. When to Use Error-Handling Testing
            4. Manual-Support Testing
              1. Objectives
              2. How to Use Manual-Support Testing
              3. When to Use Manual-Support Testing
            5. Intersystem Testing
              1. Objectives
              2. How to Use Intersystem Testing
              3. When to Use Intersystem Testing
            6. Control Testing
              1. Objectives
              2. How to Use Control Testing
              3. When to Use Control Testing
            7. Parallel Testing
              1. Objectives
              2. How to Use Parallel Testing
              3. When to Use Parallel Testing
        4. Task 4: Plan Unit Testing and Analysis
          1. Functional Testing and Analysis
            1. Functional Analysis
            2. Functional Testing
              1. Testing Independent of the Specification Technique
              2. Testing Dependent on the Specification Technique
          2. Structural Testing and Analysis
            1. Structural Analysis
            2. Structural Testing
          3. Error-Oriented Testing and Analysis
            1. Statistical Methods
            2. Error-Based Testing
            3. Fault-Based Testing
          4. Managerial Aspects of Unit Testing and Analysis
            1. Selecting Techniques
            2. Control
        5. Task 5: Build the Test Plan
          1. Setting Test Objectives
          2. Developing a Test Matrix
            1. Individual Software Modules
            2. Structural Attributes
            3. Batch Tests
            4. Conceptual Test Script for Online System Test
            5. Verification Tests
            6. Software/Test Matrix
          3. Defining Test Administration
            1. Test Plan General Information
            2. Define Test Milestones
            3. Define Checkpoint Administration
          4. Writing the Test Plan
        6. Task 6: Inspect the Test Plan
          1. Inspection Concerns
          2. Products/Deliverables to Inspect
          3. Formal Inspection Roles
            1. Moderator
            2. Reader
            3. Recorder
            4. Author
            5. Inspectors
          4. Formal Inspection Defect Classification
          5. Inspection Procedures
            1. Planning and Organizing
            2. Overview Session
            3. Individual Preparation
            4. Inspection Meeting
            5. Rework and Follow-Up
      7. Check Procedures
      8. Output
      9. Guidelines
      10. Summary
    4. 9. Step 3: Verification Testing
      1. Overview
      2. Objective
      3. Concerns
      4. Workbench
      5. Input
        1. The Requirements Phase
        2. The Design Phase
        3. The Programming Phase
      6. Do Procedures
        1. Task 1: Test During the Requirements Phase
          1. Requirements Phase Test Factors
          2. Preparing a Risk Matrix
            1. Establishing the Risk Team
            2. Identifying Risks
            3. Establishing Control Objectives (Requirements Phase Only)
            4. Identifying Controls in Each System Segment
            5. Determining the Adequacy of Controls
          3. Performing a Test Factor Analysis
          4. Conducting a Requirements Walkthrough
            1. Establishing Ground Rules
            2. Selecting the Team
            3. Presenting Project Requirements
            4. Responding to Questions/Recommendations
            5. Issuing the Final Report (Optional)
          5. Performing Requirements Tracing
          6. Ensuring Requirements Are Testable
        2. Task 2: Test During the Design Phase
          1. Scoring Success Factors
          2. Analyzing Test Factors
          3. Conducting a Design Review
          4. Inspecting Design Deliverables
        3. Task 3: Test During the Programming Phase
          1. Desk Debugging the Program
            1. Syntactical Desk Debugging
            2. Structural Desk Debugging
            3. Functional Desk Debugging
          2. Performing Programming Phase Test Factor Analysis
          3. Conducting a Peer Review
            1. Establishing Peer Review Ground Rules
            2. Selecting the Peer Review Team
            3. Training Team Members
            4. Selecting a Review Method
            5. Conducting the Peer Review
            6. Drawing Conclusions
            7. Preparing Reports
      7. Check Procedures
      8. Output
      9. Guidelines
      10. Summary
    5. 10. Step 4: Validation Testing
      1. Overview
      2. Objective
      3. Concerns
      4. Workbench
      5. Input
      6. Do Procedures
        1. Task 1: Build the Test Data
          1. Sources of Test Data/Test Scripts
          2. Testing File Design
          3. Defining Design Goals
          4. Entering Test Data
          5. Applying Test Files Against Programs That Update Master Records
          6. Creating and Using Test Data
          7. Payroll Application Example
          8. Creating Test Data for Stress/Load Testing
          9. Creating Test Scripts
            1. Determining Testing Levels
            2. Developing Test Scripts
            3. Executing Test Scripts
            4. Analyzing the Results
            5. Maintaining Test Scripts
        2. Task 2: Execute Tests
        3. Task 3: Record Test Results
          1. Documenting the Deviation
          2. Documenting the Effect
          3. Documenting the Cause
      7. Check Procedures
      8. Output
      9. Guidelines
      10. Summary
    6. 11. Step 5: Analyzing and Reporting Test Results
      1. Overview
      2. Concerns
      3. Workbench
      4. Input
        1. Test Plan and Project Plan
        2. Expected Processing Results
        3. Data Collected during Testing
          1. Test Results Data
          2. Test Transactions, Test Suites, and Test Events
          3. Defects
          4. Efficiency
        4. Storing Data Collected During Testing
      5. Do Procedures
        1. Task 1: Report Software Status
          1. Establishing a Measurement Team
          2. Creating an Inventory of Existing Project Measurements
          3. Developing a Consistent Set of Project Metrics
          4. Defining Process Requirements
          5. Developing and Implementing the Process
          6. Monitoring the Process
            1. Summary Status Report
            2. Project Status Report
        2. Task 2: Report Interim Test Results
          1. Function/Test Matrix
          2. Functional Testing Status Report
          3. Functions Working Timeline Report
          4. Expected Versus Actual Defects Uncovered Timeline Report
          5. Defects Uncovered Versus Corrected Gap Timeline Report
          6. Average Age of Uncorrected Defects by Type Report
          7. Defect Distribution Report
          8. Normalized Defect Distribution Report
          9. Testing Action Report
          10. Interim Test Report
        3. Task 3: Report Final Test Results
          1. Individual Project Test Report
          2. Integration Test Report
          3. System Test Report
          4. Acceptance Test Report
      6. Check Procedures
      7. Output
      8. Guidelines
      9. Summary
    7. 12. Step 6: Acceptance and Operational Testing
      1. Overview
      2. Objective
      3. Concerns
      4. Workbench
      5. Input Procedures
        1. Task 1: Acceptance Testing
          1. Defining the Acceptance Criteria
          2. Developing an Acceptance Plan
          3. Executing the Acceptance Plan
          4. Developing Test Cases (Use Cases) Based on How Software Will Be Used
            1. Building a System Boundary Diagram
            2. Defining Use Cases
            3. Developing Test Cases
            4. Reaching an Acceptance Decision
        2. Task 2: Pre-Operational Testing
          1. Testing New Software Installation
          2. Testing the Changed Software Version
            1. Testing the Adequacy of the Restart/Recovery Plan
            2. Verifying the Correct Change Has Been Entered into Production
            3. Verifying Unneeded Versions Have Been Deleted
          3. Monitoring Production
          4. Documenting Problems
        3. Task 3: Post-Operational Testing
          1. Developing and Updating the Test Plan
          2. Developing and Updating the Test Data
          3. Testing the Control Change Process
            1. Identifying and Controlling Change
            2. Documenting Change Needed on Each Data Element
            3. Documenting Changes Needed in Each Program
          4. Conducting Testing
          5. Developing and Updating Training Material
            1. Training Material Inventory Form
            2. Training Plan Work Paper
            3. Preparing Training Material
            4. Conducting Training
      6. Check Procedures
      7. Output
        1. Is the Automated Application Acceptable?
        2. Automated Application Segment Failure Notification
        3. Is the Manual Segment Acceptable?
        4. Training Failure Notification Form
      8. Guidelines
      9. Summary
    8. 13. Step 7: Post-Implementation Analysis
      1. Overview
      2. Concerns
      3. Workbench
      4. Input
      5. Do Procedures
        1. Task 1: Establish Assessment Objectives
        2. Task 2: Identify What to Measure
        3. Task 3: Assign Measurement Responsibility
        4. Task 4: Select Evaluation Approach
        5. Task 5: Identify Needed Facts
        6. Task 6: Collect Evaluation Data
        7. Task 7: Assess the Effectiveness of Testing
          1. Using Testing Metrics
      6. Check Procedures
      7. Output
      8. Guidelines
      9. Summary
  8. Four. Incorporating Specialized Testing Responsibilities
    1. 14. Software Development Methodologies
      1. How Much Testing Is Enough?
        1. Software Development Methodologies
          1. Overview
          2. Methodology Types
            1. Waterfall Methodology
            2. Prototyping Methodology
            3. Rapid Application Development Methodology
            4. Spiral Methodology
            5. Incremental Methodology
            6. The V Methodology
          3. Software Development Life Cycle
            1. Phase 1: Initiation
            2. Phase 2: Definition
            3. Phase 3: System Design
            4. Phase 4: Programming and Testing
            5. Phase 5: Evaluation and Acceptance
            6. Phase 6: Installation and Operation
            7. Roles and Responsibilities
        2. Defining Requirements
          1. Categories
          2. Attributes
            1. Desired Attributes: A Systems Analyst Perspective
            2. Requirements Measures: A Tester’s Perspective
            3. International Standards
        3. Methodology Maturity
        4. Competencies Required
        5. Staff Experience
        6. Configuration-Management Controls
          1. Basic CM Requirements
            1. Configuration Identification
            2. Configuration Control
            3. Configuration-Status Accounting
            4. Configuration Audits
          2. Planning
          3. Data Distribution and Access
          4. CM Administration
            1. Project Leader’s CM Plan
            2. Work Breakdown Structure
            3. Technical Reviews
        7. Configuration Identification
          1. CI Selection
          2. Document Library
          3. Software Development Library
          4. Configuration Baselines
          5. Initial Release
          6. Software Marking and Labeling
          7. Interface Requirements
        8. Configuration Control
      2. Measuring the Impact of the Software Development Process
      3. Summary
    2. 15. Testing Client/Server Systems
      1. Overview
      2. Concerns
      3. Workbench
      4. Input
      5. Do Procedures
        1. Task 1: Assess Readiness
          1. Software Development Process Maturity Levels
            1. The Ad Hoc Process (Level 1)
            2. The Repeatable Process (Level 2)
            3. The Consistent Process (Level 3)
            4. The Measured Process (Level 4)
            5. The Optimized Process (Level 5)
          2. Conducting the Client/Server Readiness Assessment
          3. Preparing a Client/Server Readiness Footprint Chart
        2. Task 2: Assess Key Components
        3. Task 3: Assess Client Needs
      6. Check Procedures
      7. Output
      8. Guidelines
      9. Summary
    3. 16. Rapid Application Development Testing
      1. Overview
      2. Objective
      3. Concerns
        1. Testing Iterations
        2. Testing Components
        3. Testing Performance
        4. Recording Test Information
      4. Workbench
      5. Input
      6. Do Procedures
        1. Testing Within Iterative RAD
        2. Spiral Testing
        3. Task 1: Determine Appropriateness of RAD
        4. Task 2: Test Planning Iterations
        5. Task 3: Test Subsequent Planning Iterations
        6. Task 4: Test the Final Planning Iteration
      7. Check Procedures
      8. Output
      9. Guidelines
      10. Summary
    4. 17. Testing Internal Controls
      1. Overview
      2. Internal Controls
        1. Control Objectives
        2. Preventive Controls
          1. Source-Data Authorization
          2. Data Input
          3. Source-Data Preparation
          4. Turnaround Documents
          5. Prenumbered Forms
          6. Input Validation
          7. File Auto-Updating
          8. Processing Controls
        3. Detective Controls
          1. Data Transmission
          2. Control Register
          3. Control Totals
          4. Documenting and Testing
          5. Output Checks
        4. Corrective Controls
          1. Error Detection and Resubmission
          2. Audit Trails
        5. Cost/Benefit Analysis
      3. Assessing Internal Controls
        1. Task 1: Understand the System Being Tested
        2. Task 2: Identify Risks
        3. Task 3: Review Application Controls
        4. Task 4: Test Application Controls
          1. Testing Without Computer Processing
          2. Testing with Computer Processing
            1. Test-Data Approach
            2. Mini-Company Approach
          3. Transaction Flow Testing
          4. Objectives of Internal Accounting Controls
            1. Systems Control Objectives
            2. Financial Planning and Control Objectives
            3. Cycle Control Objectives
          5. Results of Testing
        5. Task 5: Document Control Strengths and Weaknesses
      4. Quality Control Checklist
      5. Summary
    5. 18. Testing COTS and Contracted Software
      1. Overview
      2. COTS Software Advantages, Disadvantages, and Risks
        1. COTS Versus Contracted Software
        2. COTS Advantages
        3. COTS Disadvantages
        4. Implementation Risks
        5. Testing COTS Software
        6. Testing Contracted Software
      3. Objective
      4. Concerns
      5. Workbench
      6. Input
      7. Do Procedures
        1. Task 1: Test Business Fit
          1. Step 1: Testing Needs Specification
          2. Step 2: Testing CSFs
        2. Task 2: Test Operational Fit
          1. Step 1: Test Compatibility
          2. Step 2: Integrate the Software into Existing Work Flows
          3. Step 3: Demonstrate the Software in Action
        3. Task 3: Test People Fit
        4. Task 4: Acceptance-Test the Software Process
          1. Step 1: Create Functional Test Conditions
          2. Step 2: Create Structural Test Conditions
        5. Modifying the Testing Process for Contracted Software
      8. Check Procedures
      9. Output
      10. Guidelines
      11. Summary
    6. 19. Testing in a Multiplatform Environment
      1. Overview
      2. Objective
      3. Concerns
      4. Background on Testing in a Multiplatform Environment
      5. Workbench
      6. Input
      7. Do Procedures
        1. Task 1: Define Platform Configuration Concerns
        2. Task 2: List Needed Platform Configurations
        3. Task 3: Assess Test Room Configurations
        4. Task 4: List Structural Components Affected by the Platform(s)
        5. Task 5: List Interfaces the Platform Affects
        6. Task 6: Execute the Tests
      8. Check Procedures
      9. Output
      10. Guidelines
      11. Summary
    7. 20. Testing Software System Security
      1. Overview
      2. Objective
      3. Concerns
      4. Workbench
      5. Input
      6. Where Vulnerabilities Occur
        1. Functional Vulnerabilities
        2. Vulnerable Areas
        3. Accidental Versus Intentional Losses
      7. Do Procedures
        1. Task 1: Establish a Security Baseline
          1. Why Baselines Are Necessary
          2. Creating Baselines
            1. Establish the Team
            2. Set Requirements and Objectives
            3. Design Data Collection Methods
            4. Train Participants
            5. Collect Data
            6. Analyze and Report Security Status
          3. Using Baselines
        2. Task 2: Build a Penetration-Point Matrix
          1. Controlling People by Controlling Activities
          2. Selecting Security Activities
            1. Interface Activities
            2. Development Activities
            3. Operations Activities
          3. Controlling Business Transactions
          4. Characteristics of Security Penetration
          5. Building a Penetration-Point Matrix
        3. Task 3: Analyze the Results of Security Testing
      8. Evaluating the Adequacy of Security
      9. Check Procedures
      10. Output
      11. Guidelines
      12. Summary
    8. 21. Testing a Data Warehouse
      1. Overview
      2. Concerns
      3. Workbench
      4. Input
      5. Do Procedures
        1. Task 1: Measure the Magnitude of Data Warehouse Concerns
        2. Task 2: Identify Data Warehouse Activity Processes to Test
          1. Organizational Process
          2. Data Documentation Process
          3. System Development Process
          4. Access Control Process
          5. Data Integrity Process
          6. Operations Process
          7. Backup/Recovery Process
          8. Performing Task 2
        3. Task 3: Test the Adequacy of Data Warehouse Activity Processes
      6. Check Procedures
      7. Output
      8. Guidelines
      9. Summary
    9. 22. Testing Web-Based Systems
      1. Overview
      2. Concerns
      3. Workbench
      4. Input
      5. Do Procedures
        1. Task 1: Select Web-Based Risks to Include in the Test Plan
          1. Security Concerns
          2. Performance Concerns
          3. Correctness Concerns
          4. Compatibility Concerns
            1. Browser Configuration
          5. Reliability Concerns
          6. Data Integrity Concerns
          7. Usability Concerns
          8. Recoverability Concerns
        2. Task 2: Select Web-Based Tests
          1. Unit or Component
          2. Integration
          3. System
          4. User Acceptance
          5. Performance
          6. Load/Stress
          7. Regression
          8. Usability
          9. Compatibility
        3. Task 3: Select Web-Based Test Tools
        4. Task 4: Test Web-Based Systems
      6. Check Procedures
      7. Output
      8. Guidelines
      9. Summary
  9. Five. Building Agility into the Testing Process
    1. 23. Using Agile Methods to Improve Software Testing
      1. The Importance of Agility
      2. Building an Agile Testing Process
      3. Agility Inhibitors
      4. Is Improvement Necessary?
      5. Compressing Time
        1. Challenges
        2. Solutions
        3. Measuring Readiness
        4. The Seven-Step Process
      6. Summary
    2. 24. Building Agility into the Testing Process
      1. Step 1: Measure Software Process Variability
        1. Timelines
        2. Process Steps
          1. Workbenches
          2. Time-Compression Workbenches
          3. Reducing Variability
          4. Developing Timelines
            1. Identifying the Workbenches
            2. Measuring the Time for Each Workbench via Many Testing Projects
            3. Defining the Source of Major Variability in Selected Workbenches
        3. Improvement Shopping List
        4. Quality Control Checklist
        5. Conclusion
      2. Step 2: Maximize Best Practices
        1. Tester Agility
          1. Software Testing Relationships
            1. Operational Software
            2. Software Quality Factors
          2. Tradeoffs
          3. Capability Chart
          4. Measuring Effectiveness and Efficiency
            1. Defining Measurement Criteria
            2. Measuring Quality Factors
            3. Defining Efficiency and Effectiveness Criteria
            4. Measuring Effectiveness
            5. Measuring Efficiency
            6. Building Effectiveness and Efficiency Metrics
            7. Identifying Best Practices from Best Projects
        2. Improvement Shopping List
        3. Quality Control Checklist
        4. Conclusion
      3. Step 3: Build on Strength, Minimize Weakness
        1. Effective Testing Processes
          1. Assessing the Process
          2. Developing and Interpreting the Testing Footprint
        2. Poor Testing Processes
        3. Improvement Shopping List
        4. Quality Control Checklist
        5. Conclusion
      4. Step 4: Identify and Address Improvement Barriers
        1. The Stakeholder Perspective
          1. Stakeholder Involvement
          2. Performing Stakeholder Analysis
        2. Red-Flag/Hot-Button Barriers
        3. Staff-Competency Barriers
        4. Administrative/Organizational Barriers
        5. Determining the Root Cause of Barriers/Obstacles
        6. Addressing the Root Cause of Barriers/Obstacles
        7. Quality Control Checklist
        8. Conclusion
      5. Step 5: Identify and Address Cultural and Communication Barriers
        1. Management Cultures
          1. Culture 1: Manage People
            1. Why Organizations Continue with Culture 1
            2. Why Organizations Might Want to Adopt Culture 2
          2. Culture 2: Manage by Process
            1. Why Organizations Continue with Culture 2
            2. Why Organizations Might Want to Adopt Culture 3
          3. Culture 3: Manage Competencies
            1. Why Organizations Continue with Culture 3
            2. Why Organization Might Want to Adopt Culture 4
          4. Culture 4: Manage by Fact
            1. Why Organizations Continue with Culture 4
            2. Why Organizations Might Want to Adopt Culture 5
          5. Culture 5: Manage Business Innovation
        2. Cultural Barriers
          1. Identifying the Current Management Culture
          2. Identifying the Barriers Posed by the Culture
          3. Determining What Can Be Done in the Current Culture
          4. Determining the Desired Culture for Time Compression
          5. Determining How to Address Culture Barriers
        3. Open and Effective Communication
          1. Lines of Communication
          2. Information/Communication Barriers
          3. Effective Communication
        4. Quality Control Checklist
        5. Conclusion
      6. Step 6: Identify Implementable Improvements
        1. What Is an Implementable?
        2. Identifying Implementables via Time Compression
        3. Prioritizing Implementables
        4. Documenting Approaches
        5. Quality Control Checklist
        6. Conclusion
      7. Step 7: Develop and Execute an Implementation Plan
        1. Planning
        2. Implementing Ideas
          1. Preparing the Work Plan
          2. Checking the Results
          3. Taking Action
        3. Requisite Resources
        4. Quality Control Checklist
        5. Conclusion
      8. Summary

Product information

  • Title: Effective Methods for Software Testing, Third Edition
  • Author(s): William E. Perry William E. Perry Enterprises, Inc.
  • Release date: May 2006
  • Publisher(s): Wiley
  • ISBN: 9780764598371