1.1. The History of Testing Tools

Tools for testing have been around for as long as developers have been writing code. In the early years of software development, however, there wasn't a clear distinction between testing and debugging. At the time, this model worked. Some argue that this model worked because the system was closed; most companies who needed software had the developers on staff to create and maintain the systems. Computer systems were not widespread, even though developers worked very closely with customers to deliver exactly what was required. In the years between 1970 and 1995, computer systems started becoming more popular, and the relationships between developers and customers became distant, often placing several layers of management between them.

What is the difference between debugging and testing you might ask? Testing is the process of finding defects in the software. A defect could be a missing feature, a feature that does not perform adequately, or a feature that is broken. Debugging is the process of first tracking down a bug in the software and then fixing it.

Many of the tools developers used for testing in the early days were internal tools developed specifically for a particular project and oftentimes not reused. Developers began to see a need to create reusable tools that included the patterns they learned early on. Testing methodologies evolved and tools started to become standardized, due to this realization. In recent years, testing methodologies ...

Get Testing ASP.NET Web Applications now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.