Skip to main content

We Don’t Need No Stinkin’ Code: Testing Software Requirements

Someone once asked me when you can begin testing software. “As soon as you’ve written your first requirement, you can begin testing,” I replied.

It’s hard to visualize how a system will function by reading some requirements. Tests that are based on requirements make the expected system behaviors more tangible. Even the simple act of designing tests reveals many requirements problems long before you can execute those tests on a running system.

Requirements and Tests

Tests and requirements present complementary views of the system. Creating multiple views of a system—written requirements, diagrams, tests, prototypes, and so forth—provides a much richer understanding of the system than can any single representation. Some agile development methods emphasize writing user acceptance tests from user stories in lieu of writing detailed functional requirements. Thinking about the system from a testing perspective is valuable, but that approach still leaves you with just a single representation of requirements knowledge, so you must trust it to be correct.

Writing black-box (functional) tests crystallizes your vision of how the system should behave under certain conditions. Vague and ambiguous requirements will jump out at you because you won’t be able to describe the expected system response. When business analysts (BAs), developers, and customers walk through tests together, they’ll codify a shared vision of how the product will work and increase their confidence that the requirements are correct.

A personal experience brought home to me the importance of combining test thinking with requirements specification. I once asked my group’s UNIX scripting guru, Charlie, to build a simple e-mail interface extension for a commercial defect-tracking system we had adopted. I wrote a dozen functional requirements that described how the e-mail interface should work. Charlie was thrilled. He’d written many scripts for people, but he’d never seen written requirements before.

Unfortunately, I waited a couple of weeks before I wrote the tests for this e-mail function. Sure enough, I had made an error in one of the requirements. I found the mistake because my mental image of how I expected the function to work, represented in about twenty tests, was inconsistent with one of the requirement. Chagrined, I corrected the defective requirement before Charlie had completed his implementation, and when he delivered the script, it was defect free. It was a small victory, but small victories add up.

Conceptual Tests

You can begin deriving conceptual tests from user requirements early in the development process. Use the tests to evaluate functional requirements, analysis models, and prototypes. The tests should cover the normal flow of each use case, alternative flows, and the exceptions you identified during elicitation and analysis. Similarly, agile acceptance tests should cover both expected behaviors and exceptions.

Consider an application called the Chemical Tracking System. One use case, “View a Stored Order,” let the user retrieve a particular order for a chemical from the database and view its details. Some conceptual tests are:

  • User enters order number to view, order exists, user placed the order. Expected result: show order details.
  • User enters order number to view, order doesn’t exist. Expected result: Display message “Sorry, I can’t find that order.”
  • User enters order number to view, order exists, user didn’t place the order. Expected result: Display message “Sorry, that’s not your order. You can’t view it.”

Ideally, a BA will write the functional requirements and a tester will write the tests from a common starting point, as shown in Figure 1. Ambiguities in the user requirements and differences of interpretation will lead to inconsistencies between the views represented by the functional requirements, models, and tests. Those inconsistencies reveal errors. As developers translate requirements into user interface and technical designs, testers can elaborate the conceptual tests into detailed test procedures.

no code pic 1Figure 1. Development and testing work products derive from a common source.

A Testing Example

Let’s see how the Chemical Tracking System team tied together requirements specification, analysis modeling, and early test-case generation. Here are some pieces of requirements information, all of which relate to the task of requesting a chemical.

Use Case. A use case is “Request a Chemical.” This use case includes a path that permits the user to request a chemical container that’s already available in the chemical stockroom. Here’s the use case descriptionUse Case. A use case is “Request a Chemical.” This use case includes a path that permits the user to request a chemical container that’s already available in the chemical stockroom. Here’s the use case description.

The Requester specifies the desired chemical to request by entering its name or chemical ID number. The system either offers the Requester a container of the chemical from the chemical stockroom or lets the Requester order one from a vendor.


Advertisement

Functional Requirement. Here’s a bit of functionality associated with this use case:

  1. If the stockroom has containers of the chemical being requested, the system shall display a list of the available containers.⦁ If the stockroom has containers of the chemical being requested, the system shall display a list of the available containers.
  2. The user shall either select one of the displayed containers or ask to place an order for a new container from a vendor.

Dialog Map. Figure 2 illustrates a portion of the dialog map for the “Request a Chemical” use case that pertains to this function. A dialog map is a high-level view of a user interface’s architecture, modeled as a state-transition diagram. The boxes in this dialog map represent user interface displays (dialog boxes in this case), and the arrows represent possible navigation paths from one display to another.

no code pic 2Figure 2. Portion of the dialog map for the “Request a Chemical” use case.Figure 2. Portion of the dialog map for the “Request a Chemical” use case.

Test. Because this use case has several possible execution paths, you can envision numerous tests to address the normal flow, alternative flows, and exceptions. The following is just one test, based on the flow that shows the user the available containers in the chemical stockroom:

At dialog box DB40, enter a valid chemical ID; the chemical stockroom has two containers of this chemical. Dialog box DB50 appears, showing the two containers. Select the second container. DB50 closes and container 2 is added to the bottom of the Current Chemical Request List in dialog box DB70.

Ramesh, the test lead for the Chemical Tracking System, wrote several tests like this one, based on his understanding of how the user might interact with the system to request a chemical. Such abstract tests are independent of implementation details. They don’t describe entering data into specific fields, clicking buttons, or other interaction techniques. As development progresses, the tester can refine these abstract tests into specific test procedures.

Now comes the fun part—testing the requirements. Ramesh first mapped the tests against the functional requirements. He checked to make certain that every test could be “executed” by going through a set of existing requirements. He also made sure that at least one test covered every functional requirement.

Next, Ramesh traced the execution path for every test on the dialog map with a highlighter pen. The yellow line in Figure 3 shows how the preceding test traces onto the dialog map.

no code pic 3Figure 3. Tracing a test onto the dialog map for the “Request a Chemical” use case.

By tracing the execution path for each test on the model, you can find incorrect or missing requirements, improve the user’s navigation options, and refine the tests. Suppose that after “executing” all the tests in this fashion, the navigation line in Figure 2 labeled “order new container” that goes from DB50 to DB60 hasn’t been highlighted. There are two possible interpretations:

  • That navigation isn’t a permitted system behavior. The BA needs to remove that arrow from the dialog map. If you have a requirement that specifies the transition, that requirement also needs to go.
  • The navigation is a legitimate system behavior, but the test that demonstrates the behavior is missing.

When I find such a disconnect, I don’t know which possible interpretation is correct. However, I do know that all of the views of the requirements—textual, models, and tests—must agree, so there’s clearly something wrong.

Suppose that another test states that the user can take some action to move directly from DB40 to DB70. However, the dialog map doesn’t contain such a navigation line, so the test can’t be “executed:” you can’t get there from here. Again, there are two possible interpretations:

  • The navigation from DB40 to DB70 is not a permitted system behavior, so the test is wrong.
  • The navigation from DB40 to DB70 is a legitimate function, but the requirement that allows you to execute the test is missing.

In these examples, the BA and the tester combined requirements, analysis models, and tests to detect missing, erroneous, or unnecessary requirements long before any code was written. Every time I use this technique, I find errors in all the items I’m comparing to each other, quickly and cheaply

As consultant Ross Collard  pointed out, “Use cases and tests work well together in two ways: If the use cases for a system are complete, accurate, and clear, the process of deriving the tests is straightforward. And if the use cases are not in good shape, the attempt to derive tests will help to debug the use cases.” I couldn’t agree more. Conceptual testing of software requirements is a powerful technique for discovering requirement ambiguities and errors early on.

Karl Wiegers is Principal Consultant at Process Impact. He’s the author of numerous books and articles on software development, project management, design, quality, chemistry, military history, and other topics. This article is adapted from Software Requirements, 3rd Edition by Karl Wiegers and Joy Beatty. Karl’s latest book is The Thoughtless Design of Everyday Things.


Karl Wiegers

Karl Wiegers is Principal Consultant with Process Impact, a software development consulting and training company in Portland, Oregon. He has a PhD in organic chemistry. Karl is the author of 14 books, including Software Requirements Essentials, Software Requirements, More About Software Requirements, Successful Business Analysis Consulting, Software Development Pearls, The Thoughtless Design of Everyday Things, and a forensic mystery novel titled The Reconstruction. Karl also has written many articles on software development, design, project management, chemistry, military history, and consulting, as well as 18 songs. You can reach him at ProcessImpact.com or KarlWiegers.com.

Comment