Here's the issue for everyone looking for a silver bullet: requirements are never done - at least not in the business information systems world. Trying for 'absoluteness' (to make up a new and cool consulting word) will more likely lead to process failure than success. As you cascade past business requirements and get into increasingly detailed iterations of specification documentation, the line between what is a requirement and what is a solution gets increasingly fuzzy.
Now before all the agilists out there start whooping it up in agreement: 'absenting' requirements (you need to be equal opportunity about making up consulting words!) will lead to incredible performance inconsistency - at an individual resource level, at a long term asset management and integrity level, and at a corporate expectation management level. I can even prove it: low requirements maturity agilists perform terribly in on time/budget/success performance versus high requirements maturity agilists (if you want the data, send me a note through the editor or the www.iag.com site and ask). The issue of requirements quality is common across all development approaches: You MUST define what are "clear, accurate and complete requirements" for your situation, if you expect to materially change project performance and success rates.
Therein lies the answer: defining "complete" means a company has to describe all three of:
- the state of requirements (quality of information),
- the format of requirements (template and techniques used for visualizing requirements), and
- the process through which these artifacts are achieved.
Doing this defines what 'done' is for analysts and project managers. The question of "Am I done?" only really arises at companies where there is weakness in one of the requirements state, format or process attributes. Deal with these three attributes, and the company starts itself down that path of maturing requirements practices and materially changes its ratio of on-time/on-budget/success performance on projects.
OK! Get up and stretch - for those of you saying "Hey, he's ducking the question!" - there's more.
Not everyone can deal with the long term fix. We sometimes have to do quick fixes as a reality of day to day management. Here are four simple tests to assess if requirements are not done to a reasonable or sufficient degree of quality. These are Business Requirements level tests that work and will improve your requirements quality irrespective of delivery methodology:
- If the requirements lack context. Requirements always exist to support "what" the business wants to do, not "how" it wants to do it. The "what" part of this is the context of BUSINESS PROCESS. No understanding of what business processes are impacted by requirements means someone has no idea of how requirements impact each other, the impact of removing requirements, or the ability to assure that the requirements collectively are complete or will meet a specific business objective. The way a company applies context in its documentation also creates the STRUCTURE of the documentation. Here is one technique example - Use Cases. As a technique Use Cases give both context and structure to requirements and help an analyst assure that the scope of the project is both well described and sequenced.
- If the interdependency is not evident: How do you look for proof that interdependency is documented? Look for a section in the material called "dependencies", check the "issues list", look for an analysis technique called a context diagram (every line on a context diagram is an interdependency). Why is interdependency so important? There are two aspects to scope: internal to the system (e.g., its functionality, the workflow and information flow, etc), and external to the system (e.g., how this system needs to interact with other systems, how the workflow being automated hands off across other departmental units). In the absence of knowing the interdependencies, you only ever know HALF of the story on scope, so it becomes probable that you will encounter significant scope shift on any system of any degree of complexity.
- Unclear business objectives: Objectives must be Specific, Measurable, Achievable, Results-oriented, and Time Bounded (easy to remember as 'SMART'). The absence of objectives eliminates the ability to assess solution tradeoffs, makes difficult the prioritization of functionality, and etc. other problems. You can test if a particular function meets needs with user acceptance tests. You cannot test if the collective system meets needs unless you have clear objectives.
- You cannot tell from the description of business need how information is going to move. I need $10 for every time I've seen business requirements expressed as a process flowchart. This is a nice (albeit somewhat inefficient) first step but here's the problem: when you get to the 'decision diamond' in the picture that says "approve the policy (Y/N)" how do you expect developers to know what this means? The only way to elaborate this is to ask the questions, "What information do you need to know to approve that policy?", "what do you do with that information?", "where do you get the information from?", "who else do you give the information to?" etc. The more detailed the description of WHAT the business wants to do, the more the description of process will center on how information needs to move in support of the business process. Until you get to the level of detail that is expressing information movement, you have no idea from the documentation what the business intent is. It's easy to test - just look for the NOUNS. Lots of nouns used consistently when describing a step in a process, mean you're probably OK.
I've given everyone these above four tests because they always apply. Even at the executive level in the organization, these things should be deemed important. They are clear, auditable and technique-independent traits that can be assessed, whether you are plan-driven, prototype, vendor-supplied method, agile, or whatever.
How does this view of "Done" change when you're a business analyst?
From a business analyst perspective, it is your job to dig deeper and be accountable for quality. I'll give you a few thoughts as take-aways especially for you:
There are techniques that can help an analyst test requirements completeness and uncover missing business logic. Look at CRUD diagrams as an example of a commonly known test and technique. Look at an entity relationship diagramming (ERD) as a lesser known test (a bit of trivia here: any time there is a null relationship between entities in an ERD, you need to ask the question "What do you do ... " to flesh out what happens in the circumstance of, for example, a lead is never assigned to an agent). Look at some of these techniques to improve your ability to be proactive and identify issues before they arise as missed requirements. Factor the time to do some of this activity into your assignments.
From an analyst perspective the bar is quite a bit higher on what constitutes quality. You are "done" when the requirements have quality. To have requirements quality, requirements must be:
- Correct - the requirement is an accurate elaboration of a documented business objective or goal
- Unambiguous - the requirement has only one interpretation
- Complete - the requirement is self contained with no missing information
- Consistent - the requirement is externally consistent with its documented sources such as higher-level goals and requirements
- Ranked - the requirement is prioritized for some purpose
- Verifiable - the requirement is usable (e.g., testable) by the testers who must verify and validate it
- Modifiable - the requirement specifies only one thing
- Traceable - the requirement has its own unique identifier that can be used for tracing purposes
Friends, my rant is a little long this month but I hope a few ideas here bring clarity or a new perspective on a troublesome topic. "Done" never happens; reasonable happens. Your perspective on what to look at to assess requirements "quality" depends on your role, and you can educate the organization about how each level of the organization plays a role in assessing whether or not requirements are reasonably clear, accurate and complete. Check out http://www.iag.biz/resources/webinars/microcast--executive-guide-to-evaluating-requirements-quality.html for a quick example of how you might educate executives.
I wish you all, great success.
Keith Ellis is the Vice President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith's former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector. Keith is the author of IAG's Business Analysis Benchmark - the definitive source of data on the impact of business requirements on technology projects.