Skip to main content

Tag: Elicitation

FAT REQUIREMENTS; How to Lose Weight and Enjoy the Sunshine

Have you noticed the obesity epidemic plaguing North America? Hey, I’m not talking about the kind you get when eating the wrong foods – I’m talking requirements obesity. Generating fat, hulking, documents that look terrible on the shelf gathering dust is not at all conducive to enjoying the summer fun. Here are a few ideas for slimming-down those requirements. Not only will you experience the joy of actually being ‘done’ and getting into the sunshine, people will better understand what the business wants, and projects get to be more successful. How cool is that as a win-win for everyone?

  1. Split requirements documents into three: scoping, business requirements, and detailed requirements. Most people don’t make this separation and end up spending way too much time detailing functionality that will never see the light of day, or in detail levels that are completely unnecessary for the task at hand. Remember – iterate.
  2. Separate the WHAT from the HOW. Get clear on WHAT the business wants to do before you start to detail HOW it is going to do it. Over and again, I find myself mired in a review of details that are not only clearly inappropriate, but inconsistent, because there was just too much about how some technical aspect of the system was going to be ____________ [fill in the latest technical jargon term]. Get people back to basics here and you’ll find both faster cycle times on projects, AND, vastly simplified requirements documents.
  3. Concentrate on just the right information. Business requirements fall directly out of understanding the desired state of the process flow, data flow, and business rules of the business. The actual ‘requirements’ are the structured statements describing the gap between current state and future state in these areas. Can we agree that this is Business Requirements? If so, why blather on about business case, or get into interface design, or have an opus on the history of compliance. It’s not really necessary. Sure, you need to have some additional pieces like a data dictionary if you hope to have everyone on the same page when you use the term “Customer”, but we’re not dealing with a lot of additional information.
  4. I’m willing to bet, in North America, more projects have managed to kill their intended benefits by failing to identify and manage interdependency than pepperonis have been killed to make pizzas. Business professionals need to have interdependency drawn out for them because this is the stuff where executives actually have to make decisions and love you for bringing it to their attention. Get into using context diagrams – every line on that diagram is an interdependency; or, at least have a section for this in your documentation
  5. Negotiate the details. Every project is different and needs different detail to bridge from business requirements to system design. It’s natural that a system selection/implementation will have fundamentally different dynamics than an off-shore design/build. In the face of high quality business requirements, the nuance of what detailed requirements should be for this project at this time can be negotiated amongst the project team. You’ll invariably end up with a tighter definition of what is needed, and better project momentum with recipients when you deliver what they’ve asked to receive.
  6. Remember – ALL YOUR STAKEHOLDERS ALSO WANT TO BE IN THE SUN TOO. Make the process efficient for everyone.

This is not about being lazy – this is about being hyper-efficient when it comes to projects. In candor, executives respect analysts that can make them, and their project team, more proactive and productive. Driven to extreme, you can wallow in requirements detail until the winter months start blowing frozen air over the corpse of your project. But, is this valuable – or can you get to the right information more efficiently; get better process around developing the information, and yield successful results faster? I’ll tell you – absolutely, yes, you can! Take a look at the base of www.iag.biz research – you can be iterative, reduce time to get requirements by 58%, AND STILL have documentation quality high enough to drive down change requests by 75%. So, what do you do with the 58% less time? Why, it’s summer. Go to the beach!

I bet every experienced analyst out there knows where to chop out fat without sacrificing requirements quality. I’d encourage you to share your stories of obese requirements – or fat-chopping solutions.


Keith Ellis is the Vice President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith’s former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector. Keith is the author of IAG’s Business Analysis Benchmark – the definitive source of data on the impact of business requirements on technology projects.

Requirements; How Do I Know When I’m Done?

This is one of the most common and controversial questions amongst business analysts and project managers: “When are requirements done?” The most common response to this question, “When the customer signs off,” frankly makes me want to tear my hair out. The focus has to be on what you can control – defined levels of quality, timeliness, completeness, accuracy, clarity, or communication – not, what you can not control. But that still begs the question – “How do I KNOW when I’m done?” – what’s the definitive if-I-have-“X”-I-am-sure-I’ve-not-missed-anything-and-I’ll-get-a-pat-on-the-back?

Here’s the issue for everyone looking for a silver bullet: requirements are never done – at least not in the business information systems world. Trying for ‘absoluteness’ (to make up a new and cool consulting word) will more likely lead to process failure than success. As you cascade past business requirements and get into increasingly detailed iterations of specification documentation, the line between what is a requirement and what is a solution gets increasingly fuzzy.

Now before all the agilists out there start whooping it up in agreement: ‘absenting’ requirements (you need to be equal opportunity about making up consulting words!) will lead to incredible performance inconsistency – at an individual resource level, at a long term asset management and integrity level, and at a corporate expectation management level. I can even prove it: low requirements maturity agilists perform terribly in on time/budget/success performance versus high requirements maturity agilists (if you want the data, send me a note through the editor or the www.iag.com site and ask). The issue of requirements quality is common across all development approaches: You MUST define what are “clear, accurate and complete requirements” for your situation, if you expect to materially change project performance and success rates.

Therein lies the answer: defining “complete” means a company has to describe all three of:

  • the state of requirements (quality of information),
  • the format of requirements (template and techniques used for visualizing requirements), and
  • the process through which these artifacts are achieved.

Doing this defines what ‘done’ is for analysts and project managers. The question of “Am I done?” only really arises at companies where there is weakness in one of the requirements state, format or process attributes. Deal with these three attributes, and the company starts itself down that path of maturing requirements practices and materially changes its ratio of on-time/on-budget/success performance on projects.

OK! Get up and stretch – for those of you saying “Hey, he’s ducking the question!” – there’s more.

Not everyone can deal with the long term fix. We sometimes have to do quick fixes as a reality of day to day management. Here are four simple tests to assess if requirements are not done to a reasonable or sufficient degree of quality. These are Business Requirements level tests that work and will improve your requirements quality irrespective of delivery methodology:

  • If the requirements lack context. Requirements always exist to support “what” the business wants to do, not “how” it wants to do it. The “what” part of this is the context of BUSINESS PROCESS. No understanding of what business processes are impacted by requirements means someone has no idea of how requirements impact each other, the impact of removing requirements, or the ability to assure that the requirements collectively are complete or will meet a specific business objective. The way a company applies context in its documentation also creates the STRUCTURE of the documentation. Here is one technique example – Use Cases. As a technique Use Cases give both context and structure to requirements and help an analyst assure that the scope of the project is both well described and sequenced.
  • If the interdependency is not evident: How do you look for proof that interdependency is documented? Look for a section in the material called “dependencies”, check the “issues list”, look for an analysis technique called a context diagram (every line on a context diagram is an interdependency). Why is interdependency so important? There are two aspects to scope: internal to the system (e.g., its functionality, the workflow and information flow, etc), and external to the system (e.g., how this system needs to interact with other systems, how the workflow being automated hands off across other departmental units). In the absence of knowing the interdependencies, you only ever know HALF of the story on scope, so it becomes probable that you will encounter significant scope shift on any system of any degree of complexity.
  • Unclear business objectives: Objectives must be Specific, Measurable, Achievable, Results-oriented, and Time Bounded (easy to remember as ‘SMART’). The absence of objectives eliminates the ability to assess solution tradeoffs, makes difficult the prioritization of functionality, and etc. other problems. You can test if a particular function meets needs with user acceptance tests. You cannot test if the collective system meets needs unless you have clear objectives.
  • You cannot tell from the description of business need how information is going to move. I need $10 for every time I’ve seen business requirements expressed as a process flowchart. This is a nice (albeit somewhat inefficient) first step but here’s the problem: when you get to the ‘decision diamond’ in the picture that says “approve the policy (Y/N)” how do you expect developers to know what this means? The only way to elaborate this is to ask the questions, “What information do you need to know to approve that policy?”, “what do you do with that information?”, “where do you get the information from?”, “who else do you give the information to?” etc. The more detailed the description of WHAT the business wants to do, the more the description of process will center on how information needs to move in support of the business process. Until you get to the level of detail that is expressing information movement, you have no idea from the documentation what the business intent is. It’s easy to test – just look for the NOUNS. Lots of nouns used consistently when describing a step in a process, mean you’re probably OK.

I’ve given everyone these above four tests because they always apply. Even at the executive level in the organization, these things should be deemed important. They are clear, auditable and technique-independent traits that can be assessed, whether you are plan-driven, prototype, vendor-supplied method, agile, or whatever.

How does this view of “Done” change when you’re a business analyst?

From a business analyst perspective, it is your job to dig deeper and be accountable for quality. I’ll give you a few thoughts as take-aways especially for you:

There are techniques that can help an analyst test requirements completeness and uncover missing business logic. Look at CRUD diagrams as an example of a commonly known test and technique. Look at an entity relationship diagramming (ERD) as a lesser known test (a bit of trivia here: any time there is a null relationship between entities in an ERD, you need to ask the question “What do you do … ” to flesh out what happens in the circumstance of, for example, a lead is never assigned to an agent). Look at some of these techniques to improve your ability to be proactive and identify issues before they arise as missed requirements. Factor the time to do some of this activity into your assignments.

From an analyst perspective the bar is quite a bit higher on what constitutes quality. You are “done” when the requirements have quality. To have requirements quality, requirements must be:

  • Correct – the requirement is an accurate elaboration of a documented business objective or goal
  • Unambiguous – the requirement has only one interpretation
  • Complete – the requirement is self contained with no missing information
  • Consistent – the requirement is externally consistent with its documented sources such as higher-level goals and requirements
  • Ranked – the requirement is prioritized for some purpose
  • Verifiable – the requirement is usable (e.g., testable) by the testers who must verify and validate it
  • Modifiable – the requirement specifies only one thing
  • Traceable – the requirement has its own unique identifier that can be used for tracing purposes

Friends, my rant is a little long this month but I hope a few ideas here bring clarity or a new perspective on a troublesome topic. “Done” never happens; reasonable happens. Your perspective on what to look at to assess requirements “quality” depends on your role, and you can educate the organization about how each level of the organization plays a role in assessing whether or not requirements are reasonably clear, accurate and complete. Check out http://www.iag.biz/resources/webinars/microcast–executive-guide-to-evaluating-requirements-quality.html for a quick example of how you might educate executives.

I wish you all, great success.


Keith Ellis is the Vice President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith’s former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector. Keith is the author of IAG’s Business Analysis Benchmark – the definitive source of data on the impact of business requirements on technology projects.

The Realities of Surveys in Requirements Gathering

Requirements gathering techniques include the easy to send, but sometimes hard to develop, survey method to obtain data from a wide variety of people located anywhere. Surveys, however, are notorious for many faults such as ambiguity and a lack of response.

But surveys can produce a large volume of information for the gathering parties to peruse and collate, so developing good surveys is important for both the respondents who have to understand the questions and for the collators to get useful data.

Statistics Prove “IT”

It has been said you can prove anything with statistics if you ask the right question. Obviously, you need to ask the right questions to get the right data. Right?

Reality can be different. The interpretation of an ambiguous question, an open-ended question, a leading question, or a question made up of words most people do not understand, can be very detrimental to getting the answers you need.

For instance, if the requirements were related to virtualization of your datacenter equipment, you should avoid being too general or assume too much knowledge on the part of the respondent.

Ambiguous Question examples:

How does it help?
Who can use this?

Open Ended Question examples:

Why do you need this feature?
What services would help you?

Leading Question examples:

If this service were to fail, who would be affected?
Without web based access to this device, what can be done?
Who amongst your group has the necessary required certifications?

The “Right” Questions

So what is the right question to ask? You do not want to insult your respondent nor do you want to assume they understand the situation or concepts entirely.

How can you ask the right question? Ask an eight year old. Or rather, pretend the audience is eight years old. Assume they do NOT have a vast knowledge of the topic, a slanted knowledge of the world, or possibly a jaded view of the system. Use simple language that is less likely to be interpreted differently.

Provide respondents with some background information or assumptions at the beginning of the survey, or point them to relevant documents or websites they can use to clarify the concepts and issues.

Avoiding Acronyms and Industry Jargon

Often times, in the technical world, we have developed our own jargon or acronyms for the myriad technologies, services, applications, and such that we developed and/or have to deal with every day. For instance, SMB is one acronym that has vastly different meanings depending on your perspective. A recent book had this acronym within the title, and it was very confusing for the network specialist who later found out the book was meant for a general business type audience. What does it mean to you? In the book context, it was Small- to-Medium Business, to the technician, Server Message Block. To my kids on their instant messaging service, it meant So Much Better.

The standard axiom of KISS (or, Keep It Simple, Stupid), applies for questions – keep them short and simple but clearly to the point. In some cases, a preliminary paragraph stating the base assumptions will help to clarify the premise for the questions.

In Requirements Gathering

A common method of gathering requirements from many stakeholders, when interviewing or workshops are not prudent, is to send a general topic survey of questions in the early phase of discovery. These general questions should lead toward more specific questions as the discovery phase progresses. This is when specific questions become more critical. The questions must be very topical or delivery-oriented, maybe even very technical in nature, and yet they must ensure the respondent clearly understands both the premise and concepts.

For instance, a general trend today is toward using virtualization. Most people have no idea what this really means or even implies. To make your survey results more coherent, you may need to include references to websites, local or external, that explain the concepts. Have multiple people within your organization peruse these materials to assess their usefulness, and employ a wide cross-section of people to gauge if the content is technical in nature. You should not rely exclusively on technicians for feedback.

Survey Says …

The right questions on a survey may not always be enough. In some cases, the options to respond and/or the choice of responses can affect your survey results.

Fixed answers, such as in multiple choice, should give the reader plenty of leeway for their decision. Be as wordy as necessary for each choice, without too many overlapping or confusing variations of the same request.

Assumptions

Surveys have to assume some level of knowledge for the respondent, but that is not always achievable when dealing within different technical or managerial staffing levels.

For instance, the following question assumes a great deal of knowledge on the part of the respondent:

Q. How would your department benefit from using virtualization?

This is a rather open-ended question requiring the respondent to be very familiar with the topic and critical about the lack of virtualization use for a long time. To improve this question, provide some options that will make sense rather than have the respondent come up with all the answers:

Q. In an effort to reduce our data-center footprint, the company has been looking into the possibility of moving toward the use of specialized hardware that maintains the separate host services on a centralized, redundant virtual computing platform. Each current host that meets all the criteria (see website //xxx/criteria for details) for being virtualized will be transitioned to the virtual environment. How would your department benefit from using virtualization?

Redundancy [ ] Backups [ ] Ease of recovery [ ] Availability [ ]
Reduce licensing requirements [ ] Reduce number of physical boxes [ ]
Reduce management overhead [ ] Reduce resources [ ]
Expand service levels [ ] Ease of testing [ ] Ease of deployment [ ]

This may be too limiting, instead provide a range of responses

Q. [same intro] How would your department benefit from using virtualization?
(Scaled response with 1 being the least important and 5 being the most important)

This might be followed-up later with questions that specifically define some of the key attributes for switching to a virtualized platform or not.

Q. Select all that apply to your server:

Server provides: [ ] file [ ] print [ ] dns [ ] AD [ ] other _____
Comments: _________

Service must be: [ ] kept separate [ ] own IP [ ] behind firewall
Comments: _________

Setup requires: [ ] Win2K [ ] Win2K3 [ ] Win2K8 [ ] XP [ ] Linux
Comments: _________

Survey Choice Option Default Settings

Surveys often provide a scale for their responses. Some include as few as three options, while others may have 10 or more within the range. There is no best answer, although five to eight choices are probably more than adequate for most ranges.

Imagine a now defunct restaurant asked you to fill out their web-based survey as you left the restaurant. They failed, however, to put an easy-to-find link to the survey on their website. After searching the entire main page, and a few subsequent pages, a small, bottom menu link was eventually found. Within the survey web page, there was a simple explanation of their scale being from 1 to 5, for the 10 questions presented. Each question ended with the same simple drop down list. The default drop down value was preset to 1 for all 10 questions. This may have been planned or not, but it might produce a lot of low results.

Common survey choice ranges include:

“Excellent, Good, Fair, Poor, N/A”

“Strongly Agree, Agree, neutral, Disagree, Strongly Disagree”

“1=Very Bad to 5=Excellent”

“1 is worst, and 10 is best”

Clearly, each of these ranges requires questions that would elicit such a response from the expected target audience. However, do not expect that you will get the top-level choice. How often are you told you are doing an “excellent” job as apposed to a “good” job? Probably not very often. So why would you expect the general public to give a 5 or 10, an Excellent or Best response? Why would you then assume anything else is unfavorable if you have not clearly defined to the respondent what each choice represents?

Do Ranges Reflect Expectations?

The option “Good” could mean many things to many people such as: good, good enough, acceptable, expected norm, at least as good as you expect, etc. This is especially true when the “Excellent” response is the only one good enough. If anything other than “Excellent” is not good enough, ensure the respondent knows that they have to explain why they did not select “Excellent.”

In many hotels, you are given a document requesting that you call the local manager if there is anything that is not to your liking at any time. They want to provide you with a “10 out of 10” experience every visit. They conveniently provide a survey form for when you leave. What do you think happens if you do not fill out the survey for your stay? Perhaps an automatic “10” for that visit?

In some local grocery and department stores, they now have “Outstanding Service” forms you can fill out to honor anyone who has gone out of their way to provide exemplary service for you. No mention of where to enter complaints.

Can you remember your last few visits to a restaurant and a large department store? What was your experience? What was your expectation? What would you expect to be asked? Which range would you expect for the answers?

Surveys Everywhere

Surveys are now ubiquitous throughout the world and we have developed a love-hate relationship. We love to send them but hate to respond to them. We, and this is a collective “we,” tend to ignore most surveys unless there is something in it for us. There are companies on the Internet that offer rewards for filling out surveys from their clients.

Many years ago, a former boss had a great philosophy about surveys done by large groups of customers. He would sort them according to rank, from best to worst, and then ignore the first and last to get the average. He assumed that there was always one in the crowd who had a weird idea of what “good” meant, and he balanced this against those who just quickly gave a general “good” or “excellent.”

There are lots of ways to interpret survey results – many of them bad!

The professional sports teams have all sorts of methods, or “improvements,” to make their stats look good. There are rules for how stats are accepted and interpreted. They usually keep a running total, not just the single events. Keep this in mind when you are working on your results. Taken in small groupings, there can be wide swings in averages. Taken in gross, these averages of averages are more consistent.

In Summary

The usefulness of a survey is in the responses that are received. Be as helpful as possible to the respondents by ensuring they have sufficient information and knowledge to complete the survey. Strive to provide a reasonable and clearly stated range if used. Restrict the space for open-ended questions and keep the questions as simple and succinct as possible, so that anyone can understand each question. Adding an incentive for respondents to complete the survey quickly may also speed up the process. Surveys can take a long time to prepare, collect, and collate but with careful planning, a well-executed survey can simplify the process of gathering requirements.


David Egan is a Course Director and developer of content for more than 20 Global Knowledge courses. David has more than 20 years of teaching experience including more than five years of teaching ‘on-line’ virtual classes over the Internet using Centra, Interwise and iLinc virtual classroom services. David has also written technical books and delivered adult learning seminars on UNIX, Linux, Microsoft Operating Systems and Services, as well as Business Process Analysis and Project Management since 2005. David and his family live in a suburb of Vancouver, B.C. This article was originally published in Global Knowledge’s Management in Motion e-newsletter, Business Brief. Global Knowledge (www.globalknowledge.com) delivers comprehensive hands-on project management, business analysis, ITIL, and professional skills training.

Copyright © Global Knowledge Training LLC. All rights reserved.

Why Visualize Requirements?

How many times have you been in a meeting discussing a set of requirements, a methodology, or a project plan etc and someone has gotten up from their chair and said “where’s the whiteboard let me draw what I mean”?

I can tell you for me it has been plenty!!!!

Whilst requirements specifications are a great way to document the detailed information related to a new or existing product’s functionality we all live in a time poor society and few of us have the time to trawl through large documents and extract the information we need and then start the seemingly endless e-mail threads to discuss the individual use cases associated with each requirement consisting of many messages that start and end with “what did you mean by X?”, ” I meant X and Y but I think you thought I meant Z!” Instead why don’t we adhere to the adage of a picture tells a thousand words and instead of page after page of documents create a visual representation of those requirements – hopefully communicating a thousand words in a single picture.

However, what we must remember is that visualization of requirements can vary in its meaning. For example, some people may view requirements visualization in the same context as simulation diagrams, whilst others interpret visualization to mean simple use case diagrams or business process flows typically created in a MS Visio type tool. For me, all of these usage contexts can represent visualization, so instead of trying to classify visualization into one genre I thinking it is best to view it on a scale with simple flows at one end and high end simulations at the other – and the user selects which method is most appropriate for them at any given time. For example, if you are trying to show how a user will move through an application to make a purchase then using MS Visio to define process flows may be enough. However, if you are trying to envisage how a new UI (User Interface) may look then mockups and more rich content visualizations would serve you better. Whichever method is selected there are a number of benefits that come from visualization these include:

  • Flexibility and Usability – flow diagrams can be easier to navigate helping to find content
  • Mistakes can be easier to identify in a visualization
  • Easier to identify potential parallelisms between requirements and business processes
  • Easier to spot missing Use Cases in business process
  • Increase understanding of the requirements themselves
  • Increase understanding of the dependencies between requirements
  • Visualization of business flows can provide a first bridge to Business Process Models or SOA repositories

Now that we have explored some of the benefits of visualization the question now becomes when should it be used? Should we visualize every requirements we write or just some and if we are going to be selective which requirements should we chose?

In my opinion there are a number of questions we can ask ourselves which can help to determine when to and when not to visualize. These include (and there are many more):

  • Type of development method – we need to ask ourselves the question do requirements visualizations fit in with the need for more agile and rapid requirements definition or will they add more time to the development process?
  • Complexity of the requirement – if a requirement has too many sub requirements will this create a “spiders web” diagram which may overcomplicate the definition of the requirement?
  • Type of requirement – should we visualize the user story only and define the functional requirements associated with this user story as text or do we want to visualize all requirements?
  • Risk level of the requirements – should only high priority or high risk requirements be candidates for visualization?

It is important to note that I am not saying that requirements visualization is a “panacea” for enabling effective business and IT communication but what it will do is act as a good facilitator to help initiate a better degree of communication and understanding between the two parties.

So now the decision is yours. Why not try visualizing requirements and feed back to the group how things go.


Genefa Murphy works and blogs with Hewlett-Packard where she is Product Manager for Requirements Management. This article first appeared in HP Communities.

© Copyright 2009 Hewlett-Packard Development Company, L.P.

Requirements Definition for Outsourced Teams

In today’s economic environment, business organizations are demanding focused attention to fiscal discipline. IT organizations are finding themselves asked to support in-production applications on flat budgets, and new development is largely being approved only by the rule of efficiencies. Software applications are the focal point of improving efficiencies, as consolidation and integration projects can both reduce support costs of multiple siloed applications and streamline business processes for end users.

In this effort to do more with less, IT software groups are turning to outsourcing in record numbers in 2009. According to IT World, the economic collapse of 2009 has accelerated the use of outsourcers for software projects to record levels 1.

With CIOs turning to outsourcing as a strategic imperative to increase efficiencies for software projects, new challenges are being introduced that threaten the same efficiencies CIOs are moving to achieve.

By definition, outsourcing introduces third-party goods and services to augment capacity and capabilities. Since IT software has mission-critical implications, such third-party influence places a new burden on the business to ensure that these outsourced teams are properly goal-oriented, properly instructed, and properly managed to ensure productivity.

While there are many areas that can be influenced to ensure outsourcer success, there has been study after study that indicate the true control point for IT software projects is application requirements definition.

What follows explores industry, analyst, and customer recommendations on how to focus on requirements to ensure application development accuracy and to control risks so that the IT organization can turn those efficiencies into increased horse-power and lower operational costs.

Requirements Communication: A Challenge for IT Project Teams

The quality of requirements communication is a significant challenge for IT project teams, whether they are co-located or distributed. In the Software Development Lifecycle, the time dedicated to requirements definition has largely been consumed at

the early stage of the lifecycle, and it has involved dozens of subject matter experts who typically carry the title of business analysts or business systems analysts.

However, recent studies indicate that while business analysts do consume up to 10% of the project budget documenting requirements specifications 2, the result of their effort is typically in the form of difficult-to-understand paper-based documents. These paper-based documents are largely consumed by IT project teams, who must work to understand the intent of the author and translate the business need into detailed specification documentation.

Even in IT projects which largely consist of in-house development teams, i.e. not outsourced, the resulting rework and waste has been measurable. IAG tells us that typical waste and rework levels of poor requirements trace directly to upwards of 40% of budget consumption 3.

As IT organizations move to embrace outsourced teams as an extension of IT software project teams, the challenge of communicating requirements is exacerbated. MetaGroup tells us that over 50% of organizations that leverage outsourced teams have critical business-application knowledge in the minds of in-house developers that have been disenfranchised by the outsourced labor pool 4. As a result of this loss of subject matter expertise, outsourced IT organizations increase their dependence on the customer to

produce highly precise and specific requirements documentation. MetaGroup also tells us that turnover rates in outsourced service providers run at an average of 15-20%, resulting in a likely chance that specific talent assigned to your project will experience turn-over during the project cycle. This continues to reinforce the need for easily referenceable and consumable requirements direction. Experienced outsourcing customers and industry analysts have identified the appropriate focus areas to ensure IT project teams’ success when deploying outsourcing. While there are many areas that can impact the success of an IT team that has moved to leverage outsourced teams, there are a select few that dramatically improve success. IDC’s recent report on control points to outsourcing success helped draw focus to the most important areas.

IDC continues to articulate the control points that ensuring outsourcing is an opportunity for efficiency and not a threat to efficiency involve the strategic touch-points to the outsourced team. IDC documents these touch-points to be requirements definition, quality assurance, and in-flight project collaboration 5 Other analysts such as Gartner, voke, and Forrester all have offered supporting research, which point to these same control points.

Controlling the Control Points

As mentioned earlier, there is a generally accepted principal of the importance of requirements, quality assurance, and collaboration when aligning outsourced teams. However, when IT arranges the control points in relation to one another, the logical focus priority of requirements is revealed. As you can see in Figure A, the quality assurance and project collaboration control points are directly impacted by the depth, quality, and understanding of the project Requirements Definition phase. In fact, rarely is a single quality assurance test scenario not directly based on (and traced back to) functional or non-functional requirements. A modern quality assurance trend is a move toward test-based-development, a trend that is accelerating with outsourced teams. Test-based-development builds a one-for-one relationship between test cases and requirements, where literally a test case can function as a requirement asset. In addition, collaboration with development is often directly linked to the implementation of a business requirement, or how the software influences that requirement.

Figure A: Relationship with Requirements to QA and Collaboration

By directing focus on improving requirements definition, IT project teams that leverage outsourcing groups can better manage all control points, and thus improve the impact and focus of quality assurance and inflight collaboration efforts.

Problems with Requirements Communication

As we discussed in the previous section, working with outsourcers bring obvious challenges of aligning distributed, third-party resources around project goals. Location challenges alone can introduce time zone and collaboration barriers that can tax productivity and efficiencies. However, with third-party organizations, IT groups can introduce additional challenges that include processes, tools, training, context, domain expertise, and incentives.

Requirements communication fits squarely into the center of this challenge. As we discussed, traditional methods of communicating requirements, which include enumerated lists of features, functional and non-functional requirements, business process diagrams, data-rules, etc., generally are documented in large word-processing or spreadsheet documents. When applied to an outsourced team, this method of communicating creates significant waste and opportunities for failure, as the barrier to understanding is too large to overcome.

Incorrect interpretation and the lack of requirements validation can create artificial (or false) goals which consume valuable outsourcing resources. Due to the nature of software development, these false goals usually manifest themselves into incorrectly implemented code, resulting in costly waste and rework. Outsource providers often treat such rework as “changes”, and bill back these “changes” to the customer. This continues to erode the efficiency that IT organizations strive to achieve when adopting outsourcing in the first place.

Models, Validation and the Requirements Contract

To significantly reduce the probability of ineffective requirements communication through natural language documentation, IT organizations are transitioning to more precise vehicles to communicate requirements.

One of these vehicles is the adoption of the model-based approach to communicate requirements in a highly visual way. Requirements models provide detailed context capture through highly precise data structures. Complete models include the use of universally accepted formats as structural guides, interlinked them together to create a holistic representation of the future system. The formats used in these holistic representations include use-cases for role (or actor) based flows, user-interface screen mockups, data lists, and the linkage of decision-points to business process definitions. These structures augment enumerated lists of functional and nonfunctional requirements.

The benefits of models include the use of simulation to ensure requirements understanding. Simulation is a communication mechanism that walks requirements stakeholders through process, data, and UI flows in linear order to represent how the system should function. Stakeholders have the ability to witness the functionality in rich detail, consuming the information in a structured way that eliminates miscommunication entirely.Models and simulation also provide context for validation. Validation is the process in which stakeholders review each and every requirement in the appropriate sequence, make appropriate comments, and then sign-off to ensure the requirements are accurate, clear, understood, and are feasible to be implemented. Requirements validation can be considered one of the most cost-effective quality control cycles that can be implemented for an outsourcing initiative.

Since requirements are the “blueprint” of the system, outsourced stakeholders can make use of requirements models and simulation during implementation to gain understanding of the goals of the project. Simulation eliminates ambiguity by providing visual representation of goals which in turn eliminates interpretation.

Rich requirements documentation often is a specified deliverable for most IT projects for various reasons that include regulatory compliance (Sarbanes Oxley, HIPAA, etc.), internal procedural specifications, and other internal review cycles.

This documentation also serves as the contract between the customer and outsourced provider. Models can serve as the basis of this documentation and next generation requirements workbench solutions (such as Blueprint Requirements Center) can transform models into rich, custom Microsoft Word documentation. Since these documents are auto-generated, the amount of effort required to build and maintain these documents is minimal

Abstract vs. Detailed: Outsourcer Involvement in Requirements Definition

Outsourcing providers have learned a tremendous amount about how to improve the efficiencies of requirements communication. Many providers are shifting to a much heavier involvement in the process of Requirements Definition. Others continue to operate in a more traditional model, which abstracts them from the requirements definition process, leaving this on the shoulders of the customer.

Western outsourcers have heavily pioneered and practiced an approach that includes efforts to work with customers to articulate, document, and communicate requirements. Part of the value proposition of this approach is that the outsourced provider mitigates the risk of misunderstanding, and ensures that members of the outsourced team gain a clearer understanding of the project goals and deliverables. This approach is often referred to as a detailed approach for requirements definition in outsourced projects.

Indian and European outsourcers largely continue a practice which abstracts outsourced providers from the definition of requirements. Such abstraction means that the customer takes on responsibility to clearly document and articulate requirements to outsourced providers. This requires that the customer partake in extremely accurate and precise specification of project requirements, knowing that cultural, time zone, process, and alignment barriers that exist in the interpretation of these requirements. This approach is often referred to as an abstract approach for requirements definition in outsourced projects.

It is important for an IT organization to understand which of these two approaches are taken, as they can dramatically change the methodologies and practices required to ensure clear understanding of project requirements.

The Solution: A Case Study

The principles described in this paper should be considered and applied at the earliest stages of the project both to set the stage for the work that follows but also because the earlier errors are discovered and resolved, the less expensive the impacts of those errors will be. Just as the cost of errors increases exponentially the later they’re found in the lifecycle, the corollary is also true – to find and deal with them early can result in exponential savings. In the case of outsourced development this should happen even before the outsourced vendor is chosen, during development of the Request for Proposal (RFP).

An example of such a case is provided by Knowsys, a Blueprint partner. Knowsys staff were contracted to come in late to an RFP cycle that had gone awry at a major North American financial institution. This company needed to re-architect their entire e-commerce platform for their Wealth Management business. Significant investment had already been made in the RFP process and when Knowsys arrived, the client was just beginning a series of vendor presentations with senior executives of the client in attendance, summarizing their bid for the outsourcing contract. As the presentations continued, the “elephant in the room” kept getting bigger and bigger. It had become clear that something had gone terribly wrong.

The requirements specified in the RFP to the vendors were incomplete. There was conflicting information and inconsistent levels of detail. Upon further analysis, it was discovered that whole business areas were neglected. Compounding this was the fact that subject matter experts had incorrectly assumed how certain areas of the business functioned. These inaccuracies were further compounded by the various vendors who bid (seven in all) by layering on their own assumptions to fill in the gaps. The result was a series of vendor presentations that were wildly different, almost as if they were trying to address seven different problems and none of them being that of the customer. In addition to uncovering these major flaws in the requirements of the RFP, this event also made obvious that the process for inviting vendors was less than perfect. Some had clearly invested huge amounts of time and effort in the proposal, while others less so. None had sufficient familiarity with the customer’s business or situation to be able to point out obvious flaws.

In effect, the reset button was hit. The executives directed the group, with Knowsys now involved, to redevelop the RFP. This time executive sponsorship was front and center and all aspects of the business were directed to be accessible and support the initiative. All relevant aspects of the business were thoroughly analyzed and their needs amalgamated into a unified representation of the requirements. Validation was performed to ensure coverage, depth, and clarity. Much more rigor was applied to the process of vendor selection for bidding (a contrast from the open invitation used in the first cycle). A smaller group of more focused vendors who knew the client’s business were invited. The Knowsys team also made sure the vendor relationship was far more collaborative, while respecting the impartiality required of bidding process. Theyensured there were multiple points of client-vendor contact and also put measures in place to ensure that any and all assumptions were validated. Finally, an emphasis was also applied to quality assurance and testing aspects (as opposed to a sole focus on the requirements of what was to be built) to produce a much more rounded picture of the bidder’s proposals.

The net result of these initiatives was tremendous. The second set of presentations, by a much smaller group of bidders, was like night and day compared to the previous round. It was clear that each vendor had a very accurate grasp of the client’s problem and goals. Each proposal was compelling and had unique and interesting variations on their proposed solutions.

A vendor was selected and the project got underway, later than hoped due to the failed initial RFP cycle. Everyone felt much more confident entering such an important development initiative with the specifications they now had, the vendor they had selected, and the proposed solution. That confidence was validated when the project, even in the face of unexpected business changes during the project, was delivered on time and on budget with all success criteria being met. Had this financial institution selected a vendor in the first round, and proceeded with development on that basis, the results would undoubtedly have been quite different.

Conclusion

The steady rise in outsourcing of software development has increased in the recent economic climate as companies desperately seek ways to reduce IT costs. The promise of cost savings is realizable, but only to those who focus on three vital requirements control points in the outsourcing arrangement: requirements communication, requirements reference, and requirements validation. Gaining mastery of these through

appropriate processes, practices, and automation will dramatically improve the probability of success of the outsourced engagement, delivering the needed cost savings.

Footnotes

[1] Five trends that challenge technology offshoring in 2009, IT World
[2] Thorny Requirements Issues Handbook 2005 – Process Impact – Karl. E. Weigers.
[3] IAG Requirements Survey, 2007
[4] MetaGroup, Search CIO Top 10 Risks of Offshore World
[5] IDC Analyst, Melinda Ballou, Offshore your Way to ALM, RedmondMag


Matthew Morgan is a 15 year marketing and product professional with a rich legacy of successfully driving multi-million dollar marketing, product, and geographic business expansion efforts. He currently holds the executive position of SVP, Chief Marketing Officer for Blueprint, the global leader in Requirements Lifecycle Acceleration solutions. In this role, he is responsible for strategic marketing, partner relationships, and product management. His past tenure includes almost a decade at Mercury Interactive (which was acquired for $4.5B by HP Software), where he was the Director of Product Marketing for a $740 Million product category including Mercury’s Quality Management and Performance Management products. He holds a Bachelor of Science degree in Computer Science from the University of South Alabama. He holds a Bachelor of Science degree in Computer Science from the University of South Alabama.