Skip to main content

Optimized Use Cases

Use cases have been around for some time now, bursting onto the public spotlight at the 1987 OOPSLA conference in Orlando. Since that time they’ve been used in countless projects. In some situations use cases have excelled, producing remarkable results while in many others they’ve failed miserably. Use cases have what seems to be an endless stream of textbooks and articles written about them which, especially to those new to the subject, can be overwhelming and confusing.

One missing piece to this equation was the fact that requirements definition as a professional discipline has been underserved until recently by the industry and by software tool vendors. Communities, professional organizations, and modern software toolsets are now finally available for requirements authors and to support the requirements lifecycle. Some of these tools bring new and innovative capabilities not imagined before, and as they get applied to existing approaches like use cases, new sets of best practices emerge. This article examines some use cases best practices learned from using Blueprint Requirements Center on real projects.

Fundamentals

Less is More

One of the most unfortunate habits of requirements authors is the belief that providing greater quantities and more detailed information is a good thing. The goal with requirements is to communicate precisely what is needed, and for the information to be completely and accurately understood by the consumer. Unfortunately many authors feel the more information they can stuff into the document and the more detail they provide, the better job they’re doing.

Example: Pretend for a moment that you just got the job of running the county. Tourists arrive with the goal of discovering the area – its culture, character, history, attractions, and so on. Your job is to help them achieve this goal. In your enthusiasm you start creating brochures detailing the various aspects of the county. And you keep on making brochures. So many that you end up with hundreds that cover every mundane little thing in the area. After a month of operation you discover there’s no pattern to the brochures taken – one from here, one from there. Something like 90% of the brochures haven’t even been touched. Surveys of tourists show no consistency in their perceptions of the county – it’s almost like the different tourists have visited different counties entirely. To summarize the results of this:

  • The goal of communicating the unique culture and character of the area wasn’t met;
  • Visitors all left with very different impressions and understandings of the county;
  • You spent a lot of time, effort, and money providing information and 90% of it wasn’t even used.

This example is much like the requirements situation on many projects and organizations. “Information overload” is a huge problem. So often the answer to every issue and misinterpretation in requirements documents is to add more content in order to ‘elaborate’ and ‘clarify’. Simply adding more content is generally counter-productive, often doing more harm than good by introducing conflicting information, inconsistencies, and redundancies. When authoring requirements you should always put yourself in the position of the consumer. You should strive to communicate what’s needed using the smallest volume of content possible. Since even this can be considerable in size, you also should strive to make that content navigable. This means you should structure the content so that readers know where to start and how to traverse the content in order to best understand what’s being communicated. While this takes skill and effort on behalf of the requirements author, the positive effects on the software project can be dramatic.

Know Your Boundaries

If I had to pick one aspect of use case models that people should ensure they do right, it would be to have a good understanding among all stakeholders of what the system boundary is. For some applications it’s obvious and apparent, while for others it can be quite the opposite. Since a use case documents a dialog that spans this boundary, not having a good understanding of it can severely reduce the clarity of your requirements. For those who rely on the use case model to do their work, people like designers and testers, their work-products will suffer similarly.

Fantasy vs. Reality. Try Bottom-Up

For those new to the use case approach, it’s easy to get lost in use cases with unusual terms like, includes, extends, actors, associations, models, and so on. With so much focus on learning to navigate this new and imaginary world, it’s easy to lose sight of the real world it’s supposed to represent. Developing work-habits that regularly move you back and forth between the two can help keep your modeling work grounded in reality.

Use cases do of course provide many benefits, not the least of which is to clearly identify a higher purpose for a collection of steps and decisions, to answer the question “why are we doing these actions”. So, where a group of steps and decisions together fulfill some goal of an external user, we group them and that becomes a use case. Where a flow crosses from one group to another, this becomes a relationship. If the flow came from a decision it’s an “extends” relationship. Otherwise, it’s an “Include” relationship.

There is typically little debate over what the actions and decisions are as this is simply the real-world work that needs to get done. On the contrary, there is often debate over what the use cases are, since these are abstract groupings of the real-world steps and decisions, and one person could come up with a different grouping than the next person. For multiple people to independently group the steps and decision and come up with a comparable set of use cases is an indication of a well-understood problem space.

It’s important when modeling to not lose sight of the real world which is the subject of your model.

Modeling Style

There are many different styles of modeling. Some modeling styles, for example, make use of decomposition, where abstract actions or steps are decomposed into finer detail contained in included use cases. There is much debate over whether decomposition with use cases is a good idea or not. It is simply one approach. One advantage it offers is to allow readers to “adjust” the level of detail at which they work at will – work at a high level, or follow the path of decomposition to drill into the details of a specific action. One objection to its use is that it will influence the developers who work from the use cases toward implementations that are not object-oriented in nature.

Think Like a Tester

Tests are very similar to requirements. Both are really just descriptions of “what the system is supposed to do.” So if they’re talking about the same piece of software then they need to be completely consistent. Some, in fact, hold the point of view that tests are really just a more detailed form of the requirements. Looking at the requirements from the perspective of a tester can be very valuable for detecting issues early. In particular I’ve found the tester’s perspective can help a great deal in defining the system boundary. This is because testing is all about providing “stimulus” to some “thing” and then observing that “thing’s” behavior in response, so the borders of where the “thing” begins and ends need to be clear, and testers are great at thinking this way. A second area where I’ve found testers to be a great help in requirements definition is thinking of exception cases. Stakeholders and business analysts tend to be very good at identifying what the system should do when things go right, but experienced testers excel at thinking of all the possible ways that things can go wrong. Having this knowledge up-front means it can be accounted for and influence the requirements while they’re being defined, as opposed to being an after-thought late in the cycle. In addition to the tester as an individual, modern requirements toolsets that can automatically generate tests provide tremendous value as well. When reviewing requirements – actually even when authoring them – these tools allow the corresponding tests to be instantly generated and used to provide another litmus-test for requirements quality. Often inconsistencies and errors can be spotted in the generated tests that were missed when reviewing just the requirements.

Be Expressive

Use cases are our tool for communicating what is to be built. To achieve this you need to be as expressive as possible with use cases. No matter how good you are with words, written text can only go so far. One of the easiest yet most effective ways is to mockup potential user interfaces for the steps of the use case. In other words, render the more significant aspects of the user interface as it evolves through a use case. Most often this will just be simple sketches, but where appropriate can be higher-fidelity visualizations. Together this set of mockups will form a storyboard for the various scenarios. If the nature of the project is enhancing an existing application, screen-shots of the existing application can serve as the starting point, then annotate, markup, or add new screen controls as needed. Shifting focus onto storyboards as opposed to the text of the use case flows can make reviews significantly more effective. As with test generation mentioned earlier, there are tools now that support this more visual approach to defining use cases.

Another way to be more expressive is with data or information. Where the use case effects or transforms data in the application or where data influences the behavior of the use case, instead of describing this in text, modern requirements tools will actually allow you to encode these calculations and updates. During a simulation session not only will the visualizations be shown but samples of real data can be entered, displayed and calculated similar to the real application. Together this has the effect of bringing the story to life as opposed to forcing reviewers to imagine it from textual use cases.

Another example of increasing the expressiveness of the use cases again is provided with modern automation. There is typically a large amount of reference material on a typical software project related to the requirements. This can be standards, guidelines, regulations, vision documents, operational documents, and more. Tools today can automatically open, navigate to relevant content, and highlight it when these references are called upon. This automatically brings relevant content into the discussion as opposed to leaving it buried off to the side where important aspects could be missed.

The Bigger Picture

Where Do They Come From?

Let’s say I came up to you and said, “Hey, please make me three or four detailed use cases. I’ll be back after lunch to pick them up, and I expect them to be correct!” Your chances of delivering what I need are pretty much zero. You need to find out what my goals are for the application, what the application is to be used for, what major decisions will I need it to support, is this enhancing something that exists or is it new, are there any constraints, are there any special needs like security or safety, and other questions like this. In other words, you have some work to do before you get to the use cases. Typically this results in textual, hierarchical lists of goals, rules and other categorizations of need, and often sketches of the business processes in which the application is to play a part.

So I’ve Made Some Use Cases. Now What ?

After creating use cases, you’ll need them to be reviewed with the client whom the application is for, and also with the people that are to build and test the application. Reviews of requirements are one of the most crucial control points in the software lifecycle. It’s an opportunity to point the project in the right direction, and to do so early. Errors missed in reviews are simply errors whose discovery has been delayed – they will eventually be found, just later when they’re more expensive to fix. The effectiveness of the review depends on how well the requirements can be communicated. The more expressive the requirements, the more likely they’ll be communicated clearly. Another major way to be more expressive, is to use simulation during reviews. Modern Requirements Definition toolsets support simulation of requirements where the requirements can be “brought to life” to give an impression of how the future application, if built to these requirements, will look, feel, and behave. After all, most people when reading requirements are in fact performing a simulation in their minds trying to visualize the future application to help decide if this requirement document is acceptable. The problem is that in their minds’ simulation things are missed, all the interactions cannot be accounted for, and perhaps worst of all everyone has a somewhat different vision depending on how they interpret the written text. Automated simulation, projected for all to see, has none of these issues and provides all the benefits – literally a common vision.

Not only is communication vital during reviews to get the requirements right, but it’s also vital for those who will build and test the application to understand what they need to do. Any miscommunication here means people will go off in the wrong direction with their work. Simulation is a very good tool for this as well. Once they do understand however, they actually need access to the use cases and associated requirements information to do their work, since their tasks depend upon this information. These are areas where modern tools can really make a difference, in a number of ways.

First, tools today can automatically generate tests directly from use cases. This is a huge time-saver. Not only is the work done automatically, but it’s correct. Even more advanced tools allow you to filter the set of tests produced to focus only on those of high-risk, or the most business-critical.

Second, requirements definition tools today also can auto-populate the tools used by the designers and testers with the requirements, and tests, produced using the requirements definition tool. This avoids transcription errors and oversights that often happen when you deliver a document, and the practitioner needs to manually enter relevant information into their toolset.

Third, requirements definition tools today can automatically generate the documentation you need, either because your process calls for it, or to comply with corporate standards. Document generation governed by templates allows you to define entirely the format and content of the documents. More advanced tools can even automatically produce redlined documents showing changes since some previous version like the last review session, for example.

Conclusion

There have been significant gains made in requirements definition tools in recent years. This perhaps shouldn’t be surprising given that this area, arguably one of the most crucial for determining the success of software projects, was neglected by software tool vendors for decades. These advancements, coupled with best practices learned by applying this new technology in real and complex projects, has the potential to clear the log-jam of software project failures that has plaguing the industry for years.

Don’t forget to leave your comments below


Tony Higgins is Vice-President of Product Marketing for Blueprint, the leading provider of requirements definition solutions for the business analyst. Named a “Cool Vendor” in Application Development by leading analyst firm Gartner, and the winner of the Jolt Excellence Award in Design and Modeling, Blueprint aligns business and IT teams by delivering the industry’s leading requirements suite designed specifically for the business analyst. Tony can be reached at [email protected].

It Ain’t Easy Being Agile

You have to admit – agile folks are conflicted. On one hand there’s the folks screaming requirements are dead***. On the other hand, people teaching agile practices have to explain the asterisks; mention these things called user stories and the practices of getting good user stories (like making each user story testable and how to deal with non-functional requirements). Then there are the folks rolling out these practices and using them in real life on complex engagements. We’re facing the issues of sequencing and redundancy of stories, figuring out which ones accidentally change the architecture of the system (oops!), which ones were really a whole book rather than just a story, etc. and how to actually get to the promised land of higher productivity. No wonder you get questions from developers like, “can I write down this non-functional requirement?” Agile is still a storm of mixed messages – and like the Internet bubble of the late 90s hype might do more harm than good to the movement over time.

 

itainteasy1

Let’s get away from the hype and look at the facts of performance – meaning development efficiency, timeliness and stakeholder satisfaction. I recently looked at all of these variables in a large scale research study – The Business Analysis Benchmark. What I found is that Agile, Waterfall, Iterative, Prototyping/visualization-centric approaches all performed identically. Statistically there is absolutely no difference between any of them. What creates the performance difference is the level of requirements discovery and management maturity behind the primary SDLC selected. Any hype that says agile practices are universally better is simply wrong and detrimental over time. It will lead to very large scale and eventually public failures – and a counter-movement away from these practices. This would be a real shame. Agile is great stuff when used correctly.

Methods and practices of business analysis have to follow the mantra of “the right practices at the right time.” You can’t use OLTP analysis practices/techniques and expect to produce effective requirements on a data warehouse. It’s silly! Similarly, you want to target agile practices to the right projects where agile makes the most sense and have enough corporate requirements discovery and management maturity to know the difference. Anything else is like trying to blast a square peg through a round hole. If the stakeholders don’t want to participate in scrum meetings every day… get onto another track! If you’re dealing with something data driven or with very high numbers of transactions in a regulated system – run away… don’t walk. Yes, you could do it… why would you? I could run around on the freeway too, but I’m suggesting there are better ways to use some roads. In the long run, using the right techniques at the right time will pull far more momentum to the movement than creating hype and watching the carnage.

Don’t forget to leave your comments below


Keith Ellis is the Vice President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith’s former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector. Keith is the author of IAG’s Business Analysis Benchmark – the definitive source of data on the impact of business requirements on technology projects.

Using the Requirements Creation Process to Improve Project Estimates

 

Estimation can be one of the most difficult parts of a project. Important questions must be asked in order to form the right figures and plans. How long will the project take? How many resources will it consume? Consultants may also ask the following question: What is the appropriate amount to bid on this project? These questions are not easy to answer at the outset when one generally has only a vague idea of what will be required throughout the project.

The good news is that there is a fairly simple way to improve project estimation and, consequently, the bidding process. Most people do not realize that the requirements creation process can lend insight into the length and scope of a project. Let me give you an example of how this method works and then explain how you can implement it within your own company.

The Story

Back in 1992, I was working for a consulting company named The Kernel Group (TKG). During this time, I was put in charge of porting Tivoli’s software from Sun’s Solaris operating system to IBM’s AIX operating system. The project was to be done under a fixed bid, and consequently, we at TKG knew that estimating the effort required to port the code was of paramount importance.

I looked at the code with a coworker of mine, and he came to the conclusion that if Tivoli didn’t make the project hard for us in some unspecified way, we could port the million or so lines of code in about a weekend. I told him that he was nuts – that it would take at least a week, maybe even two. We jointly decided that we probably ought to call it three weeks just to be safe. We also decided, rather smugly, not to report our padding of the schedule to evil upper management.

As a result, evil upper management drastically expanded the project bid to $325,000, and my coworker and I thought that this was a ridiculously high price. We believed that we were gouging the customer and that they would never accept it.

Yet they did accept it, and once the project began, we proceeded to discover how truly terrible we as software engineers were at the task of project estimation. To make a long story short, the porting schedule expanded to exceed our original estimate and we consumed not only all of the $325,000, but a whole lot more on top of it.

The Formula

Now our consulting company was religious about tracking employee time on a per-project basis, and so we broke every project into phases: requirements/specification, design, coding, testing, debugging, documentation, training, etc. This project was no different in that respect; we broke it down into its respective phases as well.

Just before we started working on the project in question, I read a book called Practical Software Metrics for Project Management and Process Improvement by Robert B. Grady. (By the way, this is a truly fabulous book that I would highly recommend to anyone who is managing software development projects.) According to the book, one of Grady’s rules of thumb is that 6-8% of every software project is usually eaten up in the requirements/specification phase.

One of the conclusions that Grady comes to in his work is that you can use this fact to estimate total project size. In other words, if it took 60 hours to do the specification, that’s probably 6% of the job and the job will be 1000 hours. Following such logic, a six hour specification implies a 100 hour job. Since the specification always comes first in any project, you can get some pretty reliable estimates from this method alone. In fact, in my experience as both a programmer and the CEO of a software company, I have found it to be incredibly accurate and useful.

A second way to triangulate this project estimate is to ask experts in the area for their opinions – hopefully they will be better at project estimation than my coworker and I were that first time. A third way is to select an appropriate metric for estimation. For example, one could use line of code counts or function points in estimating the length and scope of software projects. For architecture projects, you might use number of pages in the drawings or square feet planned as similar analogies. Every project has some gross measure of its size that is available at the outset and can be used to plan the overall project in addition to this method I’ve described of tracking time against the earliest phases.

So back to the story. We really blew it on estimating and bidding on that first project for Tivoli, but when the next one came around, we had data on the portion of the overall project that the requirements phase had taken up. This allowed us to use Grady’s ratio to predict overall project size, and we found that on this second project, we came up with a very accurate project estimate. This worked very well for all of the subsequent fixed-cost consulting work we did for Tivoli.

Partially due to the strength of the solution and how well it ran on IBM’s AIX operating system, Tivoli was able to eventually sell their company to IBM for 743 million dollars in 1995.

For a consultancy that is doing fixed-cost projects, this concept of using the standard ratio of requirements phase to overall project length is a very powerful project estimation technique. It can eliminate erroneous bidding and its resulting costs, which is a major concern for such companies.

Accurate Bidding

Overbidding on a consulting job means that you won’t get the work in the first place, because the potential customer will give it to your competitor at a cheaper price. Underbidding, however, means you will win the deal and lose money. Neither situation is acceptable for businesses today, and yet, most consultancies do a poor job in this area. One way to make more precise bids is to use a key performance indicator, which is a tool used to measure progress towards a strategic business goal. For example, the number you want to minimize in this situation is defined by the formula [(E-A)/E], where:

E = estimated hours to complete the project
A = actual hours spent to complete the project

It is important to keep this KPI value as close to zero as possible, which indicates that you are bidding on projects more accurately.

Just tracking this number is a great first step towards better bidding, and you can get the necessary data to calculate it from any timesheet system, including a paper one. Automated timesheet systems, however, are generally even more effective in this area because they often have reports to calculate the KPI figure for you.

Improving adherence to your estimate can be difficult for some companies until they understand the ratio concept described above. An example of this is illustrated in the following diagram, which shows how the formula can work for any business. Your company’s magic number may not be 6-8% like Grady’s, but once you determine your own ratio for specification to total project length, you can use it again and again.

usingtherequirements1

Making it Work

I currently run a software company, Journyx, and I can assure you that this project estimation technique continues to be successfully employed by many of our customers to their great advantage. It is easy to implement and you can do it too. Once you do, you will start producing laser sharp estimates before you know it. And that’s a result we can all feel good about requiring.

Happy estimating!

Don’t forget to leave your comments below


Curt Finch is the CEO of Journyx. Journyx offers customers two solutions to reach the highest levels of profitability: Journyx Timesheet – a timesheet and expense management solution for the entire enterprise – and Journyx ProjectXecute – a solution that unites project and process planning with resource management. Journyx has thousands of customers worldwide and is the first and only company to establish Per Person/Per Project Profitability (P5), a proprietary process that enables customers to gather and analyze information to discover profit opportunities. Curt is an avid speaker and author, and recently published “All Your Money Won’t Another Minute Buy: Valuing Time as a Business Resource.” Curt authors a project management blog and you can follow him on Twitter.

Tips and Tricks for Facilitating Conflict Resolution

We all know that conflict is a difference of opinion and therefore neutral-neither good nor bad. Right? But try telling that to a project manager or business analyst embroiled in conflict. Conflict can threaten to destroy the team and sabotage efforts to elicit requirements. But it doesn’t have to. Having a strong, neutral facilitator and a process for conflict resolution can reduce tensions and bring about a positive outcome.

Early in my career I was a liaison representing the interests of a large branch of a national bank. I was on a committee that met monthly to prioritize requirements. Each month I met with my branch management to determine their needs. Each month I and liaisons from the other branches would argue about which new systems and enhancements should be given priority. There was no formal facilitator. Conflict was rampant and remained unresolved. I don’t remember much being accomplished in these meetings. Each branch came in with its personal agenda and each of us went away unsatisfied with the results. Time after time I was in the unenviable position of having to tell my management that they weren’t going to get what they wanted. Again!

In retrospect one of the things I should have done was to spend time understanding the problem management was trying to solve. That way I could have presented a coherent set of recommendations at the monthly meetings.

Another thing I should have done was to meet individually with key representatives before each monthly meeting to discuss our concerns, find common ground, and build relationships. Instead of returning empty-handed each month, I should have returned with a recommendation that helped not just our bank, but the entire network of branches across the country. Everyone would have benefited.

Finally and maybe most importantly, the meetings would have run more smoothly if we had had a facilitator to tell us where we were going and keep us on track.

Many years later I learned that when conflict is preventing important tasks from completing, having a facilitator and a facilitation process is essential. Such a process might include:

  • Find a neutral facilitator. When emotions run high, it is important to find someone without a vested interest in the outcome. Some BAs and PMs take turns facilitating meetings for each other. Some organizations or PMOs provided facilitation services. What’s important is having a designated, neutral facilitator role.
  • The facilitator should set ground rules. One ground rule that can be used for conflict situations is that the participants will disagree with ideas and not people. This helps prevent the discussion from turning personal. If the discussion becomes emotional, the facilitator needs to bring the focus back to the issues at hand. If this is not possible right then, the meeting should adjourn.
  • Take time to understand the problem. Conflict arises for a variety of reasons. People have personal agendas, they think their way is the right way, they want to be recognized as experts, etc. We need to understand the real needs behind the stated needs, the issues behind the positions.
  • It is important for those in conflict to resolve it themselves. Once all participants understand the problem, we need to hold a brainstorming session to generate ideas to solve the problem. This can be done individually or in a group. Sometimes it is useful to have the participants write ideas on yellow stickies. It is important at this point to concentrate on generating ideas to solve the problem, not to evaluate the ideas presented.
  • Prioritize the solutions that have been generated by comparing approximate costs and benefits. You may need follow-up action items to quantify both the costs and benefits of the solutions.
  • Another facilitated session may be needed to develop a recommendation, or the recommendation can be assigned to one of several of the participants.
  • Present the recommendation to a pre-determined decision-maker, such as a project sponsor. It’s important to have a designated tie-breaker to ensure the conflict is resolved.

These steps will not prevent conflict, which is a natural part of a project. But they will help keep the project on track and prevent ruined relationships.

Don’t forget to leave your comments below


Elizabeth Larson, PMP, CBAP, CEO and Co-Principal of Watermark Learning (www.watermarklearning.com) has over 25 years of experience in business, project management, requirements analysis, business analysis and leadership. She has presented workshops, seminars, and presentations since 1996 to thousands of participants on three different continents. Elizabeth’s speaking history includes, PMI North American, EMEA, and Asia-Pacific Global Congresses, various chapters of PMI, and ProjectWorld and Business Analyst World. Elizabeth was the lead contributor to the PMBOK® Guide – Fourth Edition in the new Collect Requirements Section 5.1 and to the BABOK® Guide – 2.0 Chapter on Business Analysis Planning and Monitoring. Elizabeth has co-authored the CBAP Certification Study Guide and the Practitioner’s Guide to Requirements Planning, as well as industry articles that have been published worldwide. She can be reached at [email protected]

Pro Sports and Business Analysis Come Together

Let me share a little about me. I love my family, the business analysis profession, professional sports, Rocky Balboa, and Bruce Springsteen. When my loves blend I couldn’t be happier! It happened a few weeks ago when the Dallas Cowboys unveiled their new $1 Billion; yes that’s a “B”, stadium. Pro sports and business analysis came together for me. Let me explain.

prosports1Jerry Jones, the owner of the Dallas Cowboys, who I do not love since I am a NY Giants fan, had a pet feature for this stadium…sixty yard long, high definition screens that are 90 feet above the field and run along the side lines.

Man, and I thought my 47″ HDTV was cool.

Prior to having the screens constructed the Cowboys allegedly showed the specifications to the NFL, National Football League, and obtained sign-off to move forward. The screens were built and installed. Although I have not been to the stadium they look really cool.

The hype around the screens was all positive until the third quarter of the first pre-season game. The opposing team’s punter kicked a ball right into the screen causing a do-over. Just like that the stadium “project” went from a success to a big concern in some critical stakeholders’ eyes. The NFL is now reviewing the situation, the height of the screens, the impact on games, etc. A decision will be made soon if the screens need to be moved, who has to pay for the work and when they can safely be moved, if necessary.

So, what does sign-off really mean? Does sign-off of a requirements specification mean anything if the end solution does not meet the needs of key stakeholders? In the case of the Cowboys, they’ll most likely argue to the NFL that they received the necessary approval on the plans. This may result in the NFL picking up the tab for the move of the screens, but there is still an impact to everyone involved. In my opinion only having this type of sign-off is worthless. If you stop there, all it does is allow parties to place blame on other parties. Why do you think so many customers are skeptical about signing off on requirements documents?

Now, let’s talk about the right way to obtain sign-off. The Indianapolis Colts, who I am impartial to because they play in a different division than the NY Giants, were planning on a similar screen set-up when they were building their new stadium. To verify the screens would not impede the game, they built a “prototype” of the screens and had their punter try to hit the mock screens. He was very successful in hitting them which resulted in the team changing the design and placement of the screens due to this simulation. Now that’s what I call sign-off you can be confident about.

In addition to my disliking the Cowboys, the moral of the story is you need to make sure you obtain the right level of approval throughout a project. At different stages of a project you need to take the opportunity to ensure you are headed in the right direction. I think the Cowboys did the right thing about getting the plans approved. The issue was they stopped there. By simulating the scenario with their punter, the Indianapolis Colts were able to obtain the right level of approval.

I’d love to hear your sign-off stories, good and bad!

Have fun simulating,

Kupe

Don’t forget to leave your comments below


Jonathan “Kupe” Kupersmith is Director of Client Solutions, B2T Training and has over 12 years of business analysis experience. He has served as the lead Business Analyst and Project Manager on projects in various industries. He serves as a mentor for business analysis professionals and is a Certified Business Analysis Professional (CBAP) through the IIBA and is BA Certified through B2T Training. Kupe is a connector and has a goal in life to meet everyone! Contact Kupe at [email protected].