Skip to main content

Author: Keith Ellis

Get Your Priorities in Order

Let’s talk about setting priorities…  Not life priorities, prioritizing requirements.

Prioritizing requirements is one of those activities that can be either super easy or crazy tough.  If you planned ahead, prioritization is surprisingly easy.  Prioritization needs three major ingredients:

  1. A large group of stakeholders that know development priorities need to be set
  2. A set of business objectives
  3. An application with at least some complexity

Here’s why:

The larger the group of stakeholders, the more important it becomes to have a way of prioritizing requirements.  Let’s face it, if you only have one stakeholder, priority setting is really much more an executive decision than a process.  However, if there are a lot of stakeholders, the process of priority setting needs to be objective, and business analysts can add value.  In fact, the larger the group, the more the rigor needed in methodology.

Sometimes you have to ask yourself, “Am I stepping into someone else’s role?”  That position is going to get very uncomfortable very fast.

Let’s say you’re a BA working with a team that clearly needs to set priorities and there isn’t one individual that can step forward in that role. The next big issue is to create an objective way of prioritizing.  For this you must have some decent business goals/objectives for the project and use these to help guide prioritization objectively.  The approach is quite simple; take each requirement (or bundles of requirements supporting similar functionality) and ask the business stakeholders to rate it in terms of contribution to the business goals.  Similarly, ask the IT folks to rate the relative complexity of development (cost/risk) then put the two sides together to create a four quadrant grid (Dilbert would be proud!).  The resulting chart makes for a good  conversation.

The exercise of setting priorities is really less about the mathematical and much more about each stakeholder putting a stake in the ground and talking about why they’ve rated something relatively high or low.  It’s the discussion that counts – don’t shrug it off.

Finally, let’s talk about application complexity.  Let’s say you’re doing a customer order management system.  There is often no value in prioritizing something like ‘validate the customer’ higher than the next step in the same process, for example, ‘price the order.’  All these functionalities are absolutely necessary to the BASIC functioning of a system.  If a system cannot function without something like ‘record the order’, why prioritize it; it doesn’t make sense.  The first step is to separate functionality into must have/should have/could have (look at the MoSCoW technique). Then start prioritizing.  This approach separates the stakeholder discussion into two valuable stages:

  • Is this functionality absolutely essential to the basic system?
  • Okay, if it’s possible to function without that capability (even in the short term) then let’s prioritize it so that we look at all the pieces together.

For me, if a system is reasonably complex and everything described in the specification is mandatory, the spec may not be detailed enough.

Helping to prioritize functionality is one of those super high value activities of a business analyst.  It helps sometimes to uncover issues underlying stakeholders and may help bring obstinate stakeholders along.  I find it very useful myself, but it doesn’t necessarily fit in every engagement.  If it’s not your role to play, or it is not an application with sufficient complexity to have elements with different priorities, don’t try to force feed the tactic on time deprived stakeholders.

I wish you all great success.

Don’t forget to leave your comments below

A Refreshing Viewpoint from a New Business Analyst Professional

refreshingviewpoint1I think it’s time for every experienced business analyst to go out there and interview someone who’s new to the profession.  It’s easy to get caught up in the daily grind and lose your perspective on the industry.  So last Friday I caught one of the newest members of the analyst community unawares and decided to pick his brain on his perceptions after two full months in the trenches of our industry with a pretty simple question:  “After two months doing this, what were your big AH-HA realizations?”

This Thing’s an Iceberg

“It’s everywhere.”  That’s the funny thing about business analysis; it looks small on the surface, but it touches a lot of things beneath the surface.  Business, technology implementation, understanding how processes work, getting people in agreement on how processes are improved, it’s a lot of stuff!  At first you look at it and say, “OKAY, business requirements, got it – it’s about business requirements.” Then you start to see all those other touch points that are wired into the fabric of how businesses work, and you start to realize just how big this thing is.  That is really cool!

The Impact is Huge

Business analysts really change how businesses perform and not in little ways. In a big way!  Business analysis impacts the success of organizations not just on the project front, but it also changes an organization’s ability to achieve its objectives in a meaningful way.  Right down to changing overall corporate financial performance and the speed of response to market, analysts matter.  This is not an IT thing, it’s a business success thing.

There’s a Whole Community Here

There are a whole lot of websites dedicated to this stuff, tradeshows, a professional certification, consultants, software tools.  It’s a community – an industry – and it’s big.  Who knows about it?

The new BA’s final thought was, I think, the one that struck home for me the most:  most people have no idea our multi-billion-dollar industry exists.  He’s right; if you ask the average Joe on the street, what’s a business analyst, he’d probably shrug.  If you ask several thousand university graduates, what’s a business analyst, how many would know and aspire to that profession?  Even as companies realize that the skills of a capable business analyst are the very skills lacking in the executive ranks of so many corporations. How many people see business analysis as the starting point their career path?  Not many.   Perhaps so few of the general population know about business analysts is because it’s pretty easy to visualize the value of other business functions like marketing, sales, operations, HR or finance; but, what’s a business analyst? What do they do?

Isn’t it past time we came out of the closet as an industry?

Don’t forget to leave your comments below

Analysis versus Needs Analysis

Analysisversus1Most analysts can do analysis…but very few can really be proactive in helping customers figure out their needs.  Executives routinely complain about analysts being passive on projects, not proactive, or inconsistent in their execution. I think this is a real challenge that needs to be addressed. 

Done well, business analysis is a leadership role; one that actively impacts the speed and outcomes of IT development activity.  In this leadership role, analysts have the skills to proactively lead stakeholder discussion and create information in a format that communicates well to both business and technology about business needs.  Analysts are playing the role of bridging the gap between business and technology. 

Without the right processes, techniques, skills, organization, technology and deliverables, analysts default to small ‘a’ analysis.  This is a reactive role, taking in projects and generating activity without momentum.  The clarity of need and outcome is simply not emerging rapidly enough, or with sufficient consensus, or with the right usability of deliverables.  Projects are more difficult with small ‘a’ analysts on your team.  Work may be getting done, stuff is happening, but how much momentum is really being generated?  How much do you need to rework the deliverables of those people?  How many times are you getting the right blocks of content filled in on templates, but very little being said?  Know people like this?

Am I really being too harsh?  To me it comes back to the role and expectations, and I believe analysts provide LEADERSHIP.  Part of this leadership value is that analysts be a strong source of momentum.  This means you can’t have situations like the above.

OK, so you’re looking around your organization seeing lots of what I’ve coined small ‘a’ analysts. What do you do?

There are a few short term tactics you can employ to refocus the value delivered by analysts, and one long term tactic.  In both cases YOU, the manager, need to own the situation you’ve created for yourself.

The short term fix is to focus on ‘elicitation skills.  They’re the methods and practices (like facilitation) of engaging stakeholders and asking the right questions at the right time to extract the information needed to determine business need.  This has immediate benefits in the focus of analysts and their abilities to engage stakeholders.

You can also implement a short term fix about quality standards:  getting away from defining the completion of the template as the standard, and going toward setting clear guidelines on the fidelity of information in the template.

Finally the short term fix could be to look at the services that your analysts provide to the organization and change the intake process for new projects that want to use these services.  It can have a profound effect if the requirements management team knows how to quickly understand the status of a project, and create more clearly defined work packages for their analyst teams from this understanding.

In the longer term, the management group has to assess requirements definition and management maturity, and set out a plan for improving this maturity level.  It’s the only way to get stickiness on change.  An organization cannot hope to make substantive improvement by focusing only on one capability:  skills (continuously training), process (continuously redesigning or enforcing a process), techniques (being militant about the use of certain standards), deliverables (getting people yet another template), organization (going from a requirements center of excellence to a center of practice), and technology (roll out something else… again!). 

Business analyst management needs to look across all capability areas and systematically improve the consistency of the organization across these areas.  The value of making this improvement is also extreme, doubling most performance metrics on projects.

Why do I get passionate about this?    Seventy five percent of organizations last surveyed were poor at doing requirements… 

To really meet business need, analysts need to be leaders.

The Technology of Business Analysts

Here’s a personal opinion: Business analysts remain one of the last bastions of antiquated technology. Put up your hands all of you only using Word and Visio. Probably only 20% of you have requirements management software, and even fewer have requirements definition tools worthy of the name. Yet everyone is under pressure to improve performance and productivity – especially now that the economic recovery is underway and project pipelines are filling rapidly. Here’s a quick primer on the Dos and Don’ts of acquiring analyst productivity software.

OK, so analysts are the proverbial shoemaker’s children, always defining requirements for the solutions of others, and never for themselves. Unfortunately, this is the cardinal rule that many analyst organizations break – they don’t properly define requirements when they look for analyst requirements definition and management solutions. Seems a little crazy doesn’t it? A quick suggestion would be to step back from the glitz of the technology for a minute and perhaps run some facilitated sessions to identify the analyst processes, the information required to support these processes – maybe even identify a few interdependencies? Sound familiar? Once you do this, the wide variety of solutions out there start rapidly falling away – and very few likely remain.

Would it surprise you to learn that there are over 100 analyst tools that are well known enough for IAG to track them? Who knew the lowly analyst had so much choice? When you evaluate tools, it is easy to get caught up in the sizzle rather than the steak. The sizzle are the flashy functionality around the system that make it more flexible in execution. The steak is the no-holds-barred answer to the question, “what was this thing principally designed to do?”

Since the sizzle is infinitely more interesting, let’s start there and talk about the kinds of sizzle you can get in an analyst tool:

Tool Integration. All tools have a degree of model sharing; upstream (to portfolio management, or project management) and downstream integration (to quality systems or development). This model sharing is either fully bi-directional (YAY!), meaning changes in one system are reflected in the other, or it is accomplished by simple export (BOO!) which can lead to all kinds of synchronization problems between repositories.

Documentation Generators. All tools are either better or worse at generating standard document templates and some are fairly configurable so that you can replicate existing internal standards for information production. More critically, these tools either capture into the repository the changes made to output (YAY!) or they force you to edit the format every time you print the same report (BOO!).

Stakeholder Collaboration. Most tools have a degree of automated collaboration to capture stakeholder commentary which is free (YAY!) but some tools force you to license the viewers and collaboration engine (BOO!). Usually, there is an auditing capability for indicating who said what, giving traceability on stakeholder review, validation and acceptance content.

Model Sharing and Global Repositories.: Tools are typically designed to be massively multi-user (YAY!) or they are primarily a single user system that can share models between users (BOO!). The difference between these are the server database options, security/access controls, check in/out controls, model sharing options and controls, etc within the system. Many systems lack the concept of a’project which I think is a bizarre oversight. A project has unique attributes like a charter, stakeholders, objectives, etc. and would include one or more use cases from an enterprise requirements repository that could easily be sharing the identical use case out to another project being worked on by another analyst. Some systems enable these concepts, many do not.

Object Relationship and Impact Assessment. Most tools enable the user to define custom object types, however the big YAY is the ease with which relationships are established (one click versus, in some cases many, many, many clicks) and the degree to which you can compare object types or look at object relationships. This type of functionality is ABSOLUTELY critical for traceability management and impact assessment. Traceability and impact assessment needs you to first establish relationships between objects (which can be cumbersome in some tools), illustrate these relationships in standard models, and be able to show how relationships are impacted under changing circumstance.

Ability to Embed Analyst Best Practices into the System. In a bizarre twist of fate, the better a tool is, the more ways it has to represent information in the system and the less usable the system is ‘off the shelf’ without some amount of training. This is a good-news, bad-news story since better quality requirements definition and management systems are highly customizable, but it could potentially spell chaos for larger teams of analysts unless some standards are set. The best tools have package configurability and the ability to embed elements of your analysts processes into the tool – meaning you can preset what objects are called, the attributes of objects and even eliminate certain options that you don’t want available to analysts in your company’s implementation of a particular tool. The purpose is to force analysts to model specific things, using specific techniques, in specific ways.

I could keep talking, but remember, these functionalities are the sizzle, not the steak. Yes, they are important functionalities, BUT, you have to first and foremost decide what analyst business processes you are trying to support – just like you’d advise your stakeholders to do. When you only focus on the sizzle – every tool has some degree of sizzle and every tool has some degree of functionality in every area above. When you get down to the steak – tools were either designed to support certain analyst processes, or they were not.

Tools tend to fall into different categories based on what they were first and fundamentally designed to do. Often, if something excels at one domain, it is poorer in other domains. You need to step back and pick one:

  • End-to-End Suites. Take IBM’s Rational product line for instance, or a number of the Application Lifecycle Management (ALM) tools. Here the big difference is the scope of the software and component integration between the tools. These systems are principally designed to be end-to-end and you buy into the end-to-end concept rather than best-of-breed components. Inherently, this kind of design will have gaps. Some of these (like Rational) have requirements management (Reqpro/Doors) AND definition functionality (RRC), some are only requirements management tools. Some have the ability to manage conceptual and logical data models – others do not. Be careful.
  • Requirements Visualization Tools. These popular tools (like iRise or Microfocus’ TeamDefine) are mainly designed to allow stakeholders to interact with visual models of a system. I draw these into two broad categories: ‘Visualization’ tools allow stakeholders to visually experience the process flow of an application but they’d probably want an analyst walking them through the activity to talk about the specific rules and functionality. ‘Animation’ enables stakeholders to fully interact with the system interface, and by embedding data rules behind the system, the user can also experience error conditions and alternate scenarios by operating (or animating) the system. Both classes of visualization tools have numerous vendors selling quite capable tools. Other points of evaluation are the fidelity of the models you can create.
  • Requirements Modeling Tools.: There are tools like eDev’s InteGREAT (or Requirements Composer by IBM) that were principally designed for an analyst to model business requirements. Others, like RAVEN, are focused on a specific type of modeling (Use Cases) and are more ‘assistive‘ for analysts (where we’re using all meanings of ‘assist’ e.g., increase productivity, help the analyst learn, automate the production of something, and act to increase quality). The key differentiators between systems are the number of model types created, the model fidelity, number of domains of knowledge that can be created or managed – some only allow the modeling of process knowledge, others can model any number of dimensions of knowledge (Process, data, rules, issues, interdependencies/interfaces, training, etc) and the degree of knowledge transformation (given a text description will the system create graphical description and vice versa) or given a use case model and clear actors, can the system render a cross-functional swim lane diagram.
  • UML Modeling Tools. These deserve a special mention since the UML can be used to generate a rich model of a business process and identify/manage requirements. The upside to these tools is often the ability to forward and reverse engineer code or to model enterprise architecture. The downside is that you are locked to the UML and this modeling standard’s ability to resonate with business stakeholders that are not trained in the UML. The big differentiators here are very numerous since UML tools also get used across the entire application life cycle.

If you go beyond the above four areas there is also a whole series of tools from other domains like quality (HP’s Quality Center), Enterprise Architecture (Casewise), Business Process Management (IDS Scheer), process logic capture tools (StereoLOGIC), among others that are converging to better service the needs of business analysts.

The landscape is filled with productivity tools that are well designed to make analyst teams more productive. IAG has itself implemented or used most of the ones I’ve named here and we are resellers of more than six of them so I’ll not stand up and endorse one vision over another. The key is to recognize that there are choices – a huge number of them – and take the time to do the requirements analysis properly before selecting and implementing a system. As ironic as it sounds, it seems that lousy requirements definition is pervasive when looking at requirements definition and management tools. Use the purchase of a tool to figure out your analyst processes and to strengthen consistency. Don’t be the shoemaker’s child any longer.

Don’t forget to leave your comments below

Managing Metrics; Lies, Damn Lies and Statistics

Here’s a common scenario for IAG, the company I work for: client wants to make significant improvement in their business analyst organization – AND – they need to demonstrate the performance improvement made. It happens all the time, and there are many managers that know the pain of getting locked into managing the wrong set of key performance indicators.

Here are the ways IAG approaches this need:

  1. Benchmarking organizations’ requirements maturity: the best possible scenario for setting up a system of metrics is to benchmark an organization’s requirement maturity at a specific point in time – then repeat at periodic intervals. Rather than rely on a single point metric as a judge of performance, it is best to diagnose people, techniques, process, technology, organization, and documentation standards, and from this establish a baseline of current performance with an action plan for the future. One year from now, you should then snapshot the organization to validate that performance improvement has been made and compare with industry data to show the value of these activities.

    Here’s the issue with most organizations: They have a blind spot that is radically reducing their overall performance as an organization. It could be deliverables, skills, technology, who knows, but unless this is fixed, even though the organization thinks it’s making huge improvements, they may actually be delivering in a less than stellar fashion. When you assess maturity, it uncovers these issues, and enhances the ability of the organization to move forward quickly.

  2. Scorecarding: I love it when a client goes the scorecarding route with IAG for building their metrics program. It’s a robust approach to building a program of metrics that harmonizes these back to broader organizational and strategic objectives. Some people have the idea that a scorecard will be a perfect nirvana – pfffft – not going to happen. It’s a discipline! The idea of the disciple is – first and foremost – to get into the discipline and get value out of it. So it is better to get a scorecard in place in two weeks and concentrate on teaching people the discipline of using it, than to strive for the ‘perfect metric’ and wind up not measuring anything. When business people see great value out of an activity – especially metrics-based management – then the quality of the metrics that get managed also naturally evolve and improve over time. The problem is, for most organizations, getting the process started.
  3. Side-by-side-paired-project-execution. OK! in English; take two projects of roughly equal size/complexity. On one, do whatever that project team thinks is best (the traditional approach). On the other, follow the new disciplined approach. Repeat three times. There is absolutely nothing like having a project pushed through in half the time and stakeholders singing the praises of the new process. It’s great I theory but companies often have difficulties doing this internally because there is not enough difference between people(Skills), process, techniques, technology, organization, and deliverables (collectively referred to as ‘requirements capabilities’) used in one project versus the other to make a substantial difference. This is where teaming with an outside organization can show what performance change is possible – IF – the outsider can demonstrate significantly more maturity in requirements capabilities. The point being: if you want to run two projects at roughly the same time, make sure there is sufficient difference between the two teams across all requirements capabilities; don’t just change the deliverable and expect a significant performance improvement.

    The other alternative is to compare projects before and after investing heavily in resources and process improvement over a period of months; i.e., have good metrics on current projects in the anticipation of pairing future projects with these and looking at performance differences. You can also have this kind of research process externally audited like we did years ago in building stats on our own methodology. I’ve also done this kind of thing for clients, and you can get fairly strong data showing new versus old – but you do need to appreciate it is a fairly large project to set up the metrics, conduct the measures multiple times over the course of the project, compile results, etc. You can get anecdotal results fairly quickly, but it takes a rigorous method, and the disciple to execute on the research method over months to get strong statistically defensible results.

  4. Finally, you can do look-back reviews. Set up a methodology for determining all the dimensions of requirements maturity on a particular project, then investigate the projects, looking at the documentation, interviewing stakeholders, PMs and BAs. Compare the findings on maturity with the project outcome in terms of on-time and on-budget performance to get your metrics for past projects – and set the baseline for future performance. This is not my favorite approach simply because it is backward looking – presumably on poor performance, rather than forward looking on performance improvement. I’ve built analysis this way – it yields an almost perfect line (better requirements maturity = better project performance) but because it is not forward looking with positive accomplishment being represented in the data, it is generally not as compelling to senior management.

Regardless of the method you use for measuring and representing performance, remember there are two types of statistics:

  • Information that is interesting (I’m fat)
  • Information that causes us to change behavior (I’m only expected to live until 55 given current health

The same information might be represented in each of these two sets of statistics but only the second causes me to change behavior (assuming the source was credible). I call this catelizing your benefits. Expressing data in terms that communicate clearly, where the implications of the finding are well understood, and people see that action is needed based on the finding.

Whatever method you chose – catalyze your benefits!

Don’t forget to leave your comments below


Keith Ellis is Vice-President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith’s former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector. Keith is the author of IAG’s Business Analysis Benchmark – the definitive source of data on the impact of business requirements on technology projects.