Project Reporting: The Illusion of Control
As a business analyst, I’ve been fortunate enough to work alongside some excellent project managers in my career. I’ve always considered the BA/PM relationship to be crucial for success; there will typically be some creative tension between the two perspectives, but it is that tension that means that the BA is free to challenge, with the PM ensuring that we don’t get drawn into any unnecessary ‘analysis paralysis’. The two roles are different and complementary, and it’s crucial that open and honest communication takes place between the two.
One thing I’ve always found challenging though is reporting, particularly on projects where (by their very nature) some hefty upfront work is required. Imagine creating a specification for an Invitation to Tender (“ITT”) or Request for Proposal (“RFP”) process—there needs to be enough certainty about the detail upfront for the vendors to create an initial quote and proposal, which means there’s a chunk of up-front work, even if the detailed elaboration and design work is done in an agile way.
Reporting in these situations can be troublesome because it’s really hard to know how close to ‘done’ we really are. Imagine a project manager coming to you clutching a Gantt chart and saying:
“According to the plan, you should be 72.6% complete with your high-level requirements. I just wanted to check that you are, in fact, 72.6% complete?”.
This is a very difficult question to answer honestly; let’s face it, with the best will in the world you might think you’re 80% done, but all of a sudden a stakeholder has a flash of inspiration and realizes that some crucial element of scope has been forgotten. It’s great to catch it early, but all of a sudden you’ve gone from 80% complete to 20% complete…
Let’s face it, progress reporting of this type gives the illusion of control. Unless the environment is stable, predictable and you happen to be working on a completely predictable repeatable project where there’s historic data, then any estimate is really just an informed guess. And reporting progress in percentages is highly suspect—I mean, how can anyone really know if they’re precisely 72% and not 71% or indeed 68% complete? So what happens in reality is people just say “yes”, and you get a task that appears to be on track until suddenly it really isn’t. It becomes like a taskbar on a computer that works steadily up to 99% and then stalls, with the final 1% taking 10 times longer than the previous 99%…
Gaming the Numbers
Similar games get played on agile projects. I remember once hearing somebody who was under pressure to improve a team’s velocity say “I can double our velocity overnight—I’ll just double our story point estimates”. The point was semi-facetious but also deadly serious: Assuming the right mixture of people with the right expertise are involved, the work is going to take the time the work takes. Asking people to “work faster” is, at best, a temporary fix and, at worst, a way of completely burning out a talented team.
This reminds me of the classic film This Is Spinal Tap which parodies a rock group documentary. The group famously has amplifiers that ‘go up to 11 rather than 10’ thinking they are louder. Of course, in reality, the amps generate exactly the same volume, there’s just an extra number on the dial… In organizations, we need to be careful that we don’t fall for a similar fallacy.
I’m not sure that there’s any easy solution to this dilemma. There are some who advocate avoiding certain types of estimation at all (see the #noestimates movement on social media). Some things I have found personally useful when it comes to estimating analysis work are:
- Show progress not percentages: Breaking work up into modular parts and getting artefacts reviewed (and on to their consumers) sooner helps progress get tracked. For example, an initial context diagram and problem statement might be useful artefacts on their own.
- Estimate effort remaining: Rather than saying ‘I’m 72.6% through’, I’m much more comfortable saying “based on what I currently know, and the progress and interaction with the stakeholders, I’d estimate another five to seven days effort, spread over ten to twelve working days to allow for stakeholder availability”.
- Estimate in ranges: You’ll notice that I included ranges in the estimate above, the wider the range the less confident I am. The more I know about the situation, and the more stable and predictable the situation, the narrower the ranges.
- Separate effort from duration: Neither you or I, realistically, can do ten days effort in ten days’ duration if we need input from others. The chances of them being available exactly when we need them is low. Plus, there are probably other things in our diaries (team meetings, training, etc). So best to be open and transparent.
- Make it clear it’s an estimate: As is commonly (and sensibly) said in the agile world ‘this is an estimate, not a commitment; it’s my best guess but if the existing situation is a heck of a lot more complex, then we’ll need to revisit it.
- State assumptions and revisit: By stating assumptions, we make it clear what the estimate is based on. One thing that is often overlooked is that (on any complex piece of work) it’s useful to revisit What initially has a very wide range will become narrower when more is known.
- Never estimate the work of others: If somebody needs to know how much development time something will take, then it’s time to ask a developer…
Like so much, this is very much about transparency and communication. The relationship between the BA and project (or product) manager is crucial, and this open dialogue enables us to jointly learn and collect data that may be useful for future estimates. It isn’t ever going to be easy, but by having the difficult conversations early, we avoid having to disappoint people later on!
What are you experiences with creating estimates for analysis work? I’d love to hear from you. Feel free to connect with me on LinkedIn and let’s keep the conversation going.