Skip to main content

Tag: Best Practices

A Root Cause Analysis of a Failure of a Root Cause Analysis of the Failure of Root Cause Analysis

A colleague recently sent me a link to a blog describing a failure of root cause analysis, which discusses root cause analysis in the context of software development. The author, Mike Cohn, describes a scenario of applying Five Whys to a flat tyre, and discovers that the flat tyre happened because it rains. In this case root cause analysis has turned out to be of little use, and Mike asserts that it’s also less useful with software development or product development. There were a whole bunch of things in the blog that bothered me. But Mike is no slouch; he’s a well-respected agile practitioner, major contributor to Scrum, and co-founder of the Scrum Alliance.

Exactly five whys – no more, no fewer?

The Five Whys is a reminder that just asking ‘why?’ once usually isn’t enough, and sometimes it takes asking ‘why’ as many as five times to get to the bottom of the problem. But you might get there in three – or it might take six. Unless there’s an element of the ‘why’ that’s in our control, seriously consider whether it’s worth continuing.

There can be only one…

… root cause? Not so! Almost all interesting effects or problems have multiple causes. Or, as my wife puts it (because she’s a doctor and likes to use big words even more than I do), they’re multifactorial. Why is Mike’s tyre flat? Because it got a screw in it. Why was there a screw in the tyre? Or we could ask – why did the screw cause the tyre to go flat?

Is it necessary? Is it sufficient?

Because there are often a multitude of root causes, it can also be useful to identify whether they are necessary or sufficient or both, either singly or in groups. Necessary conditions are those which must be met for the effect to occur. Sufficient conditions are those which will always cause the effect. If we cannot form the root causes we have found into at least one group that is both necessary and sufficient, then there will be other (perhaps more useful or informative) causes out there.

Which is the bigger problem? The one you’re analysing, or the time spent doing the analysis?

The author has decided that he wants to prevent the wastage of the five minutes that it took him to fix the flat tyre. He hasn’t described how often he has a flat tyre, which is a critical factor when considering how much time to invest in fixing the root cause. In this case, spending more than a few minutes thinking about why it happens is quite possibly a waste of time. Or, as the author himself puts it later in a comment: “My point remains that some root causes are not worth fixing. The real issue is that the cost of fixing a root cause can sometimes outweigh the cost of living with the problem.” Fair enough, but sometimes all that means is that you’re looking at the wrong root cause. The point Mike’s actually made with his example is that the effect or problem is not worth analysing.

Why the Five Whys?

I’m not convinced this is the best technique for root cause analysis. I find that the Five Whys technique on its own is most effective when investigating motivations for a requirement, because it suits a more linear cause-effect model that I find prevails in why people say they do the things that they do. For example:

  • Why do we need to upload the file in this format? Because that’s what the suppliers give us.
  • Why do the suppliers give us that format? Because that’s what we tell them to use.
  • Why do we tell them to use that format? Because we think it’s going to be easy for them.
  • Why do we think it’s going to be easy for them? Because we find it easy to manually work with.
  • Why are they manually working with this file? We don’t know, we just assume that they are, because at the moment we’re manually working with the files.

There’s useful information not just at the end of the chain, but all the way through – and there are still opportunities to branch your questioning.

Is root cause analysis less useful for software development?

In the original article, the author says:

“My point? Sometimes root cause analysis isn’t worth it. As with any tool or technique, it can be misapplied. A great deal of software development is novel–the thing being done or thought about is being done for the first time. (At least by that individual or team.) Root cause analysis is most useful when situations are repeated–as in manufacturing. The technique is less useful with novel, complex work such as software development (or even product development more generally).”

I’m very sorry, Mike, but I’m afraid I have to disagree. Yes, you’ve made the case that sometimes root cause analysis shouldn’t be done, particularly when the impact of the problem is small compared to the effort involved in looking at it. Yes, it can be misapplied. I’d definitely back you if you were to argue that it’s overkill for most bugs, where once the problem has been identified it can be fixed and we all move on.

But Mike hasn’t made the case (particularly with that example) that root cause analysis isn’t helpful with novel and complex work. I find that root cause analysis is in fact particularly useful with complex work, because there are usually a multitude of factors. Sometimes, if you find the right one, it’s possible to make a small change to effect a bigger one. It’s also been my experience that major and serious issues in software development come back to a sadly familiar set of root causes. With a quick rebranding, the Manufacturing groups can easily apply – Technology, Process, Data, Mindpower, QA, or Environment.

A different example

Over a decade ago, I was part of a team who were in a grotesque hurry in the middle of a death march, somewhere at the beginning of what was to be six months of 80-hour weeks, and approximately three months out from a go-live date that was carved in stone (or at least written in legislation). I received a data summarisation process and report to the client for user acceptance testing, for which I’m pretty sure I had provided the requirements document. The data summarisation process was to run daily, to provide the summary information for the report that would run monthly, because the development team wasn’t happy with the report performance.

One test run told me that a) the output was wrong and b) the performance was appalling. A re-run using production hardware and database configuration told me that it would take approximately three and a half years every night. A superficial ‘five whys’ analysis might lead us to believe that the root cause was a bad design decision, or poor requirements documentation. Maybe we could fix that by shouting at the designer or business analyst for a bit. (I confess that shouting at the designer made me feel better at the time – sorry, Graham.) But a more in-depth analysis, maybe using Ishikawa’s techniques, would perhaps conclude that we had at least the following factors at play:

  • Technology
    • The development team didn’t have access to production-quality hardware.
  • Process
    • The release process didn’t include performance testing, even where potential performance issues had been identified.
  • Data
    • The development team didn’t have access to a production-size data set.
  • Mindpower
    • All minds involved were working long hours under pressure to deliver.
  • QA
    • The testing that was performed, did not test the actual requirements.
  • Environment
    • The client and the development house were separated by thousands of kilometres and several time zones.

For this organisation, these were all causes that had resulted in delivery issues in the past, and (since this kind of analysis wasn’t actually performed) would continue to result in delivery issues.

Root cause analysis can work for you

Mike argues at the end of his article that root cause analysis shouldn’t be automatically performed, but done on the right problems, and I agree completely. Furthermore, done properly on the right problems, it is more than just an exploration of a single chain of causality. It examines the multitude of causes that contribute to the result, and attempts to determine where best to make the changes that will have the greatest effect.

In the rare cases that software or IT organisations are self-confident enough to examine why things went wrong, and to commit to fixing the root causes, I believe they’ll find themselves well rewarded. To argue that software development is novel and complex, and that therefore root cause analysis is less useful is, I think, flat-out wrong.

Don’t forget to leave your comments below.

The Magic Behind Functional Requirements – Data!

For me a good day of gathering requirements is marked by a business user saying, “That’s a very good question.” In the majority of instances the trick of coming up with those questions is the same – while talking to users about their business processes, I am mentally mapping the information those processes involve to a data model.

In a previous life I was a Data Administrator. Relational database technology was new and ‘normalisation’ was a magical art. Apprentice data analysts could get to third normal form. Full data wizards understood fourth and fifth normal form. Hard to believe it now but we ran sessions with business users that focused purely on data. We asked them to put their processes out of mind and focus only on their information requirements. Our deliverables were entity/relationship models intended to support the development of enterprise-wide, reusable databases. It was a VERY painful process for all involved.

I’d estimate that data administrators (myself included) spent about 30% of their time documenting business data requirements and the other 70% debating with other data administrators about such things as naming standards, domains, and whether “Double-headed arrows,” “Crows Feet” or “A Big Dot” was the best way to represent the “Many” end of relationships on Entity/Relationship (E/R) diagrams. I really miss those days – NOT!

Fast forward twenty-something years to today. The Unified Modelling Language has morphed E/R diagrams into Class Diagrams (although Relational databases for the most part are still the norm rather than genuine Object-oriented databases). When people talk about requirements the two main types are Functional and Non-Functional. Data requirements are typically relegated to entries in a glossary or one or more E/R diagrams in an appendix.

Business users now, as then, have zero (or less) interest in looking at a data model. However, they are more than happy to be shown a screen ‘wireframe’ representing the way a system would present data to them in support of their business process. The trick, at least in my case, is applying my data perspective during discussions of these wireframes, but without using the “E” or “R” words. Most typically a screen equates to an entity (e.g. Customer, Contract, Order). So does a displayed table of items (e.g. Order Line Item, Payments). Fields in the screen (or columns in the table) either are ‘facts’ about that entity or reference some other entity (e.g. the Customer ‘named’ in an order or the Product ‘short description’ in an order line item).

OK, back to the magical source of ‘good’ questions. The basic data normalisation technique is ensuring that each attribute is a fact about the most appropriate entity. [“The key, the whole key and nothing but the key, so help me Codd.”] If not, then it is a fact about some other entity.

For example in an Order Line Item, “Quantity” is definitely a fact about the Line Item. The “Product” being ordered is a fact about that Line Item, but because there are a bunch of ‘facts’ about Products irrespective of them being ordered, there needs to be a Product entity. Users entering Line Items need a way to ‘identify the product’, and having done so the screen is populated by fields that are facts taken from the identified instance (e.g. short description, unit price). A “good question” might be, “When a Product is ordered, is the Unit Price the same in all cases, or can the unit price be different (e.g. Customer-specific price schedules)?”

Another ‘good question’ that has its basis in normalisation is, “Can that fact have multiple values at the same time (e.g. an Employee being classified as having multiple Skills)? Or the cardinality of a relationship (e.g. can a Contract involve more than one Supplier?).

If you have data analysis skills, I encourage you to apply them during requirements gathering. If you are light on data modelling skills, I encourage you to learn more about the subject. Regardless, I encourage you to keep these skills hidden from your business users and let them think you have magical abilities to ask ‘good’ questions.

Anyone else, “been there, suffered that”? If so please add any examples and/or tips for gathering requirements without diverting Business Users from their focus on their processes.

Don’t forget to leave your comments below.

The Industry Agnostic Business Analyst – Defying Market Trends

It is common within the ICT industry, to come across job advertisements listing prior domain experience as mandatory for a Business Analyst. Finance and health are among the worst culprits and those of us who have faced a position description stating ‘experience in a financial organisation is essential’ will understand exactly what I mean. Whilst prior domain experience certainly has benefits for the organisation in terms of shortening the learning curve of a Business Analyst, and getting them started faster, there are also pitfalls that organisations should be aware of and should take into consideration.

I recall my first Business Analyst position in the health industry. The organisation was looking for a Business Analyst with experience in health. Fortunately having worked with the Program Manager previously was enough to get me in the door with my proven analysis skills and experience. The clinical project I was assigned required me to work as a Business Analyst very closely with subject matter experts from Nursing and Allied Health disciplines. Naturally I was aware of my lack of domain knowledge and relied on my inherent Business Analysis skills and techniques to get up to speed quickly – document analysis was a great place to start.

To my surprise, the subject matter experts on the team commended my ability to actively listen and understand their domain to the point that I was adopted by the team as a ‘pseudo clinician.’ I received comments like ‘we are really glad you haven’t worked in health before, it means you ask the obvious questions instead of bringing your own assumptions and past experiences that may not align to how we define and do things here.’ Another comment was ‘it is so nice to not have to hear you tell us, how the place you used to work does things, and that “x” word actually means something completely different in your prior organisation.’ The take away lesson I learnt, is that the power of ignorance should not be disregarded and can actually be used as a strength to create the right frame of reference from the start, instead of having to alter perceptions brought from other ‘similar’ organisations within the industry. It is easier to create new understandings than to change those already deeply embedded.

Travelling further back to my university days, I recall a business lecturer emphasising the value of learning critical lessons from other industries that ‘do what they do best.’ For example, if you are working on a system to manage capacity within an aged care facility, why not look to the lessons learnt and finely tuned practices of the airline or hospitality industry for creative ideas and advice? The benefit of having a Business Analyst who has worked across multiple industries is that they can bring a new perspective to solving cross-industry problems. I have worked in mining and construction which many would assume is quite a contrast to health, however from a process and systems perspective, body parts and tyres are really not that different – they each need to be screened and assessed, have attributes which need to be recorded, have an acceptable range for blood pressure or tyre pressure, require monitoring and reporting.

In my current role as a Service Delivery Manager for an expert Business Analysis consultancy, I would rather hire a Business Analyst who is foremost a Business Analyst and not a health or finance industry subject matter expert. Similarly, I would take my car to a mechanic and not a motoring enthusiast. When I look for a great Business Analyst to join our highly skilled team, I look for someone who is well versed in the core competencies, skills and techniques required for the role, whether that is business process modeling, stakeholder management, use case documentation, business/functional/non-functional requirements specification, requirement traceability, business case development, enterprise analysis or workshop facilitation. The frustration however comes with placing these expert people into clients who are more focused on hiring a Business Analyst with subject matter expertise, than a Business Analyst who is great at their own job, and who brings a diverse range of experiences with them to the role. The divide between a Business Analyst and ‘the business’ needs to remain preserved – the role of the Business Analyst is to facilitate decision making and provide objective advice, however when the Business Analyst is also the subject matter expert, objectivity can be compromised and the Business Analyst may creep into decision making, or making assumptions and prioritising requirements without the business input.

With all this said, my first Business Analyst role in the health industry did assist me in some ways with my subsequent health role – for example I was familiar with software solutions used within the industry which could be applied to my later role, although it would not take long for a Business Analyst without prior health experience to also quickly come up to speed on this. My experience working with clinical stakeholders was an advantage, however the same core stakeholder management skills come into play whether proving your value and credibility to a clinician or tyre performance manager.

Many industries feel they are unique, just as many stakeholders feel their processes and requirements are very different from the next. If this were the case, it would be difficult for cross-industry process classification frameworks to exist, and Business Analysis guidelines such as the Business Analysis Body of Knowledge (BABOK) to be developed. I therefore still arrive at the conclusion that whilst prior domain knowledge does have advantages, it is certainly not everything it is promoted to be by the industry and it is not my preference in hiring someone for a role. The industry has not yet matured in this space, and those on the job hunt will share this frustration. So I urge you to keep educating the market with the benefits, as the industry agnostic Business Analyst has a lot of value to offer – as Business Analysts many of us know this, perhaps it will take the industry a while longer to catch up. However, with the modern pressures now on industry to innovate or perish, some will be more open to looking at the agnostic Business Analyst providing cross-industry innovation.

Don’t forget to leave your comments below.

Dispersed Business Analyst Teams Are Hard Work, But Fun

As a BA Manager who works in a consultancy with a dispersed bunch of brilliant Business Analysts, it’s hard to show our Business Analysts we really care and value what they can bring to the wider Business Analyst team.

Problem Statement (aka problem scream!)

How do we encourage our dedicated Business Analysts to engage with their colleagues who are located elsewhere, and contribute to our central BA practice, when they already have a team to interact with every working day?

Some context

Although my current role is within a consultancy, I’ve also worked in organisations as a BA Manager. In these organisations, Business Analysts have been dispersed or situated outside a central practice, to be found in business units, IT departments or form part of other business teams. I’ve also been a Business Analyst working the coal face and engaged or otherwise with all sorts of BA practice structures.

What I think I’ve learned about engaging dispersed BAs

  1. Your madly dedicated Business Analysts will become attached to their clients/project teams/business units. They will like the people they work with, they will participate in social activities with those people they work with, they will quote those people too. Don’t panic. Learn from their engagements what does draw them in, and seek to use those learnings to create similar or better at home base. Plagiarism of ideas here is a good thing.
  2. Business Analysts are all unique – that’s no surprise! You can’t cater for all your unique people in all activities or events that you organise to bring them together. Don’t sweat it! Hosting an event/activity with 70/30 fit for your people, even 60/40, is damn good. Explain what you’re doing and why, and the return to be gained by participants. Your team will decide their level of involvement, or even better, their peers will often exert the pressure for them to participate where you can’t.
  3. Find your outliers and middle ground: your participators and your non-participators, and your sometimes participators, and get to know a bit about them. I’m not saying bell curve them – they are people and I’m not into bell curving people. But I am into personas and trying my best to wear different hats, and say to myself “if I was xxxx what might I think of this event/communication/approach?”
  4. Don’t assume that your communications have got the message across. You may deliver by email, you may deliver by a regular “circular”, you may have “gatherings”, but until you (or someone in your practice) gets face-to-face and talks the latest news directly to an individual there is still a good chance you will “surprise” a Business Analyst with some news long after they should have known about it.
  5. Elicit from them, with quality business analysis techniques, what would engage them. Just don’t ask them too much and make sure you USE the information they supply. A BA practice needs to practice what it preaches. They will tolerate you getting it wrong occasionally but they won’t tolerate you not learning to get it right.

Practical tips:

These are some of the tools myself and my colleagues have used to keep dispersed Business Analysts engaged within our BA practice:

  • Regular coffee catch-ups at client sites
  • Regular gatherings offsite – to network, bond over food and drinks
  • Regular learning sessions – from colleagues or guest speakers, or team training
  • A few full days together offsite during the year
  • Facilitated a higher level of availability of senior managers for BAs i.e. give us this consideration and we guarantee value for time
  • Fortnightly circulars with a variety of content
  • Existing Business Analysts involved in recruitment process for new Business Analysts
  • Involved Senior Business Analysts in running “BA for a Day” opportunities
  • Put BA representatives into other team/specialist meetings, or onto collective committees. They then have a responsibility to engage with the central practice and they learn something new too.
  • Business Analysis Team Posters – list Business Analysts who reside in their unit, and central practice details, with offers for free talks to teams on what we do
  • Collectively, by small focus groups, or as individuals, engage Business Analysts in leading projects of change for the practice
  • Listening, learning, evolving, getting it wrong sometimes, but genuinely caring for individuals and delivery.

Summary

Being a BA Manager with a dispersed workforce of Business Analysts is hard work but great fun too. It drives you and the team to explore and innovate in a way that might not have otherwise come to mind or had such a high priority if you had a centralised team.

This does mean of course that these learnings are just as applicable to a centralised team of Business Analysts. Since it’s all about engagement, perhaps a centralised practice should raise the priority on these sorts of activities?

There is nothing new in my experience, but somewhere there might be a new BA Manager who needs some starting hints, or an “older hand” like me who knows it’s never too late to learn new tricks.

Which reminds me, the key learning might be this one:

  1. You don’t have to figure out how to engage a dispersed Business Analyst workforce all on your own. I didn’t. I learnt from my leaders, my network, from trial and error, from a variety of roles I’ve held, from the vast universe of the web, by sharing what I do know and asking people for their sharing in return.

I’d love to hear your experience (techniques that work and those that don’t), in keeping a bunch of dispersed Business Analysts engaged with each other and their practice.

Don’t forget to leave your comments below.

When Good is Good Enough

It is not always necessary to achieve the best quality result or employ the best practice approach when implementing a project. Blasphemy you say?! Or perhaps this sounds very obvious?

As a Consultant Business Analyst, I want to deliver the highest possible standard of work. This is for myself, to protect my reputation in a very small market, my employer, for the same reason, and naturally the Client. So when confronted with the familiar request from a client of ‘we need X by Y’, X being ‘too much’, and Y being, ‘yesterday’, I prepared myself for the familiar conversation where I ‘re-set’ the expectations of the business to give them a good old reality check in order to be able to deliver a high quality result.

After much discussion, and adjusting the project plan, the team came to a point where the original timeline to deliver was cut by 50%, the approach was Iterative (read developing before requirements had been finalised), and we were still meant to be delivering by yesterday…sounds like the Perfect Storm perhaps?

So the end result? We successfully launched a new product to market in an ever competitive market, with minimal major defects (oh yes, there were defects). Let’s be clear, there were issues, namely rework of development due to changing requirements, requirements identified late and defects.

Yet the project was still considered a success by the Client. How did this happen? Here’s what I think:

  1. Planning – No surprises
    We planned, oh how we planned. The business grit their teeth as we took the (valuable and limited) time to lay out all the Risks, and consider the appropriate approach. We looked at the options around descoping or phasing delivery of functionality to achieve the same or similar benefits in the time available.

    All of those things proved valuable and contributed to the success of the project. However of MOST value was the time taken to identify the comprehensive list of Risks, which became largely Issues. Because we had highlighted them all early on and the Business Sponsor and Owners were aware, there were no ‘gotcha’ moments, and all were mitigated to resolution or to a point where it was not a show stopper.

  2. The Right Team – Skills and Experience
    It sounds so simple – but we have all felt the pain of having team members that don’t have the appropriate level of skill or experience or both. Yes, having the right team can be difficult to achieve, but when it all comes together, magic can happen! Specifically we had a QA Lead, Solution Architect, Technical Lead, and a humble BA that had experience both in the relevant industry, and similar projects. Were things missed? Yes, however there was enough support in the team, and coverage of the key functional areas to get us through.
  3. Number One Priority
    To achieve this project within the appropriate timelines, compromises had to be made. The old adage ‘Something’s gotta give’ was so very true. So we asked and the business approved a portfolio review, projects were re-prioritised, and while there were clashes on occasion, we managed to get there.
  4. Business and Project Team Aligned
    The Business and Project Team were aligned. My biggest feeling of anxiety was around not being able to provide the appropriate level of detail in the requirements up front in order to avoid ambiguity. However this Risk had been communicated up front, and QA, IT and the Business Owners knew this – and we communicated, we collaborated and the business were responsive when we needed clarity from them. We did 80% fantastically, and the remaining 20%…not so…but 80% got us through.

Conclusion:

So we got there! Result! Did we have defects? Yes. Did we have re-work? Yes, but not enough to be restrictive. Was there luck involved? Absolutely! Would I employ this approach at every engagement? No.

But doing just enough, still delivering to the benefits and to see it be successful was a great reality check. It has allowed me to understand what is possible given the Perfect Storm, and added a different perspective to my perhaps formerly idealistic view of things in the world of Consultancy. I expect the next assignment I go into will provide me with anecdotes that totally contradict all that I have just shared, but hopefully it at least provides fodder for some great blog comments!

Don’t forget to leave your comments below.