Skip to main content

Cognitive Biases Versus Exception Flows

Despite the various methodologies and business analysis techniques that have surfaced over the years, organizations continue to either deliberately, or accidentally, ignore the possibilities of errors or issues occurring with their software.

For decades, the IT industry has continually delivered, or canceled, projects that are inherently flawed. The failure rate has improved slightly over time, but it seems that project teams continue to be surprised by problems in their software, even though the same mistakes have been made by teams in almost every country on the planet for years.

Much of that failure, rather than being caused by a specific issue such as ‘poor requirements’, is more likely to be initiated by one or more cognitive biases at play in a project team and with stakeholders.

There are few organizational cultures aware enough to acknowledge that cognitive biases play a significant part in every day decision making. A cognitive bias is defined as, “(…) a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own “subjective reality” from their perception of the input. An individual’s construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.” (Wikipedia.)

Cognitive biases aren’t deliberate or born from a place of malice, but they’re very real, and can derail any project at any point in time. Cognitive biases are inbuilt into human behavior, and for the most part appear to have evolved as an adaptation during human evolution. In general, a set of heuristics that could quickly deliver an answer served most humans in the past far better than taking too long to detect danger and wind up being injured or killed. “Within this framework, many ostensible faults in human judgment and evaluation may reflect the operation of mechanisms designed to make inexpensive, frequent errors rather than occasional disastrous ones (Haselton & Nettle, 2006; Johnson et al., 2013).”

Part of the issue is that there are so many cognitive biases in existence that the business analyst is going to end up fighting one or more while trying to do their job. For example, with Parkinson’s Law of Triviality, people will spend more time discussing minor issues because they are easier to understand. However, complex issues are ignored. This leads to everyone becoming involved in resolving the minor issues because they can show their contribution to the project, while the complex issues are parked, or have only one or two people trying to solve them.

In other words, a business analyst, by trying to discover possible exception flows may also need to discuss an issue that has a higher amount of complexity than normal. As the complexity is difficult to resolve, the exception flow as a topic morphs into the more trivial issue of personality and team dynamics. The questions around an exception flow is interpreted as the business analyst asking what is wrong with the software and therefore the team. The developers are insulted because it seems to imply their coding is substandard. The project manager is unhappy because they don’t like negative people on the team, and trying to think of exceptions is going to add to the timeline for delivery. For the agile team relying on a product owner, the product owner needs to be particularly adept at being able to self-critique their own requests and ideas, and accept input from others, which, let’s face it, doesn’t always happen.

As Carol Tavris and Elliot Aronson point out in their book Mistakes were Made (but not by me)  “Because most people have a reasonably positive self-concept, believing themselves to be competent, moral and smart, their efforts at reducing dissonance will be designed to preserve their positive self-images.”

The business analyst’s quest to find exception flows in an effort to protect the organization should failure occur, could be doomed because the implications of an exception occurring interferes with the project team’s self-concept of being competent, moral and smart.


Advertisement

Now, I assume that some people reading this article are muttering to themselves, “This is all about communications skills. I don’t have any problems at all because people listen to me.”

This is where another cognitive bias comes into play. If you’re a beneficiary of the Halo Effect, then people may be listening to you in spite of your communication skills. The Halo Effect is a cognitive bias in which a positive impression of a person may influence another’s person’s opinion. In other words if you’re a well-dressed, good looking person, you’re probably going to have an easier time convincing someone that there might be a problem with their project. This is because your appearance is subconsciously associated with other aspects, such as assuming you’re a good and/or smart person. This is in direct opposition to the Horn Effect, in which an unattractive person is subconsciously assumed to be morally inferior and not as smart as an attractive person.

Worse, cognitive biases can interact together to turn seemingly trivial design decisions into software that causes a significant impact once implemented into production. For example, a business unit designs a long form with many text areas because they want a lot of data from their users. The business analyst may advise them that this may not be a good idea if their users type slowly or the data collected via text entry can’t be effectively used for data analysis, but the business can’t be convinced otherwise. The business unit’s conviction that the form is okay could be a result of the Law of Instrument (we have always used forms), Optimism Bias (there is no reason why this wouldn’t work), Planning Fallacy (a user won’t spend much time on the task), the IKEA Effect (we designed this form, we’re proud of this form, therefore it is a good form), and perhaps a touch of Illusion Superiority (our business unit is the best because we know our users and the business analyst does not.)  From there it goes to the tester, who happens to be a touch typist and can quickly enter data into the form. The form passes testing. The business unit take a ‘ I told you so’ approach that the business analyst can’t defend. No one is aware that the developers have left the timeout at a default of twenty minutes because no one told them otherwise.

When the form goes live, it creates havoc. Many users, as predicted, are slow typists and it takes longer than twenty minutes to fill out the form. The users encounter timeouts. The data in the form hasn’t been saved. The users are not pleased. They contact the Call Center. The Call Center scrambles for a solution that can tide the users over until the developers can make the change, and/or the form is redesigned. The user is given the impression that the organization is incompetent, and simply doesn’t care about the user at all.

So, how can a business analyst work on exception flows when the stakeholders and project team aren’t interested, won’t acknowledge the need for them, or seem to only deal with easily understood issues. What can a business analyst do if they’re working in an organization that is unaware of the influence of cognitive biases?

  • The exception flow will occur whether the organization wants to acknowledge it or not. Even if you can’t get any ‘buy-in’ from the team or the stakeholders, make a list of possible exception/issues anyway. Look for solutions and document a resolution via the exception flow as best you can. (Note that in this article an exception flow is considered to be separate to a risk register item.)
  • Watch out for cognitive biases in yourself. Which is easier said than done.
  • Realize that unless you’re working in a team that acknowledges the realities of cognitive biases then those biases will not only be on full display, they will also be in full effect.
  • You can do nothing to change anyone’s mind (unless you benefit from the Halo Effect). Research into cognitive biases say that it’s almost impossible to change a person’s mind, and yes, some people can do this in some organizations but it’s delusional to assume you can do it in all organizations.
  • As above, think up possible exception flows anyway and keep them in your back pocket so you can deal with them should they occur. If they do occur, you can surface them, and highlight how the issue could be solved. At this stage you will have a higher chance of being heard, as people will be looking to keep their self-image intact. It’s a good news story for a project manager if they can go to a higher level manager and already have a solution.

Finally, be kind to yourself. If you’ve tried your best to identify issues that might occur, and you’ve got several exception flows, and you’ve been ignored and your meetings hijacked, then it’s not because you’re the world’s worst business analyst. It is more likely that you’re battling cognitive biases, and cognitive biases don’t respond well to logic or reason.

Comment