Author: Yvonne Harrison

Cognitive Biases Versus Exception Flows

Despite the various methodologies and business analysis techniques that have surfaced over the years, organizations continue to either deliberately, or accidentally, ignore the possibilities of errors or issues occurring with their software.

For decades, the IT industry has continually delivered, or canceled, projects that are inherently flawed. The failure rate has improved slightly over time, but it seems that project teams continue to be surprised by problems in their software, even though the same mistakes have been made by teams in almost every country on the planet for years.

Much of that failure, rather than being caused by a specific issue such as ‘poor requirements’, is more likely to be initiated by one or more cognitive biases at play in a project team and with stakeholders.

There are few organizational cultures aware enough to acknowledge that cognitive biases play a significant part in every day decision making. A cognitive bias is defined as, “(…) a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own “subjective reality” from their perception of the input. An individual’s construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality.” (Wikipedia.)

Cognitive biases aren’t deliberate or born from a place of malice, but they’re very real, and can derail any project at any point in time. Cognitive biases are inbuilt into human behavior, and for the most part appear to have evolved as an adaptation during human evolution. In general, a set of heuristics that could quickly deliver an answer served most humans in the past far better than taking too long to detect danger and wind up being injured or killed. “Within this framework, many ostensible faults in human judgment and evaluation may reflect the operation of mechanisms designed to make inexpensive, frequent errors rather than occasional disastrous ones (Haselton & Nettle, 2006; Johnson et al., 2013).”

Part of the issue is that there are so many cognitive biases in existence that the business analyst is going to end up fighting one or more while trying to do their job. For example, with Parkinson’s Law of Triviality, people will spend more time discussing minor issues because they are easier to understand. However, complex issues are ignored. This leads to everyone becoming involved in resolving the minor issues because they can show their contribution to the project, while the complex issues are parked, or have only one or two people trying to solve them.

In other words, a business analyst, by trying to discover possible exception flows may also need to discuss an issue that has a higher amount of complexity than normal. As the complexity is difficult to resolve, the exception flow as a topic morphs into the more trivial issue of personality and team dynamics. The questions around an exception flow is interpreted as the business analyst asking what is wrong with the software and therefore the team. The developers are insulted because it seems to imply their coding is substandard. The project manager is unhappy because they don’t like negative people on the team, and trying to think of exceptions is going to add to the timeline for delivery. For the agile team relying on a product owner, the product owner needs to be particularly adept at being able to self-critique their own requests and ideas, and accept input from others, which, let’s face it, doesn’t always happen.

As Carol Tavris and Elliot Aronson point out in their book Mistakes were Made (but not by me)  “Because most people have a reasonably positive self-concept, believing themselves to be competent, moral and smart, their efforts at reducing dissonance will be designed to preserve their positive self-images.”

The business analyst’s quest to find exception flows in an effort to protect the organization should failure occur, could be doomed because the implications of an exception occurring interferes with the project team’s self-concept of being competent, moral and smart.


Now, I assume that some people reading this article are muttering to themselves, “This is all about communications skills. I don’t have any problems at all because people listen to me.”

This is where another cognitive bias comes into play. If you’re a beneficiary of the Halo Effect, then people may be listening to you in spite of your communication skills. The Halo Effect is a cognitive bias in which a positive impression of a person may influence another’s person’s opinion. In other words if you’re a well-dressed, good looking person, you’re probably going to have an easier time convincing someone that there might be a problem with their project. This is because your appearance is subconsciously associated with other aspects, such as assuming you’re a good and/or smart person. This is in direct opposition to the Horn Effect, in which an unattractive person is subconsciously assumed to be morally inferior and not as smart as an attractive person.

Worse, cognitive biases can interact together to turn seemingly trivial design decisions into software that causes a significant impact once implemented into production. For example, a business unit designs a long form with many text areas because they want a lot of data from their users. The business analyst may advise them that this may not be a good idea if their users type slowly or the data collected via text entry can’t be effectively used for data analysis, but the business can’t be convinced otherwise. The business unit’s conviction that the form is okay could be a result of the Law of Instrument (we have always used forms), Optimism Bias (there is no reason why this wouldn’t work), Planning Fallacy (a user won’t spend much time on the task), the IKEA Effect (we designed this form, we’re proud of this form, therefore it is a good form), and perhaps a touch of Illusion Superiority (our business unit is the best because we know our users and the business analyst does not.)  From there it goes to the tester, who happens to be a touch typist and can quickly enter data into the form. The form passes testing. The business unit take a ‘ I told you so’ approach that the business analyst can’t defend. No one is aware that the developers have left the timeout at a default of twenty minutes because no one told them otherwise.

When the form goes live, it creates havoc. Many users, as predicted, are slow typists and it takes longer than twenty minutes to fill out the form. The users encounter timeouts. The data in the form hasn’t been saved. The users are not pleased. They contact the Call Center. The Call Center scrambles for a solution that can tide the users over until the developers can make the change, and/or the form is redesigned. The user is given the impression that the organization is incompetent, and simply doesn’t care about the user at all.

So, how can a business analyst work on exception flows when the stakeholders and project team aren’t interested, won’t acknowledge the need for them, or seem to only deal with easily understood issues. What can a business analyst do if they’re working in an organization that is unaware of the influence of cognitive biases?

  • The exception flow will occur whether the organization wants to acknowledge it or not. Even if you can’t get any ‘buy-in’ from the team or the stakeholders, make a list of possible exception/issues anyway. Look for solutions and document a resolution via the exception flow as best you can. (Note that in this article an exception flow is considered to be separate to a risk register item.)
  • Watch out for cognitive biases in yourself. Which is easier said than done.
  • Realize that unless you’re working in a team that acknowledges the realities of cognitive biases then those biases will not only be on full display, they will also be in full effect.
  • You can do nothing to change anyone’s mind (unless you benefit from the Halo Effect). Research into cognitive biases say that it’s almost impossible to change a person’s mind, and yes, some people can do this in some organizations but it’s delusional to assume you can do it in all organizations.
  • As above, think up possible exception flows anyway and keep them in your back pocket so you can deal with them should they occur. If they do occur, you can surface them, and highlight how the issue could be solved. At this stage you will have a higher chance of being heard, as people will be looking to keep their self-image intact. It’s a good news story for a project manager if they can go to a higher level manager and already have a solution.

Finally, be kind to yourself. If you’ve tried your best to identify issues that might occur, and you’ve got several exception flows, and you’ve been ignored and your meetings hijacked, then it’s not because you’re the world’s worst business analyst. It is more likely that you’re battling cognitive biases, and cognitive biases don’t respond well to logic or reason.

“The secret: One day builds always take longer than a day.”

After reading Every Tool’s a Hammer by Adam Savage I was struck by how many lessons he’d learned as a maker that were applicable to the act of creating software.

It also made me reconsider the title of Business Analyst. As explained in the BABOK it can be applied to “any person who performs business analysis tasks”. Analysis is the practice of “enabling change in an enterprise by defining needs and recommending solutions that deliver value to stakeholders. Business analysis enables an enterprise to articulate needs and the rationale for change, and to design and describe solutions that can deliver value.”

The general definition of analysis is “a detailed examination of anything complex in order to understand its nature or to determine its essential features: a thorough study” (Merriam Webster), or as Google succinctly puts it “detailed examination of the elements or structure of something.”

These definitions seem narrow. A Business Analyst doesn’t only examine something to understand it. Enabling change can be an arduous process, depending on the organization and the people involved. The Business Analyst tries to turn a customer’s vision into a concrete and understandable grouping of requirements/user stories/use cases, along with models/presentations/diagrams.  A BA needs to negotiate with all stakeholders and help everyone to agree and collaborate.

It’s a multi-faceted job and goes well beyond the boundary of ‘analysis’.

Perhaps a different way to view ourselves isn’t as Business Analysts, but as ‘makers’. A term used by Adam Savage and others to describe a person who creates, makes, and produces something. That something can be anything. For example; a wooden bird house, a suit of armor, a costume for a convention, a model for a film or TV series, a painting, a novel, or a new app (to name a few). Adam Savage doesn’t discriminate in terms of what is being made—just that something is created out of nothing from a desire to make an idea exist in the world.

The something can be created by one person, or multiple people trying to deliver a common vision. For example, a film crew working together to deliver the vision of the director.

Switching from the perspective of Business Analyst to that of a maker also transforms the typical frustrations of BA life into common issues that all makers contend with. The trials and tribulations that the project is going through is, in all likelihood, pretty much standard.


For a start, every maker acknowledges that mistakes will be made. The first version will, most probably, not be the final version. In keeping with this premise, demands of perfection early in the process means that your end product will probably have issues. As a BA / maker, it’s a more relaxing process when you realize that your first draft of your use cases/diagrams/user stories will always have issues, if not be outright awful. You won’t have all the information, and you won’t have a clear view of the vision unless you’ve managed to get everyone to articulate it and agree to it in a comprehensible manner. You’re going to need several attempts at this, and each piece of feedback helps you figure out a more exacting version of the overall idea and helps you solidify how it might be delivered. Like any good maker, you realize it’s an advantage to fail at the beginning because it gives you a chance to recover and iterate. That’s how making/creating works in real life. Not because Gartner published an article about failing fast.

On top of that, maybe that estimate you gave about how long it takes to produce workable requirements isn’t all that precise. As Adam Savage says in his book, “One day builds always take longer than a day.” Makers know that the only way an estimate gets close to being right is by having built a similar thing many times over. As per making, the materials you use to create something is usually standard—but the thing you’re building is not. Just because your friends built their cosplay outfits out of EVA foam and a hot glue gun does not mean that your envisioned cosplay costume is going to work because you’re using the same materials. Likewise, even if you gathered a set of requirements, and the developers are all using C++, and you’re running on an Apache server does not mean your software build is standard. Yes, everyone on the project, especially the BA, will be under pressure to deliver by a date (whether using waterfall or scrum). The trick is to acknowledge the date and work towards it, knowing full well that estimates are not an absolute due to challenge of what you’re trying to achieve. As Adam Savage writes in his book, “As a maker of any kind, with any project, you will never really know what your destination is. You know your starting point, you know roughly what your “problem to solve” is, and you can try having a whiteboard session about final goals to help figure out what you’d like your destination to be, or at least what the rough outline of it should be. (…) But it won’t change the fact that nothing can quite prepare you for what it’s like to set out along the path of creation only to realize you are not going to end up where you planned.(…) Put another way: How many of your projects turned out EXACTLY like you intended? How many went as smoothly as you expected? Mistake free. Distraction free. In my experience, the answer is pretty close to none.”

After reading Adam Savage’s book it struck me that a BA, and in fact, the entire project team, would be far less stressed moving to the mindset of a maker. Methodologies and frameworks and manifestos are not needed to acknowledge a basic fact of making: Creating something from nothing is difficult work.

I’ve only briefly touched upon the premise of the book, but I would encourage you to read it and gain a different perspective on the art of business analysis—or more to the point—the art of making.

Can the Business Analyst Survive the Future?

In the early 1800s, lacemaking was a necessary source of income for women and families with the lace fetching a reasonable price when sold.

By 1843, with the advent of lacemaking machines, prices collapsed, and factories employed women and children for a pittance. Most wages were barely enough to live on, and many families were thrown into subsistence living. (Ivy Pinchbeck, 1977.)

The Eastman Kodak Company dominated the sales of camera film, employing thousands of people through much of the twentieth century—until digital technology arrived in the 1990s and unable to adapt fast enough or target the right products, Kodak filed for Chapter 11 in 2012.

Automotive parts manufacturing workers are now in the same situation. What was once a job that allowed men and women to support their families, barely pays a living wage and has a high degree of injury associated with it. (Waldman, 2017.)

And on a personal note, my great-great-grandfather earned his living as a wheelwright. That’s a person who builds and repairs wooden wheels and is a profession that had been around for centuries until the arrival of the modern automobile. In a few short decades, the craft of the wheelwright had nearly disappeared.

It seems that about forty years (or less) is all it takes to destroy a formerly secure job. The question is, could it happen to the role of the business analyst, and when?

This isn’t a strange question when even coders are experiencing a shift in how their role is perceived.

In February of 2017, WIRED magazine published an article asking if coding was the next blue-collar job in America. The article pointed out that coding had become a two-tiered career choice. Silicon Valley only employs eight percent of the USA’s coders and programmers. The rest are in other business sectors. “These sorts of coders won’t have the deep knowledge to craft wild new algorithms for flash trading or neural networks. Why would they need to? That level of expertise is rarely necessary at a job. But any blue-collar coder will be plenty qualified to sling Java-Script for their local bank.” (Thompson, 2017.)

In an interview with NPR, Clive Thompson, the article’s author, noted that a huge number of available coding jobs don’t require the high-level skills that would get a person a gig at Google. “But the truth is, you know, an awful lot of programming doesn’t require or need that type of, you know, crazy pouring out of creativity. I guess it is more like maintenance or the slow stable making sure that a company is sort of moving along, that its software is working.” More importantly, this sort of coding doesn’t always require a college degree. There are plenty of self-taught coders that are gainfully employed, and there are more and more courses available where a person can get a junior level position after completing the syllabus.

If you look up the definition of blue-collar via Wikipedia, the page says that a blue-collar worker is a person who performs non-agricultural manual labor that can be skilled or unskilled. Examples of blue-collar jobs include firefighting, manufacturing, and mining. But why would Clive Thompson compare what’s an obviously non-physical profession to a blue-collar job? I think if it’s viewed in terms of the changes happening now, more and more blue-collar jobs will shift (and have shifted) from the purely physical. Many blue-collar jobs are already there. Working on a car assembly plant in the 21st century means working with, and alongside, technology. Modern mining operations require a similar level of understanding about the machinery once the person in the entry-level position wants a promotion. However, the primary identifying characteristic of the jobs will continue to be the requirement to produce the same things over-and-over-again. Viewed from that perspective, assembling code in the future could wind up like assembling a car. Everyone tackles a piece of code that they repeatedly produce (without much variation), but they never do more than that.

This brings me back to the role of the business analyst. Viewed from the blue-collar perspective, the role of business analyst shares much of the same characteristics as ‘every day’ coding. It’s rare that we work on a project that’s inventing something new, and it’s also rare that we need a vast amount of creativity to get the job done. We perform a lot of the same actions on every project, with minimal variation on the theme. Even the BABOK states that the job can be done by many different roles. “A business analyst is any person who performs business analysis tasks described in the BABOK® Guide, no matter their job title or organizational role. (…) Other common job titles for people who perform business analysis include business architect, business systems analyst, data analyst, enterprise analyst, management consultant, process analyst, product manager, product owner, requirements engineer, and systems analyst.” (BABOK v3.)

There are many people performing business analysis, under many guises, and more joining the field every day as the demand increases. We come from a variety of backgrounds (like coders) and don’t need a specialized college degree. Much like the coders Clive Thompson talks about in his article, we’re the workers, “making sure that a company is sort of moving along.”

And this creates an issue because this means that there are many factors that could disrupt the role of the business analyst in the future.

The most obvious one is supply and demand. To date, there’s always been a demand for business analysts. The market is currently in an equilibrium where supply and demand appear evenly matched, and the price the market is prepared to pay for business analysis skills means most people in the role can maintain a solidly middle-class existence.

However, the law of supply and demand also says that if ever the time arrives where the demand side decreases because the economy is forcing organizations and businesses to cut back on spending, or an oversupply of business analysts means the price can be lowered for the service, then the role becomes far less attractive.

The role can also be disrupted by technology and methodologies. Agile already attempted to do this with a focus on developers and the business working directly with each other. With AI on the horizon, it’s entirely possible someone will figure out a way for a customer to answer a series of questions and the AI will synthesize the answers and produce a decent specification or set of user stories as an output.

The only thing that may save the business analyst’s role in the near future is that the role also falls into the pink-collar job classification. Originally used to denote a group of service jobs predominantly performed by women in the 1970s, the term has slowly morphed into a way to classify jobs that require social skills and consists of interacting with people and customers.

“For an office worker, that could mean being able to communicate across departments. For someone in customer service, it’s interacting with another complicated human. For a care provider, it’s the empathy to help someone vulnerable and in need. These are all skills robots are really bad at—at least for now. And they have, over the last three decades, become increasingly vital in the labor market.” (Greenfield, 2016.)

But as we all know, industries change quickly and roles and jobs that were seen as necessary, suddenly become unnecessary.

So, for now, and in the near future, our jobs as business analysts might be safe. Our social skills and certifications may save us. But as plenty of people have found out the hard way over the centuries and decades, disruption in an industry comes swiftly, and the effects are devastatingly immediate.

Clive Thompson, “The Next Big Blue-Collar Job is Coding”, WIRED (2017):
IIBA, BABOK v3 A Guide to the Business Analysis Body of Knowledge: (International Institute of Business Analysis, Toronto, Ontario, Canada, 2015).
“‘Wired’ Declares Coding As Next Blue-Collar Job Boom,” NPR (2017):
Rebecca Greenfield, “Forget Robots—People Skills Are the Future of American Jobs. You might call it pink-collar work. Experts call it the future of the labor market”, Bloomberg (2016):
Peter Waldman, “Inside Alabama’s Auto Jobs Boom: Cheap Wages, Little Training, Crushed Limbs. The South’s manufacturing renaissance comes with a heavy price”, Bloomberg (2017):
Ivy Pinchbeck, Woman Workers in the Industrial Revolution: (Frank Cass and Company, 1930, 1969,1977).

Stuck in the Middle

As Business Analysts, we have all been there.

We held high hopes for a collegial give-and-take in a workshop or a productive meeting where processes and requirements are teased out. In our imagination, visions of cooperative brainstorming danced before our eyes along with the shimmering promise of decisive and swift stakeholder agreement. The solution would be optimal and deliver a quality outcome.

Then the project starts, and the original objectives are lost. It moves from the potential of an excellent solution to a half-baked compromise. The users only have a fraction of what they wanted, and that fraction does not do anything useful. Everyone swears they did not ask for what he or she got although our requirements management spreadsheet/system says otherwise (and so do the stakeholder signatures). In the world of Scrum, the product owner looks at the design and keeps saying, “Is that what I asked for?” Moreover, the design keeps changing.

The Business Analyst struggles their way through the project feeling like a failure, wondering why the exciting techniques described in the BABOK (brainstorming, collaborative games, experimenting, research!) seem so ineffective.

For example, the BABOK 3 notes that “Workshops can promote trust, mutual understanding, and strong communication among the stakeholders and produce deliverables that structure and guide future work efforts.” However, the BABOK does list some limitations. “The success of the workshop is highly dependent on the expertise of the facilitator and knowledge of the participants. Workshops that involve too many participants can slow down the workshop process. Conversely, collecting input from too few participants can lead to the overlooking of needs or issues that are important to some stakeholders, or to the arrival at decisions that don’t represent the needs of the majority of the stakeholders.”

Research argues that the long-standing advice about collaboration in the workplace may be entirely wrong. The paper “Equality bias impairs collective decision-making across cultures” suggests that decisions made during meetings, workshops or as part of a collaborative team, will likely not only be less than optimal, but possibly substandard unless all participants are at an equal level of expertise and (just as important) awareness.

Ali Mahmoodi and his coauthors wrote, “When making decisions together, we tend to give everyone an equal chance to voice their opinion. To make the best decisions, each opinion must be scaled according to its reliability. Using behavioral experiments and computational modeling, we tested (in Denmark, Iran, and China) the extent to which people follow this latter, normative strategy. We found that people show a strong equality bias: they weight each other’s opinion equally regardless of differences in their reliability, even when this strategy was at odds with explicit feedback or monetary incentives.”

The problem is compounded by the inability of most people to recognize when they are not competent. “A wealth of research suggests that people are poor judges of their own competence—not only when judged in isolation but also when judged relative to others. For example, people tend to overestimate their own performance on hard tasks; paradoxically, when given an easy task, they tend to underestimate their own performance (the hard-easy effect) (1). Relatedly, when comparing themselves to others, people with low competence tend to think they are as good as everyone else, whereas people with high competence tend to think they are as bad as everyone else (the Dunning–Kruger effect) (2). Also, when presented with expert advice, people tend to insist on their own opinion, even though they would have benefitted from following the advisor’s recommendation (egocentric advice discounting).” (Mahmoodi et al., 2015.)

This suggests that even in a facilitated workshop, the bias will not be sufficiently neutralized to get the desired outcomes. People will still insist on their own view of requirements, even if they are faced with differing opinions. Alternatively, they defer to the person perceived as having higher competence, even if that is not true. As suggested by Mahmoodi’s paper, and contrary to received wisdom, the best strategy for arriving at a set of optimal requirements might first involve determining the participant’s skill levels before deciding which requirements are more valid. The research also suggests fewer, more knowledgeable participants in a workshop or meeting could produce a clearer set of requirements.

The danger of assuming expertise (the Dunning-Kruger effect) and the demonstration of equally weighting people’s opinions are witnessed in a real-world project example from New Zealand. Novopay was an infamous education sector payroll project, and the government ran an inquiry to identify the issues that led to its failure. The inquiry specifically called out the SMEs (Subject Matter Experts) in the report. “The Ministry had difficulty providing sufficient SMEs with adequate knowledge, and there were many examples of SME input being incomplete, inaccurate or contradictory.” (Jack and Wevers, 2013.) The Ministry did not have the expertise to realize their SMEs were not providing competent information, and the SMEs thought they had sufficient expertise to provide advice on a software development project. As the SMEs and the Ministry agreed with each other and at the same time deferred to each other, it is no surprise that the project had major issues.

This behavior is not unique, and anecdotal evidence suggests many projects fall into the same trap. David Dunning (one of the researchers who identified what is now called the Dunning-Kruger effect) points out that our minds can be, “(…) filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of useful and accurate knowledge. This clutter is an unfortunate by-product of one of our greatest strengths as a species. We are unbridled pattern recognizers and profligate theorizers.” (Dunning, 2014.)

The problem of cognitive biases may also help explain some of the frustrations evident in the Agile world. Scrum tries to deal with the issue of identifying who can guide the product vision by assigning one role to the task—the product owner. Scrum aims to reduce the hazards associated with everyone making decisions by making one person responsible for the product. Of course, the pitfall is that the product owner needs to have real expertise, as opposed to thinking they have expertise. Although the Agile approach seems instinctively better, (collaboration, sustainable pace, self-organizing teams, business people and developers working together), Agile remains as susceptible to failure as the waterfall model. Perhaps it comes down to the simple fact that the Agile Manifesto was conceived by experts who may have assumed that everyone else was highly skilled. In the ordinary world, there are enough people trying to use Agile that are precisely none of those things.

So, what’s a business analyst to do in the face of the knowledge that we are all affected by cognitive biases and metacognition errors?

Luckily, business analysts have the only role on the project that stands a chance of seeing past the biases. We are tasked with collecting information and reviewing it as best we can, and producing (what we hope) is an optimal solution. We are forced to keep an open mind and arrive at our conclusions by weighing up options. As a business analyst working on a project by project basis, there are many occasions when we have little or no knowledge about the business area/organization that we are working for. It makes it impossible to maintain an illusion that we are competent because it is obvious that we are not. Therefore, we have already cleared one hurdle: we have enough expertise to realize we are not an expert and seek assistance from others.

This of course, can be a double-edged sword. If we have worked in one industry for a period of time, there’s a danger (as per the Novopay example) that we assume we know the job intimately enough to produce a sound set of requirements without consulting anyone else in the business.

We also need to contemplate whether we have the right business analysis skills for a project or if we are at the right level to tackle the task ahead. If we consider the pitfall of cognitive biases, it is obvious that we could fall into the trap of thinking that we are proficient in analysis when we are not. Therefore, the IIBA certifications become an important instrument in helping to offset this delusion. By gaining certification, we have gone some way to proving we have a level of mastery in the business analysis arena.

Even certification does not completely get us off the hook. Dunning points out education’s limits. “Here’s a particularly frightful example: Driver’s education courses, particularly those aimed at handling emergency maneuvers, tend to increase, rather than decrease, accident rates. They do so because training people to handle, say, snow and ice leave them with the lasting impression that they are permanent experts on the subject. In fact, their skills usually rapidly erode after they leave the course. Months or even decades later, they have confidence but little leftover competence when their wheels begin to spin.” (Dunning, 2014.)

Recertification, although painful, may be the necessary thorn in our sides that prevents us from assuming we are still good business analysts twenty years after we read a book on the subject.

Finally, there’s one consoling aspect of learning about cognitive biases: we can be less hard on ourselves if we are struggling to get any agreement on requirements, or if the user stories cannot be corralled into a sensible design. It may be a clear demonstration of Dunning-Kruger and the equality bias in full effect rather than the fault of the business analyst.

Then again, maybe that is just another example of an error in thinking. As David Dunning notes, cognitive biases are, “the anosognosia of everyday life.” (Dunning, 2004.)

“As such, wisdom may not involve facts and formulas so much as the ability to recognize when a limit has been reached. Stumbling through all our cognitive clutter just to recognize a true “I do not know” may not constitute failure as much as it does an enviable success, a crucial signpost that shows us we are traveling in the right direction toward the truth.” (Dunning, 2014.)

IIBA, BABOK v3 A Guide to the Business Analysis Body of Knowledge: (International Institute of Business Analysis, Toronto, Ontario, Canada, 2015)
Al Mahmoodi, Dan Bang, Karsten Olsen, Yuanyuan Aimee Zhao, Zhenhao Shi, Kristina Broberg, Shervin Safavi, Shihui Han, Majid Nili Ahmadabadi, Chris D. Frith, Andreas Roepstorff, Geraint Rees, Bahador Bahrami, “Equality bias impairs collective decision-making across cultures”, Proceeding of the National Academy of Science (2015):
Murray Jack and Sir Maarten Wevers, KNZM, Report of the Ministerial Inquiry into the Novopay Project, (New Zealand Government, 2013).
David Dunning, “We are all Confident Idiots,” Pacific Standard. Miller-McCune Center for Research, Media, and Public Policy (2014):
David Dunning, Self-insight: Roadblocks and Detours on the Path to Knowing Thyself: (Taylor & Francis, 2004).

Editor’s Notes
(1) The hard–easy effect is a cognitive bias that manifests itself as a tendency to overestimate the probability of one’s success at a task perceived as hard and to underestimate the likelihood of one’s success at a task perceived as easy. (Wikipedia definition ––easy_effect)

(2) The Dunning–Kruger effect is a cognitive bias in which low-ability individuals suffer from illusory superiority, mistakenly assessing their ability as much higher than it really is. (Wikipedia definition ––Kruger_effect)

Data Migration – The Journey of a Thousand Miles

You’ve held workshops, you’ve consulted your stakeholders, you’ve written requirements, you’ve produced use cases, you’ve collaboratively designed the UI and everyone is happy. And then seemingly out of nowhere you find that the data from the old system refuses to fit neatly into your new system. Suddenly the project that has been going so well is plunged into chaos.

How the heck did it happen?

Data migration is typically the most overlooked component of a project that involves moving from an old system to a new system. (Note: I’m using the generic term system to cover everything from applications to websites.) While there can be many people involved in discovering the new business requirements or in designing a new UI, the data migration task itself tends to be either forgotten about or delegated to one person (typically a more junior member of the team). On the surface data migration appears to be one of the easier tasks to complete. After all it’s just transferring the data from one system to another. This perceived simplicity leads many project managers (and sometimes the business analyst) to think that data migration can be separated from the main body of tasks needed to deliver the system. Even the BABOK talks vaguely about this area in section 7.4 Define Transition Requirements. The task of data migration isn’t specifically mentioned. It’s framed as, “move information between the new and old solution” or in the case of the data itself, “Rules for conversion of this information will need to be developed, and business rules may need to be defined to ensure that the new solution interprets the converted data correctly.”

This impression that the task is small in scale typically leads to scheduling the data migration near the end of the project rather than at the beginning. Unfortunately, leaving the migration analysis until later or not understanding the full implications can have fairly devastating results. The project can wind up running late and the budget is blown. Even worse, the new system starts with bad data from the old system or no data at all. If a decision is made not to move the data to ensure the delivery date doesn’t shift then the data ends up split between two systems. The old system needs to keep going for longer than intended and costs balloon as two systems are maintained to do the same job.

Data migration typically goes wrong because of a misunderstanding of what it means to collect data. If you reduce a computer system to its most basic components a computer is merely a way to collect data, store data, perform an operation on the data, get a result from the data and then use that result to generate an outcome or more data. For example, in a billing system you collect data (the name of the person being provided with a service, the contact details for the person and the service the person is being billed for), you perform an operation on it (calculating if the person owes any money for the billing period) and then you generate an outcome and/or more data (you send a bill to the person and then receive a payment from the person or the person is still in debt).

How that data is collected and stored in the old system and how it’s collected and stored in the new system determines whether your data migration will be straight forward or difficult. In data warehousing the process of moving data from a source system to a target system is known as ETL (Extract, Transform, Load).

The key to understanding the difficulties you may experience with data migration is the ‘transform’ part of ETL. It’s highly unlikely that you are going to move from your old system to a new system and not have to transform your data in some way. For example, a typical problem is that the old system may store the address details in one field. The values are comma separated. This means the street number, street name, suburb and city values are contained in one field. However the new system now has a separate field for each of these values. You now have to figure out how to move the values that are sitting in one single field in your old system to the new system with multiple fields. If you’re very lucky the users have consistently separated each value in the field with a comma. If you’re unlucky then there haven’t been any rules. Or you have several users who have made up their own rules – instead of separating each element with a comma they have separated each element with a pipe (|).

To the inexperienced and non-technical project members on the team this type of problem can seem to be a mere annoyance and Project Managers can sometimes dismiss this as a technical person over stating their case.
However even the smallest data problem can start to quickly add costs and require the business to make some tough choices. For example, if the business wants to solve the address value problem and move the values to the new system correctly then this would require someone to write an ETL script to transform that data. And before the script can be run the data is going to have to be analyzed and cleaned to ensure that the ETL can execute without failure. If there are thousands of records (or millions) then cleaning the data to be consistent enough to transform and load to the new system may require hiring temporary personnel to manually correct the records depending on the state of the data. For example, if the address values have been entered in an unstructured manner and no transform rules can be applied then it can only be corrected using human intervention and judgement.

The seemingly minor technical issue of transferring data suddenly becomes a costly and time consuming task requiring temp workers and a developer to write the ETL scripts. Faced with rising costs and having to extend the completion date for delivery the business can start to panic and the subsequent decisions can result in the data in the new system being poorly structured before it goes live. For example, the offending values are simply moved en masse into a comment field in the new system with the intent that the users will correct the problem during their normal working day.

Other issues with the data result in the entire process being deemed, “too hard” and only the data that can be transferred on a one-to-one basis is moved. For example, only a person’s first name, middle name, last name, gender and date of birth go into the new system. Everything else is archived. Archiving data is perfectly fine if you never have to look at the data again. However this will be highly unlikely and having to search between two systems creates a less than optimal user experience.

Data migration tends to have five consistent factors that contribute to issues during delivery of a project.

  1. The person responsible for performing the gap analysis may not have a data background or has ignored the significance of redesigning the business processes in terms of data collection and storage.
  2. The data migration itself is left to the last minute and is assigned to a different business analyst or a tester. They are typically isolated from the business because the migration is seen as separate technical task. The task might also be assigned to the least experienced member of the team such as a junior business analyst.
  3. The business has decided not to collect certain types of data any more or they are unsure as to why they collected the data in the first place. The initial analysis fails to identify the other units or departments in the organization that may still have a use for the data.
  4. The migration takes far longer than anticipated. A data migration can turn into a considerable intellectual challenge that requires months of analysis. This is especially true for payroll projects that have to migrate an organization’s entire employee history including leave, over time, allowances and pay rates.
  5. No one has factored in the defect rate for the migration. Even successful migrations can have a small defect rate that needs to be addressed once the records are moved. Knowing whether the migration has to be started again or whether the defect can be manually corrected can make all the difference between a delay of days, weeks or months. It’s very rare for a data migration to achieve 100% perfection when moving data from one system to another. You should always allow additional time to review and clean data in the new system if it’s needed.

Considering all of the above points, are data migration issues solvable? The key is to start as early as possible, and make sure you consult with your development team.

Your first step is to talk to your Data Warehouse team or find a developer who knows ETL.

Depending on the complexity of your data migration you’re going to need help. You need to get that help from someone who knows ETL.

You should also have a clear understanding of what it means to Extract, Transform and Load.

Your second step is to construct a data model for both systems.

What does your current system do? With any luck there is already an existing model. What does your new system do? With any luck someone has already completed a data model in your team.

If you don’t have a model for one system (or it’s missing for both systems) then you need to construct one. It’s the only way you can compare the state of the data in both systems and check for gaps.

Your third step is to identify odd fields or problems with the way the data is stored.

When you look at your model do you have a one-to-one match between both systems? Or are there strangely labelled fields that seem to make no sense? Is it as per the example at the start of the article – you have values concatenated into a single field that must be split into separate fields in the new system?

Depending on how rushed your developers or vendors were when they developed the old system they may have cut some corners in terms of how fields were named ‘under the hood’. I recently reviewed a system that had their date fields labelled, “Date1”, “Date2”, “Date3” and “Date4”. The UI had date fields on different pages that had slightly different meanings (one was a create date, one was a modify date, one was a delete date and one was a create date for a separate record). However, when these dates were stored into fields in the database they didn’t have any context. For example, is “Date1” the date the record was updated or the date the record was created?

You need to have a good understanding of what each field means before you can decide how (or if) you can move the data to the new system.

Your fourth step is to look at the specifications for the new system.

If you can’t get a data model for the new system because it’s still being designed see if you can spot any obvious problems from the specification (if the project has one).

Look for things that everyone will have presumed is covered off but wasn’t. Is data missing? Is the cardinality (the relationship between the data) incorrect?

Any or all of these things could indicate that you need to re-do the gap analysis or begin a new gap analysis.

Your fifth step is to find all the consumers of the data.

Other units or business areas in your organization may consume the data. You need to find all the consumers of this data because even if the business no longer wants to collect the data the data may be very important to other areas.

For better or for worse someone asked for the data. It’s entirely possible that it was simply added for one particular person’s reporting needs at the time. However, you need to dig these facts out and make sure that if the decision is to ignore the data going forward, it won’t result in problems later on.

This seems another obvious step but can be accidentally lost if the new system is complex or there are many people involved in the project.

Your sixth step is to check your findings with the business.

After your research is completed you should be able to explain any anomalies with the business but more importantly you should identity the possible outcomes of any migration problems. Typically the lack of data in the new system may interfere with the user’s ability to complete their tasks. The business may not have been aware of this problem when specifying the requirements for the new system.


You should be prepared for your data migration to be more difficult than originally assessed and to be forced to make decisions in which the outcome is not always ideal. You should be prepared for the business to misunderstand the implications of what they’ve asked for or when they do understand they become overwhelmed by the decisions that need to be made.

Project pressures may result in solutions that are not only less than ideal, they also create problems from a BAU (Business as Usual) perspective.

Attempting your data migration too late in a project may mean that your journey doesn’t even begin.

A successful project will realize they must start their data migration analysis as soon as possible and use experienced analysts to give themselves any chance of completing a successful transition to a new system.

Don’t forget to leave your comments below.

  • 1
  • 2