Skip to main content

Tag: Best Practices

BATimes_May15_2024

Beyond The Happy Path: One Size Does Not Fit All

Up until a few years ago, I used to spend a lot of time working ‘on the road’. I’d spend time traveling between different client-sites, and this would inevitably mean spending far too much time in airports. Business travel is one of those things that sounds really glamorous until you do it, but believe me it soon gets really boring.

 

When you are a regular traveler, you tend to become very familiar with certain airports and you know exactly how to transition through them quickly. If you’re ever at an airport, you can usually spot regular travelers as they tend to know exactly where they are going, and tend to move at pace.  This is completely different to the family who are checking in with three kids, who are having to refer to the signs, and might have even initially arrived at the wrong terminal. Quite understandably, they often need a little bit more help.

I used to joke that it would be good to have an entrance especially for regular travelers, as their needs are so different (I certainly wouldn’t be buying anything from the duty free store, not even one of those giant airport Toblerones that seem ubiquitous in Europe, but a family may well do). This wouldn’t be practical in airports, but it does highlight a point that has direct relevance for business and business analysis: sometimes you need two (metaphorical) entrances…

 

Understanding Different Usage Patterns

When defining a process, journey or set of features for an IT system, it’s common to think about one path or scenario through which users will navigate.  This main success scenario or ‘happy path’ is then often supplemented by exceptions and alternative paths which are essentially ‘branches’ from the main scenario.

Yet it is worth considering that different types of stakeholders might have different needs as they navigate through. There might even be benefit in having two entry points. Building on the airport analogy, an experienced traveler probably doesn’t need to be told about the security rules (unless they have changed recently). A new or less frequent traveler may well need to be informed in detail.

This translates to wider contexts too. An experienced user of an IT system probably won’t want lots of dialogue boxes popping up with hints and tips. A new user might need virtual hand-holding as they learn how the application works.  Someone who calls a call center once might need to hear that three minute Interactive Voice Response (IVR) message explaining all about the services they can access via phone, and letting them know which number to press.  Someone who calls every day as part of their job probably doesn’t need to listen to the whole three minute spiel every time they call… Understanding these usage patterns is key.

 

Advertisement

 

One Size Rarely Fits All

There is often a desire to standardize processes, journeys and customer experiences. There is benefit in doing this, but the benefit really surfaces when the different users and stakeholders are understood. Understanding whether users will be casual/occasional or regular is important, as is understanding what they are ultimately trying to achieve.

This relies on elicitation and customer research. This is an area where business analysts can add value by advocating for the customer’s perspective. Too often definition and design decisions are made by people in comfortable conference rooms who are detached from what the experience will actually be like. Sometimes those decisions are made by people who haven’t spoken to a real customer in a decade (or ever!).

In these situations we can ask important but difficult questions such as “what evidence do we have that customers want that?”,  “which types of customers does that appeal to most?” or “how do we know this will be a priority for our customers?”.  Using a technique such as personas, when coupled with proper insight and research, can make a real difference here.

 

As in so many cases, asking these questions can sometimes be uncomfortable. But if we don’t ask them, who will?

BATimes_May08_2024

Overcoming 3 Common Challenges of Business Process Modelling

Identifying and depicting business processes is the first step towards understanding the current state and developing a plan for the future. Business analysis activities are often oriented towards enabling and supporting change. The most important aspect of having a process model is that it enables a business to quickly see how well all the different aspects of the business are aligned to achieve common goals. When there is misalignment, it becomes evident very quickly in the model, and the business can plan how it will deal with getting properly aligned again.

Business analysts, using primarily elicitation and modelling techniques, try to find out the means by which an organization carries out its internal operations and delivers its products and services to its customers.

 

However, process modelling and analysis can be tricky. Below are some challenges:

 

  1. Figuring out the tasks

It’s difficult to obtain information about the complete process when there are many engaged departments. Usually every part of the process is aware of the specific tasks and activities in which they’re involved, but they miss the whole picture. Frequently, after the process modelling has been finalized, the engaged actors can holistically understand the end-to-end process.

  • Trying to figure out first who is involved and the starting and ending points of the process is crucial in order to drill down and find the details for each step. It may be a good idea to begin with the most experienced actors or those who have a helicopter view. It is more than important, however, to validate your insights against other sources of information to be sure that you have captured accurate information.
  • Having information about the industry context may be helpful, as the basic business processes among organizations in the same industry have things in common. This, of course, does not mean that the specific organization’s parameters should not be taken into consideration.

 

Advertisement

 

  1. Systems Thinking

Consistency, It’s a necessary verification criteria in process identification and modeling. The steps and tasks involved in the process should make sense as parts that form wholeness, not as independent elements. In order to meet deadlines and get immediate results, business analysts frequently reduce the amount of time spent understanding the context. Delivery of value through a process modelling initiative will be limited, as long as we think analysis is about figuring out just specific characteristics of a solution that are already predefined in their minds. Systems thinking is a vital mindset that allows a business analyst to understand the as-is state and communicate it in a way that will be commonly understood by all stakeholders. This is an essential step in defining the future state.

 

 

  1. Understand how the process fits into its environment.

If a model doesn’t define how it fits into its environment, it will struggle, and its likelihood of resounding success is greatly diminished. Understanding the fit of a process within its internal and external environment is a complex, multi-faceted exercise. A business analyst needs to understand who the actors are, what their needs are, and how they can be reached. The business also needs to know who its suppliers are. Only when all relationships between the internal and external environment are understood can the business analyst ensure it is shaping an effective process model.

 

Identifying misalignment issues and understanding problems and opportunities for the business can be triggered by process analysis via modeling. Through effective process modelling, the following questions can be answered:

  • What processes does the business currently maintain?
  • How do the processes fit within their environment?
  • How do the processes create and maintain value in the external and internal environments?
  • What is the gap between the as-is and to-be states?
BATimes_Apr24_2024

The Pitfalls Of Efficiency: Process Improvement Is A Balancing Act

Business analysis work often involves improving processes. This might include simplification of a process, reengineering or automation. When used well, IT can be used to enhance (or even completely rethink) a process. The ideal outcome is to design a process that is quicker, more convenient and more cost-effective than what it replaces.

 

When aiming for efficiency, it’s important to ask “for whom are we optimizing this process?”. This might sound like an odd question to ask, but often there’s a fine balancing act. A process that appears very efficient for a company might actually be very inefficient and inconvenient for its customers. Standardizing a procurement process might create internal efficiencies for the company involved, but might place additional work on the company’s suppliers.

 

An Example: “No Reply” Secure Email

I was recently a customer of a company that would send correspondence via secure email. I’d receive a notification via regular email, and I’d then need to log in to the company’s secure email portal to read what they had sent me. This was fine, except the emails they sent were all from a ‘no reply’ address.  While the secure email system they had implemented literally had a ‘reply’ button, there was a disclaimer on every email they sent which said “don’t reply, as we won’t read what you send us” (OK, it wasn’t that blunt, but you get the idea!).

This led to the crazy situation where the only way of replying to their secure emails was to either call via phone (and queue for 45 minutes), or put a reply in the mail.

 

This is an example of a situation where convenience and savings are predominantly biased towards the company, with some minor benefit for the customer. Prior to sending secure email, they would put correspondence in the regular mail. Moving this to an electronic platform presumably saves in printing, postage and stamps. It’s of marginal benefit to customers too, as they receive correspondence quicker (providing they look at their email regularly).

But the real customer benefit would have been to be able to correspond and reply with the company via secure email. Ironically, by implementing the solution the way that they did I suspect their ‘no reply’ mailbox is actually full of replies from customers who didn’t read their disclaimer!

 

Advertisement

 

There is no “right”, it’s a balance

As a customer, I found the situation frustrating, but there is no inherent or universal ‘right’ answer here. It might be that the company in question had deliberately chosen not to accept incoming secure email for compliance reasons, or perhaps they feared they’d be flooded with lots of customer inquiries as they are now ‘too easy’ to contact (although I’d argue that if this is the case then there’s probably a bigger root cause they ought to be contending with!).

 

The point here is that it should be a conscious balancing act. It is all too easy to create a situation that is more efficient for one group of stakeholders, but actually worse for another. An employer who decides to streamline their process for employees who need to claim travel expenses might decide that they can save time if they ask their employees to input more data at the time they submit their claim. If they get the employee to select where the expense was incurred, the amount of sales tax that was included in the expense, the category of cost and so forth, then this saves time later. Yet an employee who isn’t a tax expert might find this frustrating (“Is train travel exempt, or zero-rated for sales tax?”). Of course, in reality this will likely affect the quality of data too, as people try their best (but don’t know which of the different tax code options to choose).

This is a specific example, but it highlights a wider point: it’s important to consider process improvements from the perspectives of the stakeholders impacted. This involves considering what efficiency as well as effectiveness looks like for each key group.

As with so much in business analysis, stakeholder identification, engagement and empathy is key!

 

BATimes_Apr11_2024

Don’t Let Your Software Requirements Die

In the realm of software development, the clarity and accuracy of software requirements are pivotal for project success. Traditionally viewed as static documents to be archived post-project, this perspective neglects their ongoing potential. 

 

Living software requirements is a paradigm where these documents evolve continually with the software, serving as an enduring source of truth. This approach not only maintains relevance but also actively shapes the software’s lifecycle, promoting adaptability and precision in development processes. 

They ensure that as software grows and changes, the documentation is not left behind, thus avoiding the pitfalls of outdated or irrelevant information – because often zero documentation is worse than out of date documentation!

 

How requirements slowly die.

Picture this: a new software project kicks off with energy and optimism. The business analyst dives deep, engaging with stakeholders to gather an amazing set of requirements. They craft an impressive functional specification that serves as the project’s North Star, and as the project kicks off, hundreds of tasks get populated into a project management tool like Jira, mapping out the journey ahead.

The software delivery team starts strong. 

 

As expected, questions and clarifications emerge, evolving the requirements a little. Some tasks need tweaks; others have missing components, and there are even some sew requirements that surface. This is fine (we are agile after all!) – and these changes and additions are all added into the project management tool, as that’s now the source of truth keeping the project on track. 

As the tasks are ticked off, a sense of accomplishment fills the air. Finally, the project crosses the finish line, the board clears, and it’s a wrap. Success!

Or is it? Software, particularly the large, mission-critical kind, is never truly ‘done.’ 


The project may have ended, but the software lives on, continuous adaptation and enhancement are normal these days. But scoping new tasks becomes a little harder, as the detailed system knowledge from that original functional specification, has now changed. The source of truth is now fragmented across completed Jira tasks and buried in comment threads. 

In this scenario, the requirements didn’t just become obsolete; they died a slow death, leaving the team navigating a labyrinth of past decisions and discussions to grasp the full scope of their own software. 

 

Advertisement

 

How to keep my requirements alive?

Keeping software requirements alive is pivotal for the long-term health and adaptability of your system. Rather than relegating these crucial insights to a static document, consider embedding them within a collaborative platform accessible to the entire organization. This living, breathing approach ensures that requirements can evolve alongside your software, reflecting real-time changes and decisions. Here’s how you can make it happen:

 

1. Centralize requirements and allow collaboration: Choose a platform where stakeholders across the business can access, review, and iterate on the requirements. This system should be the go-to source for everything related to what your software does and why, and platforms such as Userdoc are specifically tailored to this task.

 

2. Project management integration: While the main body of requirements should live outside, ensure there’s a seamless flow of information into your project management tools like Jira. This helps in translating the high-level requirements into actionable tasks and ensures day-to-day activities align with the broader goals.

 

3. Continuous updates and iterations: Encourage a culture where updating the requirements is part of the process, not an afterthought. This keeps the requirements current and relevant throughout the software lifecycle.

 

4. Embrace AI – AI can be an amazing tool for helping determine what changes could affect other parts of your system, and understanding that when writing requirements for New Feature X, you will also need to update Existing Feature Y’s requirements.

 

5. Requirements versioning: Just like with code, software requirements need versions and branches. Ensure you clearly denote what features are live, what features are in development, and what features are still being scoped.

 

6. Living reference for all teams: From development to QA, from business analysts to project managers, ensure that everyone references and contributes to the same set of requirements. This alignment prevents information silos and fosters a unified understanding of the system.

 

7. Long-term business asset: Beyond project completion, maintain these requirements as a living record of what’s in place. This becomes invaluable for training, onboarding, and new developers understanding the system’s capabilities and limitations. It also ensures the source code isn’t the only source of truth for the system’s functionality.

 

Transforming your software requirements into living documentation is a strategic move that pays dividends throughout the lifecycle of your software. 

And the thing is, it’s not actually doing any extra work – it’s just simply unifying the place where that work is done, and fostering a culture of continuous collaboration and documentation.

Embrace the concept of living software requirements and watch your software, and team, move faster with more confidence.

BATimes_Mar27_2024

Web 3.0: The Future of Process Catalogue Management?

Web 3.0 technology, in my view, can be used for new innovations and has the ability to deliver positive change quicker. Specifically, Blockchain technology could allow for a transparent, automatic and secure way to manage a business’ process catalogue.

Traditionally, when analysing processes things like Upper/Lower Specified Limits, Service Level Agreements, and Defects Per Million Opportunities are used to understand whether a process is performing satisfactorily. This requires a BA to take measurements, validate them and then work with the business to pivot the process back to delivering the agreed standard. The typical business trigger event for this is either automatic or internal– it requires a BA to pick up during routine quality testing, or an actor to notice and raise through an agreed mechanism. This is because the process infrastructure is basically storage; it could be coined as “static management”. This means things can be missed, as humans make mistakes and the data does not work for the business, rather the business works for the data.

There have been recent advancements in technology, namely Web 3.0, which can reduce or potentially eliminate the human error element and turn the process catalogue into a dynamic storage, in which the data works for the business. In particular, Blockchain technology offers several features that could transform the way we work.

A Blockchain has several features, such as: Nodes, Ledger, and Wallets. Nodes are users/devices that hold the ledger, in full or in part. The Ledger is the record of transactions that happen across the blockchain and wallets are areas, in crypto blockchains, where the cryptocurrencies are stored.

 

At a first glance, this ecosystem seems locked to currencies, I believe it can be adapted to handle processes. Each process would need to be broken down into its steps and identified by its inputs/outputs and business actors. This dataset is then integrated into a blockchain – with each block containing the data from a single process step. In terms of a traditional process map, the block is the process step and the transaction is the connector lines between two process steps. In process terms it would be Step, Connector, Step; in blockchain terms it would be Block, Transaction, Block.

When the process is run, new unique blocks are added to the chain with the details of that unique process step run, which are then linked to further blocks/steps via transactions, providing a completely transparent and auditable record.

 

This setup has an infrastructure advantage because a blockchain validates transactions through decentralisation, using other blocks already in the chain. It means process rules are embedded in a chain from existing blocks and are then used to validate new blocks, resulting in a guaranteed uniformed process run, as the blockchain would only validate the transactions in accordance with the blocks already there.

The blockchain allows for easy performance monitoring, as each block is recorded with management information as well as process information and this is all in one place, it is easy for an analyst to calculate run times, business actor performance on individual or multiple transactions and process efficiencies.

Once an improvement is identified, the process is updated and released onto the blockchain, then becoming the single-source-of-truth for transaction validation, therefore only allowing the most up-to-date process to be followed by business actors. In this sense, the blockchain is both the governing authority as well as storage for processes.

 

The problem with this is that it is still reliant on humans picking up on the fact that a process is not performing, so whilst we have an enforceable process level to six sigma, we do not have the benefit of removing the human error or time lag associated with a drop in process performance.

This can be resolved using a feature of a blockchain called a smart contract. Smart contracts are automated digital contracts which trigger when the terms and conditions of that contract are met. There is an equivalent document in the business world, which sets out an agreement between two parties to perform in a particular way or to a particular standard under particular terms – a Service Level Agreement (SLA).

The smart contract is the Web 3.0 equivalent to the SLA. However, a smart contract offers much more than just an agreement, it self-executes which means as soon as the terms are met, action is taken with virtually no time lag.

 

Advertisement

 

The smart contract is created using an if/when then statement. An example smart contract can be if a customer makes an enquiry and no one contacts the customer in 3 working days, then an escalation notice is sent to the assigned persons manager. As this is automated, as soon as the condition is met, the contract is acted upon – meaning management do not have to spend time reviewing whether the conditions within SLAs, making both service and personal performance management easier.

There are, however, some issues with blockchains which need further consideration to overcome: a large number of transactions can cause lag on the chain, due to the required effort to process them all, meaning slower transaction times. It may mean that this model is best suited to small startups/businesses. Blockchain technology is still new, and therefore is not thoroughly regulated yet, meaning it can be difficult to fit in with current governance structures. This can be tackled by robust risk management and future legislation or policies, meaning this model may be suited to an innovator type business.

 

In summary, Web 3.0 Blockchains can offer improvements to the operation, governance and management of processes. By leveraging features of blockchains, it’s possible to move from a static process catalogue to a dynamic, automatic and smart infrastructure which reacts quicker to changes in business environments, freeing up staff to find other efficiencies or grow the business in other ways. While there are concerns and issues around things like scalability and regulations, it is clear that Web 3.0 technologies can offer new and exciting opportunities.