Skip to main content

Business Analysis Benchmark

How Business Requirements Impact the Success of Technology Projects

The Business Analysis Benchmark report, conducted by IAG Consulting in late 2008, presents the findings from surveys of over 100 companies and definitive statistics on the importance and impact of business requirements on enterprise success with technology projects. The survey focused on larger companies and looked at development projects in excess of $250,000 where significant new functionality was delivered to the organization. The average project size was $3 million.

The study has three major sections:

1 Assessing the Impact of Poor Business Requirements on Companies: Quantifying the cost of poor requirements.

2 Diagnosing Requirements Failure: A benchmark of the current capability of organizations in doing business requirements and an assessment of the underlying causes of poor quality requirements

3 Tactics for Tomorrow: Specific steps to make immediate organizational improvement.

In addition to the full text report, these sections have also been published as stand-alone white papers for ease of use. All can be accessed from www.iag.biz.

The study provides a comprehensive analysis of business requirements quality in the industry and the levers for making effective change. The following issues are addressed in the report:

  • the financial impact of poor quality requirements;
  • the information needed to identify underlying issues critical to success; and,
  • the data necessary to target specific recommendations designed to yield performance improvement.

The report finds two basic scenarios for companies:

Scenario 1: Project success is ‘Improbable’. Companies might be successful on projects, – but not by design. Based on the competencies present, these companies are statistically unlikely to have a successful project. 68% of companies fit this scenario.

Scenario 2: Project success is ‘Probable’. Companies where success can be expected due to the superior business requirements processes, technologies, and competencies of people in the organization. 32% of companies fit this scenario.

babenchmark1

Almost everyone understands that requirements are important to project success. The data above demonstrates that while people understand the issue, they did not take effective action in almost 70% of strategic projects.

Effective Business Requirements are a process – not a deliverable. The findings are very clear in this regard – companies that focus on both the process and the deliverables of requirements are far more successful than those that only focus on the documentation quality. Documentation quality can only assure that investment in a project is not wasted by an outright failure. The quality of the process through which documentation is developed is what creates both successes and economic advantage. To make effective change, companies must rethink their process of business requirements discovery.

The following are a few key findings and data from the study:

1 Companies with poor business analysis capability will have three times as many project failures as successes.

2 Sixty-eight percent of companies are more likely to have a marginal project or outright failure than a success due to the way they approach business analysis. In fact, 50% of this group’s projects were “runaways” which had any two of:

  • Taking over 180% of target time to deliver.babenchmark2
  • Consuming in excess of 160% of estimated budget.
  • Delivering under 70% of the target required functionality.

3 Companies pay a premium of as much as 60% on time and budget when they use poor requirements practices on their projects.

4 Over 41% of the IT development budget for software, staff and external professional services will be consumed by poor requirements at the average company using average analysts versus the optimal organization.

5 The vast majority of projects surveyed did not utilize sufficient business analysis skill to consistently bring projects in on time and budget. The level of competency required is higher than that employed within projects for 70% of the companies surveyed.

Almost 70% of companies surveyed set themselves up for both failure and significantly higher cost in their use of poor requirements practices. It is statistically improbable that companies which use poor requirements practices will consistently bring projects in on time and on budget. Executives should not accept apathy surrounding poor project performance – companies can, and do, achieve over 80% success rates and can bring the majority of strategic projects in on time and on budget through the adoption of superior requirements practices.

Making Organizational Improvement

The survey findings made it clear that there is no single silver bullet for making organizational improvement. CIOs must look at making improvement across all the areas of people, process, and tools used to support processes to gain organizational improvement. Only a systematic change to all areas of people, process and enabling tools yields material improvement. 80% of projects from the companies which had made these broad-based changes had successful projects.

The findings of the Business Analyst Benchmark describe both the poor state of the majority of companies surveyed, and, a path to performance improvement. To assist the reader, this report also analyzes the data for alternative actions which could be taken to make improvement in order to separate myth from actions that create benefit. The top three findings in this area can be used to change business results:

  • 80% of projects where successful from companies with mature requirements process, technology and competencies.
  • Auditing three specific characteristics of business requirements documentation and forcing failing projects to redo requirements will eliminate the vast majority of IT development project failures.
  • Elite requirements elicitation skills can be used to change success probabilities on projects.babenchmark3

The above survey findings and the underlying statistics are described in detail within the report.

Overall, the vast majority of companies are poor at both, establishing business requirements and delivering on-time, on-budget performance of IT projects. Satisfaction with IT and technology projects wanes significantly as the quality of requirements elicitation drops. This report is both a benchmark of the current state of business analysis capability and a roadmap for success.

The Bottom Line

The challenges in making quantum improvement in business analysis capability should not be underestimated.

Organizations understand conceptually that requirements are important, but in the majority of cases they do not internalize this understanding and change their behavior as a result. The most successful of companies do not view requirements as a document which either existed or did not at the beginning of a project; they view it as a process of requirements discovery. Only companies that focus on both the process and the deliverables are consistently successful at changing project success rates.

For companies that have made the leap to the use of elite facilitation skills and solid process in requirements discovery, there are significant benefits. Not only were these projects rarely unsuccessful, they were delivered with far fewer budget overruns and in far less time.

The report describes the use of poor requirements process as debilitating. Using this word is perhaps unfair, since companies do not collapse as a result of poor quality analysis. In fact, IT organizations and the stakeholders involved will overcompensate through heroic actions to attempt to deliver solid and satisfactory results. However, ‘debilitating’ is an accurate word to describe the cumulative effect of years of sub-optimal performance in requirements analysis when results are compared to competitors who are optimal. Even leaving out the effect of high failure rates and poorer satisfaction with results, the capital investment in information technology of the companies with poor requirements practices is simply far less efficient than companies that use best requirements practices.

To illustrate this inefficiency of capital expenditure on technology at companies with poor requirements practices, use the average project in this study ($3 million):

  • The companies using best requirements practices will estimate a project at $3 million and better than half the time will spend $3 million on that project. Including all failures, scope creep, and mistakes across the entire portfolio of projects, this group will spend, on average, $3.63 million per project.
  • The companies using poor requirements practices will estimate a project at $3 million and will be on budget less than 20% of the time. Fifty percent of the time, the overrun on the project both in time and budget will be massive. Across the entire portfolio of successes and failures, the company with poor requirements practices will (on average) pay $5.87 million per project.

The average company in this study using poor requirements practices paid $2.24 million more than the company using best practices.

If overruns are common at your company, or if stakeholders have not been satisfied with more than five of the last ten larger strategic projects, there is definitely a problem and your company is likely paying the full poor requirements premium on every project.


Keith Ellis is the Vice President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith’s former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector.

The Three Myths of Virtual Team Leadership

I recently worked with a virtual team that was developing and rolling out a new product across many European countries. One colleague in London refused point blank to cooperate with another colleague who was based in Amsterdam. When I asked why, I was told that my English colleague found his Dutch counterpart rude and offensive-even though they had never actually met or spoken. When I dug deeper, it became clear that the real issue was about culture and style: the Dutch team member, for time management reasons, only checked his e-mails once a day, used a far more direct form of speaking than his English counterpart, and wrote his e-mails all in capital letters. His English colleague felt he was being ignored and, when he did get a response, that his Dutch colleague was SHOUTING AT HIM!

This is typical of the problems that occur when a team is spread across time zones, national borders or cultural boundaries. A virtual team does not have the same advantage as a team whose members meet face-to-face on a regular basis-when all the team members are inside the “30-foot limit.” Unconsciously, people pick up innumerable subtle cues when operating face to face with colleagues. We acquire information about our colleagues, including what is acceptable behaviour and what is not through observation. Inside the30-foot limit, people find ways to work together without even being aware of it. When we have no face-to-face experience with our new team colleagues, our communication lacks a certain richness. And because we don’t consciously attend to these things when we’re in close propinquity, we don’t appreciate their importance. As a result, the importance of these activities is often overlooked when working with remote teams. To ensure success, the virtual team leader must avoid being seduced by the three myths of virtual teams and must proactively attend to providing the social richness that virtual communication lacks.

The Reality of Virtual Teams

What can you do? At the start of the project, bring team members together in a start-up session so that they can meet each other face to face and socialize. This meeting will establish relationships in a way that is only possible when people are in each other’s physical presence. Human beings can relate to, start to trust, and feel responsibility towards someone they have met in ways that aren’t possible with a stranger.

Use this meeting to involve the team in developing team processes. This includes mapping dependencies between members, articulating their expectations of each other, and explicitly discussing and agreeing on the protocols that will operate between team members. The process-setting portion of the meeting may be the “official” reason for holding this meeting. This involvement will also create commitment and buy-in among the team members for these processes. If the start-up meeting isn’t feasible because of cost or time constraints, you should ensure that any complex work that requires close collaboration is handled by team members working inside the 30-foot limit.

Another option is to have key team members visit their remote colleagues to establish relationships and agree how they will work together. If neither of these options is feasible, be sure that your timeline and budget allow for the additional problems, delays and rework likely to be caused by mistaken assumptions, misunderstandings and conflict. During the life of the project, make sure that all team members are kept up to date on its progress, both what is going well and what is still to be achieved. This helps both to reduce the “outpost syndrome” and to show team members where they fit into the big picture. Repeat and reinforce the agreements made during the initial face-to-face meeting.

The Three Myths of Virtual Teams

As a virtual team leader, you must reject these three myths:

Myth #1

“My virtual team will be successful because they all share a common goal.”

Clarity of purpose and a compelling vision of success is a prerequisite for all high-performing teams. However, outside the 30-foot limit, the shared context that vision provides gets lost or forgotten if it isn’t reinforced daily through words and actions. You must ensure that everyone is up to date with the team’s progress and that expectations and dependencies between team members are understood.

Myth #2

“My virtual team will be successful because I have the best people on it.”

Individuals who perform at a high level in a co-located team situation aren’t guaranteed to operate well in a virtual or remote team. Often, individuals who find themselves working alone stop feeling connected to the team-an emotional state known as the “outpost syndrome.”

Myth #3

“My virtual team will be successful if technology is in place to allow them to communicate.”

It is, of course, vital that appropriate technology be evaluated and deployed in a considered and planned manner (and subsequently monitored to ensure it delivers the expected results). But, with due respect to all currently available technology, technology cannot and does not replicate human communication: technology doesn’t provide the same richness as that of people working in close proximity. The successful virtual team leader recognizes this gap and takes action to bridge it by addressing the human factors that technology filters out. As a team leader, you must initiate action to support the transition to remote working as well as develop team processes and enable a feeling of connection to prevent the outpost syndrome.


Judi Williams is the owner of Great Beginnings Limited and an author and instructor with Learning Tree International. Great Beginnings offers tailor-made training solutions. They design and develop a wide variety of training and professional support material such as fully immersive classroom training, interactive e-learning tools, video-based training, downloadable products etc. Judi has authored a number of courses for Learning Tree International such as The Art of Coaching, Effective Time Management and Personal Skills for Professional Excellence.

Be the Go-to Person!

Kupe’s Korner

I am thrilled to be the newest blogger on BA Times. I have been practicing business analysis for over 12 years now and have found one thing to be consistent with business analysis. Our profession is 80% art and 20% science. Ask any experienced BA a question and they either try to qualify the question to its smallest denominator to give you an answer, or they give you my favorite response, “it depends.” The reason is that most of what we do is not cut and dry. Like doctors “practice” medicine, we “practice” business analysis. With every experience we learn a little more and adapt accordingly. I am thankful for the opportunity to share some of my experiences with you through this blog. I plan on focusing on the art of our profession, but I may dip into the science piece every now and then. Let’s make Kupe’s Korner a place to share ideas and experiences to help us all reach our highest aspirations. And away we go…

For everything I do I try to find my go-to person or persons. For real estate advice, I have a few agents that are on speed dial. For projects around my house, I have my father-in-law to lean on. For my electronics purchases I know people that do more than enough research, and the list goes on. What each individual has in common are knowledge, experience and passion for the particular topic. In your circle are you the go-to person for business analysis? You can be!

Recently I received my tip of the month email from author Keith Ferrazzi. In his email he provided a quote which he pulled from entrepreneur/author Guy Kawasaki. “Eat like a bird and poop like an elephant.” In short, Keith went on to say birds eat 50 percent of their body weight per day. You should do the same when it comes to knowledge of your industry. Read everything, talk to everyone, be everywhere. Don’t rely on others or be passive about it; become an absolute expert by taking the lead. Once you’ve become a hub of this information, don’t hoard it. Spread it around-like the elephant!

You need to consume as much information as you can related to business analysis and the industry you work in. Never stop finding ways to get information. As you consume this information share it with everyone that will listen. Slowly but surely everyone will start to recognize you as the “go-to person”. You’ll start to see peers coming to you for advice and your perspective. Management will want you to be part of the most important initiatives. Project managers will not want to manage a project without you or at least without your input.

Get the Scoop

So where do you turn for all this information? For starters, keep reading all the great stuff on BA Times. I recommend you begin using Google Reader and subscribe to the many online business analysis communities and blogs. Do the same for your industry. Google Reader can be your one place to collect online information. As you go you can add and delete feeds as appropriate. Seek out individuals in your company and start to increase your industry knowledge. Continue or start participating in your local IIBA chapter to gain valuable business analysis information.

At the minimum, take one hour a week and read and participate in the online communities by commenting on discussions/blogs. Be careful…it becomes addictive!

Share the Scoop

Don’t hoard this fabulous information you start collecting. Share this wealth of knowledge you have with everyone willing to listen. Share articles with the other analysts you work with. Get the ideas out on Twitter, you have a Twitter account right? Forward some of your favorite articles to relevant contacts. Post discussions on community sites like BA Times, and LinkedIn and Facebook groups.

Here is a little thing I do. As I speak with people I pick up hints of their issues, problems, and what they are passionate about. As I come across related information I share it with those individuals. Fortunately my memory is still strong, but I need to start coming up with a “system” where I capture this information for contacts and not rely solely on my memory. You should come up with a system as well.

Enjoy being the source of information and the go-to person.

Kupe

Follow me on Twitter, http://twitter.com/Kupe


Jonathan “Kupe” Kupersmith is Director of Client Solutions, B2T Training and has over 12 years of business analysis experience. He has served as the lead Business Analyst and Project Manager on projects in various industries. He serves as a mentor for business analysis professionals and is a Certified Business Analysis Professional (CBAP) through the IIBA and is BA Certified through B2T Training. Kupe is a connector and has a goal in life to meet everyone! Contact Kupe at [email protected].

Requirements; How Do I Know When I’m Done?

This is one of the most common and controversial questions amongst business analysts and project managers: “When are requirements done?” The most common response to this question, “When the customer signs off,” frankly makes me want to tear my hair out. The focus has to be on what you can control – defined levels of quality, timeliness, completeness, accuracy, clarity, or communication – not, what you can not control. But that still begs the question – “How do I KNOW when I’m done?” – what’s the definitive if-I-have-“X”-I-am-sure-I’ve-not-missed-anything-and-I’ll-get-a-pat-on-the-back?

Here’s the issue for everyone looking for a silver bullet: requirements are never done – at least not in the business information systems world. Trying for ‘absoluteness’ (to make up a new and cool consulting word) will more likely lead to process failure than success. As you cascade past business requirements and get into increasingly detailed iterations of specification documentation, the line between what is a requirement and what is a solution gets increasingly fuzzy.

Now before all the agilists out there start whooping it up in agreement: ‘absenting’ requirements (you need to be equal opportunity about making up consulting words!) will lead to incredible performance inconsistency – at an individual resource level, at a long term asset management and integrity level, and at a corporate expectation management level. I can even prove it: low requirements maturity agilists perform terribly in on time/budget/success performance versus high requirements maturity agilists (if you want the data, send me a note through the editor or the www.iag.com site and ask). The issue of requirements quality is common across all development approaches: You MUST define what are “clear, accurate and complete requirements” for your situation, if you expect to materially change project performance and success rates.

Therein lies the answer: defining “complete” means a company has to describe all three of:

  • the state of requirements (quality of information),
  • the format of requirements (template and techniques used for visualizing requirements), and
  • the process through which these artifacts are achieved.

Doing this defines what ‘done’ is for analysts and project managers. The question of “Am I done?” only really arises at companies where there is weakness in one of the requirements state, format or process attributes. Deal with these three attributes, and the company starts itself down that path of maturing requirements practices and materially changes its ratio of on-time/on-budget/success performance on projects.

OK! Get up and stretch – for those of you saying “Hey, he’s ducking the question!” – there’s more.

Not everyone can deal with the long term fix. We sometimes have to do quick fixes as a reality of day to day management. Here are four simple tests to assess if requirements are not done to a reasonable or sufficient degree of quality. These are Business Requirements level tests that work and will improve your requirements quality irrespective of delivery methodology:

  • If the requirements lack context. Requirements always exist to support “what” the business wants to do, not “how” it wants to do it. The “what” part of this is the context of BUSINESS PROCESS. No understanding of what business processes are impacted by requirements means someone has no idea of how requirements impact each other, the impact of removing requirements, or the ability to assure that the requirements collectively are complete or will meet a specific business objective. The way a company applies context in its documentation also creates the STRUCTURE of the documentation. Here is one technique example – Use Cases. As a technique Use Cases give both context and structure to requirements and help an analyst assure that the scope of the project is both well described and sequenced.
  • If the interdependency is not evident: How do you look for proof that interdependency is documented? Look for a section in the material called “dependencies”, check the “issues list”, look for an analysis technique called a context diagram (every line on a context diagram is an interdependency). Why is interdependency so important? There are two aspects to scope: internal to the system (e.g., its functionality, the workflow and information flow, etc), and external to the system (e.g., how this system needs to interact with other systems, how the workflow being automated hands off across other departmental units). In the absence of knowing the interdependencies, you only ever know HALF of the story on scope, so it becomes probable that you will encounter significant scope shift on any system of any degree of complexity.
  • Unclear business objectives: Objectives must be Specific, Measurable, Achievable, Results-oriented, and Time Bounded (easy to remember as ‘SMART’). The absence of objectives eliminates the ability to assess solution tradeoffs, makes difficult the prioritization of functionality, and etc. other problems. You can test if a particular function meets needs with user acceptance tests. You cannot test if the collective system meets needs unless you have clear objectives.
  • You cannot tell from the description of business need how information is going to move. I need $10 for every time I’ve seen business requirements expressed as a process flowchart. This is a nice (albeit somewhat inefficient) first step but here’s the problem: when you get to the ‘decision diamond’ in the picture that says “approve the policy (Y/N)” how do you expect developers to know what this means? The only way to elaborate this is to ask the questions, “What information do you need to know to approve that policy?”, “what do you do with that information?”, “where do you get the information from?”, “who else do you give the information to?” etc. The more detailed the description of WHAT the business wants to do, the more the description of process will center on how information needs to move in support of the business process. Until you get to the level of detail that is expressing information movement, you have no idea from the documentation what the business intent is. It’s easy to test – just look for the NOUNS. Lots of nouns used consistently when describing a step in a process, mean you’re probably OK.

I’ve given everyone these above four tests because they always apply. Even at the executive level in the organization, these things should be deemed important. They are clear, auditable and technique-independent traits that can be assessed, whether you are plan-driven, prototype, vendor-supplied method, agile, or whatever.

How does this view of “Done” change when you’re a business analyst?

From a business analyst perspective, it is your job to dig deeper and be accountable for quality. I’ll give you a few thoughts as take-aways especially for you:

There are techniques that can help an analyst test requirements completeness and uncover missing business logic. Look at CRUD diagrams as an example of a commonly known test and technique. Look at an entity relationship diagramming (ERD) as a lesser known test (a bit of trivia here: any time there is a null relationship between entities in an ERD, you need to ask the question “What do you do … ” to flesh out what happens in the circumstance of, for example, a lead is never assigned to an agent). Look at some of these techniques to improve your ability to be proactive and identify issues before they arise as missed requirements. Factor the time to do some of this activity into your assignments.

From an analyst perspective the bar is quite a bit higher on what constitutes quality. You are “done” when the requirements have quality. To have requirements quality, requirements must be:

  • Correct – the requirement is an accurate elaboration of a documented business objective or goal
  • Unambiguous – the requirement has only one interpretation
  • Complete – the requirement is self contained with no missing information
  • Consistent – the requirement is externally consistent with its documented sources such as higher-level goals and requirements
  • Ranked – the requirement is prioritized for some purpose
  • Verifiable – the requirement is usable (e.g., testable) by the testers who must verify and validate it
  • Modifiable – the requirement specifies only one thing
  • Traceable – the requirement has its own unique identifier that can be used for tracing purposes

Friends, my rant is a little long this month but I hope a few ideas here bring clarity or a new perspective on a troublesome topic. “Done” never happens; reasonable happens. Your perspective on what to look at to assess requirements “quality” depends on your role, and you can educate the organization about how each level of the organization plays a role in assessing whether or not requirements are reasonably clear, accurate and complete. Check out http://www.iag.biz/resources/webinars/microcast–executive-guide-to-evaluating-requirements-quality.html for a quick example of how you might educate executives.

I wish you all, great success.


Keith Ellis is the Vice President, Marketing at IAG Consulting (www.iag.biz) where he leads the marketing and strategic alliances efforts of this global leader in business requirements discovery and management. Keith is a veteran of the technology services business and founder of the business analysis company Digital Mosaic which was sold to IAG in 2007. Keith’s former lives have included leading the consulting and services research efforts of the technology trend watcher International Data Corporation in Canada, and the marketing strategy of the global outsourcer CGI in the financial services sector. Keith is the author of IAG’s Business Analysis Benchmark – the definitive source of data on the impact of business requirements on technology projects.

The Virtues of Virtualization

Virtualization technologies have graduated to the big time, but it didn’t happen overnight. While early virtualization application experiments can be traced back to the 1960s, it is only in the past decade that there has been growing acceptance of this cost-saving technology.

Foreshadowing new virtualization breakthroughs, a 2006 IDC analysis projected that companies would spend more money to power and cool servers by 2009 than they would spend on the servers in the first place. And a recent Goldman Sachs survey of corporate technology users found that 45% of respondents expect to virtualize more than 30% of their servers (up from only 7% today).

The heart and lifeblood of virtualization consists in using a hypervisor (software that provides a virtual machine environment), situated between the hardware and the virtual machine. The virtual machine is, in essence, data, while its applications files are stored on a physical server or on a remote storage device. The result is that the virtual machine has portability, which translates into a strategic advantage in adverse situations.

Virtualization technologies have come a long way, says James Geis, director of integrated solutions development at Boston IT consulting firm Forsythe Solutions, because evaluating capacity was once difficult. Thanks to improved capacity management tools, that task has been simplified and has become a mainstream means for resource planning.

Geis also notes that, while massive adoption of virtualization solutions has become commonplace, not all servers and applications are meant to be virtualized. The choice, he says, of when, where, and how an application can be virtualized should be based on performance metrics. “There are cases where processing, memory, storage, and network requirements dictate a solely dedicated server.”

However, the value of virtualization as an enduring strategy for continued growth is enormous. Geis outlines the following benefits:

Capacity optimization. Virtualization places capacity planning and optimization at the forefront of data center management. Properly implemented, it produces the maximum return on investment per server dollar.

Rapid server provisioning. Speed and accuracy are essential in a frenetic virtual business environment. Using a server template, virtual servers can be created effortlessly. Geis says new server provisioning takes minutes or seconds, rather than the days or weeks required to procure a new box and install an operating system and software.

Server portability. Virtual servers and the applications they support can be easily moved or copied to other hardware, independent of physical location or processor type. This feature alone provides unlimited flexibility for hosting servers and applications on any combination of physical hardware.

Reduced hardware, facilities, and HR expenses. Fewer server boxes cost less, take up less floor space, require less electricity and air conditioning, and require less maintenance, thus reducing costs related to hardware procurement, real estate, utilities, and human resources.

Larry Honarvar, vice president, consulting services, at CGI, a Montreal-based IT and systems integration consulting company, employs virtualization technologies in the following areas: managed services, software development and maintenance, and hosting solutions.

For software development, virtualization better leverages hardware and software investments, Honarvar says. This works well, given the fact that customers are often scattered around the globe, working in different time zones. “Virtualization makes better use of our infrastructure investments because it allows us to test different development and testing environments. It lets us control costs and redirect funding into product maintenance and enhancement,” he explains.

In hosting solutions, CGI employs virtualization solutions to maximize services and, at the same time, contain costs. Honarvar stresses that a compelling selling point for clients is that virtualization offers transparency. “They see the benefit of being able to have more environments pre-configured and quickly available to map their needs.”

The virtualization solutions marketplace gets bigger every year. Many companies are turning out half a dozen virtualization solutions a year. Here are two examples:

Toronto-based company Asigra has developed a line of backup and recovery services. Its Multi-Tiered Storage Billing System is designed to save the time and expense of developing or modifying an existing billing system, which the company says could run up to thousands of dollars. Its features include “agentless simplicity” (software is installed on only one node, whether the customer has one PC or hundreds); advanced security features (authentication, encryption and non-escrowed keys); and autonomic healing (provides managed backup/restore services for customers).

Ottawa-headquartered Mitel has introduced a number of communication tools for small and medium-size businesses, offering reporting and a signaling protocol called SIP (Session Initiation Protocol) capabilities. Mitel is aggressively promoting its Business Dashboard, which allows companies to track call activity on an internal IP network with both historical and real-time reporting. It collects trend data on call volumes and times, and trunk usage. Its neatest feature is tracking the path of a single call through internal systems and departments, which makes for accurate management of calls.

And that’s just a brief sampling of the virtualization technologies on the market. Look for aggressive new startup companies from all over the globe to jump into this application-rich, expanding niche.


Bob Weinstein is a science and technology writer with Troy Media Corporation.