The articles are about Google going after Microsoft's customer base, using something called its "Cloud" computing framework. But Ken Orr’s interpretation of the Google-Microsoft confrontation emphasizes the time-to-market advantages that Google's software development lifecycle has over Microsoft's. Google is apparently practicing a more agile, iterative-style approach (sometimes quarterly) to releasing software, while Microsoft is more tied to the big bang, multi-year cycle for its products.
Might the public start perceiving companies like Google as "agile and adaptive," while tagging Microsoft as "heavy and slow?" Agile methods may have found their version of Malcolm Gladwell's "sticky message." Most agree that it began in earnest with the infamous Agile Manifesto -- elegant in its simplicity. It emphasized the value of "individuals and interactions" in creating "working software via customer collaboration while being responsive to change."
Simple and Elegant.
But some people felt the word "manifesto" carried an interesting connotation because of its perceived arrogance. One manifesto, by a crazed lunatic called the Unabomber, made headlines years ago by decrying the evils of an industrialized society and railing against the establishment. The agilists (who were NOT crazed lunatics) were also railing against an establishment; in this case, the Carnegie Mellon Software Engineering Institute (SEI). The agilists' message was that they were "lean and fast," while anyone who followed the SEI was "heavy and slow." Put simply, it was the younger generation calling their village elders (or at least their software processes) FAT.
The defiance had gotten personal. They were mad about project overrun statistics and sick and tired of being blamed for them. All those Ed Yourdon Death March projects had taken their toll. They were not lunatics, but they were irreverent for lots of reasons, and it was understandable.
Manifestos and name-calling seemed to help the Agile message to stick. Moreover, if Agile rides a Google wave, it will make a lot of software development organizations consider following Google's lead.
Meanwhile, there's an interesting quote by a long-ago congressman named Willard Duncan Vandiver. In an 1899 speech, he said, "I come from a country that raises corn and cotton, cockleburs and Democrats, and frothy eloquence neither convinces nor satisfies me. I'm from Missouri, and you have got to show me." Some people say that the speech is the reason why Missouri is famously nicknamed, "The Show Me State."
Westerners at the time used the phrase to suggest that Missourians were slow and not very bright. (There's that name-calling thing again.) Missourians, however, turned the definition around and claimed that "show me" meant that they were shrewd and not easily fooled. (It turns out that the phrase was current before Vandiver, so the thinking is that his speech may have merely popularized it.)
Now here's where it gets interesting. Manifestos and name-calling might have some frothy eloquence to them, but they neither convince nor satisfy one important constituency that many agilists need badly so they can practice their agile craftmaking. This constituency happens to be the people who sign the checks: senior management. Senior management has to buy into the idea and take risks with a "manifesto" concept that can impact the company that employs all of them. They've been around long enough to see fads come and go and can be cynical at times. Management also doesn't like the processes that they’ve invested in to be called fat.
The agilists come to the elders and they ask for some money. They want a new thing called Agile Methods. The elders respond with, "You have got to show me some productivity metrics." No metrics, no money. The agilists cringe, because they associate metrics guys as process-heavy people spouting function points.
But you don't have to be process-heavy or say "function points" all the time to be someone who knows a little bit about IT metrics. I am neither of these and have been collecting essential core metrics on hundreds of projects over the years. Many of my clients in the last 12 months are companies running Agile projects, mostly XP and Scrum, and they want to know how they compare against waterfall projects.
We have plenty of data in a worldwide database of over 7,400 projects -- agile, waterfall, package implementation, new development, legacy, military, commercial, engineering -- you name it. The numbers are so very interesting that I can't fit them all into this article. Suffice to say I've been on the lecture circuit recently on this subject and conducting webinars for people who want me to show them.
So what have I found? Here are some of the highlights:
• Agile teams have metrics. The perception might be that Agile teams are a bunch of undisciplined programmers slinging code and not documenting anything. Not true. They know their schedules (time), keep records of team members working on their projects (effort), they count requirements and things called iterations and stories (a size metric), and they keep track of bugs (defects).
• We easily draw this out along with their velocity charts on a whiteboard sketch. This profile is all we need to capture the measures and load them into a computer database.
• Agile trends can be plotted on a chart where a picture says a thousand words (or in this case, metrics). Smaller releases, medium-sized releases, and large releases are charted from left to right. Vertically, we can chart the schedules, team size, and defects found and fixed.
• As a group, the projects were mostly faster than average. About 80% were below the industry average line. Note that some took longer, for several reasons (too long to explain here). Some companies developed software in two-thirds or even half the time.
• They used larger than average teams. Even though many of the smaller releases used small teams, some -- apparently in response to deadline pressure -- used large numbers of people. One company applied seven parallel Scrum teams totaling 95 people, where the norm was about 40 people.
• On waterfall projects, the "laws of software physics" showed a predictable outcome of large teams trying to create fast schedules -- high defect rates (sometimes 4x-6x). On Agile projects, we saw a shockingly low number of defects -- in some of the companies. The best performer had high-maturity XP teams. These project teams showed defects that were 30% to 50% lower than average. Other less-mature Agile teams had defect rates that were more like waterfall projects.
The initial results from these companies were fascinating. One thing that stood out was that there was in fact a learning curve. The sample had a range of Agile experience from one to four years. You can easily see that the highest performing teams achieved a level of performance that the others didn't match. Agile was not a cure-all for all of the companies, but it will be interesting to see how the others fare as time progresses.
Another factor that was interesting indeed, was that all of the companies were being challenged by the outsourcing/India option from the top down. Some adopted Agile methods as a better, faster -- and, yes, cheaper -- alternative while saving their jobs in North America.
It will also be interesting so see more patterns emerge as more data comes in. Soon enough, we'll have sufficient statistics for a series of Agile industry trend lines against which we can make direct Agile-to-Agile project comparisons. And the Agilists will have something they surely have longed for all along: Agile metrics. And the village elders just might buy into the idea.
Michael Mah is a Senior Consultant with Cutter Consortium, and also Managing Partner of QSM Associates. He welcomes reader comments and insights at email@example.com.