PM ChangeAgent Commentary
We’ve written before about the intelligent application of Agile methods in Information Technology (IT) projects: See part 3 of our 4-part 2011 series, The First 10% of a Project: 90% of Success, here in our ChangeAgents articles. This article is a follow up with more insights. And, much has happened since our earlier article.
Agile is maturing, and moving beyond a focus on the last-half-of-the-IT-life-cycle. For example, we have seen excellent discussions on “hybrid” approaches. This involves using Agile where it is most appropriate (and where the prerequisites are in place), and using other insightful pm methods where they are more appropriate. That approach in IT, plus increasing use of Agile concepts in areas such as New Product Development, shows promise.
I do still have concerns about a few agile zealots who insist upon contrasting Agile to Waterfall. Competent PMs disposed of Waterfall in the early 1980s. We also disposed of, for the most part, years-long, hold-your-breath-and-wait-forever IT projects. What did we replace them with? Three-to-six-month bursts (we called them iterations) that delivered prioritized useful business functions. Of course, we also identified most of the prerequisites for success:
- A good, high-level project plan;
- Clear estimates of project time and cost (subject to change with new learning);
- A compelling and convincing business case;
- Understanding of the information and data structures;
- Customer-driven high-level business requirements, and high Customer engagement throughout;
- Risk assessment, and mitigation responsibilities;
- The right talent assigned, the right amount of time; both on the IT side, and from customers;
- Facilitated sessions (Rapid Initial Planning and Joint Application Design) for fast project planning, and excellent requirements elicitation in 1-2 weeks;
- And, all the other factors mentioned in part 3 of our Success series, mentioned above.
In recent articles and presentations, I am seeing new insights that Agile practitioners are applying. For example, presenters speak about how they decide which parts of the Information Technology project are best-suited for an agile approach, and which should use classic methods. They also clearly understand the advantages and requirements of each approach… but still, there was something missing. And now, I can explain the title of this article. I solved this problem—for a different advancement—thirty-plus years ago: Prototyping and Agile share a strong set of parallels.
DeJa Vu Context
In the early 1980s, many IT groups were moving from third-generation languages such as COBOL to higher-level languages. These new languages were improving coding throughput by a factor of 2-3x. Excited about the prospect, developers were interested in “getting to code” much quicker. They thought the then-classic Structured Systems Analysis methods were a waste of time, and began “prototyping;” adapting an approach Engineers had used for years.
This meant they needed to sit down with their customer, show what they had produced, and quickly make improvements; they did this both for screens and for reports. Of course, all the prerequisites we spoke of above, were still essential—especially if the system involved new data.
But the most enthusiastic “new way” proponents were adamant that everything must be prototyped, because any other approach was the old way. And their new way needed no requirements, no documentation, and seldom even needed testing. All that overhead “stuff” was a holdover from the past—-they claimed.
After a few sessions of guiding these prototyping bright lights to higher ground, I came up with a solution. I too had been an early adopter of high-level languages, and of prototyping. As a manager, I had transformed entire organizations to their use. And, I understood both their prerequisites, and benefits. So I built a table that identified a range of attributes about the system, sub-system or business process being developed. A copy of that table (updated for readability), from around 1984, is below.
The instructions directed developers to use the table to evaluate their system or application to determine which type it is: Process-oriented, or Information-based. For each factor, rate the application by circling 1 to 6, depending on how well it meets the process-oriented or information-based test.
One outcome: They often found that they did not always know the answers—yet they were still eager to develop the solution. So they performed professional analysis to resolve the open items. They then followed our steps to analyze and evaluate the results:
- Add the circled scores, divide by 8 (the number of factors), and truncate any remainder.
- If the results are clear, decide the most approach approach:
- Systems scoring 1 or 2 are process-oriented; you should use classic structured systems analysis, aided by prototyping of all outputs during requirements.
- A System that scores 3 or 4 is a mixture. Decompose it into its sub-systems, and re-evaluate them against the factors; repeat for any sub-system scoring 3 or 4, until you get to detailed processes.
- Systems scoring 5 or 6 are primarily information-based, and are good candidates for delivery using iterative Prototyping.
We found a lot of results that we called Zorro systems: The circled scores showed as one or more Z’s down the chart. Thus step 2.2, above, to further decompose the system. Because this approach gave developers a rational process, it caught on very quickly. Even project managers and managers liked the approach. And of course, we used that interest to get those key prerequisites into place—especially those involving the right customers—still a challenge today.
We integrated this chart into our commercial methodologies, and added them to the many home-grown and commercial methodologies for which we did methods improvement. And, the last time I looked at the chart was in 1987—until one of those Hybrid Agile presentations tweaked my memory.
By the way, we also found that teams that knew enough to confidently score their system (or decompose and score it) were able to produce much better early estimates—even before they had business requirements.
Applicable to Agile?
I do not profess to be an Agile expert; I did follow Scrum from the early 1990s, when a business partner asked for help in relating Scrum to project management; he was working on integrating his facilitated requirements analysis with Scrum. I was an early advocate of Extreme Programming; and I like DSDM as a true life-cycle-wide Agile approach.
I have seen dozens of new ideas in the practice of competent and performing project management. I think Agile methods (depending on the flavor) can offer significant benefits when used wisely, and huge risk when mismanaged. For example, I’d be very careful with Agile around Regulatory projects that have high consequences. But Agile has built on the smart pm practices of the 1980s, and added useful concepts, principles, tools, and expectations.
My Question: How would you change the Prototyping chart change to adapt the factors to Agile’s key decision points? I think some of the factors might be the same. The question about Data, for example, is important; it affects primarily whether the project is intended to build the database, or to use it. If it is building the database, huge amounts of regression testing will be required; for example.
So, dear reader, I am interested in your opinion on this question: What would you change to help make the decision where Agile methods are best used? Given the savvy insights of those who are practicing Hybrid Agile, I’d bet you have some good experience to share…
As a parting comment, and to share my then-biases for Prototyping, I ended my sessions with another page that identified four areas where Developers (and Analysts) could benefit from Prototyping:
Note that these were most-often shown from transparencies on an overhead projector. Some of you may have never seen that approach, in this day of Powerpoint slides and projectors.