C. Keith Ray

C. Keith Ray writes about and develops software in multiple platforms and languages, including iOS® and Macintosh®.
Keith's Résumé (pdf)

Wednesday, June 10, 2009

Measured Throughput - Where are your bottlenecks?

Agile Hiring: see Esther Derby's article on Hiring for a Collaborative Team.

Johanna Rothman writes about an organization where the number of requirements implemented are always significantly less than the number of requirements requested. They've been blaming the testers (!) and not seeing that their throughput through the whole system is limited. Given the numbers that she posts, I compute that the through-put is about 3.4 completed requirements per week for project 1, 3.7 per week for project 2, and 2.3 per week for project 3.

If that company accepted that they're only going to get about 3 requirements per week implemented, no matter how long the project is, they could avoid waste in the "requirements/design" phase by not specifying more requirements than their measured through-put allows.

They could also use that figure to move to incremental development as well, and maybe learn practices that would improve throughput as well. If they used XP or another appropriate set of agile practices, they could maximize information-flow and minimize delays between requirements, implementation, and testing.

To find the bottle-neck, we need to know who's waiting on whom in this organization. Is there a queue of finished-but-not-tested work piling up in front of the testers? Then it could be that the testers are the bottleneck. One way to deal with a bottleneck is to change to a "pull system". In this case, a programmer only does implementation when tester is about to be ready to test their work. That leaves the programmers idle, but then they could use that "idle" time to improve their development process (code reviews, unit testing, etc.)

In this system, the bottleneck is the programmers. The "pull system" can extend to the requirements: don't specify a requirement until a programmer about to be ready to implement it and a tester is about ready to test it. To move out of phasist approach, work on testing (creating automated tests) and implementation can overlap -- or (the XP way) the programming can actually follow the creation of tests and use the tests as an executable version of the specification.

(Reposted: original was posted 2004.Apr.05 Mon)

1 comment:

  1. Another aspect of the problem occurs to me. Based on the way the organization is described, it sounds as if they aren't using incremental delivery so they can get feedback about partially-completed features. That prompts me to wonder whether all these "requirements" are really required. Stakeholders may be listing everything they can imagine in hopes of getting something rather than nothing at the end of a long delivery cycle. In addition to identifying the process bottleneck, they might also consider paring down the requirements by analyzing which ones really add value and which ones are only nice-to-haves.