Aug

24

What’s Done in an Agile Project?

Posted by Keith McMillan

August 24, 2011 | 2 Comments

In my time at a major insurance company in mid-state Illinois, we had a very interesting conversation on what it meant to be “done” with a user story.  Done can mean a lot of things to a development team, anywhere from “hey, I just finished writing this code” to “we’ve turned on the feature in production.”

We created what we called a “spectrum of done” to illustrate various levels of “done-ness” (okay, enough with the “quotes”), and to help teams decide where they wanted to be when they said something was Done. Some of the highlights are:

I submitted, only half-joking that we were really only Done with a feature when we turned the system off, because before that it was still subject to change or finding a problem.

There are a lot of reasons why you want your definition of Done to be as advanced as possible, but possibly the most important is that you want the best possible idea on whether you have anything left to do on a particular feature before it can be used in production. Measuring by what’s Done vs what’s not Done requires that you have a diminishing chance that when you say it’s Done, that there’s still something left to do.

The other side of this argument is that we want regular feedback on small chunks of functionality in order to give us regular data points to judge progress, and the amount of time you typically need to invest gets much bigger as you move up the Done scale.  Deploying to production environments in the typical company requires change control procedures, and that means time and money.

The sweet spot for most folks seems to be in building on a shared development server, with some sort of robust suite of tests to assure that new functionality is at least working per the tests, and that nothing that used to work was broken (subject always to the suite of tests being reasonable).  This most folks will recognize as Continuous Integration (although you should really read Continuous Integration isn’t a Tool!)

Keith returns to consulting on 9/1/11 after a year at RedSky Technologies. He’s currently looking for his next engagement.


Comments

RSS feed | Trackback URI

2 Comments »

Comment by Jason Titus
2011-08-24 15:24:58

Keith –

A term that I’ve used and seen used is “signed-off”. From the development team’s perspective, I think the “gate” that is most important, or at least hits the sweet spot, is this one. My definition of signed-off is when someone responsible for delivery of the functionality can confirm they have been given what they requested and/or need. On some projects I’ve seen the project owner give developers sign-off after quickly checking out a change on the development server where it was freshly deployed (a high level of trust between project owner and development team existed in that case). On other projects I’ve seen sign-off only occur when a QA team resource has written passing automated unit tests AND a business team has confirmed that the functionality does what is needed.

It seems pretty common in my experience for teams to stack up a pile of “dev-complete” stories that haven’t been seen by anyone until the big-bang testing cycle before production release. That is a big red flag that the release is at risk, and a fire-drill is looming.

One of my favorite agile projects I’ve been involved with required the developers to “demo” the story to the “customer” (XP project) in order to get sign-off. No story points counted against the iteration without that sign-off. It prevented too many stories from building up. (On a side note, I distinctly recall part of the demonstration for one story actually required me to crawl under my desk and pull the network cord out of the wall, demonstrate a failure, plug it back in before showing the subsequent success — at the request of our customer who was watching.)

So from my perspective, the spot on the done continuum I believe is most worthy of focus would be getting stories signed off. If the development team indeed delivered the functionality that meets the stakeholders’ needs, and those stakeholders have demonstrated that to themselves somehow, then getting that code deployed to the production environment is an orthogonal problem (albeit, a critical one).

Now, the other related question I’ve also seen along the lines of this topic are — how many stories makes a release done? That’s a whole different discussion.

Sorry for the ramblings. I hope you feel this is all on-topic with your ideas of what done means.

Jason

Comment by Keith McMillan
2011-08-24 15:33:34

Hi Jason,

I appreciate your points. I think your definition of “signed off” corresponds nicely with “reviewed by stakeholders” in the continuum above. I think that it’s a good place to aim for in terms of done-ness.

In some projects, for better or worse (mostly worse), the product owner isn’t available during the sprint to review, and only shows up at the end of the sprint (and sometimes not even then) so it’s sometimes a challenge to get something signed off and still get the kind of regular feedback we need in order to measure progress.

I always like to have at least two stories per iteration, even if that means making them small, so at least the team gets something done, and we don’t have an all-or-nothing velocity computation as a result.

In the end, it’s up to the team to decide on the most “advanced” definition of done they can accomplish, based on the constraints of the individual project. It’s in everyone’s best interest they balance the remaining uncertainty against the need for feedback in a timely fashion.

 
 
Name (required)
E-mail (required - never shown publicly)
URI
Your Comment (smaller size | larger size)
You may use <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> in your comment.

Blogroll