Using Definition of Done to Drive Agile Maturity
Posted
by Dylan Smith
on Geeks with Blogs
See other posts from Geeks with Blogs
or by Dylan Smith
Published on Fri, 30 Nov 2012 03:22:56 GMT
Indexed on
2012/11/30
5:06 UTC
Read the original article
Hit count: 281
I’ve been an Agile Coach at a lot of different clients over the years, and I want to share an approach I use to help them adopt and mature over time.
It’s important to realize that “Agile” is not a black/white yes/no thing. Teams can be varying degrees of agile. I think of this as their agile maturity level. When I coach teams I want them to start out being a little agile, and get more agile as they mature. The approach I teach them is to use the definition of done as a technique to continuously improve their agile maturity over time.
We’re probably all familiar with the concept of “Done Done” that represents what *actually* being done a feature means. Not just when a developer says he’s done right after he writes that last line of code that makes the feature kind-of work. Done Done means the coding is done, it’s been tested, installers and deployment packages have been created, user manuals have been updated, architecture docs have been updated, etc. To enable teams to internalize the concept of “Done Done”, they usually get together and come up with their Definition of Done (DoD) that defines all the activities that need to be completed before a feature is considered Done Done.
The Done Done technique typically is applied only to features (aka User Stories). What I do is extend this to apply to several concepts such as User Stories, Sprints, Releases (and sometimes Check-Ins). During project kick-off I’ll usually sit down with the team and go through an exercise of creating DoD’s for each of these concepts (Stories/Sprints/Releases). We’ll usually start by just brainstorming a bunch of activities that could end up in these various DoD’s. Here’s some examples:
- Code Reviews
- StyleCop
- FxCop
- User Manuals Updated
- Architecture Docs Updated
- Tested by QA
- Tested by UAT
- Installers Created
- Support Knowledge Base Updated
- Deployment Instructions (for Ops) written
- Automated Unit Tests Run
- Automated Integration Tests Run
Then we start by arranging these activities into the place they occur today (e.g. Do you do UAT testing only once per release? every sprint? every feature?). If the team was previously Waterfall most of these activities probably end up in the Release DoD. An extremely mature agile team would probably have most of these activities in the DoD for the User Stories (because an extremely mature agile team will probably do continuous deployment and release every story). So what we need to do as a team, is work to move these activities from their current home (Release DoD) down into the Sprint DoD and eventually into the User Story DoD (and maybe into the lower-level Check-In DoD if we decide to use that).
We don’t have to move them all down to User Story immediately, but as a team we figure out what we think we’re capable of moving down to the Sprint cycle, and Story cycle immediately, and that becomes our starting DoD’s. Over time the team makes an effort to continue moving activities down from Release->Sprint->Story as they become more agile and more mature. I try to encourage them to envision a world in which they deploy to production as each User Story is completed. They would need to be updating User Manuals, creating installers, doing UAT testing (typical Release cycle activities) on every single User Story. They may never actually reach that point, but they should envision that, and strive to keep driving the activities down closer to the User Story cycle s they mature.
This is a great technique to give a team an easy-to-follow roadmap to mature their agile practices over time. Sure there’s other aspects to maturity outside of this, but it’s a great technique, that’s easy to visualize, to drive agility into the team. Just keep moving those activities (aka “gates”) down the board from Release->Sprint->Story.
I’ll try to give an example of what a recent client of mine had for their DoD’s (this is from memory, so probably not 100% accurate):
Release
- Create/Update deployment Instructions For Ops
- Instructional Videos Updated
- Run manual regression test suite
- UAT Testing
- In this case that meant deploying to an environment shared across the enterprise that mirrored production and asking other business groups to test their own apps to ensure we didn’t break anything outside our system
Sprint
- Deploy to UAT Environment
- But not necessarily actually request UAT testing occur
- User Guides updated
- Sprint Features Video Created
- In this case we decided to create a video each sprint showing off the progress (video version of Sprint Demo)
User Story
- Manual Test scripts developed and run
- Tested by BA
- Deployed in shared QA environment
- Using automated deployment process
- Peer Code Review
Code Check-In
- Compiled (warning-free)
- Passes StyleCop
- Passes FxCop
- Create installer packages
- Run Automated Tests
- Run Automated Integration Tests
PS – One of my clients had a great question when we went through this activity. They said that if a Sprint is by definition done when the end-date rolls around (time-boxed), isn’t a DoD on a sprint meaningless – it’s done on the end-date regardless of whether those other activities are complete or not? My answer is that while that statement is true – the sprint is done regardless when the end date rolls around – if the DoD activities haven’t been completed I would consider the Sprint a failure (similar to not completing what was committed/planned – failure may be too strong a word but you get the idea). In the Retrospective that will become an agenda item to discuss and understand why we weren’t able to complete the activities we agreed would need to be completed each Sprint.
© Geeks with Blogs or respective owner