Eight Tips for writing a good definition of ‘done’

After scouring the internet, looking at what my teams have come up with, and discussing with colleagues, I ended up with the following eight tips for writing a definition of ‘done’ (DoD):

1. Write it as a team. Consider the points of view of all the disciplines, competencies and skill sets in the team

The value in a team’s DoD is the shared understanding and agreement by the whole team including testers and technical writers, etc. Delegating the writing of the DoD to one or two team members or worse, delegating it to the team’s Scrum Master, Product Owner or department manager misses the whole point. The team’s Scrum Master, as protector of the process, is responsible for ensuring the team agrees a DoD but the Scrum Master is not responsible for writing it. The team needs to do it as a whole.

2. Use a bullet list, not long paragraphs

A simple flat bullet list is usually the best format. It is easier to read and remember than paragraphs of text, and helps keep the DoD concise. If a team has a technical writer or access to one, they can help the team express each bullet point clearly and concisely. Having nested bullet points in the list is not generally recommended. While nesting bullet-points once is ok, nesting any deeper is going too far, making things too complicated. If you feel you must elaborate on an item, do so in a foot note after the main DoD.

3. Do not include specific best practices or conventions. Instead describe these separately and refer to the from the DoD

Specific best practices, conventions and patterns should be documented somewhere appropriate and referred (linked) to from the DoD. e.g. “… code must comply with agreed best practices“.  Doing this enables guidelines, idioms, and best practice to be modified and improved without having to update the team’s DoD every time.

4. Write it in terms of new and modified product functionality to avoid duplicating content for different backlog item types

What does the definition of done refer to: user stories, tasks, bug fixes, pocs, spikes? Ultimately, the outcome from a team’s efforts is new or modified code that forms part of a software product, and the quality of that code should be consistent.  So instead of writing separate DoD criteria for different backlog item types, consider expressing the DoD in terms of new and modified product functionality and source code instead. It might help to start out by listing the criteria for each kind of backlog item and then generalise and abstract from there. This should help prevent the DoD being overly-detailed and keep it concise.

5. Avoid concrete, ticks-in-a-box clauses that tempt the team to artificial compliance

Avoid concrete measures and limits in a DoD like “test coverage of at least 70%” or ” at least one automated test”. Such statements tempt people to game DoD compliance. For example, dictating test coverage percentages encourages the team to write low value tests simply to boost the test coverage percentage. It is better if the DoD requires the team to think through and design the right level of automated testing and coverage for the work being done. Similarly, ‘at least one’ clauses tend to encourage teams to fulfil them by providing only one. Again, it is better to require the team to identify and provide the appropriate amount of whatever it is. The DoD is the teams agreement to do what they consider best, and is what they expect from each other as much as anything else. It should not be an additional, artificial burden that adds little value and does not ultimately contribute to the quality of what the team produces.

6. Consider the DoD from a number of angles

  • Acceptance – the Product Owner is the last word in whether an item does what is required but some teams have other stakeholders that also need to approve their work, e.g. User Interaction / Product Design lead, an Enterprise Architect, etc.
  • Functional Testing – what does the DoD need to say about new and existing automated or manual test cases? e.g. any new or modified product functionality is covered by a set of good automated tests automated tests for all other functionality are still passing
  • General Requirements – about security, data protection, performance, 3rd party licensing and compatibility, usability,  installation and upgradability, and localisation etc.? Does the DoD need to include anything about these?
  • Code Quality – code reviews and static analysis requirements? e.g. all new and changed code plus supporting resource files have been peer reviewed and comply with best practice, scanned by static analysis tools X, Y and Z
  • Documentation – if the team has a technical writer or needs to provide input to a technical writing department, what does the DoD need to cover that? e.g. the team’s technical writer has everything they need to update the product documentation, release notes, etc.
  • Integration – a team’s code is not usually ‘done’ if it has not been integrated at some level with the main body or a product or project’s code e.g. code has been merged and integrated into the team’s main branch after successful execution of the team’s continuous integration pipelines?
  • Process – have process items been updated: sticky notes moved, whiteboards updated, JIRA tickets set to the correct status, wiki pages modified, etc

7. Think through as a team how to verify each criteria in the DoD has been met

This will help the team understand the impact of the DoD on their day-to-day ways of working: the columns and filters they might want in their sprint boards, fields in their backlog items/tickets in JIRA, what additional information is captured on a wiki page, checklist of things to be considered in sprint planning for each backlog item, etc.

8. Don’t leave it to gather dust in a tool. Keep it in view and review regularly.

If the team is co-located, stick the DoD on a wall somewhere where it is unavoidably visible. If the aims and goals of the team change, e.g. they start work on a new type of product, with a new technology stack, or a new business area then consider the impact of that on the DoD. If the DoD is not being talked about and referred to, is it still relevant? If not, then review and bring it up to date. The DoD is a tool to help the team; don’t relegate it to a dry, next-to-useless, process-compliance item taking up space on some long-lost web page , presentation slide, or word-processor document.