I see this confusion time and again: teams muddling up acceptance criteria with the definition of done. It’s a subtle distinction, but one that, if ignored, leads to missed expectations, hidden risks, and ultimately, broken trust between teams and stakeholders. Let’s be clear—one defines quality, the other defines scope. If you want to move fast without cutting corners, you need to get this right.
Acceptance Criteria: The Specifics
Acceptance criteria are the specific, testable conditions that a particular backlog item must satisfy. They answer the question: “Did we meet this specific need?” or “Did we cover this particular case?” Think of them as the checklist for a single story or feature. They’re unique to each item and help the team and stakeholders agree on what “done” looks like for that piece of work.
- They clarify intent and remove ambiguity.
- They help the team know when to stop working on a story.
- They provide a basis for acceptance testing.
But—and this is crucial—acceptance criteria are not the same as the definition of done.
Definition of Done: The Quality Baseline
The definition of done is your organisation’s quality baseline. It’s the minimum standard that every increment, every story, every feature must meet before it can be considered complete. This is not negotiable, not optional, and certainly not story-specific.
- Security checks? Done.
- Telemetry and monitoring? Done.
- Automated tests passing? Done.
- Deployment readiness? Done.
- Regulatory compliance? Done.
The definition of done is the safety net that ensures every piece of work meets your organisation’s standards, regardless of who worked on it or what the feature is. It’s about consistency, reliability, and trust.
Why the Distinction Matters
When teams confuse acceptance criteria with the definition of done, corners get cut. Maybe a story passes its acceptance criteria, but skips security checks or forgets about deployment readiness. That’s how technical debt creeps in, and how trust with customers and stakeholders erodes.
I’ve seen teams deliver features that “work” according to their acceptance criteria, only to find out later that they’re not secure, not compliant, or not ready for production. That’s not agility—that’s chaos.
Building Clarity That Scales
If you’re still mixing up acceptance criteria and the definition of done, it’s time to build clarity that scales. Here’s my advice:
- Make your definition of done visible and explicit. Post it on the wall, put it in your tools, and review it regularly.
- Treat acceptance criteria as the contract for each backlog item, but never as a substitute for organisational quality.
- Inspect and adapt both regularly. As your organisation grows, your definition of done will evolve. So will your acceptance criteria.
In summary, acceptance criteria define scope; the definition of done defines quality. Both are essential, but they serve very different purposes. Get this right, and you’ll move faster, deliver better, and build trust that lasts.