Design and development of products and services
ISO 9001:2015 Clause 8.3 requires you to establish, implement, and maintain a controlled design and development process for products and services so outputs reliably meet requirements. To operationalize it, define the stages, responsibilities, inputs/outputs, reviews, verification/validation, change control, and records you will keep for every design effort.
Key takeaways:
- You need a documented, consistently followed design and development process, not ad hoc “engineering best effort.” 1
- Auditors will look for objective evidence: defined inputs/outputs, reviews, verification/validation results, and controlled design changes. 1
- Scope includes product, service, and software design, plus changes to existing offerings, including where third parties do design work under your QMS. 1
“Design and development of products and services” is where many ISO 9001 programs fail in practice: teams build things, ship things, and only later try to reconstruct how requirements were understood, risks were addressed, and changes were controlled. Clause 8.3 forces discipline. You must run design work through a defined process that produces consistent outputs and traceable records. 1
For a Compliance Officer, CCO, or GRC lead, the fastest path is to translate Clause 8.3 into an operational “design control” backbone: a lightweight lifecycle, required checkpoints, and mandatory artifacts. This does not need to be slow or bureaucratic. It does need to be repeatable, auditable, and integrated with how your teams actually deliver (product management, engineering, professional services, R&D, packaging/labeling, implementation, and support).
This page gives you requirement-level implementation guidance: who it applies to, what to build, what evidence to retain, where audits get stuck, and a practical execution plan you can run with functional leaders.
Regulatory text
Requirement (excerpt): “The organization shall establish, implement and maintain a design and development process.” 1
Operator interpretation: You must have a defined end-to-end process for design and development work, you must follow it, and you must keep it current over time. “Process” here means more than a policy statement. It means a managed lifecycle with clear responsibilities, controlled inputs and outputs, planned reviews, and retained evidence that design results meet requirements and that changes are controlled. 1
Plain-English interpretation (what Clause 8.3 is really asking)
If your organization designs or develops what it sells or delivers, ISO expects you to:
- Plan the work so design activities are appropriate for the product/service and risk.
- Start from defined requirements (customer needs, regulatory needs, internal standards).
- Create design outputs that can be built, delivered, and supported.
- Check the work through reviews and objective testing/confirmation steps.
- Control change so you know what changed, why, who approved it, and what it impacts.
- Keep records so the story is provable later. 1
Who it applies to (entity and operational context)
Clause 8.3 applies to any organization within scope of an ISO 9001:2015 QMS that performs design/development activities for products or services. 1
In practice, it applies when any of the following are true:
- You create a new product, feature, formulation, device, or software capability.
- You design a service method (onboarding, implementation, managed service, testing service) where outcomes depend on controlled design decisions.
- You customize or configure solutions in a way that changes requirements, performance, security posture, safety, or usability.
- You make material changes to existing offerings (design changes, substitutions, re-architecture, workflow redesign). 1
It also applies when design work is performed by third parties (contract manufacturers, design houses, engineering contractors, software development partners). The work can be outsourced, but accountability for the process and evidence stays with you under your QMS scope. 1
What you actually need to do (step-by-step)
Below is a pragmatic build-out that satisfies the “establish, implement, maintain” requirement and aligns to what auditors test for.
1) Define your design and development scope and triggers
Create a short rule set that answers:
- What counts as “design and development” in our business?
- What work is exempt (for example, routine, low-risk changes) and what still requires change control?
- What are the triggers for initiating a design record (new product, major change, customer-driven customization)? 1
Deliverable: Design & Development Procedure with scope, triggers, and roles.
2) Set lifecycle stages and required checkpoints
Define a stage model that fits your delivery style (waterfall, agile, hybrid). Auditors care less about the model name and more about consistent control points. Minimum checkpoints to define:
- Planning / initiation: purpose, scope, responsibilities, needed resources.
- Requirements: documented design inputs and acceptance criteria.
- Design / build: outputs produced to meet inputs.
- Review: formal review(s) with defined participants and outcomes.
- Verification and validation approach: what evidence proves the outputs meet the inputs and intended use.
- Release / handoff: readiness for production or service delivery.
- Change control: method to evaluate, approve, implement, and record changes. 1
Deliverable: Design Plan template (even a one-page form) plus a stage gate checklist.
3) Standardize design inputs
For each design effort, require a minimum set of inputs:
- Customer and user needs (including contractual requirements)
- Applicable internal standards and constraints
- Regulatory or statutory requirements, if applicable to your offering
- Risks/assumptions and operational constraints (support model, environment, interfaces)
- Acceptance criteria (what “done” means, measurable where possible) 1
Deliverable: Requirements document (PRD/SRD/URS) with version control.
4) Define what “design outputs” must include
Outputs must be sufficient to enable provision of the product/service. Make this concrete:
- Specifications, drawings, architectures, configurations, BOMs, workflows, runbooks
- Service delivery method, tooling requirements, competency requirements
- Test strategy and test cases where relevant
- Release notes and operational handover requirements (support, monitoring, training) 1
Deliverable: Output list by offering type (software, service, hardware, mixed).
5) Run documented design reviews with decision records
Set a rule: design cannot advance without a documented review at defined points. Reviews should produce:
- Attendance/participants and roles
- Issues found, decisions made, action items, due dates
- Explicit confirmation that inputs are addressed or exceptions are accepted (with rationale) 1
Deliverable: Design review minutes template and a decision log.
6) Execute verification and validation (V&V) and retain results
Even if your teams do not use the terms “verification” and “validation,” you need the concepts:
- Verification: evidence the outputs meet the inputs (tests, inspections, peer review, static analysis, calculations).
- Validation: evidence the resulting product/service meets intended use (UAT, pilots, simulations, field trials, customer acceptance). 1
Deliverable: Test plan, test results, acceptance records, and traceability where needed.
7) Put design change control under configuration management
Define a change process for design outputs:
- How changes are proposed and documented
- Impact analysis expectations (quality, safety, security, performance, usability, delivery)
- Approval authority (who can approve what)
- Required re-review and re-testing rules
- How you ensure the latest approved version is used in production/service delivery 1
Deliverable: Engineering change order (ECO) or design change request workflow plus version control rules.
8) Ensure third-party design work is controlled
Where third parties produce design outputs:
- Contractually require deliverables, review rights, and record access
- Define acceptance criteria and review steps for third-party outputs
- Make sure you can retain evidence (or access it) during audits 1
Operational tip: tie this into your third-party onboarding and contract review so design control expectations are not discovered after work starts.
9) Maintain the process (keep it current)
“Maintain” means your process must survive organizational change:
- Periodic internal audit coverage for design controls
- Lessons learned fed back into templates/checklists
- Training and onboarding for new designers, PMs, and service leads
- Metrics that detect bypassing the process (for example, releases without required approvals) 1
Required evidence and artifacts to retain
Auditors will expect objective evidence that your process exists and is followed. Maintain, at minimum:
| Artifact | What it proves | Owner |
|---|---|---|
| Design & Development Procedure | Process is established | Quality / QMS owner |
| Design plan / project initiation record | Work was planned with responsibilities | Product/Engineering/Service lead |
| Requirements (inputs) with version history | Inputs are defined and controlled | Product/Engineering |
| Design outputs (specs, drawings, workflows) | Outputs exist and support delivery | Engineering/Service Ops |
| Design review records + decision log | Reviews occurred and issues were managed | Project lead |
| Verification/validation plans and results | Outputs meet inputs and intended use | QA/Engineering/Service QA |
| Change requests / ECOs + approvals | Changes are evaluated and controlled | Engineering/QMS |
| Release/handoff checklist | Controlled transition to operations | Release manager/Service delivery |
| Third-party deliverables + acceptance evidence | Outsourced design is controlled | Vendor/TPRM owner + project lead |
All of the above tie back to the requirement to establish, implement, and maintain a design and development process. 1
Common exam/audit questions and hangups
Expect these lines of questioning:
- “Show me your design and development process. Where is it documented?” 1
- “Pick a recently released product/service. Walk me from inputs to outputs to release.” 1
- “Where are the design reviews documented, and who approved the decisions?” 1
- “How do you confirm requirements were met? Show verification/validation evidence.” 1
- “How do you control changes? How do you prevent outdated specs from being used?” 1
- “What happens when a third party designs part of the solution? How do you accept it?” 1
Hangups that cause nonconformities:
- Records exist but are scattered and cannot be produced quickly.
- Teams can describe “how we usually do it” but cannot show a controlled process.
- Changes happen through informal chat approvals with no retained evidence.
Frequent implementation mistakes (and how to avoid them)
-
Writing a procedure no one follows.
Fix: map your real delivery flow, then add the minimum controls to make it auditable. -
Treating services as exempt from design control.
Fix: define “service design outputs” (method statement, runbook, tooling, acceptance criteria) and review them before rollout. -
Confusing “peer review exists” with “design review is controlled.”
Fix: require review records with decisions, not just code comments. -
No link between requirements and testing evidence.
Fix: require at least basic traceability: each major requirement points to a test, inspection, or acceptance step. -
Change control only after production incidents.
Fix: enforce a pre-release change process for design outputs; reserve “emergency change” as an exception with retrospective review.
Enforcement context and risk implications
No public enforcement cases were provided for this requirement in the source material. For ISO programs, the practical “enforcement” mechanism is certification audits, customer audits, and contractual quality requirements. Failure modes show up as nonconformities, delayed releases, rework, defect escape, warranty/service failures, and audit findings that can block deals where ISO 9001 certification is a gating requirement. 1
A practical 30/60/90-day execution plan
You asked for speed; here is a staged plan you can run without betting on a major tool rollout.
First 30 days (stabilize and define)
- Identify all teams performing design/development (product, engineering, services, R&D, ops engineering).
- Publish a single Design & Development Procedure with scope, triggers, roles, and required artifacts. 1
- Standardize three templates: design plan, requirements, design review minutes.
- Pilot the process on one active initiative and one recent change (retroactive evidence collection to find gaps).
By 60 days (implement and evidence)
- Train owners and reviewers on the checkpoints and what “done” evidence looks like.
- Add a release/handoff checklist that requires links to review and V&V evidence.
- Implement change control for design outputs (even if it’s a ticket workflow plus version control rules).
- Extend controls to third-party design deliverables (contract addendum language or SOW checklist, plus acceptance steps).
By 90 days (maintain and audit-ready)
- Run an internal audit focused on a sample of design projects and changes; document findings and corrective actions. 1
- Add monitoring so bypassing the process is visible (for example, releases without approvals).
- Consolidate evidence storage (single repository structure by project/release).
- If you need workflow discipline, Daydream can help centralize design control evidence, approvals, and third-party deliverable tracking so audit retrieval is not a scramble.
Frequently Asked Questions
Does Clause 8.3 apply if we only configure an existing product for each customer?
If configuration changes requirements, performance, interfaces, or delivery outcomes, treat it as design/development and run it through your process. If it is routine and low-risk, define it as standard work but keep change control and acceptance evidence. 1
We are agile. Do we need formal stage gates?
You need defined control points and records; they can align to agile ceremonies. Document how epics/stories map to requirements, how reviews/approvals happen, and where V&V evidence is retained. 1
What’s the minimum evidence an auditor will expect for one design change?
A controlled change record, the updated design output version, evidence of review/approval, and evidence the change was verified and, where needed, validated for intended use. The point is traceability from decision to outcome. 1
Can we rely on customer acceptance as “validation”?
Customer acceptance can be part of validation if it demonstrates intended use is met and is documented. You still need verification evidence that specific requirements were met, not only that the customer signed off. 1
How do we handle third-party designers who won’t share detailed working papers?
Put evidence access and deliverable expectations into the contract/SOW, and define your internal acceptance testing and review as the control. If you cannot obtain sufficient evidence, treat the third party’s output as higher risk and increase your verification/validation on receipt. 1
Where should we store design records so audits don’t turn into a fire drill?
Use a consistent repository structure by product/service and release, and require that review and V&V records are linked from the design plan or release checklist. Tools like Daydream can help by centralizing artifacts and approvals across teams and third parties, but the key is a single source of truth. 1
Footnotes
Frequently Asked Questions
Does Clause 8.3 apply if we only configure an existing product for each customer?
If configuration changes requirements, performance, interfaces, or delivery outcomes, treat it as design/development and run it through your process. If it is routine and low-risk, define it as standard work but keep change control and acceptance evidence. (Source: ISO 9001:2015 Quality management systems — Requirements)
We are agile. Do we need formal stage gates?
You need defined control points and records; they can align to agile ceremonies. Document how epics/stories map to requirements, how reviews/approvals happen, and where V&V evidence is retained. (Source: ISO 9001:2015 Quality management systems — Requirements)
What’s the minimum evidence an auditor will expect for one design change?
A controlled change record, the updated design output version, evidence of review/approval, and evidence the change was verified and, where needed, validated for intended use. The point is traceability from decision to outcome. (Source: ISO 9001:2015 Quality management systems — Requirements)
Can we rely on customer acceptance as “validation”?
Customer acceptance can be part of validation if it demonstrates intended use is met and is documented. You still need verification evidence that specific requirements were met, not only that the customer signed off. (Source: ISO 9001:2015 Quality management systems — Requirements)
How do we handle third-party designers who won’t share detailed working papers?
Put evidence access and deliverable expectations into the contract/SOW, and define your internal acceptance testing and review as the control. If you cannot obtain sufficient evidence, treat the third party’s output as higher risk and increase your verification/validation on receipt. (Source: ISO 9001:2015 Quality management systems — Requirements)
Where should we store design records so audits don’t turn into a fire drill?
Use a consistent repository structure by product/service and release, and require that review and V&V records are linked from the design plan or release checklist. Tools like Daydream can help by centralizing artifacts and approvals across teams and third parties, but the key is a single source of truth. (Source: ISO 9001:2015 Quality management systems — Requirements)
Authoritative Sources
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream