Design and development — General

ISO 9001 Clause 8.3.1 requires you to define, run, and keep current a design and development process that fits what you build and sell, so products and services can be delivered consistently. To operationalize it, you must set clear stages, controls, roles, and records for design work, then prove the process is followed in real projects. 1

Key takeaways:

  • You need a documented, repeatable design and development process scaled to your product/service risk and complexity. 1
  • Auditors will test “process exists” and “process is used,” so project records matter as much as procedures. 1
  • The fastest path is to standardize a design lifecycle template, gate reviews, and required artifacts, then pilot on active work and fix gaps. 1

Clause 8.3.1 is the entry point for ISO 9001’s design and development controls: it tells you to establish, implement, and maintain a process that is appropriate for your organization and that supports successful product and service delivery. 1 The operational challenge is “appropriate.” ISO does not prescribe a single lifecycle model, document set, or approval hierarchy. You must choose a process that matches how you develop outputs (physical products, software, regulated services, engineered-to-order work, or configured offerings) and then demonstrate consistent execution.

For a Compliance Officer, CCO, or GRC lead, the objective is straightforward: create a governance-ready design process that stands up in audit and reduces downstream failures (rework, customer complaints, nonconforming outputs). This page focuses on turning the requirement into a working control system: scope decisions, minimal viable procedures, required evidence, audit questions, and a practical execution plan you can hand to Engineering, Product, R&D, Professional Services, and Quality.

Regulatory text

ISO 9001:2015 Clause 8.3.1: “The organization shall establish, implement and maintain a design and development process appropriate to ensure the subsequent provision of products and services.” 1

What an operator must do

  • Establish: Define a standard design and development process (stages/activities, responsibilities, required inputs/outputs, controls). 1
  • Implement: Put the process into use on actual design work, not just on paper. 1
  • Maintain: Keep the process current as org structure, offerings, tools, and risks change; ensure people still follow it. 1
  • Appropriate: Scale controls to complexity and risk. A two-person team making a minor configuration change should not be forced through the same rigor as a new safety-critical product line, but both must follow a defined method. 1

Plain-English interpretation

You must run design work as a controlled business process, not as ad hoc heroics. That means:

  1. everyone knows how design begins and ends,
  2. requirements are captured and validated,
  3. design decisions are reviewed,
  4. changes are controlled, and
  5. you can show objective evidence that the process happened for each project.
    All of that serves one practical goal: deliver products and services that can actually be provided consistently (manufactured, deployed, supported, serviced) after design is “done.” 1

Who it applies to (entity and operational context)

Applies to: any organization within the ISO 9001 QMS scope that performs design and development of products or services. 1

Typical functions in scope

  • Product management, engineering, R&D, software development
  • Service design (implementation methodology, managed services, clinical/technical service offerings)
  • Quality, regulatory, compliance, operations, supply chain (when they influence design outputs)
  • Third parties performing design work on your behalf (design houses, outsourced development, consultants). You remain accountable for process control within your QMS scope. 1

Common scoping decision

  • If you only build to a customer’s complete design with no internal design authority, your “design and development” may be limited. If you interpret requirements, select components, configure, or create service methods, you likely do design work and need controls.

What you actually need to do (step-by-step)

1) Define what “design and development” means in your org

Create a short definition and boundary statement:

  • What counts as design (new products, major revisions, service model creation, configuration rules, customer-specific engineering)?
  • What does not (pure purchasing, basic installation, routine maintenance)?
  • What are your design outputs (drawings, code, BOMs, specifications, SOPs, work instructions, service playbooks)?
    This avoids audit pain where teams argue, “we don’t do design” while producing design outputs.

2) Choose a lifecycle model and map it to gates

Pick a lifecycle that matches your work (waterfall, agile, stage-gate, hybrid). Then map it into controlled gates you can evidence. A practical gate structure:

Gate Purpose Minimum evidence (examples)
Initiation Confirm scope, owner, and success criteria Project charter; design plan outline
Requirements Capture and approve inputs Requirements/spec; acceptance criteria
Design Produce solution and verify approach Design docs; review minutes
Build/Configure Implement the design Code commits; build records; work instructions
Verification/Validation readiness Confirm it meets inputs and intended use Test results; pilot results; sign-off
Release/Transfer Ensure operations can deliver/support Release notes; training; support handoff

Keep it “appropriate” by allowing lighter evidence for low-risk changes while still requiring a defined path and record trail. 1

3) Assign roles and decision rights

Auditors look for clarity on accountability. At minimum, define:

  • Design Owner (accountable for outputs and gate completion)
  • Approvers (often Quality plus functional leads)
  • Reviewers (cross-functional stakeholders: operations, service delivery, security/compliance if relevant)
  • Document control owner (ensures records are retained and versions are managed)

Publish a simple RACI for design gates and key artifacts.

4) Build “minimum viable” documented information

Clause 8.3.1 does not list documents by name, but you need enough documented information to show the process is established and implemented. 1 Start with:

  • Design & Development Procedure (or Process Map)
  • Design Planning template (even one page)
  • Gate review checklist template
  • Design change request template
  • Record retention rule for design artifacts

Keep it short, but enforce it.

5) Embed the process in daily tooling

Make compliance the default path:

  • Add required fields/checklists to your PLM/ALM/Jira/DevOps tool, document system, or service management platform.
  • Prevent “release” status without required approvals for defined risk classes.
  • Standardize naming conventions so records can be found during audit.

If you use Daydream to manage third-party due diligence and evidence workflows, treat external design contributors as third parties and use Daydream tasks to collect design controls evidence (e.g., design plans, reviews, verification results) as part of project governance. This keeps evidence centralized and audit-ready without chasing emails.

6) Pilot on active projects, then correct gaps

Pick a live project and run the gates end-to-end. Document:

  • where teams got stuck,
  • which artifacts were unclear,
  • which approvals were missing,
  • whether low-risk work needs a lighter path. Update templates and guidance. That is “maintain” in practice. 1

Required evidence and artifacts to retain

Retain enough records to prove the process exists and was followed. A practical evidence list:

Process-level (system)

  • Approved Design & Development procedure/process map (current version)
  • RACI or role definitions for design activities
  • Training/communication records for affected teams (optional but persuasive)

Project-level (execution)

  • Design plan or project plan showing stages/gates
  • Design inputs/requirements and acceptance criteria
  • Design outputs (specs, drawings, code/design docs, service procedures)
  • Records of design reviews (agenda, attendees, decisions, action items)
  • Verification/validation records tied to requirements (test plans/results, pilots)
  • Release/transfer records (handoff checklist, training, operational readiness)
  • Design change records (what changed, approval, rationale)

Retention and control

  • Version history and access control for controlled documents
  • A traceable file path or system link that an auditor can follow without tribal knowledge

Common exam/audit questions and hangups

Expect these, and prepare the evidence trail:

  1. “Show me your design and development process.” Provide the procedure and a process map with gates. 1
  2. “Show me a project where this was followed.” Have one “golden thread” project package ready.
  3. “How do you decide what’s ‘appropriate’?” Explain a risk-based scaling rule (e.g., risk classes determine which gates/artifacts are mandatory).
  4. “Who approves design outputs?” Point to role definitions and actual approval records.
  5. “How do you maintain the process?” Show periodic review notes, template updates, and corrective actions when audits or issues find gaps.

Frequent implementation mistakes and how to avoid them

  • Mistake: A beautiful procedure nobody uses. Fix: build gate checklists directly into delivery workflows and require evidence before release.
  • Mistake: Treating services as “not design.” Fix: define service design outputs (runbooks, service descriptions, onboarding workflows) and apply the same governance.
  • Mistake: No rule for “small changes.” Fix: create a simplified path for low-risk changes so teams do not bypass the process.
  • Mistake: Records scattered across tools with no index. Fix: require a project evidence index (a single page with links) and keep it under document control.
  • Mistake: Third-party design work unmanaged. Fix: require external contributors to follow your gates or provide equivalent evidence; collect and retain it.

Enforcement context and risk implications

ISO 9001 is a certifiable standard rather than a regulator, so “enforcement” typically occurs through certification audits, customer audits, and contractual requirements, not government penalties. The risk is still real: weak design controls lead to nonconforming outputs, delivery failures, customer complaints, and audit nonconformities that can threaten certification status and customer trust. 1

Practical 30/60/90-day execution plan

First 30 days (Immediate)

  • Confirm scope: list products/services and where design authority exists.
  • Draft a one-page design process map with gates and owners.
  • Create templates: design plan, gate review checklist, change request.
  • Select a “golden thread” project to pilot and collect evidence.

Next 60 days (Near-term)

  • Run the pilot through gates; capture issues and refine the process.
  • Implement a scaling rule (risk tiers) so “appropriate” is defensible.
  • Train affected teams with short, role-based guidance.
  • Set up a single evidence index structure per project (folder or system record).

By 90 days (Stabilize and maintain)

  • Expand to additional projects and confirm consistency.
  • Perform an internal check on one completed project package: can someone unfamiliar reconstruct the design history from records?
  • Establish maintenance ownership: schedule periodic process review, template changes, and corrective actions.
  • Integrate third-party design contributors into the evidence workflow (collection, review, retention).

Frequently Asked Questions

Do we need a formal “design procedure” document to meet Clause 8.3.1?

You need documented information that shows the process is established and maintained, and evidence that it is implemented in real work. A concise procedure plus templates and project records usually satisfies this expectation. 1

We’re agile. How do we prove a design and development process exists?

Define your agile workflow as the design process: roles, ceremonies as gates (e.g., backlog refinement, sprint review), and required artifacts (epics, acceptance criteria, test evidence). Then show a project where those controls were actually executed and recorded. 1

What does “appropriate” mean in an audit?

“Appropriate” means scaled to the nature of your products and services and sufficient to support consistent subsequent provision. Your best defense is a written scaling rule that ties rigor (reviews, approvals, testing evidence) to risk and complexity. 1

Does this apply to service companies with no manufacturing?

Yes, if you design services or service delivery methods. Treat service descriptions, runbooks, onboarding workflows, and acceptance criteria as design outputs and control them through your design process. 1

How should we handle third parties doing design work for us?

Keep accountability inside your QMS scope. Require third parties to follow your defined gates or provide equivalent artifacts (requirements, reviews, verification results), then retain those records in your system of record. 1

What’s the single most persuasive audit artifact?

A “golden thread” project file that links requirements to design outputs, reviews, and verification/validation results, with clear approvals and version control. Auditors prefer one complete example over many partial ones. 1

Footnotes

  1. ISO 9001:2015 Quality management systems — Requirements

Frequently Asked Questions

Do we need a formal “design procedure” document to meet Clause 8.3.1?

You need documented information that shows the process is established and maintained, and evidence that it is implemented in real work. A concise procedure plus templates and project records usually satisfies this expectation. (Source: ISO 9001:2015 Quality management systems — Requirements)

We’re agile. How do we prove a design and development process exists?

Define your agile workflow as the design process: roles, ceremonies as gates (e.g., backlog refinement, sprint review), and required artifacts (epics, acceptance criteria, test evidence). Then show a project where those controls were actually executed and recorded. (Source: ISO 9001:2015 Quality management systems — Requirements)

What does “appropriate” mean in an audit?

“Appropriate” means scaled to the nature of your products and services and sufficient to support consistent subsequent provision. Your best defense is a written scaling rule that ties rigor (reviews, approvals, testing evidence) to risk and complexity. (Source: ISO 9001:2015 Quality management systems — Requirements)

Does this apply to service companies with no manufacturing?

Yes, if you design services or service delivery methods. Treat service descriptions, runbooks, onboarding workflows, and acceptance criteria as design outputs and control them through your design process. (Source: ISO 9001:2015 Quality management systems — Requirements)

How should we handle third parties doing design work for us?

Keep accountability inside your QMS scope. Require third parties to follow your defined gates or provide equivalent artifacts (requirements, reviews, verification results), then retain those records in your system of record. (Source: ISO 9001:2015 Quality management systems — Requirements)

What’s the single most persuasive audit artifact?

A “golden thread” project file that links requirements to design outputs, reviews, and verification/validation results, with clear approvals and version control. Auditors prefer one complete example over many partial ones. (Source: ISO 9001:2015 Quality management systems — Requirements)

Authoritative Sources

Operationalize this requirement

Map requirement text to controls, owners, evidence, and review workflows inside Daydream.

See Daydream
ISO 9001: Design and development — General | Daydream