Design and development inputs
ISO 9001 Clause 8.3.3 requires you to explicitly define and control the “design inputs” for each product or service you design, so engineering and delivery teams build against known, approved requirements instead of assumptions. Operationally, you need a repeatable method to capture, review, approve, and maintain inputs such as functional/performance needs, legal requirements, applicable standards, and consequences of failure. 1
Key takeaways:
- Treat “design inputs” as controlled requirements, not informal notes or emails.
- Inputs must cover functional/performance needs, statutory/regulatory requirements, standards/codes, and failure consequences. 1
- Auditors look for traceability: where inputs came from, who approved them, and how changes were controlled.
Design and development inputs are the requirement baseline for any controlled design process. If your organization designs products or services, ISO 9001 expects you to determine what requirements are essential before you commit to design outputs, verification/validation plans, or release decisions. Clause 8.3.3 is where many programs fail quietly: teams start building with partial requirements, rely on “tribal knowledge,” or assume regulatory requirements are “handled by someone else.”
For a Compliance Officer, CCO, or GRC lead, the job is to make design inputs operational: define what must be captured, ensure the right functions review it, and create evidence that inputs were complete, current, and approved. You are not being asked to design the product. You are being asked to make the requirement system reliable and auditable.
This page translates ISO 9001:2015 Clause 8.3.3 into a practical playbook: who owns what, the steps to implement quickly, the artifacts to retain, common audit hangups, and a phased execution plan you can run as a small program.
Regulatory text
ISO 9001:2015 Clause 8.3.3 states: “The organization shall determine the requirements essential for the specific types of products and services to be designed and developed.” 1
Operator interpretation (what you must do):
- Define what inputs are required for your design work, appropriate to the product/service type. 1
- Ensure the input set includes, at minimum, functional and performance requirements, statutory and regulatory requirements, standards and codes of practice, and consequences of failure. 1
- Control those inputs so teams can design against an approved baseline and changes are visible and managed.
Plain-English interpretation of the requirement
Design inputs are the “what must be true” statements for the thing you are designing. Clause 8.3.3 expects you to decide, in a structured way, what requirements apply and to document them so the design team can produce outputs that meet customer needs, legal obligations, and safety/reliability expectations. 1
A useful test: if you cannot explain how a requirement was identified, where it is recorded, who approved it, and what happens when it changes, you do not yet have controlled design inputs.
Who it applies to (entity and operational context)
This requirement applies to any organization whose scope includes design and development of products and/or services under ISO 9001. 1
Typical operational contexts:
- Manufacturing/design engineering: new product introduction, component redesign, material substitutions.
- Software and digital services: new features, major architecture changes, regulated workflows.
- Professional services with designed deliverables: implementation methodologies, configured solutions, customer-facing reports where requirements must be defined.
- Design done with third parties: outsourced engineering, contract manufacturers, external development partners. You still own determining essential requirements even if a third party performs the work.
Practical scope boundary: if you claim “no design responsibility,” auditors often test whether you truly only build-to-print/build-to-spec, or whether you make design decisions (configurations, substitutions, performance commitments). If you make decisions that affect form/fit/function or service outcomes, treat it as design input territory.
What you actually need to do (step-by-step)
1) Define your design input categories (standardize the checklist)
Create a standard that applies across projects, then allow project-specific additions. Your minimum categories should include: (a) functional requirements, (b) performance requirements, (c) statutory/regulatory requirements, (d) standards/codes of practice, and (e) consequences of failure. 1
Deliverable: Design Input Template (or requirements register) with mandatory fields.
2) Establish sources of inputs and how they are captured
Define accepted sources and capture method. Examples:
- Customer contracts, statements of work, user needs.
- Market requirements documents, product management PRDs.
- Regulatory registers maintained by compliance/legal.
- Industry standards list maintained by engineering/quality.
Control point: Inputs must be recorded in a controlled system (QMS module, requirements tool, controlled spreadsheet with versioning) rather than scattered across email and chat.
3) Assign ownership and required reviewers (RACI)
Set clear accountability:
- Accountable: Design authority (engineering lead/product owner) for completeness and technical correctness.
- Responsible: Project engineer/analyst for drafting and maintaining the input set.
- Consulted: Quality, Regulatory/Compliance, Security/Privacy (if relevant), Operations/Service delivery, Procurement (if third parties involved).
- Informed: Commercial teams, customer stakeholders.
Compliance’s job is to ensure the statutory/regulatory and standards/codes categories are not treated as optional. 1
4) Perform a structured design input review and approval
Run a “requirements readiness” review before design work moves into detailed design. Minimum review questions auditors love:
- Are inputs complete for the product/service type? 1
- Are legal/regulatory requirements identified and mapped to requirements? 1
- Are applicable standards/codes identified and current? 1
- Are consequences of failure considered (safety, service outage, data loss, customer harm) and reflected as requirements or controls? 1
Control point: Approval must be explicit (signature/approval record in your system), dated, and tied to a version/baseline.
5) Baseline the inputs and control changes
Once approved, freeze a baseline (version). Any change requires:
- Change request (what changed and why).
- Impact assessment (design, verification/validation, downstream docs, suppliers/third parties).
- Re-approval by the right functions.
This is where many teams fail: they “update the doc” but don’t show decision history. Auditors typically accept agile iteration if you still maintain versioning, approvals, and traceability.
6) Maintain traceability from input → output → verification/validation
Clause 8.3.3 is easiest to defend when you can show traceability:
- Input requirement ID
- Related design output (drawing/spec/service procedure)
- Verification method (test/inspection/analysis)
- Validation evidence (meets intended use, where applicable)
You do not need a heavyweight tool, but you do need a consistent mapping method.
7) Include third parties in the input control model (if they design/build)
If a third party develops or manufactures parts of the design:
- Flow down relevant input requirements (including standards and regulatory constraints). 1
- Require acknowledgment/acceptance from the third party.
- Control third-party changes through your change control.
A practical approach is to treat third-party design packages as design outputs that must trace back to your inputs and be reviewed against them.
Required evidence and artifacts to retain
Auditors will ask for objective evidence. Keep:
- Design input procedure (how you determine, review, approve, and change inputs). 1
- Design input checklist/template showing required categories. 1
- Project requirements register (versioned), including statutory/regulatory requirements and applicable standards/codes. 1
- Design input review records (attendees, approvals, decisions, open issues).
- Change control records for input changes (impact assessment + approvals).
- Traceability matrix (or equivalent mapping) from inputs to outputs and verification/validation.
- Third-party flowdown evidence (SOW clauses, technical specs provided, acknowledgment, change approvals), if applicable.
Tooling note: Daydream can act as the system of record for controlled requirements and evidence mapping if you need a single place to tie inputs, approvals, and artifacts to each design project.
Common exam/audit questions and hangups
Expect questions like:
- “Show me the design inputs for Product X and who approved them.” 1
- “Where are statutory and regulatory requirements captured for this design?” 1
- “Which standards/codes apply, and how do you ensure they’re current?” 1
- “How did you consider consequences of failure, and how did that affect requirements?” 1
- “Show a change to a design input and evidence of impact assessment and re-approval.”
Common hangups:
- Requirements live in multiple places with no single baseline.
- Teams confuse design outputs (drawings, code) with inputs (requirements).
- Regulatory requirements are referenced vaguely (“must comply with laws”) instead of being determined and recorded.
Frequent implementation mistakes and how to avoid them
-
Mistake: Treating inputs as a one-time document.
Avoid it: Make input management part of change control. Require updates to carry a change record and approval. -
Mistake: Missing statutory/regulatory requirements until late-stage reviews.
Avoid it: Compliance maintains a “regulatory requirements intake” step for each project and signs off on the input baseline. 1 -
Mistake: No stated consequences of failure.
Avoid it: Add a mandatory “failure consequence” section that forces teams to describe credible failure modes and their impact, then translate that into requirements (alarms, tolerances, fallbacks, training, service constraints). 1 -
Mistake: Third-party design changes bypass your controls.
Avoid it: Contractually require notification/approval for any requirement-impacting change; route changes through your internal impact assessment.
Enforcement context and risk implications
No public enforcement cases were provided for this requirement. Operational risk still matters: weak design inputs drive rework, nonconforming outputs, and release of products/services that fail to meet customer, legal, or safety expectations. From a certification perspective, auditors often treat uncontrolled or incomplete design inputs as a systemic design-control failure because later controls (testing, inspection) cannot prove you built the right thing if “right” was never defined.
Practical execution plan (30/60/90 days)
First 30 days (stabilize and standardize)
- Publish a one-page procedure for determining and approving design inputs aligned to Clause 8.3.3 categories. 1
- Roll out a standard design input template/requirements register with mandatory fields for functional/performance, statutory/regulatory, standards/codes, and consequences of failure. 1
- Pick one active project and pilot an input baseline + approval record.
By 60 days (make it auditable)
- Implement a recurring design input review gate (agenda + approval record).
- Add change control for input updates (impact assessment + re-approval).
- Build a lightweight traceability approach (matrix or linked records) for the pilot project.
- Ensure third-party flowdown language exists in templates (SOW/spec) where needed.
By 90 days (scale and operationalize)
- Expand to all design projects in scope; require baselining before detailed design.
- Train engineering/product, quality, and compliance reviewers on what “good inputs” look like.
- Run an internal audit-style spot check: pick projects and verify inputs are complete, approved, current, and traceable.
- If evidence is scattered, consolidate in a single system of record (for example, Daydream) so audits do not become document hunts.
Frequently Asked Questions
Do we need a specific software tool to meet the design and development inputs requirement?
No. ISO 9001 requires that you determine and control essential requirements, but it does not mandate tooling. Auditors care about completeness, approvals, version control, and traceability. 1
What counts as “statutory and regulatory requirements” in design inputs?
They are the legal or regulatory obligations that apply to the product/service being designed and must be explicitly determined as inputs. If you only write “comply with laws,” you usually cannot prove you determined the requirements. 1
We use agile development. How do we baseline design inputs?
Baseline per increment or release: record the input set for the scope you intend to deliver, approve it, and control changes through your backlog/change process with approval records. The key is versioned, reviewable requirements. 1
What does “consequences of failure” mean in practice?
It means you identify credible harm or impact if the design fails and reflect that in requirements or controls. Examples include safety hazards, loss of service, data loss, or inability to meet a critical performance claim. 1
If a third party designs part of our product, can they own the design inputs?
They can draft inputs, but your organization must determine the essential requirements and control them, including flowdown and change approval. Keep evidence of what you provided and what they accepted. 1
What’s the minimum evidence an auditor will expect to see?
A defined method to determine inputs, a documented set of inputs for a project (including required categories), and records of review/approval and controlled changes. Traceability evidence strongly reduces audit friction. 1
Footnotes
Frequently Asked Questions
Do we need a specific software tool to meet the design and development inputs requirement?
No. ISO 9001 requires that you determine and control essential requirements, but it does not mandate tooling. Auditors care about completeness, approvals, version control, and traceability. (Source: ISO 9001:2015 Quality management systems — Requirements)
What counts as “statutory and regulatory requirements” in design inputs?
They are the legal or regulatory obligations that apply to the product/service being designed and must be explicitly determined as inputs. If you only write “comply with laws,” you usually cannot prove you determined the requirements. (Source: ISO 9001:2015 Quality management systems — Requirements)
We use agile development. How do we baseline design inputs?
Baseline per increment or release: record the input set for the scope you intend to deliver, approve it, and control changes through your backlog/change process with approval records. The key is versioned, reviewable requirements. (Source: ISO 9001:2015 Quality management systems — Requirements)
What does “consequences of failure” mean in practice?
It means you identify credible harm or impact if the design fails and reflect that in requirements or controls. Examples include safety hazards, loss of service, data loss, or inability to meet a critical performance claim. (Source: ISO 9001:2015 Quality management systems — Requirements)
If a third party designs part of our product, can they own the design inputs?
They can draft inputs, but your organization must determine the essential requirements and control them, including flowdown and change approval. Keep evidence of what you provided and what they accepted. (Source: ISO 9001:2015 Quality management systems — Requirements)
What’s the minimum evidence an auditor will expect to see?
A defined method to determine inputs, a documented set of inputs for a project (including required categories), and records of review/approval and controlled changes. Traceability evidence strongly reduces audit friction. (Source: ISO 9001:2015 Quality management systems — Requirements)
Authoritative Sources
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream