Monitoring, measurement, analysis and evaluation
ISO 9001:2015 Clause 9.1 requires you to define what you will monitor and measure in your quality management system (QMS), how you will do it, when it happens, and how results are analyzed and evaluated. Operationalize it by building a controlled “measurement system” (KPIs, methods, cadence, ownership, and decision rules) tied to risks, processes, and customer outcomes. 1
Key takeaways:
- You must specify what you monitor/measure and the methods; auditors expect defined criteria, cadence, and evaluation practices. 1
- Evidence matters more than dashboards: keep method definitions, records, analysis outputs, and management actions linked to results. 1
- The fastest path is a single measurement register that maps process KPIs, data sources, and evaluation/decision triggers to corrective actions.
Clause 9.1 is where “we track quality” becomes auditable system design. A mature organization can show that monitoring and measurement are not ad hoc or personality-driven. You can explain what you measure, why those measures matter to product/service conformity and customer satisfaction, and how you decide whether performance is acceptable. 1
For a Compliance Officer, CCO, or GRC lead, the operational challenge is governance: consistent methods, clear ownership, data integrity, and an evaluation workflow that results in decisions. The common failure mode is a collection of reports with no defined method, no documented criteria, and no traceable link to actions when performance drifts.
This page gives requirement-level implementation guidance you can execute quickly: define the measurement scope, standardize methods, establish analysis and evaluation routines, and retain the artifacts auditors ask for. It also calls out practical hangups (sampling, subjective measures, tool changes, and “who signs off”) that create nonconformities.
Regulatory text
Excerpt (Clause 9.1): “The organization shall determine what needs to be monitored and measured and the methods for monitoring and measurement.” 1
Operator interpretation: You must deliberately design your monitoring and measurement system. That means:
- Identify the processes, outcomes, and controls that require monitoring/measurement.
- Define the method(s): how data is collected, calculated, validated, and reported.
- Establish when monitoring/measurement occurs and when results are analyzed and evaluated, then show decisions or actions based on that evaluation. 1
Auditors generally test this by sampling: they pick a process (for example, nonconforming output control, on-time delivery, complaint handling) and ask you to prove that measures are defined, performed as planned, analyzed, and acted upon.
Plain-English requirement
You need a documented, repeatable way to answer four questions for each important process or outcome:
- What are we measuring (or monitoring)?
- How do we measure it (definition, formula, sampling, tools, data source, validation)?
- When do we measure and review it (cadence, triggers, reporting path)?
- What do we do with the results (evaluation criteria, thresholds, decision rights, corrective actions)? 1
Who it applies to
Entity scope: Any organization operating an ISO 9001:2015 QMS, including organizations where quality responsibilities are distributed across operations, engineering, service delivery, and third-party management. 1
Operational contexts where Clause 9.1 shows up in audits:
- Manufacturing and test/inspection environments (yield, scrap, rework, calibration impacts).
- Service organizations (SLA performance, incident trends, rework, complaints).
- Regulated or safety-adjacent operations where measurement methods must be controlled (traceability, validation of tools, data governance).
- Third party-dependent processes (outsourced production steps, cloud services, logistics providers). Clause 9.1 still applies: you remain accountable for defining how performance is monitored and evaluated, even if data comes from a third party.
What you actually need to do (step-by-step)
1) Build a “Monitoring & Measurement Register” (one source of truth)
Create a controlled register (spreadsheet, GRC system, or QMS tool) with one line per metric/monitoring activity. Minimum columns that auditors recognize immediately:
- Process / area (linked to process map)
- Metric name and type (leading/lagging; monitoring vs measurement)
- Definition (what is included/excluded)
- Method (formula, sampling approach, tools/systems, data source)
- Frequency and timing (collection and review)
- Acceptance criteria / target / control limits (as applicable)
- Owner (data owner + process owner)
- Reporting format (dashboard, report, meeting)
- Evaluation rule (what constitutes “needs action”)
- Required records (what is retained, where, retention rule)
Practical tip: keep definitions stable. If you change a metric formula, version it and document the change reason so trends remain interpretable.
2) Start from risks and critical-to-quality outcomes
Clause 9.1 does not say “measure everything.” It expects you to measure what matters. A defensible approach is to map measures to:
- Customer requirements (delivery, defect rate, responsiveness).
- Product/service conformity points (inspection, validation, QA gates).
- Process performance and effectiveness (cycle time, first-pass yield, ticket reopens).
- Supplier/third party performance where it affects conformity (incoming quality, SLA adherence, defect escapes). 1
Output: a short rationale per measure (“why this measure exists”) directly in the register or in a supporting document.
3) Define methods with enough specificity to be repeatable
For each metric, document method details that prevent “same metric, different math” across teams:
- Data source control: system of record, fields used, extraction logic, and who can edit.
- Calculation rules: formula, rounding, inclusion/exclusion, handling missing data.
- Sampling rules: if sampling is used, define selection method and when resampling is required.
- Tool control: measurement equipment/tool version, calibration/verification linkage where relevant.
- Data quality checks: completeness, duplicate handling, exception review.
If a third party provides the data, document how you validate it (spot checks, reconciliation, attestation, contract reporting terms).
4) Set cadence: collection, analysis, and evaluation are different events
Audits often find that teams “collect” data but do not consistently analyze and evaluate it. Make cadence explicit:
- Collection cadence: when raw data is captured.
- Analysis cadence: who trends, segments, or investigates (Pareto, run charts, root cause triggers).
- Evaluation cadence: who decides acceptability and actions (operational review, QMS review, management review inputs). 1
5) Establish decision rules and escalation paths
Define what happens when results are out of tolerance or concerning:
- Trigger thresholds (breach of target, negative trend, repeat defect, customer complaint spike).
- Required actions (containment, corrective action, change request, training, supplier corrective action request).
- Decision rights (process owner vs quality vs leadership).
- Time-to-response expectations (your internal requirement, documented).
This is where many nonconformities originate: teams can show the chart, but cannot show a controlled response.
6) Connect results to CAPA, changes, and management review inputs
Your evaluation must lead somewhere. Build traceability:
- Metric result → investigation record (if triggered) → corrective action → verification of effectiveness.
- Significant performance topics → management review inputs and outputs.
If you run third-party oversight, connect third party performance monitoring to supplier management actions (scorecards, QBR actions, contract remedies). The standard’s text is brief, but the expectation is an end-to-end closed loop. 1
7) Operationalize with tooling (where Daydream fits)
If your measurement register, evidence, CAPA, and third party performance data are split across spreadsheets and inboxes, audits become a retrieval exercise. Daydream can act as the control plane to:
- Maintain a controlled measurement register (owners, versions, review workflows).
- Attach evidence (reports, extracts, meeting minutes) to each metric and review.
- Track follow-ups and CAPA triggered by performance evaluations.
- Consolidate third party monitoring artifacts alongside internal measures for end-to-end traceability.
Keep the principle: the tool does not satisfy the requirement; the defined methods, execution, and retained records do.
Required evidence and artifacts to retain
Auditors typically look for consistent, dated records. Retain:
- Monitoring & Measurement Register (controlled, versioned)
- Metric definitions and method statements (including formula/sampling/tool)
- Raw data extracts or system reports (or references to immutable system reports)
- Analysis outputs (trend reports, Pareto charts, root cause notes)
- Evaluation records (meeting minutes, sign-offs, review logs, decisions)
- Action records (CAPA, supplier corrective actions, change controls)
- Effectiveness verification records (post-action performance checks)
- Evidence of planned frequency (calendar invites, standing agenda items, automated report schedules)
Retention duration is your organizational rule; what matters is consistency and retrievability.
Common exam/audit questions and hangups
Expect questions like:
- “Show me how you decided what to monitor for this process.”
- “Where is the method defined for this KPI? Who owns it?”
- “How do you know data from this system is accurate and complete?”
- “When results are off-target, what happens? Show an example.”
- “How do third party performance measures feed into your QMS actions?”
- “How do you ensure measures are reviewed on schedule?”
Hangups:
- Metrics exist but are not tied to process objectives or customer requirements.
- Definitions differ across departments (same KPI name, different calculation).
- Data quality is assumed rather than checked.
- Actions are discussed verbally but not recorded, or actions are recorded but not linked back to metric evaluation.
Frequent implementation mistakes (and how to avoid them)
-
Dashboard-first, method-second.
Fix: require a method statement before publishing a KPI. -
No evaluation criteria.
Fix: define what “good/acceptable” means and who decides, even for qualitative monitoring (for example, complaint themes). -
Inconsistent cadence.
Fix: publish a review calendar and treat missed reviews as a QMS issue with documented rationale and reschedule. -
Metrics without ownership.
Fix: assign both a data owner (integrity) and process owner (action). -
Third party blind spots.
Fix: incorporate third party SLAs, quality metrics, and incident trends into the same register and evaluation rhythm.
Enforcement context and risk implications
ISO 9001 is a certifiable standard, not a regulator, so “enforcement” is typically through certification bodies and customer oversight. The operational risk is concrete: weak measurement methods create blind spots that allow nonconformities, recurring defects, SLA misses, and customer dissatisfaction to persist because signals are noisy or decisions are undocumented. Clause 9.1 is also a credibility test. If you cannot explain your measurement system, auditors infer the QMS is not controlled. 1
Practical 30/60/90-day execution plan
First 30 days (Immediate stabilization)
- Inventory existing KPIs/monitoring activities across functions.
- Draft the Monitoring & Measurement Register with owners and current methods (even if incomplete).
- Identify top gaps: missing definitions, missing cadence, missing evaluation criteria.
- Pick a small set of critical processes and fully document methods end-to-end.
By 60 days (Standardize and close the loop)
- Finalize method statements for priority metrics, including data quality checks.
- Implement a consistent review rhythm (standing agendas, review logs).
- Define triggers and escalation paths; map them to CAPA/change control workflows.
- Run an internal sample audit: choose a process and test traceability from metric → evaluation → action.
By 90 days (Institutionalize and prepare for audits)
- Extend coverage to remaining processes in scope.
- Version-control metric definitions and establish a change process for metric updates.
- Produce a repeatable “audit packet” template per metric/process (definition, last reviews, last actions, effectiveness check).
- Integrate third party performance monitoring where it affects quality outcomes, and document validation of third party-provided data.
Frequently Asked Questions
Do we need documented information for every metric method?
Clause 9.1 requires you to determine what is monitored/measured and the methods. In practice, keeping method definitions documented and controlled is the simplest way to prove consistency during an audit. 1
Can we rely on a BI dashboard as evidence?
A dashboard helps, but auditors usually ask for the method behind the numbers and proof of evaluation and action. Keep the definition, data source logic, review records, and resulting actions linked to the dashboard outputs. 1
How do we handle qualitative monitoring (for example, complaint themes)?
Define the method anyway: classification rules, who codes the data, how themes are reviewed, and what triggers action. Document evaluation outcomes the same way you would for numeric KPIs. 1
What if a third party provides the metric data (SLA reports, defect rates)?
You can use third party data, but you still need to define the method and how you validate the data’s accuracy or completeness. Keep records of your review and any actions taken with the third party when performance is unacceptable. 1
How do we show “analysis and evaluation” versus simple reporting?
Analysis shows interpretation (trends, segmentation, root cause triggers). Evaluation shows decisions (acceptability, actions, escalation). Keep meeting minutes, decision logs, and CAPA/change records tied to the metric review. 1
What’s the fastest way to get audit-ready?
Build a controlled measurement register, then fully implement it for a small set of high-impact processes first. Auditors accept phased maturity if what you claim is implemented, consistent, and evidenced. 1
Footnotes
Frequently Asked Questions
Do we need documented information for every metric method?
Clause 9.1 requires you to determine what is monitored/measured and the methods. In practice, keeping method definitions documented and controlled is the simplest way to prove consistency during an audit. (Source: ISO 9001:2015 Quality management systems — Requirements)
Can we rely on a BI dashboard as evidence?
A dashboard helps, but auditors usually ask for the method behind the numbers and proof of evaluation and action. Keep the definition, data source logic, review records, and resulting actions linked to the dashboard outputs. (Source: ISO 9001:2015 Quality management systems — Requirements)
How do we handle qualitative monitoring (for example, complaint themes)?
Define the method anyway: classification rules, who codes the data, how themes are reviewed, and what triggers action. Document evaluation outcomes the same way you would for numeric KPIs. (Source: ISO 9001:2015 Quality management systems — Requirements)
What if a third party provides the metric data (SLA reports, defect rates)?
You can use third party data, but you still need to define the method and how you validate the data’s accuracy or completeness. Keep records of your review and any actions taken with the third party when performance is unacceptable. (Source: ISO 9001:2015 Quality management systems — Requirements)
How do we show “analysis and evaluation” versus simple reporting?
Analysis shows interpretation (trends, segmentation, root cause triggers). Evaluation shows decisions (acceptability, actions, escalation). Keep meeting minutes, decision logs, and CAPA/change records tied to the metric review. (Source: ISO 9001:2015 Quality management systems — Requirements)
What’s the fastest way to get audit-ready?
Build a controlled measurement register, then fully implement it for a small set of high-impact processes first. Auditors accept phased maturity if what you claim is implemented, consistent, and evidenced. (Source: ISO 9001:2015 Quality management systems — Requirements)
Authoritative Sources
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream