Automated decision making

ISO/IEC 27701 Clause 7.3.10 requires you, as a PII controller, to identify and fulfill your obligations to individuals when you use automated decision-making or profiling, including notice and a path to human intervention and challenge where required. Operationalize it by inventorying automated decisions, mapping legal/contract duties per use case, and implementing an intake-to-resolution process with evidence. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Key takeaways:

  • Build (and maintain) an inventory of automated decisions and profiling that affect individuals.
  • For each use case, document the obligations and implement: notice, escalation to a human, and a challenge/appeal workflow.
  • Keep audit-ready artifacts: decision logs, DPIA/PIA outputs where used, scripts for notices, and case management records.

“Automated decision making” is not a single control; it is a recurring operational obligation that cuts across product, privacy, security, and customer operations. Under ISO/IEC 27701 Clause 7.3.10, the core expectation is simple: you must know where automated decision-making and profiling exists in your services, and you must address the obligations you owe the PII principal (the individual) for those activities. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

For a CCO or GRC lead, the fastest path is to treat this as a “use-case-by-use-case” requirement. You are not trying to prove that algorithms are perfect. You are trying to prove governance: (1) you can identify automated decisions and profiling, (2) you can state what you owe individuals in each context (law, contract, policy, commitments), and (3) you can execute those obligations consistently with evidence. This page gives you an operator’s checklist, the artifacts auditors ask for, and a practical execution plan that does not depend on rewriting your entire SDLC.

Regulatory text

Requirement (excerpt): “The organization shall identify and address obligations, including legal obligations, to the PII principal regarding automated decision making and profiling.” (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

What the operator must do

  1. Identify where automated decision-making or profiling is used (including models, rules engines, scoring, ranking, and segmentation that drives outcomes for an individual).
  2. Determine obligations you owe the individual for each use case (legal obligations, plus contractual and policy commitments).
  3. Address those obligations in operational processes, product UX, and customer support, including (where applicable) human intervention and a method to challenge automated outcomes. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Plain-English interpretation (what this means in practice)

If your company uses systems that make decisions about people (or meaningfully influence those decisions), you must be able to answer:

  • Where are we doing this?
  • What do we promise or owe the person?
  • How does a person get an explanation, a human review, or a chance to contest the decision when required? (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

In practice, “address obligations” usually becomes a combination of:

  • Clear disclosure in privacy notices and in-product notices where appropriate
  • Internal rules on when automation is allowed vs. when a human must decide
  • A documented appeal or escalation path
  • Controls over the data and logic used to reduce inappropriate outcomes
  • Recordkeeping that proves you did what your policy says you do

Who it applies to

Entity scope

  • PII controllers (the organization determining purposes and means of processing). (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Operational contexts where this shows up

You likely have automated decision-making or profiling if you do any of the following:

  • Approve/deny access, eligibility, pricing, offers, or service levels using scoring or rules
  • Rank, recommend, or personalize content in ways that affect opportunities or outcomes
  • Detect fraud/abuse and auto-block accounts or transactions
  • Screen applicants, users, or customers and route to “manual review” only sometimes
  • Use third-party models or SaaS decision engines to do any of the above (still your controller obligation if you decide to use it)

What you actually need to do (step-by-step)

Step 1: Create an “Automated Decisions & Profiling Inventory”

Build a register that is separate from, but linked to, your data map/ROPA. Minimum fields that work in audits:

  • Use case name and owner (product + business)
  • Population affected (customers, end users, applicants)
  • Outcome type (approve/deny, rank, price, restrict, investigate)
  • Inputs used (PII categories, signals, third-party data)
  • Whether profiling is involved (segmentation/scoring)
  • Whether there is meaningful impact on the individual
  • Where humans are in the loop today (before decision, after decision, exception only, never)
  • Systems and third parties involved (model provider, data broker, platform)

Practical tip: start with the “top-of-funnel” and “stop-the-world” decisions: account creation, eligibility, payments, fraud blocks, and access revocations. Those generate the most complaints and escalations.

Step 2: For each use case, map obligations to controls

ISO/IEC 27701 asks you to identify and address obligations, including legal obligations, to the PII principal. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Create a one-page “obligations matrix” per use case:

  • Obligation source: law (jurisdiction-specific), contract, privacy notice, product terms, internal policy
  • Obligation statement: what you must provide/do (notice, access, explanation, human review, appeal)
  • Operational control: what process/system fulfills it (support workflow, in-product request form, manual review queue)
  • SLA target: your internal target for responsiveness (keep it as internal guidance; don’t publish timelines you can’t meet)
  • Evidence produced: case log, decision log, reviewer notes, notice version

If you operate across multiple jurisdictions, don’t guess. Route the legal interpretation to counsel, then codify the output into the matrix so operations can execute consistently.

Step 3: Implement “human intervention + challenge” as a real workflow

Your goal is a repeatable, auditable process that handles:

  • Intake: how individuals submit requests or complaints tied to automated decisions
  • Triage: verify identity (as required by your program), locate the decision, classify the request type
  • Review: assign a qualified human reviewer with authority to override
  • Outcome: provide the result, and where appropriate, explain what changed or why not
  • Record: retain artifacts so you can show you followed the workflow (ISO language expects you to “address obligations,” which auditors read as “prove it happened”) (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Make it operationally credible:

  • Define who counts as a “human” reviewer (role-based, trained, independent enough to challenge the system output).
  • Define what the reviewer must check (inputs, decision rationale, exceptions, data quality flags).
  • Define permissible overrides and required approvals.

Step 4: Add product and notice touchpoints

Common implementation pattern:

  • Privacy notice describes categories of automated decision-making/profiling at a level consistent with your commitments.
  • Contextual notice appears where the decision happens (e.g., “Your account was restricted due to automated fraud signals. You can request a review.”).
  • Support macros and knowledge base articles mirror the same language, so you do not create inconsistent statements across channels.

Version control matters. Store notice versions and the effective dates tied to each use case.

Step 5: Control third parties that power the decision

If a third party provides the model, the signals, or the decision platform, include in third-party due diligence:

  • Whether the third party performs automated decision-making on your behalf
  • What documentation they can provide (model cards, performance testing summaries, data sources)
  • How you will support human intervention and challenges if the third party is “behind the curtain”
  • Contract terms needed for cooperation with requests and investigations

Where Daydream fits naturally: Daydream can centralize third-party due diligence evidence and map third-party services to the automated decision inventory, so you can quickly prove which third parties influence automated outcomes and what commitments exist in contracts and assessments.

Required evidence and artifacts to retain

Auditors and certification assessors generally want “proof of identification” plus “proof of operations.” Keep:

  • Automated Decisions & Profiling Inventory (current, owner-assigned, reviewed on a routine cadence you can defend)
  • Use-case obligations matrices (including legal/contract/policy sources)
  • Documented workflow for human intervention and challenge (SOP/runbook)
  • Training records for reviewers and support staff who handle these cases
  • Sample case records: intake, identity checks where applicable, reviewer notes, final disposition, communications sent
  • Decision logs or event logs that link an individual outcome to the system/process that produced it
  • Third-party artifacts: due diligence records, DPAs/contract clauses, service descriptions tied to the decision use case (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Common exam/audit questions and hangups

Expect questions like:

  • “Show me your inventory of automated decisions and profiling.”
  • “How do you determine which obligations apply to each use case?”
  • “Demonstrate a recent case where an individual challenged a decision. What happened?”
  • “Where is the human intervention step documented, and who is authorized to override?”
  • “Which third parties influence these decisions, and what oversight do you have?”

Hangups that stall audits:

  • Inventory exists, but owners cannot explain the logic or the impact.
  • Support has an “appeal” path, but it is informal and not recorded consistently.
  • Product disclosures don’t match actual operations (or don’t exist).

Frequent implementation mistakes (and how to avoid them)

  1. Mistake: treating “automated decision making” as only ML.
    Fix: include rules engines, scoring, ranking, and threshold-based auto-actions in scope.

  2. Mistake: no linkage between the decision and a person’s request.
    Fix: ensure logs can locate the decision event by user identifier, timestamp, and system.

  3. Mistake: “human review” exists in theory but not in authority.
    Fix: give reviewers explicit override rights (or a clear escalation path) and document it.

  4. Mistake: third-party decisioning is a blind spot.
    Fix: require transparency artifacts and cooperation terms in third-party contracts; track those relationships in your inventory and TP due diligence system (Daydream helps here).

Risk implications (what goes wrong if you don’t do this)

Operationally, failures show up as:

  • Complaint spikes and escalations (especially around fraud blocks, eligibility, and access restrictions)
  • Inconsistent customer messaging (“support says one thing, product does another”)
  • Inability to respond to privacy rights requests tied to automated decisions
  • Certification or assurance friction because you can’t prove you “identified and addressed obligations” (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Practical 30/60/90-day execution plan

Use phases instead of date promises. The goal is speed with control.

First phase (Immediate)

  • Assign an executive owner (privacy/GRC) and operational owners (product + support).
  • Stand up the Automated Decisions & Profiling Inventory with a minimum viable template.
  • Identify the highest-impact use cases (eligibility/deny, fraud blocks, access revocation, pricing/offers).

Second phase (Near-term)

  • Build obligations matrices for the top use cases; get legal sign-off where needed.
  • Implement a documented human intervention + challenge workflow in your ticketing system.
  • Update notices/macros to match the workflow and route.
  • Bring third parties into scope: map which services and data feeds influence each use case; collect due diligence artifacts in a system of record (Daydream can serve as that system for third-party documentation and mapping).

Third phase (Ongoing)

  • Expand the inventory to remaining business lines and products.
  • Add routine control testing: sample cases, reviewer quality checks, and drift checks for decision logic changes.
  • Formalize change management: any new automated decision use case must add an inventory entry and obligations matrix before launch. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Frequently Asked Questions

What counts as “automated decision making” for ISO/IEC 27701?

Treat any system-driven decision or ranking that affects an individual as in scope, including rules engines and scoring models, not just machine learning. If the outcome changes access, eligibility, pricing, or restrictions, include it in your inventory. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

If we have a manual review team “sometimes,” are we compliant?

Only if you can show when human intervention happens, who performs it, what they review, and that they can change the outcome. “Sometimes” must become defined criteria and a documented workflow with records. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Do we need to provide an explanation of every automated decision?

ISO/IEC 27701 Clause 7.3.10 requires you to identify and address obligations to the PII principal regarding automated decision making and profiling. Determine what your obligations are per use case (law/contract/policy), document them, and implement the required communications and processes. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

How do we handle automated decisions made by a third party’s platform?

Inventory the use case as your activity if you decide to deploy it, then ensure contracts and due diligence cover cooperation, documentation, and support for challenges/human review. Operationally, your support team still needs a way to trace and contest outcomes. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

What evidence is most persuasive in an audit?

A current inventory, an obligations matrix for each material use case, and closed-case examples showing intake, human review, and outcome communications. Auditors also look for consistency between notices, procedures, and actual tickets. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Where should this live: privacy program, security, or product?

Put governance in the privacy/GRC program (inventory, obligations mapping, evidence), but make product and support operationally accountable for execution. If ownership is unclear, the workflow will fail in real customer escalations. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Frequently Asked Questions

What counts as “automated decision making” for ISO/IEC 27701?

Treat any system-driven decision or ranking that affects an individual as in scope, including rules engines and scoring models, not just machine learning. If the outcome changes access, eligibility, pricing, or restrictions, include it in your inventory. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

If we have a manual review team “sometimes,” are we compliant?

Only if you can show when human intervention happens, who performs it, what they review, and that they can change the outcome. “Sometimes” must become defined criteria and a documented workflow with records. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Do we need to provide an explanation of every automated decision?

ISO/IEC 27701 Clause 7.3.10 requires you to identify and address obligations to the PII principal regarding automated decision making and profiling. Determine what your obligations are per use case (law/contract/policy), document them, and implement the required communications and processes. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

How do we handle automated decisions made by a third party’s platform?

Inventory the use case as your activity if you decide to deploy it, then ensure contracts and due diligence cover cooperation, documentation, and support for challenges/human review. Operationally, your support team still needs a way to trace and contest outcomes. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

What evidence is most persuasive in an audit?

A current inventory, an obligations matrix for each material use case, and closed-case examples showing intake, human review, and outcome communications. Auditors also look for consistency between notices, procedures, and actual tickets. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Where should this live: privacy program, security, or product?

Put governance in the privacy/GRC program (inventory, obligations mapping, evidence), but make product and support operationally accountable for execution. If ownership is unclear, the workflow will fail in real customer escalations. (ISO/IEC 27701:2019 Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management)

Authoritative Sources

Operationalize this requirement

Map requirement text to controls, owners, evidence, and review workflows inside Daydream.

See Daydream
ISO/IEC 27701: Automated decision making | Daydream