AU-13(2): Review of Monitored Sites

AU-13(2) requires you to periodically review—and keep current—the exact list of open-source information sites your organization monitors for security-relevant intelligence (for example, breach dumps, paste sites, code repos, and forums). Operationally, you need an owned inventory, a documented review cadence, defined add/remove criteria, and retained evidence that reviews happened and changes were approved. 1

Key takeaways:

  • Maintain an authoritative inventory of monitored open-source sites, tied to a named control owner.
  • Run a recurring review with documented criteria to add, remove, or reprioritize sources.
  • Retain evidence: review logs, approvals, and the “before/after” monitored-sites list.

Footnotes

  1. NIST SP 800-53 Rev. 5 OSCAL JSON

The au-13(2): review of monitored sites requirement is a small sentence with outsized audit impact because assessors can test it quickly: “Show me what sites you monitor, when you last reviewed the list, and why those sources are appropriate.” If you cannot produce an owned inventory and a repeatable review record, the control will read as ad hoc, even if your analysts are actively monitoring the internet.

AU-13 sits in the Audit and Accountability family, but this enhancement is really about the hygiene of your external monitoring program: open-source sources change constantly, become low-signal, go dark, change ownership, or raise legal/terms-of-service concerns. A disciplined review keeps you focused on sources that produce actionable, attributable intelligence, and prevents “monitoring sprawl” where you collect noise or take on avoidable risk.

This page translates AU-13(2) into a requirement you can assign, execute, and evidence quickly. It is written for a Compliance Officer, CCO, or GRC lead who needs to operationalize the control without waiting on a long program build.

Regulatory text

Requirement (excerpt): “Review the list of open-source information sites being monitored {{ insert: param, au-13.02_odp }}.” 1

What the operator must do:

  • Keep a defined list of open-source information sites your organization monitors (your “monitored sites list”).
  • Perform recurring reviews of that list at a cadence you define in your control parameters (“organization-defined parameter” in NIST language).
  • Ensure the review results in a current, approved list, not a stale document. 1

Plain-English interpretation

You are expected to manage open-source monitoring sources like a controlled inventory. The requirement is not “monitor the internet.” It is “control the inputs to your monitoring,” so you can answer:

  • What sources do we rely on for open-source intelligence?
  • Are they still relevant, reputable, and permitted for our use?
  • Did we remove sources that are dead, noisy, duplicative, or risky?

A workable interpretation for audits: a tracked inventory + a repeatable review + evidence of decisions.

Who it applies to (entity and operational context)

This control is most relevant where you operate a formal security monitoring, threat intelligence, brand monitoring, fraud monitoring, or incident response capability that includes open-source collection.

Typically in scope:

  • Federal information systems assessed against NIST SP 800-53. 2
  • Contractor systems handling federal data where NIST SP 800-53 controls are flowed down by contract, ATO boundary requirements, or program security requirements. 2

Operational teams you’ll involve:

  • Security Operations / Threat Intel (often the doers)
  • Incident Response (consumer of OSINT)
  • GRC / Compliance (control design, evidence, audit response)
  • Legal/Privacy/Procurement (if monitoring sources create terms-of-service, privacy, or third-party tool constraints)

What you actually need to do (step-by-step)

1) Assign a control owner and define the review parameter

NIST expects you to define the “organization-defined parameter” for the review. Translate that into a simple control statement your auditors can test:

  • Owner: a named role (for example, Threat Intelligence Manager).
  • Review cadence: set a recurring frequency that matches your threat environment and staffing.
  • Trigger events: define when an out-of-cycle review is required (for example, after a major incident, new business line, or new monitoring tooling).
    This is the minimum structure assessors look for because it turns “review” into a governed activity. 1

2) Build the monitored-sites inventory (authoritative list)

Create an inventory that is easy to export for an auditor and easy for analysts to maintain. A spreadsheet is acceptable if it is controlled and versioned; a GRC record is better if you can evidence workflow.

Recommended fields (keep it practical):

  • Site/source name
  • Source type (paste site, code repo, forum, social, data leak index, etc.)
  • URL / access method
  • Collection method (manual checks, RSS, API, third-party tool feed)
  • Purpose/use case (credential leak detection, brand abuse, vulnerability chatter)
  • Data sensitivity concerns (does this source tend to contain personal data?)
  • Terms-of-service constraints / allowed access notes (if applicable)
  • Signal quality rating (high/medium/low) and rationale
  • Last reviewed date, reviewer, decision (keep/add/remove), and notes

Your goal: a reviewer can look at the list and understand why each source exists.

3) Define add/remove criteria (so reviews aren’t subjective)

Write short criteria that drive consistent decisions. Example criteria you can adopt:

  • Keep sources that produce actionable indicators aligned to defined use cases and can be accessed in a permitted manner.
  • Remove sources that are persistently low-signal, duplicative, inaccessible, or raise unresolved legal/privacy concerns.
  • Escalate sources that may require specialized handling (for example, if collection could capture personal data that triggers internal privacy review).

Auditors don’t need perfect criteria. They need criteria that exist, are used, and result in updates.

4) Execute the review (repeatably) and record decisions

Run the review as a lightweight workflow:

  1. Export current monitored-sites inventory.
  2. Collect inputs from stakeholders (SOC, IR, brand/fraud, legal/privacy if needed).
  3. For each source: decide keep, remove, or change priority/method.
  4. Document decisions and approvals.
  5. Publish the updated monitored-sites list and communicate changes to operators.

A common operational pattern is to attach the review notes directly to the inventory (change log tab) so evidence is naturally created.

5) Tie the list to your monitoring tooling and alerting

AU-13(2) is about the list, but assessors will often sanity-check that:

  • the “monitored” sources are actually configured in your tooling (if you use tools), and
  • analysts know where to find the authoritative list.

Create a simple mapping:

  • Inventory row → tool configuration reference (rule name, feed name, query name), where possible.

6) Retain evidence in an audit-ready package

Treat each review cycle like an auditable event:

  • What was reviewed
  • Who reviewed it
  • What changed
  • Who approved it
  • When it became effective

If you use Daydream to manage control ownership, recurring tasks, and evidence collection, this is a natural fit: you can assign AU-13(2) to an owner, schedule the review task, and attach the “before/after” export plus approval record as recurring evidence.

Required evidence and artifacts to retain

Keep evidence that is specific, time-bound, and attributable.

Minimum evidence set:

  • Monitored open-source sites inventory (current version)
  • Change log or version history showing updates after review
  • Review record (meeting notes, ticket, or attestation) with reviewer names and date
  • Approval evidence (ticket approval, email approval, or GRC workflow approval)
  • Procedure/runbook describing how reviews occur and how changes are implemented 1

Nice-to-have evidence (helps with hangups):

  • Tool configuration export or screenshots that show sources are configured as stated
  • Rationale notes for high-risk sources (why monitored, constraints, handling expectations)

Common exam/audit questions and hangups

Assessors tend to probe AU-13(2) with direct artifact requests. Expect questions like:

  • “Show the list of open-source sites you monitor and the last review date.” 1
  • “Who owns this control and how do you decide which sources to include?”
  • “What changes resulted from the last review?”
  • “How do you prevent teams from monitoring ad hoc sources outside the list?”
  • “Do your tools match what the inventory says?”

Hangup you can avoid: producing a threat intel feed list that includes paid subscriptions and internal sources but does not clearly identify the open-source sites. AU-13(2) is explicit about open-source information sites. 1

Frequent implementation mistakes and how to avoid them

Mistake Why it fails in audits Practical fix
Inventory exists but no review record Auditors can’t verify the “review” happened Create a recurring ticket template that requires decisions and approval
Review is verbal only No attributable evidence Save minutes, attach annotated inventory, capture approver
“Everything is monitored” approach Unbounded scope invites quality and risk issues Limit sources to defined use cases; document add/remove criteria
Inventory doesn’t match tools Control looks performative Add a “tool reference” field and validate during review
No clear control owner No accountability Assign a named role and backup owner in the procedure

Enforcement context and risk implications

No public enforcement cases were provided in the source catalog for AU-13(2), so you should treat this as an assessment-readiness control rather than a “headline enforcement” item.

Operationally, the risk is still real:

  • Missed detection: stale sources can cause blind spots (for example, you stop monitoring a site that becomes newly relevant).
  • Noise overload: monitoring low-signal sources can bury analysts and delay response.
  • Governance and legal exposure: unmanaged monitoring can drift into sources with unclear terms or sensitive data handling implications.

AU-13(2) reduces those risks by forcing periodic governance of what you collect and why. 1

Practical 30/60/90-day execution plan

First 30 days (stand up the control)

  • Name the AU-13(2) control owner and define the review cadence and trigger events in your control narrative. 1
  • Build the first monitored-sites inventory from tool configs, analyst bookmarks, and existing threat intel documentation.
  • Draft add/remove criteria and a short review procedure.
  • Create the evidence container (GRC record or controlled folder) and a standard “review record” template.

Day 31–60 (run the first formal review)

  • Conduct the first formal review with SOC/Threat Intel and any required stakeholders.
  • Decide keep/remove/change for each site; record rationale for removals and additions.
  • Update the inventory, version it, and capture approval.
  • Validate at least a sample of entries against tool configuration to confirm alignment.

Day 61–90 (stabilize and make it repeatable)

  • Convert the review into a recurring task with reminders, owner, and required attachments.
  • Add a lightweight KPI for internal management (for example, “review completed on schedule” and “inventory updated”), but avoid inventing external benchmarks.
  • Do a tabletop audit: have someone not involved in the process try to pull the last review evidence in a single package. Fix gaps.

Frequently Asked Questions

What counts as an “open-source information site” for AU-13(2)?

Treat it as publicly available sources you monitor for security-relevant information, such as paste sites, public code repositories, forums, and public leak indexes. Keep the definition written in your procedure so the scope is consistent. 1

Do we need to monitor specific sites to be compliant?

AU-13(2) does not prescribe which sites to monitor; it requires reviewing the list of sites you do monitor. Document your selection criteria and show that you apply it during reviews. 1

How often do we need to review the monitored-sites list?

NIST leaves the review frequency as an organization-defined parameter. Pick a cadence you can sustain, document it, and keep evidence that you followed it. 1

We use a third-party threat intel tool. Is the tool’s feed list enough evidence?

It can be part of the evidence, but auditors still expect an explicit, reviewed list of open-source sites and a record of review decisions. Export the relevant portion of the tool configuration and attach it to your review record. 1

What if different teams monitor different sites (security, fraud, brand)?

Use one authoritative inventory with “owner” and “use case” fields per source. The AU-13(2) review can be a single coordinated review with multiple approvers or a central review that collects attestations from each team. 1

What’s the minimum evidence an assessor will accept?

A current monitored-sites inventory plus at least one completed review record that shows date, reviewer, decisions, and approval. If you can also show version history or before/after exports, the control tests much cleaner. 1

Footnotes

  1. NIST SP 800-53 Rev. 5 OSCAL JSON

  2. NIST SP 800-53 Rev. 5

Frequently Asked Questions

What counts as an “open-source information site” for AU-13(2)?

Treat it as publicly available sources you monitor for security-relevant information, such as paste sites, public code repositories, forums, and public leak indexes. Keep the definition written in your procedure so the scope is consistent. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)

Do we need to monitor specific sites to be compliant?

AU-13(2) does not prescribe which sites to monitor; it requires reviewing the list of sites you do monitor. Document your selection criteria and show that you apply it during reviews. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)

How often do we need to review the monitored-sites list?

NIST leaves the review frequency as an organization-defined parameter. Pick a cadence you can sustain, document it, and keep evidence that you followed it. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)

We use a third-party threat intel tool. Is the tool’s feed list enough evidence?

It can be part of the evidence, but auditors still expect an explicit, reviewed list of open-source sites and a record of review decisions. Export the relevant portion of the tool configuration and attach it to your review record. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)

What if different teams monitor different sites (security, fraud, brand)?

Use one authoritative inventory with “owner” and “use case” fields per source. The AU-13(2) review can be a single coordinated review with multiple approvers or a central review that collects attestations from each team. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)

What’s the minimum evidence an assessor will accept?

A current monitored-sites inventory plus at least one completed review record that shows date, reviewer, decisions, and approval. If you can also show version history or before/after exports, the control tests much cleaner. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)

Operationalize this requirement

Map requirement text to controls, owners, evidence, and review workflows inside Daydream.

See Daydream