AC-15: Automated Marking

To meet the ac-15: automated marking requirement, you must automatically apply defined security and handling markings (for example, classification, CUI tags, confidentiality labels, banners, metadata) to information and outputs based on policy and context, and prove it operates consistently across the systems in scope. Operationalize it by defining marking rules, enforcing them via tooling, and retaining evidence that markings are applied and cannot be bypassed.

Key takeaways:

  • AC-15 is about automated, policy-driven markings on information, not a one-time manual label exercise. 1
  • Your fastest path is a rule set + enforcement points + validation evidence tied to systems, data types, and output channels. 2
  • Audit success depends on repeatable artifacts: marking standards, technical configurations, samples, and monitoring/exception handling. 1

AC-15 sits in the Access Control family, but operators should treat it as an information handling control: you are making sure users and systems can recognize the sensitivity and handling requirements of data without relying on people to remember to label it. The control is most visible at “edges” where information leaves a system boundary or changes form, such as generated reports, exported files, email, collaboration shares, printed output, screen displays, and API responses.

Most implementation failures are not technical. They come from unclear marking rules (what gets marked, with what text/metadata, and under what conditions), incomplete coverage (only marking documents but not email exports or API responses), and weak evidence (no saved configurations, no samples, no testing results). If you want AC-15 to survive an assessment, you need three things: a marking specification that maps to your data categories, enforcement mechanisms that apply markings consistently, and recurring validation that the rules still work after system changes.

This page gives requirement-level implementation guidance you can execute quickly: applicability, step-by-step actions, the evidence set to retain, and the audit questions you will get.

Regulatory text

Requirement (framework control): “NIST SP 800-53 control AC-15.” 2

What an operator must do: Implement automated mechanisms that apply organizationally-defined markings to information and outputs, based on defined rules, so that handling expectations are clear and consistently enforced. AC-15 expects you to (1) define the marking scheme, (2) implement automation to apply it, and (3) show that it works across the scope you claim.

Practical reading: if a user can export, generate, view, print, or transmit sensitive information, you need a reliable way for the resulting artifact or display to carry the right label (and ideally the right metadata) without relying on manual steps. 1

Plain-English interpretation

AC-15 means: systems mark information for you.

  • “Marking” can be visual (headers/footers/banners), embedded metadata (file properties, email headers, document labels), or both, depending on your policy and tools.
  • “Automated” means the marking is applied by technical controls (platform configuration, DLP/labeling tools, document generation services, workflows), not a training-only requirement.
  • The goal is operational: markings drive correct handling and reduce accidental mishandling. That includes downstream recipients, third parties, and staff who did not create the content.

Who it applies to

Entities: Federal information systems and contractor systems handling federal data. 2

Operational context (where AC-15 shows up in real programs):

  • Any environment processing regulated or contract-restricted data where handling instructions matter (for example, “internal,” “confidential,” “export-controlled,” “CUI,” customer data categories).
  • Organizations with multiple collaboration channels (email + chat + shared drives + ticketing + reporting tools) where information frequently moves and changes format.
  • Third-party workflows where your data is processed or stored outside your boundary; markings help ensure recipients and processors see the handling expectations.

Scoping decision you must make (write it down):

  • Which data types require marking.
  • Which systems are “authoritative sources” for applying markings.
  • Which output channels are in scope (screen, PDF, CSV export, print, email, API, integrations).

What you actually need to do (step-by-step)

1) Define your marking standard (policy + technical specification)

Create a short, enforceable “Marking Standard” that answers:

  • Label taxonomy: the set of allowed markings (names, colors, banner text).
  • Trigger rules: what conditions apply each marking (data type, repository, sensitivity attributes, project/customer, contract tag).
  • Placement and format: header/footer text, watermarks, email subject prefix, file metadata fields, API header fields, UI banners.
  • Default and downgrade rules: what happens when content is mixed or classification is unknown.

Deliverable: a one-page standard plus a technical appendix with exact strings and metadata keys.

2) Map markings to your data classification and handling rules

Build a simple mapping table that ties each marking to:

  • Handling requirements (sharing restrictions, encryption expectations, retention, approved channels).
  • Who can set or change the marking.
  • Whether the marking is inherited from a system of record or computed by rules.

This prevents a common audit gap: “You apply labels, but the labels don’t mean anything operational.”

3) Identify enforcement points (where automation must happen)

List each system/output path and decide the enforcement mechanism:

Output path Typical automation control Evidence you should capture
Office docs / PDFs Sensitivity labeling + templates Policy config export; sample labeled files
Email Subject/banners + header metadata Mail rule config; sample message headers
File shares / collaboration Auto-labeling rules Label policy + monitoring events
Reports from BI tools Report templates with banners Template config + sample exports
Print Print banners / job stamping Print server config + sample output
APIs Response headers/fields, payload attributes API gateway policy + sample responses

Keep it practical: you do not need every tool to do everything; you need coverage for the material flows in your environment.

4) Implement the automation controls

Implementation options vary, but your control story should be consistent:

  • Central policy engine where possible (labeling/DLP platform, content services rules).
  • Native platform controls where they meet your marking standard (for example, document classification labels).
  • Compensating controls where automation is limited (for example, restricted export features, controlled templates, or gated release processes), but document why and how you validate.

5) Prevent bypass and handle exceptions

Auditors will probe whether users can circumvent marking:

  • Restrict who can remove/downgrade labels.
  • Require justification workflows for downgrades.
  • Monitor for unlabeled sensitive content in key repositories.
  • Define an exception register with owner approval and expiry.

6) Validate with repeatable testing

Design lightweight tests you can rerun after changes:

  • Generate each output type from representative data and confirm required visual markings and metadata.
  • Attempt common bypass methods (copy/paste to new doc, export to CSV, screenshot, print).
  • Confirm logs exist and are reviewable for labeling events or policy enforcement actions.

7) Operationalize with ownership and cadence

Assign:

  • Control owner: typically Security GRC or Information Security.
  • Technical owners: collaboration/email admins, endpoint team, app owners, BI/reporting owners, API owners.
  • Review triggers: system upgrades, new data types, new third parties, new output channels.

If you use Daydream, this is where it fits naturally: track AC-15 ownership, map systems and data flows to marking rules, and schedule recurring evidence collection so audits do not become a scramble.

Required evidence and artifacts to retain

Aim for evidence that proves design and ongoing operation:

Governance

  • Marking Standard (approved, versioned).
  • Data classification/handling policy mapping to markings.
  • Scope statement: systems, data types, output channels covered.
  • Exception register (with approvals and expirations).

Technical configuration

  • Exported configs from labeling/DLP/email rules (screenshots are acceptable if exports are hard).
  • Templates showing header/footer/watermark logic.
  • Role/permission settings proving who can change or remove markings.

Operational evidence

  • Samples of marked outputs (sanitized): documents, PDFs, emails with headers, report exports, print samples, API responses.
  • Test scripts and results for bypass attempts and edge cases.
  • Monitoring or alert evidence for unlabeled or incorrectly labeled content.
  • Change records tying major system changes to re-validation of marking automation.

Common exam/audit questions and hangups

Expect these, and pre-answer them in your artifacts:

  1. “Show me your marking rules and where they’re enforced.”
    Bring the mapping table plus configs per output channel.

  2. “How do you know users can’t remove or alter labels?”
    Show permission settings, downgrade workflows, and sample attempts.

  3. “Prove this works for exports and reports.”
    Exports are where programs fail. Keep sample exports ready.

  4. “What happens when data is copied into a new file or system?”
    Answer with inheritance rules, auto-labeling detection, or controlled channels.

  5. “How do you manage exceptions?”
    Show an exception register with expiration and periodic review.

Frequent implementation mistakes (and how to avoid them)

  • Mistake: Policy-only marking with no automation.
    Fix: identify enforcement points and implement at least one automated mechanism per major channel.

  • Mistake: Visual banners only, no metadata.
    Fix: add metadata where feasible so downstream systems can act on labels (search, DLP, sharing restrictions).

  • Mistake: Only covering Office documents.
    Fix: include email, exports, and APIs in scope, or document why they are excluded and how risk is managed.

  • Mistake: No evidence that proves ongoing operation.
    Fix: collect recurring samples and config snapshots tied to change events.

  • Mistake: Downgrade/removal is uncontrolled.
    Fix: restrict permissions and require approvals with logging.

Risk implications (why AC-15 gets attention)

Marking failures usually show up as mishandling: data sent to the wrong third party, uploaded into an unapproved channel, retained incorrectly, or shared too broadly because recipients did not see a handling restriction. AC-15 reduces that risk by making sensitivity visible and machine-readable across tools that users rely on daily. 1

Practical 30/60/90-day execution plan

First 30 days (stabilize scope and rules)

  • Name the control owner and technical owners.
  • Define the Marking Standard and the mapping to data categories.
  • Inventory output channels (docs, email, reports, exports, print, APIs) and pick initial in-scope systems.
  • Choose enforcement mechanisms per channel and document any gaps as time-bound exceptions.

Days 31–60 (implement and prove operation)

  • Configure labeling/marking automation for the top systems and channels.
  • Implement downgrade/removal controls and exception workflow.
  • Produce a first evidence pack: configs + marked output samples + initial test results.

Days 61–90 (operationalize and harden)

  • Expand coverage to remaining high-risk channels (exports, BI reports, integrations).
  • Add monitoring for unlabeled content in key repositories.
  • Build a repeatable validation routine tied to change management.
  • Run an internal “mock audit” walkthrough using the evidence pack and fix weak spots.

Frequently Asked Questions

Does AC-15 require classification markings like “Secret,” or can it be internal labels like “Confidential”?

AC-15 is satisfied by an organizationally-defined marking scheme appropriate to your environment and data. The key is that markings are defined, applied automatically, and tied to handling rules. 1

What systems should I prioritize first for automated marking?

Start where sensitive data most often leaves the boundary: email, document creation/storage, and report/export tools. Then cover print and APIs/integrations where downstream reuse is common.

Is a watermark on PDFs enough to meet the ac-15: automated marking requirement?

A watermark helps, but auditors typically expect coverage across relevant output formats and, where feasible, metadata markings that downstream systems can read. Treat PDFs as one channel within a broader marking program.

How do we handle mixed-content documents that include multiple data types?

Define a default rule in your Marking Standard, usually “highest sensitivity wins,” and enforce it in templates or labeling rules. Document any exceptions and require approvals for downgrades.

What evidence is most persuasive in an assessment?

Config exports (or screenshots) that show the rules, plus a small set of real samples from each channel that demonstrate the markings in practice. Add test results that show you attempted bypass paths and addressed failures.

How can Daydream help without turning this into a tooling project?

Use Daydream to assign ownership, track system scope, collect recurring evidence artifacts, and maintain an audit-ready narrative for AC-15. Keep the enforcement in your existing platforms; Daydream keeps the control operable and provable over time.

Footnotes

  1. NIST SP 800-53 Rev. 5

  2. NIST SP 800-53 Rev. 5 OSCAL JSON

Frequently Asked Questions

Does AC-15 require classification markings like “Secret,” or can it be internal labels like “Confidential”?

AC-15 is satisfied by an organizationally-defined marking scheme appropriate to your environment and data. The key is that markings are defined, applied automatically, and tied to handling rules. (Source: NIST SP 800-53 Rev. 5)

What systems should I prioritize first for automated marking?

Start where sensitive data most often leaves the boundary: email, document creation/storage, and report/export tools. Then cover print and APIs/integrations where downstream reuse is common.

Is a watermark on PDFs enough to meet the ac-15: automated marking requirement?

A watermark helps, but auditors typically expect coverage across relevant output formats and, where feasible, metadata markings that downstream systems can read. Treat PDFs as one channel within a broader marking program.

How do we handle mixed-content documents that include multiple data types?

Define a default rule in your Marking Standard, usually “highest sensitivity wins,” and enforce it in templates or labeling rules. Document any exceptions and require approvals for downgrades.

What evidence is most persuasive in an assessment?

Config exports (or screenshots) that show the rules, plus a small set of real samples from each channel that demonstrate the markings in practice. Add test results that show you attempted bypass paths and addressed failures.

How can Daydream help without turning this into a tooling project?

Use Daydream to assign ownership, track system scope, collect recurring evidence artifacts, and maintain an audit-ready narrative for AC-15. Keep the enforcement in your existing platforms; Daydream keeps the control operable and provable over time.

Operationalize this requirement

Map requirement text to controls, owners, evidence, and review workflows inside Daydream.

See Daydream