AT-3(3): Practical Exercises
To meet the at-3(3): practical exercises requirement, you must include hands-on security and privacy exercises as part of training so personnel can practice the behaviors your training expects, not just acknowledge a slide deck. Operationalize it by defining exercise objectives, assigning role-based scenarios, running them on a schedule, and retaining proof of completion and outcomes. 1
Key takeaways:
- Practical exercises must reinforce specific training objectives, not general “awareness.” 1
- Auditors will look for execution evidence: scenarios, attendance, results, and corrective actions.
- The fastest path is a role-based exercise catalog mapped to training modules and job functions.
Footnotes
AT-3(3) is a simple requirement with a common failure mode: organizations deliver annual security and privacy training, track completion, and assume that checks the box. It doesn’t. AT-3(3) expects practical exercises that let people demonstrate secure behavior in realistic situations, tied back to the objectives of your security and privacy training program. 1
For a CCO, GRC lead, or control owner, the work is mostly operational design: define what “practical” means for each role, choose exercise types that match your risk profile, run them consistently, and collect evidence that proves the exercises occurred and drove improvement. The control is “medium” effort in practice because it spans HR/training operations, IT/security operations, privacy, and often third parties who administer training platforms or phishing simulations.
This page gives requirement-level implementation guidance you can assign to an owner and implement quickly: scope, step-by-step actions, evidence to retain, audit questions to pre-answer, and common mistakes that trigger findings.
Regulatory text
“Provide practical exercises in security and privacy training that reinforce training objectives.” 1
Operator translation: Your training program must include exercises where personnel practice what the training teaches (for example, reporting suspicious messages, handling sensitive data, responding to an incident), and those exercises must clearly connect to the stated objectives of your training. Keep records that show the exercises were run and that results were reviewed. 1
Plain-English interpretation (what “practical exercises” means)
Practical exercises are “do the thing” activities, not passive learning. They can be short and lightweight, but they must require action and allow you to observe outcomes. The “reinforce training objectives” clause means you need a documented line of sight from:
- training objective → 2) exercise scenario → 3) expected behavior → 4) measurement/result → 5) follow-up coaching or process change.
Examples that usually qualify (choose what fits your environment):
- Simulated phishing or smishing with a required “report” action to the SOC/helpdesk.
- Tabletop exercises for incident reporting and escalation (security and privacy variants).
- Data handling drills: correctly labeling, storing, sharing, or disposing of sensitive data.
- Access control exercises for admins: least-privilege reviews or break-glass use walkthroughs.
- Privacy request handling simulations for customer support or privacy operations.
Who it applies to (entity and operational context)
Typical in-scope entities
- Federal information systems and programs using NIST SP 800-53 as the control baseline. 2
- Contractors and service providers handling federal data where the system security plan inherits or maps to NIST SP 800-53 controls. 2
Operational contexts where assessors expect depth
- High-volume user populations with regular exposure to email, collaboration tools, and customer data.
- Privileged roles (system administrators, security engineers, developers) where mistakes have outsized impact.
- Privacy-processing roles (HR, customer support, marketing ops, privacy office) that handle regulated personal data.
- Teams that rely on third parties for training platforms, phishing simulations, call centers, or IT operations. You still own the control; third parties can help execute it.
What you actually need to do (step-by-step)
1) Assign ownership and define scope
- Control owner: usually Security Awareness/Training lead in Security or GRC; privacy exercises should have a named Privacy owner as co-owner.
- Scope: all workforce members who must complete security/privacy training; carve out role-based groups (privileged IT, developers, finance, HR, customer support, executives).
Deliverable: AT-3(3) control sheet with owner, in-scope population, and exercise types per role.
2) Define training objectives in a measurable way
If your training objectives are vague (“increase awareness”), you can’t prove exercises reinforce them. Rewrite objectives into observable actions, such as:
- “Users report suspected phishing using the approved channel.”
- “Staff classify and share files using approved labels and repositories.”
- “Managers escalate potential incidents within required internal timelines.”
- “Personnel recognize and route privacy requests to the privacy workflow.”
Deliverable: training objectives list (security + privacy) with mapped roles.
3) Build an “exercise catalog” mapped to objectives
Create a small library of exercises. Each exercise entry should include:
- Objective(s) reinforced
- Target audience/role
- Scenario narrative (what the person sees/does)
- Expected correct actions
- How you measure success (completion, time-to-report, error types)
- Owner and tooling (LMS, ticketing, phishing platform)
- Remediation path (coaching, re-training, policy reminder, access changes)
Deliverable: exercise catalog (spreadsheet or GRC control narrative) linked to objectives. This is the artifact auditors understand quickly.
4) Set an execution cadence and integrate into operations
Choose a cadence you can sustain. You do not need exotic exercises; you need repeatability and evidence.
- Embed exercises into onboarding (new-hire phishing simulation, data handling walkthrough).
- Run periodic exercises for the general population (phish tests, micro-drills).
- Run deeper exercises for high-risk teams (incident tabletop, admin access workflows).
- Include privacy scenarios (misdirected email with personal data, DSAR intake simulation).
Deliverable: annual training-and-exercise calendar owned by Security/Privacy training.
5) Run the exercises and capture results in systems you already use
Operationalize through existing platforms:
- LMS completion + embedded quiz is helpful, but AT-3(3) expects practice; add a required action outside the LMS when possible (reporting flow, ticket submission, classification in a real tool).
- Use ticketing (ITSM), SOC case management, or a dedicated reporting button as the “action capture.”
Deliverable: run logs (exports), tickets created, attendance records, facilitator notes.
6) Review outcomes and take corrective action
Assessors will ask: “What did you do with the results?”
- Trend common failure points (users clicking, misrouting privacy requests, mishandling data).
- Assign corrective actions (targeted retraining, job aids, process improvements, technical controls).
- Document decisions and closure evidence.
Deliverable: post-exercise review notes, corrective action register entries, closure proof.
7) Maintain assessor-ready narratives and evidence packages
Create a one-page “AT-3(3) implementation narrative” that explains:
- Exercise types used
- How exercises map to objectives
- How results drive improvement
- Where evidence lives and who can produce it on request
Daydream note (earned mention): teams often struggle to keep the mapping, calendar, and recurring evidence consistent across security and privacy. Daydream can act as the system of record for the control narrative and recurring artifacts so you can produce an audit packet without rebuilding it each assessment cycle.
Required evidence and artifacts to retain
Keep evidence that shows design and operation:
Design evidence (static/slow-changing)
- Training objectives (security + privacy) mapped to roles. 1
- Exercise catalog with objective mapping and measurement criteria. 1
- Training and exercise calendar.
- Procedures/work instructions for running each exercise.
Operational evidence (recurring)
- Exercise run logs (date, audience, scenario name/version, facilitator).
- Completion/attendance exports from LMS or exercise platform.
- Metrics reports (e.g., number of reports submitted, common mistakes) without adding unsupported numerical benchmarks.
- Tickets/cases created during simulations (sanitized if needed).
- Post-exercise review notes and corrective action tracking to closure.
Retention tip: store artifacts per exercise “run” in a consistent folder structure and name convention. Audits fail when evidence exists but is scattered.
Common exam/audit questions and hangups
Auditors and assessors commonly ask:
- “Show me a practical exercise and the training objective it reinforces.” 1
- “How do you ensure privileged users get role-specific exercises?”
- “How do you include privacy training exercises, not just security?”
- “Where do you document results, and what changes did you make based on them?”
- “How do contractors and third-party staff complete exercises if they access the system?”
Common hangups:
- “We do phishing tests, so we’re done.” You may still need privacy and role-based exercises beyond phishing.
- Objectives aren’t written down. If objectives live only inside training content, map and extract them into an auditable list.
- No proof of follow-up. Running exercises without documenting learnings and corrective actions often produces findings.
Frequent implementation mistakes (and how to avoid them)
-
Mistake: Treating a quiz as the practical exercise.
Fix: add an action-based component (reporting, classification in a real tool, escalation drill) that demonstrates behavior. -
Mistake: One exercise for everyone.
Fix: define role tiers (general, sensitive-data handlers, privileged/admin, privacy-processing) and assign exercises accordingly. -
Mistake: Exercises don’t match your own policies and workflows.
Fix: build scenarios that use your real reporting channels, incident categories, and privacy intake process. -
Mistake: No version control for scenarios.
Fix: assign each scenario an ID/version and archive what was used each time. Otherwise you can’t prove what users were tested on. -
Mistake: Evidence is “screenshots on someone’s laptop.”
Fix: centralize exports, facilitator notes, and reports in a controlled repository with access control.
Enforcement context and risk implications
No public enforcement cases were provided in the source materials for this requirement, so don’t assume regulators will cite “AT-3(3)” directly. The real risk is downstream: poor user response to phishing, mishandled personal data, delayed incident escalation, and inconsistent privacy request routing. When incidents occur, investigators and customers often ask what training you provided and whether it was effective; practical exercises give you defensible proof of operational readiness aligned to your stated objectives. 1
A practical 30/60/90-day execution plan
First 30 days (stand up the control)
- Name the AT-3(3) owner and privacy co-owner; document scope and role groups.
- Extract and rewrite training objectives into observable behaviors.
- Draft the exercise catalog with a small starter set covering: phishing/reporting, incident escalation, data handling, and one privacy workflow scenario.
- Decide where evidence will live and standardize naming conventions.
Days 31–60 (run and prove)
- Pilot at least one exercise per role tier; capture artifacts and lessons learned.
- Validate workflows: reporting button, ticket categories, privacy intake routing, escalation paths.
- Create the assessor-ready narrative and an audit packet template (what you will hand over on request).
Days 61–90 (operationalize and improve)
- Expand coverage to remaining teams and contractors in scope.
- Add targeted remediation paths for repeat issues (micro-training, manager coaching, just-in-time prompts).
- Establish a recurring review with Security + Privacy to approve scenario updates and track corrective actions to closure.
- If evidence collection is inconsistent, move the control into a GRC system (or Daydream) as the system of record for mappings and artifacts.
Frequently Asked Questions
Does a phishing simulation satisfy at-3(3): practical exercises requirement by itself?
Often it satisfies part of it for general users because it requires action and reinforces training objectives around identifying and reporting suspicious messages. You may still need additional exercises for privacy workflows and privileged roles to show role-based reinforcement. 1
What counts as “privacy training” practical exercises?
Run scenarios where staff must correctly route or handle personal data events, such as a misdirected email containing personal data or a simulated privacy request intake. Tie each scenario back to a documented objective and keep run evidence. 1
How do we show that exercises “reinforce training objectives”?
Maintain a mapping table: objective → exercise ID → expected behavior → measurement → remediation. In audits, that mapping plus run logs and post-exercise reviews usually resolves the question quickly. 1
Do we need role-based exercises for every job title?
No. Group roles into a small number of tiers based on risk and access, then assign exercises to each tier. Document the rationale for the grouping so assessors see it is intentional.
Our LMS tracks training completion. What evidence proves the “practical” part?
Keep evidence of the action-based exercise: simulation platform exports, attendance logs for tabletops, tickets/cases created, facilitator notes, and corrective actions. LMS completion alone rarely proves practice. 1
How should we handle third-party personnel who access our systems?
Treat them as in-scope workforce for the system context: define which exercises they must complete, how you deliver them (your platform or their employer’s), and what evidence you will receive and retain.
Footnotes
Frequently Asked Questions
Does a phishing simulation satisfy at-3(3): practical exercises requirement by itself?
Often it satisfies part of it for general users because it requires action and reinforces training objectives around identifying and reporting suspicious messages. You may still need additional exercises for privacy workflows and privileged roles to show role-based reinforcement. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
What counts as “privacy training” practical exercises?
Run scenarios where staff must correctly route or handle personal data events, such as a misdirected email containing personal data or a simulated privacy request intake. Tie each scenario back to a documented objective and keep run evidence. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
How do we show that exercises “reinforce training objectives”?
Maintain a mapping table: objective → exercise ID → expected behavior → measurement → remediation. In audits, that mapping plus run logs and post-exercise reviews usually resolves the question quickly. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
Do we need role-based exercises for every job title?
No. Group roles into a small number of tiers based on risk and access, then assign exercises to each tier. Document the rationale for the grouping so assessors see it is intentional.
Our LMS tracks training completion. What evidence proves the “practical” part?
Keep evidence of the action-based exercise: simulation platform exports, attendance logs for tabletops, tickets/cases created, facilitator notes, and corrective actions. LMS completion alone rarely proves practice. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
How should we handle third-party personnel who access our systems?
Treat them as in-scope workforce for the system context: define which exercises they must complete, how you deliver them (your platform or their employer’s), and what evidence you will receive and retain.
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream