Workforce Skills Assessment
The Workforce Skills Assessment requirement means you must regularly assess whether personnel have the cybersecurity knowledge, skills, and abilities (KSAs) needed for their roles, document the results, and close gaps through targeted training or role changes. To operationalize it quickly, define role-based cybersecurity competencies, assess against them, track gaps to remediation, and retain evidence.
Key takeaways:
- Map cybersecurity KSAs to roles (not just “annual training”) and assess people against the map.
- Treat gaps like risk findings: assign owners, due dates, remediation actions, and re-assess.
- Keep audit-ready artifacts: role profiles, assessment results, training plans, completion evidence, and effectiveness checks.
A workforce skills assessment is an operational control, not an HR formality. Under C2M2’s Workforce Management domain, the expectation is that you know whether the people who design, run, secure, and recover your systems can actually do the cybersecurity tasks your environment demands, and that you address deficiencies. The simplest way to fail this requirement is to confuse “everyone took security awareness training” with “the incident responder can investigate, contain, and eradicate” or “the OT engineer understands secure remote access paths.”
For a CCO, GRC lead, or Compliance Officer, the goal is speed to repeatability: a lightweight, role-based skills model; a credible assessment method; and a closed-loop process that turns gaps into training, mentoring, hiring, or changes in responsibilities. You also need evidence that stands up in internal audit, regulator discussions, and customer due diligence.
This page gives requirement-level guidance you can implement immediately: who to include, how to scope roles, how to assess, what artifacts to keep, and the common audit hangups that slow teams down.
Regulatory text
Requirement (C2M2 WORKFORCE-1.E, MIL2): “The cybersecurity knowledge, skills, and abilities of personnel are assessed and addressed.” 1
What the operator must do:
You must (1) assess cybersecurity KSAs for relevant personnel and (2) take action to address gaps. “Assessed” implies a method that produces results you can explain and repeat. “Addressed” implies documented remediation actions and follow-through, not a one-time acknowledgement.
Plain-English interpretation (what this requires in practice)
A “workforce skills assessment requirement” means:
- You define what “capable” looks like for each cybersecurity-relevant role in your environment.
- You evaluate current people against those expectations using a defensible method (manager attestation alone is rarely enough).
- You capture gaps and treat them like operational risk: assign an owner, pick a remediation path, track completion, and confirm improvement.
This is not limited to the security team. If a role can materially change cybersecurity outcomes (for example, OT operations, IT admins, network engineering, IAM, application release management, incident response, third-party access administration), it belongs in scope.
Who it applies to (entity and operational context)
Entity types: Energy sector organizations and critical infrastructure operators 1
Operational context (who to include):
- Cybersecurity function: SOC, incident response, threat intel, GRC, security engineering/architecture.
- IT operations with privileged access: system administrators, network admins, cloud platform admins, IAM admins.
- OT/ICS roles (where applicable): control engineers, OT support, SCADA/EMS/DMS operators, OT network/remote access admins.
- Software and change roles: application owners, DevOps/release managers, vulnerability remediation coordinators.
- Third parties with operational security impact: managed service providers, incident response retainers, OT integrators, and contractors with access to sensitive environments.
A common scoping approach is to include any role that:
- administers systems, identities, or security tools,
- can introduce configuration changes, or
- participates in incident response and recovery.
What you actually need to do (step-by-step)
Step 1: Set the boundary and ownership
- Assign a control owner (often Security GRC) and operational co-owners (security leadership, OT leadership, IT operations leadership).
- Define “in-scope personnel” categories (employees and third parties) and document inclusion rules.
- Decide where the system of record lives (GRC tool, HRIS plus GRC, or a controlled repository).
Deliverable: Skills Assessment Procedure (one to two pages) describing scope, roles, cadence concept (event-driven plus periodic), and evidence retention.
Step 2: Build role-based cybersecurity competency profiles
Create role profiles with:
- Role purpose and access level (especially privileged access and production access).
- Core cybersecurity tasks the role performs or influences.
- Required KSAs tied to those tasks (knowledge, skills, abilities).
- Proficiency levels (for example: basic, working, advanced) defined in behavioral terms.
Keep it practical. A strong role profile reads like: “Can review firewall rule changes for least privilege,” not “Understands network security.”
Deliverable: Role-to-competency matrix (spreadsheet is fine) approved by IT/OT/security leaders.
Step 3: Choose assessment methods that produce defensible results
Use one or more, depending on role criticality:
- Manager assessment with structured rubric (acceptable if the rubric is specific and managers are trained).
- Hands-on or scenario-based evaluation (best for incident response, privileged admins, OT access pathways).
- Tool-validated checks (for example: can the admin correctly execute backup restore tests; can the analyst triage alerts in the SIEM workflow).
- Credential and training review as supporting evidence, not the only evidence.
Avoid self-attestation as the primary signal for critical roles.
Deliverable: Assessment rubric templates by role family (IT admin, OT operator, SOC analyst, etc.).
Step 4: Run the assessment and record results in a consistent format
For each in-scope person, record:
- Role, team, environment (IT/OT/cloud), access tier.
- Assessment date, assessor, method(s) used.
- Results by competency area and overall decision (meets / partially meets / does not meet).
- Notes on key gaps with examples (what was observed).
Deliverable: Completed assessment records with sign-off by the assessor and the role owner.
Step 5: Convert gaps into tracked remediation actions
For each gap, pick the remediation path:
- Targeted training (internal or external), with learning objectives tied to the gap.
- Mentorship / supervised practice with a qualified peer.
- Job aids / standard operating procedures to reduce error likelihood.
- Reduced access or changed responsibilities until competence is demonstrated.
- Hiring or contracting for missing competencies where training is not timely.
Track each action with an owner, due date, and acceptance criteria (what “fixed” looks like). If your program uses Daydream for third-party risk workflows, treat third-party staff gaps the same way: log the finding against the third party engagement, require a remediation plan, and retain the evidence alongside access approvals and contract artifacts.
Deliverable: Workforce Skills Remediation Log connected to your risk register or control tracking.
Step 6: Re-assess and confirm effectiveness
“Addressed” should end with confirmation:
- Re-assess the specific competency (mini-assessment) after remediation.
- Confirm performance in operational outcomes (for example, correct execution of a recovery step in a tabletop or live test).
- Record closure rationale and who approved closure.
Deliverable: Remediation closure evidence and updated assessment status.
Step 7: Make it continuous (event-driven triggers)
Add triggers so assessments occur when risk changes:
- New hire into an in-scope role.
- Role change into privileged access.
- Major technology change (new OT remote access path, new SIEM, cloud migration).
- Post-incident or after a near-miss indicating a skill gap.
Deliverable: Trigger list embedded in onboarding, access requests, and change management workflows.
Required evidence and artifacts to retain
Auditors and assessors typically want proof of all three: defined expectations, performed assessments, and closed gaps. Retain:
- Workforce Skills Assessment policy/procedure aligned to the requirement 1
- In-scope role inventory (including third parties with access)
- Role-based competency profiles and rubrics
- Completed assessment records 1
- Remediation plans and training/mentoring assignments
- Training completion evidence (certificates, LMS transcripts) mapped to specific gaps
- Re-assessment results and closure approvals
- Exceptions documentation (who is excepted, compensating controls, expiry/renewal conditions)
- Governance artifacts (reviews, approvals, reporting to leadership)
Common exam/audit questions and hangups
Expect questions like:
- “Show me how you decide which roles are in scope.”
- “What KSAs are required for a privileged administrator in your environment?”
- “How do you know training fixed the gap?”
- “How do you treat contractors and managed service provider staff?”
- “Where is this tracked, and how do you prevent it from becoming stale?”
Common hangups:
- No consistent rubric; results depend on who assesses.
- Gaps are identified but never tracked to closure.
- Third parties are excluded even though they have meaningful access.
Frequent implementation mistakes (and how to avoid them)
-
Mistake: Treating annual awareness training as the assessment.
Fix: Separate awareness (baseline) from role-based KSAs (operational competence). -
Mistake: Building an over-engineered competency library that no one maintains.
Fix: Start with the roles that matter most (privileged access and incident response), then expand. -
Mistake: No linkage between gaps and risk decisions.
Fix: For high-impact roles, require remediation or compensating controls before maintaining privileged access. -
Mistake: Ignoring OT/ICS nuances.
Fix: Define OT-specific competencies (remote access control, safety constraints, segmented networks, vendor integrator access paths) and assess the people who run them. -
Mistake: Excluding third parties because “HR doesn’t own them.”
Fix: Treat third-party personnel competence as part of engagement risk. Capture contractual requirements and evidence requests in the same place you track access approvals.
Enforcement context and risk implications
No public enforcement cases were provided in the source catalog for this requirement. Practically, skills gaps translate into predictable failure modes: misconfigured access, delayed detection, poor containment, unsafe OT changes, and incomplete recovery. In critical infrastructure, those failures can become reportable incidents, safety events, or reliability issues. Your control narrative should connect the skills assessment directly to preventing these operational outcomes.
Practical 30/60/90-day execution plan
First 30 days: Stand up the minimum viable skills assessment
- Name the owner(s), define in-scope roles, and publish a short procedure.
- Create role profiles for the highest-risk roles (privileged access and incident response first).
- Draft rubrics and run a pilot with one team.
- Set up the tracking mechanism (GRC system, controlled spreadsheet, or ticket workflow).
Days 31–60: Assess, remediate, and prove closure
- Complete assessments for the initial in-scope population.
- Open remediation actions for every material gap with owners and acceptance criteria.
- Add triggers to onboarding, access provisioning, and change management so the process repeats automatically.
- Produce a simple leadership report: coverage, top gaps by role, remediation status.
Days 61–90: Expand scope and harden evidence
- Expand to remaining cybersecurity-relevant roles, including OT/ICS (where applicable).
- Add third-party personnel handling: evidence requests, contract language alignment, and access gating.
- Run re-assessments for remediated gaps and document closure decisions.
- Prepare an audit-ready evidence package: role matrix, assessment sample set, remediation log, and effectiveness proof.
Frequently Asked Questions
Do we have to assess every employee?
No. Scope should cover personnel whose work can materially affect cybersecurity outcomes, especially privileged access, security operations, and OT/ICS roles. Document the scoping rationale so you can defend why some roles are out of scope.
Can we rely on certifications and annual training as proof of skills?
Use certifications and training transcripts as supporting evidence, but keep a role-based assessment method that shows the person can perform required tasks. Auditors usually ask how you validated competence beyond course completion.
How do we handle third-party staff who access our environment?
Treat them as in-scope personnel for assessment purposes if they have access or perform cybersecurity-relevant tasks. Require role-appropriate evidence in contracting and onboarding, and track gaps as engagement risks with remediation actions.
What cadence is expected for reassessments?
C2M2 requires that skills are assessed and addressed, but it does not prescribe a fixed interval in the provided excerpt. Use event-driven triggers (role/access changes, incidents, major technology changes) and add a periodic review practice you can sustain.
What’s the fastest way to start without boiling the ocean?
Start with a small set of high-impact roles and a simple rubric, then iterate. A credible assessment plus a closed remediation loop beats a perfect skills framework that never gets executed.
How should we prove we “addressed” gaps?
Keep a remediation log tied to each gap, show completion evidence (training, mentoring, supervised practice), and record a re-assessment or manager sign-off against defined acceptance criteria. Closure should be reviewable and time-stamped.
Footnotes
Frequently Asked Questions
Do we have to assess every employee?
No. Scope should cover personnel whose work can materially affect cybersecurity outcomes, especially privileged access, security operations, and OT/ICS roles. Document the scoping rationale so you can defend why some roles are out of scope.
Can we rely on certifications and annual training as proof of skills?
Use certifications and training transcripts as supporting evidence, but keep a role-based assessment method that shows the person can perform required tasks. Auditors usually ask how you validated competence beyond course completion.
How do we handle third-party staff who access our environment?
Treat them as in-scope personnel for assessment purposes if they have access or perform cybersecurity-relevant tasks. Require role-appropriate evidence in contracting and onboarding, and track gaps as engagement risks with remediation actions.
What cadence is expected for reassessments?
C2M2 requires that skills are assessed and addressed, but it does not prescribe a fixed interval in the provided excerpt. Use event-driven triggers (role/access changes, incidents, major technology changes) and add a periodic review practice you can sustain.
What’s the fastest way to start without boiling the ocean?
Start with a small set of high-impact roles and a simple rubric, then iterate. A credible assessment plus a closed remediation loop beats a perfect skills framework that never gets executed.
How should we prove we “addressed” gaps?
Keep a remediation log tied to each gap, show completion evidence (training, mentoring, supervised practice), and record a re-assessment or manager sign-off against defined acceptance criteria. Closure should be reviewable and time-stamped.
Authoritative Sources
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream