SA-15(8): Reuse of Threat and Vulnerability Information
SA-15(8) requires you to make developers reuse existing threat models and vulnerability analyses from similar systems, components, or services, and prove those inputs changed the current design or build work. Operationalize it by defining “similar,” creating an intake-and-trace workflow in your SDLC, and retaining evidence that reused findings were reviewed, mapped, and addressed. 1
Key takeaways:
- Define a repeatable method to identify “similar systems” and collect prior threat/vulnerability artifacts before design decisions harden.
- Make reuse auditable: trace reused threats and known weaknesses to requirements, stories, mitigations, and test results.
- Assessors look for evidence of reuse and outcomes, not a statement that teams “considered” prior analyses. 2
The sa-15(8): reuse of threat and vulnerability information requirement is a build-time control. It is triggered whenever your organization (or a third party building for you) develops a system, system component, or system service, and it expects the development team to start from what is already known about similar technologies. The goal is straightforward: don’t re-learn old lessons the hard way.
For a Compliance Officer, CCO, or GRC lead, the practical problem is usually not “how to threat model.” It is how to force reuse into the SDLC with enough structure that engineering follows it and auditors can validate it. SA-15(8) is satisfied when you can show (1) you intentionally sourced prior threat models and vulnerability analyses from similar work, (2) you reviewed and adapted them, and (3) those inputs informed design, implementation, and verification decisions.
This page gives requirement-level guidance you can drop into your control narrative, SDLC gates, and third-party development requirements. It emphasizes evidence: what to retain, how to demonstrate traceability, and how to avoid the common failure mode of having “some security docs” with no linkage to the actual build.
Regulatory text
Requirement (verbatim excerpt): “Require the developer of the system, system component, or system service to use threat modeling and vulnerability analyses from similar systems, components, or services to inform the current development process.” 1
Operator meaning (what you must do):
- You must impose an obligation on developers (internal teams and/or third parties developing for you) to reuse existing threat and vulnerability knowledge, not start from zero each time.
- “Use … to inform” means the reused information must drive decisions: requirements, architecture choices, mitigations, tests, and acceptance criteria.
- “Similar systems, components, or services” requires you to define similarity in a way that is consistent and reviewable across projects (example criteria below). 2
Plain-English interpretation (what SA-15(8) is really testing)
Assessors are testing whether your development process learns from:
- prior threat models (data flows, trust boundaries, misuse cases), and
- prior vulnerability analyses (code review findings, SAST/DAST results, penetration test reports, bug bounty trends, dependency weaknesses)
…for comparable technology stacks or architectures.
If your team builds an API gateway, a mobile app, a SaaS multi-tenant service, or an identity integration, you should be pulling in known threat patterns and common vulnerability classes from earlier, similar work. SA-15(8) expects that reuse to happen early enough to influence design and backlog priorities, not as a post-release retrospective. 1
Who it applies to
Entity scope
- Federal information systems and programs aligning to NIST SP 800-53.
- Contractors and other third parties building or operating systems handling federal data, where NIST SP 800-53 flows down through contracts, ATO packages, or customer requirements. 2
Operational scope (where it shows up)
- New system development (greenfield).
- Major enhancements, refactors, re-platforming, or adopting a new shared service.
- Developing or customizing system components: authentication modules, CI/CD templates, IaC modules, encryption libraries, API layers.
- Third-party-developed software delivered to you, including productized software where you negotiate SDLC expectations through security addenda, SOW language, or supplier assurance.
What you actually need to do (step-by-step)
1) Assign ownership and define the “similarity” standard
Control owner: usually AppSec/secure engineering; GRC owns oversight and evidence readiness.
Create a short standard that defines “similar systems/components/services.” Keep it usable:
- Same architecture pattern (multi-tenant SaaS, event-driven pipeline, client-server mobile backend)
- Same technology stack (language, framework, cloud services)
- Same data classification and trust boundaries (PII flows, privileged admin paths, internet exposure)
- Same integration type (OIDC/SAML SSO, payment processing, SCIM provisioning)
Deliverable: a one-page “Similarity Criteria” SOP and a required checklist in your SDLC intake.
2) Create a reusable artifact library (and make it searchable)
SA-15(8) fails in practice when artifacts exist but cannot be found. Build a curated repository:
- Threat model templates and prior threat models (diagrams + threat register)
- Vulnerability analysis reports (SAST/DAST summaries, pen test exec summaries, dependency risk reviews)
- “Known issues by pattern” sheets (top recurring auth flaws, injection surfaces, misconfig classes)
Operational requirements:
- Tag artifacts by stack, architecture, exposure, and service type.
- Ensure access controls allow developers and reviewers to retrieve artifacts.
- Version artifacts; keep “effective date” and context notes.
3) Add an SDLC gate: “Reuse intake” before design approval
Insert a mandatory step into your SDLC (or your change intake for major work):
- Identify candidate similar systems
- Pull their threat model + vulnerability analysis artifacts
- Extract relevant threats/weaknesses into a “Reused Findings Register”
Minimum fields for the register:
- Source artifact name/version/date
- Why it is similar (mapped to your similarity criteria)
- Reused threats/weaknesses (normalized wording)
- Impacted component(s) in the current build
- Required action (mitigation, design change, test case, acceptance criteria)
- Owner and status
This is the core evidence for “use … to inform.” 1
4) Prove traceability from reused info to engineering action
Auditors will ask: “Show me where this was addressed.” Build lightweight trace links:
- Reused finding → security requirement (or control) → user story/epic → PR/commit or configuration change → test evidence
Examples of acceptable “informed development” outcomes:
- A reused threat leads to an architecture change (e.g., isolate admin plane, add service-to-service auth).
- A reused vulnerability trend leads to a new secure coding rule or code scanning policy for that repo.
- A reused pen test finding type leads to added negative tests or DAST rules.
5) Flow down SA-15(8) to third-party developers
If a third party develops a system component or service for you, add contract language requiring:
- reuse of threat models and vulnerability analyses from similar solutions (their own or yours), and
- delivery of the reused findings register with traceability to mitigations or test results.
Keep it outcome-based. You want evidence you can audit, not a promise of “secure development.” 2
6) Add a review and exception process
Not every prior artifact will apply. That is fine if you document:
- why specific threats/weaknesses were not relevant, and
- what alternative analysis was performed.
Create a short exception form:
- Artifact considered
- Reason not reused
- Compensating analysis performed
- Approver (AppSec lead) and date
7) Operationalize reporting for assessment readiness
For a given release, you should be able to produce within a reasonable time:
- the similarity determination,
- the reused findings register,
- the trace links to tickets/PRs/tests,
- any exceptions and approvals.
Daydream fits naturally here as the control system of record: map SA-15(8) to a control owner, a standard procedure, and recurring evidence artifacts so you can answer assessors without rebuilding the story each audit cycle. 1
Required evidence and artifacts to retain
Keep evidence tied to each major development effort or release train:
Governance artifacts
- SA-15(8) control narrative (scope, owner, frequency, tools)
- Similarity Criteria SOP and SDLC gate definition
- Third-party contract/SOW security addendum language (when applicable)
Per-project artifacts
- Similarity assessment checklist (completed)
- List of “similar systems” reviewed and source links
- Reused Findings Register (with status and owners)
- Design review notes showing decisions influenced by reused info
- Ticketing exports or links (stories/epics) mapping to reused findings
- Test evidence (security test cases, scan results relevant to reused issues)
- Exceptions with approvals and compensating analysis
Repository/library artifacts
- Threat model and vulnerability analysis library index (tags + dates + owners)
- Evidence of periodic curation (ownership, updates, deprecation notes)
Common exam/audit questions and hangups
Expect questions like:
- “How do you define ‘similar’ and who approves the determination?” 2
- “Show an example where you reused a threat model from another service and it changed requirements.”
- “Where is the mapping from reused vulnerabilities to test coverage or mitigations?”
- “How do you ensure third-party developers follow this requirement?”
- “What happens if no similar system exists?”
Common hangups:
- Threat models exist, but they are created after build decisions. SA-15(8) is about informing development.
- Vulnerability analyses exist as scan outputs, but there is no normalized register tying recurring issues to prevention work.
Frequent implementation mistakes (and how to avoid them)
-
Mistake: Treating reuse as optional reading.
Fix: Make reuse intake a formal SDLC gate with a required register and reviewer sign-off. -
Mistake: Reusing a template, not the actual findings.
Fix: Require extraction of concrete threats/weaknesses from prior artifacts into the current register, with trace links to work items. -
Mistake: No definition of “similar.”
Fix: Publish similarity criteria; make it a checklist so it is consistent across teams. -
Mistake: Evidence scattered across tools with no narrative.
Fix: Maintain a single “audit packet” per project (even if it just links to Jira, Git, and diagrams) and store it in your GRC system. -
Mistake: Ignoring third-party development.
Fix: Flow down requirements contractually and require delivery of the reused findings register as a project deliverable. 1
Enforcement context and risk implications
No public enforcement cases were provided in the source catalog for this requirement. Practically, your risk is assessment failure or adverse findings in authorization, customer audits, or supplier assurance reviews if you cannot prove that prior threat/vulnerability knowledge changed the current build decisions. The operational risk is predictable: repeated security defects across similar services because lessons learned do not transfer.
Practical 30/60/90-day execution plan
Use phases rather than promises about calendar time to avoid false precision; move faster if you have an established SDLC and AppSec function.
First 30 days (Immediate)
- Assign a control owner and reviewers (AppSec + engineering).
- Publish the “Similarity Criteria” SOP and a one-page reuse intake checklist.
- Stand up the artifact library structure (even if it starts as a curated folder + index).
- Pilot on one active project: produce a reused findings register and trace at least a few items through remediation and tests.
Next 60 days (Near-term)
- Add the reuse intake gate into your SDLC tooling (templates in Jira/Azure DevOps, PR checklist, architecture review form).
- Define minimum required artifacts per release and a standard “audit packet” layout.
- Update third-party development templates (SOW/security addendum) to require reuse and evidence delivery.
- Train engineering leads and TPMs on “how to pass SA-15(8) in practice” using one internal example.
By 90 days (Operationalize)
- Expand to all in-scope projects (new systems and major changes).
- Implement a light curation process for the library (ownership, tagging, deprecation).
- Add management reporting: projects in scope, reuse gate completion, open reused findings, exceptions.
- Store recurring evidence in Daydream (or your GRC system) so assessments pull from a single record of control operation. 2
Frequently Asked Questions
What counts as “threat modeling and vulnerability analyses” for SA-15(8)?
Threat models include data flow diagrams, trust boundaries, and a threat register; vulnerability analyses include scan findings, penetration test reports, and documented weakness reviews. SA-15(8) cares that you reused prior outputs from similar systems and showed how they shaped the current build. 1
If we build something “new,” can we say there are no similar systems?
Rarely. Similarity can be based on architecture pattern, tech stack, exposure, or data flows, even if the business feature is new. If you truly have no comparable internal system, document external analogs and perform a fresh analysis with an exception rationale for reuse gaps. 2
Do we have to reuse artifacts created by other teams, or can we reuse our own prior work?
Either is fine. The requirement is to reuse threat and vulnerability information from similar systems/components/services, regardless of which internal group produced it, and to apply it to the current development process. 1
How do we show that reuse “informed” development in a way an auditor accepts?
Keep a reused findings register and trace each reused item to a requirement, a work item, and verification evidence (tests or scan outcomes). A meeting note that “we reviewed past threats” without traceable changes usually fails scrutiny.
What is the minimum evidence package per project?
A completed similarity checklist, links to the source artifacts reviewed, the reused findings register with dispositions, and trace links to tickets/PRs/tests. Add exceptions only where you decide not to reuse a relevant prior finding.
How should we handle third-party developers?
Put SA-15(8) expectations into the contract/SOW and require delivery of the reused findings register plus traceability to mitigations and test results. If the third party will not share full artifacts, require enough detail to validate the reuse decision and outcomes. 2
Footnotes
Frequently Asked Questions
What counts as “threat modeling and vulnerability analyses” for SA-15(8)?
Threat models include data flow diagrams, trust boundaries, and a threat register; vulnerability analyses include scan findings, penetration test reports, and documented weakness reviews. SA-15(8) cares that you reused prior outputs from similar systems and showed how they shaped the current build. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
If we build something “new,” can we say there are no similar systems?
Rarely. Similarity can be based on architecture pattern, tech stack, exposure, or data flows, even if the business feature is new. If you truly have no comparable internal system, document external analogs and perform a fresh analysis with an exception rationale for reuse gaps. (Source: NIST SP 800-53 Rev. 5)
Do we have to reuse artifacts created by other teams, or can we reuse our own prior work?
Either is fine. The requirement is to reuse threat and vulnerability information from similar systems/components/services, regardless of which internal group produced it, and to apply it to the current development process. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
How do we show that reuse “informed” development in a way an auditor accepts?
Keep a reused findings register and trace each reused item to a requirement, a work item, and verification evidence (tests or scan outcomes). A meeting note that “we reviewed past threats” without traceable changes usually fails scrutiny.
What is the minimum evidence package per project?
A completed similarity checklist, links to the source artifacts reviewed, the reused findings register with dispositions, and trace links to tickets/PRs/tests. Add exceptions only where you decide not to reuse a relevant prior finding.
How should we handle third-party developers?
Put SA-15(8) expectations into the contract/SOW and require delivery of the reused findings register plus traceability to mitigations and test results. If the third party will not share full artifacts, require enough detail to validate the reuse decision and outcomes. (Source: NIST SP 800-53 Rev. 5)
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream