SA-11(9): Interactive Application Security Testing
To meet the sa-11(9): interactive application security testing requirement, you must require your developers (including third parties building components or services) to run IAST tools during application testing to find security flaws and to document results and remediation. Operationalize it by embedding IAST into CI/CD, defining coverage and severity handling, and retaining auditable evidence. 1
Key takeaways:
- You need a documented requirement that developers perform IAST and record findings and outcomes. 1
- “Done” includes evidence: tool configuration, run records, findings, triage decisions, and remediation verification. 1
- Treat IAST as an SDLC gate tied to releases for in-scope systems and components, including those built by third parties. 2
SA-11(9) sits in the System and Services Acquisition (SA) family and is aimed at a specific, common gap: teams say they “test security,” but can’t show repeatable, developer-executed testing that produces findings, drives fixes, and leaves a paper trail. The control enhancement is narrow and practical: require the developer of the system, component, or service to use interactive application security testing (IAST) tools to identify flaws and document results. 1
For a Compliance Officer, CCO, or GRC lead, the fastest path is to treat SA-11(9) like a build-and-release control: define what applications are in scope, specify where IAST runs (pipelines, test environments, pre-release), define what “acceptable results” means, and make the evidence easy to produce during an assessment. You are not trying to prove that IAST finds every bug. You are proving you required it, it ran, findings were handled, and outcomes were recorded. 1
This page gives requirement-level guidance you can hand to engineering and a control owner, then audit with minimal friction.
Requirement summary (SA-11(9))
Control intent: Developers must run IAST tools to identify application flaws and must document the results. 1
Operator outcome: You can show (1) a clear developer requirement, (2) IAST execution as part of testing, and (3) documented findings and disposition that tie to remediation and verification. 2
Regulatory text
“Require the developer of the system, system component, or system service to employ interactive application security testing tools to identify flaws and document the results.” 1
What this means in plain English
- “Require the developer”: This is not optional guidance. Your SDLC or supplier requirements must obligate the party doing development work to run IAST and keep results. That includes internal teams and third parties delivering code, components, or services. 1
- “Employ IAST tools”: You need an actual IAST tool (commercial or open source where applicable) used in a way that exercises the application while it runs and detects security-relevant issues during testing. 2
- “Identify flaws”: The objective is detection of security weaknesses (for example, injection paths, insecure deserialization patterns, auth/session handling errors) surfaced as findings that can be triaged. 2
- “Document the results”: Your testing cannot be “tribal knowledge.” Results must be recorded, retained, and traceable to the build/release or test run. 1
Who it applies to
Entity scope
- Federal information systems implementing NIST SP 800-53 controls. 2
- Contractor systems handling federal data where NIST SP 800-53 is flowed down contractually or used as the governing control set. 2
Operational scope (what you should include)
Treat SA-11(9) as applying anywhere you have a “developer” producing or changing running code:
- Custom applications (web, mobile, APIs).
- Microservices and internal services that handle sensitive data.
- COTS/SaaS extensions or custom modules where you write code.
- Third-party developed components delivered to you (including outsourced dev and system integrators) when you have contractual ability to require testing artifacts. 2
How IAST fits with other security testing (so you scope it correctly)
IAST is one method inside broader “developer testing and evaluation” (SA-11). Don’t substitute IAST for everything else; position it as one layer:
- SAST finds issues in code without running it.
- DAST probes a running app from the outside.
- IAST instruments the app during functional testing to observe risky behavior with better context than DAST in many cases.
Your control story: “We require IAST during testing, and we record results as part of our SDLC security testing evidence.” 2
What you actually need to do (step-by-step)
Step 1: Assign ownership and define the control boundary
- Name a control owner (often AppSec or Secure SDLC lead) and an evidence owner (often DevSecOps or GRC).
- Define in-scope systems/components (start with systems handling federal data or high-impact functions). 2
Step 2: Write the developer requirement (policy + engineering standard)
Create a short, enforceable statement in your Secure SDLC standard:
- IAST must run for in-scope applications during testing.
- Findings must be recorded in the system of record (ticketing or vulnerability management).
- Releases require documented disposition (fixed, accepted risk, false positive with rationale). 1
For third parties, add the same requirement to:
- Statements of Work (SOWs) for development.
- Secure development addenda.
- Delivery acceptance criteria (deliver IAST results with release notes). 2
Step 3: Select an IAST operating model that produces auditable output
Pick one approach and document it:
- CI/CD integrated: IAST agent runs during automated integration tests.
- Pre-release testing: IAST runs in a controlled QA environment during regression testing.
- On-demand for high-risk changes: IAST required for auth, payment, PII handling, crypto, and access control changes.
The audit need is consistency: a defined trigger and repeatable evidence. 1
Step 4: Define coverage expectations that engineering can meet
Write down what “coverage” means for you so auditors don’t set it for you:
- Which apps/services must have IAST enabled.
- Which environments count (QA/staging).
- What test activity drives IAST observation (integration test suite, manual test scripts).
Keep it concrete and testable: an assessor should be able to pick an app, find the last run, and see results. 2
Step 5: Establish triage, remediation, and exception handling
Operationalize findings the same way you do other security defects:
- Severity assignment and routing (AppSec + dev owner).
- Remediation workflow (ticket created, fix merged, retest performed).
- Exception process for false positives and risk acceptance with approval and expiration criteria.
SA-11(9) explicitly requires results to be documented; your exception records are part of that story. 1
Step 6: Make it measurable (without inventing metrics)
Define what you will report without promising numbers:
- List of in-scope apps with IAST status (enabled/disabled + reason).
- List of IAST runs per release train (or per sprint).
- Aging view for open findings by severity.
Daydream can help by mapping SA-11(9) to a control owner, a procedure, and recurring evidence artifacts so you can answer audits quickly without rebuilding the narrative each time. 1
Required evidence and artifacts to retain (audit-ready)
Keep artifacts tied to a specific application and time period. Minimum set:
- Secure SDLC standard / control procedure that requires IAST and documentation. 1
- Tooling proof: IAST tool configuration, agent/instrumentation settings, and scope list (apps/repos/pipelines). 2
- Execution records: pipeline logs, job run screenshots, or exported run reports showing date, app version/build, and status. 1
- Findings register: vulnerabilities created from IAST outputs (tickets/issues) with severity, owner, and timestamps. 1
- Disposition evidence: fix commit/PR link, retest results, closure notes; for exceptions, risk acceptance record and rationale. 2
- Third-party deliverables (if applicable): attestation of IAST execution plus exported results aligned to the delivered build. 2
Common exam/audit questions and hangups
Assessors commonly push on these points:
- “Show me the requirement.” They will ask where you mandate IAST for developers, not just recommend it. 1
- “Show me it ran.” They will sample a system and ask for last run evidence tied to a build/release. 2
- “What happened to findings?” They will look for documented triage and closure evidence. 1
- “What about contractors?” If a third party develops code, they will ask how your requirement flows down and how you receive results. 2
Frequent implementation mistakes (and how to avoid them)
| Mistake | Why it fails SA-11(9) | Fix |
|---|---|---|
| Treating IAST as “we ran it once” | SA-11(9) is a developer requirement tied to testing, not a one-time event. 1 | Tie IAST runs to your SDLC trigger 1 and retain run records. |
| No documented results (only dashboards) | Dashboards change; audits need retained output tied to a timeframe. 2 | Export reports or preserve immutable run logs and create tickets from findings. |
| Findings exist but no disposition trail | “Document results” includes triage and outcomes. 1 | Require closure notes, retest evidence, and exception approvals. |
| Ignoring third-party development | The text explicitly includes system components and services, which often come from third parties. 1 | Add contract language and acceptance criteria for IAST artifacts. |
| Over-scoping to every internal script | Teams stop running the tool if scope is unrealistic. | Start with in-scope systems (federal data, internet-facing, high-risk paths), then expand in a controlled way. |
Enforcement context and risk implications (practical, not speculative)
No public enforcement cases were provided in your source catalog for SA-11(9). Treat the risk as assessment-driven: if you cannot show the requirement, proof of execution, and documented results, you will likely face audit findings, POA&Ms, or delays in authorization decisions for federal workloads. 2
Practical execution plan (30/60/90)
Use this plan as an operating cadence, not a promise of elapsed implementation time.
First 30 days: establish the control and pick a pilot
- Assign control owner and evidence owner; document scope criteria for in-scope apps.
- Publish the Secure SDLC requirement language for IAST and documentation. 1
- Select an IAST tool and define where it will run (pipeline vs QA).
- Pilot on one representative application; generate the first “audit packet” (procedure + run record + findings + closure). 2
By 60 days: integrate into SDLC gates and scale to priority apps
- Embed IAST into CI/CD or QA test workflow for the highest-risk apps.
- Standardize triage workflow and exception template; train dev leads and AppSec reviewers. 1
- Add third-party contract/SOW language for any active development engagements. 2
By 90 days: operational reporting and assessment readiness
- Maintain an app inventory view with IAST status and last run evidence pointers.
- Run an internal control self-test: pick samples and verify you can produce evidence in one working session. 2
- In Daydream, map SA-11(9) to the control owner, implementation procedure, and recurring evidence artifacts so audit requests turn into a repeatable export, not a scramble. 1
Frequently Asked Questions
Does SA-11(9) require a specific IAST tool brand or product?
No. The requirement is to require developers to use interactive application security testing tools and document results, not to use a named product. 1
Can we meet SA-11(9) with SAST or DAST alone?
SA-11(9) specifically calls for IAST tools. You can run SAST and DAST as part of your broader testing program, but you still need IAST coverage for in-scope development to claim this enhancement. 1
What counts as “document the results” in practice?
Keep run output tied to an app build or release and record findings in a system of record with disposition (fixed, accepted risk, false positive with rationale). That evidence should be retrievable later. 1
How do we handle false positives without failing the control?
Document your triage decision and the rationale, and retain the underlying tool output that led to the review. The control expects documentation of results, including decisions. 1
We outsource development. How do we enforce SA-11(9)?
Put IAST execution and delivery of results into SOWs and acceptance criteria, then verify deliverables during release intake. The control text explicitly includes developers of components and services, which often includes third parties. 1
What’s the minimum evidence an auditor will accept for one sampled application?
A written requirement, proof the IAST job ran for a specific build, the findings list, and proof of disposition for at least one finding (or a documented “no findings” result for that run). 1
Footnotes
Frequently Asked Questions
Does SA-11(9) require a specific IAST tool brand or product?
No. The requirement is to require developers to use interactive application security testing tools and document results, not to use a named product. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
Can we meet SA-11(9) with SAST or DAST alone?
SA-11(9) specifically calls for IAST tools. You can run SAST and DAST as part of your broader testing program, but you still need IAST coverage for in-scope development to claim this enhancement. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
What counts as “document the results” in practice?
Keep run output tied to an app build or release and record findings in a system of record with disposition (fixed, accepted risk, false positive with rationale). That evidence should be retrievable later. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
How do we handle false positives without failing the control?
Document your triage decision and the rationale, and retain the underlying tool output that led to the review. The control expects documentation of results, including decisions. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
We outsource development. How do we enforce SA-11(9)?
Put IAST execution and delivery of results into SOWs and acceptance criteria, then verify deliverables during release intake. The control text explicitly includes developers of components and services, which often includes third parties. (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
What’s the minimum evidence an auditor will accept for one sampled application?
A written requirement, proof the IAST job ran for a specific build, the findings list, and proof of disposition for at least one finding (or a documented “no findings” result for that run). (Source: NIST SP 800-53 Rev. 5 OSCAL JSON)
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream