System Development Life Cycle
To meet the System Development Life Cycle requirement in NIST SP 800-53 Rev 5 SA-3, you must define an SDLC and run every system acquisition, build, and major change through it, with security and privacy built into each phase. Auditors expect proof that requirements, design, coding, testing, and release decisions consistently include security and privacy gates. 1
Key takeaways:
- Your SDLC must be organization-defined, documented, and consistently followed for acquisition, development, and ongoing management. 1
- “Security and privacy considerations” must show up as phase-based activities with approvers, artifacts, and release gates, not as after-the-fact reviews. 1
- Evidence matters: examiners look for traceability from requirements to testing to production approvals across a sample of releases. 1
SA-3 is a lifecycle control, not a single document. For FedRAMP Moderate environments, the SDLC becomes the “assembly line” that ensures security and privacy are treated as engineering requirements from intake through retirement, not as compliance checklists at the end. The practical challenge for a CCO, GRC lead, or compliance officer is operationalizing this across teams that may already have Agile ceremonies, a CI/CD pipeline, and procurement workflows, but lack consistent security and privacy gates.
Your job is to make the SDLC auditable and repeatable. That means: (1) define the phases you use (even if you run Agile), (2) embed required security/privacy activities in each phase, (3) assign accountable roles for approvals, and (4) retain artifacts that prove it happened for real releases and real changes. SA-3 also covers acquisition, so third-party software and managed services need equivalent lifecycle controls: security requirements at procurement, verification before onboarding, and ongoing change management.
The guidance below is written to help you stand up a working SDLC program quickly, align it to engineering reality, and be ready to answer an assessor’s sampling-based questions with clean evidence. 1
Regulatory text
Requirement (SA-3): “Acquire, develop, and manage the system using an organization-defined system development life cycle that incorporates information security and privacy considerations.” 1
Operator interpretation: you must (a) define what your SDLC is, (b) apply it to acquisition, development, and ongoing management, and (c) show that security and privacy are integrated as explicit lifecycle activities with outputs you can prove. If you can’t demonstrate consistent execution across a sample of projects/releases, you will struggle to show conformance even if your engineers “generally do the right thing.” 1
Plain-English interpretation (what SA-3 really expects)
- “Organization-defined SDLC” means you pick and document your phases and gates. Waterfall, Agile, and DevSecOps are all acceptable patterns if you can show controls are built in and consistently executed. 1
- “Acquire” means procurement and onboarding of software, platforms, and services must include security/privacy requirements and verification, not just legal terms.
- “Develop” means engineering work follows a controlled process with security/privacy requirements, secure design, secure coding, testing, and release approvals.
- “Manage” means changes after launch (patching, configuration changes, infrastructure updates, deprecation) still flow through the SDLC and its gates. 1
Who it applies to
Entity types: Cloud Service Providers and Federal Agencies operating in a FedRAMP Moderate context. 1
Operational context where this shows up:
- Product engineering teams building or operating the system boundary authorized under FedRAMP.
- Platform/infra teams shipping IaC, container base images, CI/CD pipeline changes, and environment configuration.
- Procurement and third-party onboarding teams bringing in SaaS, libraries, APIs, and managed services that touch the authorized boundary.
- Security and privacy functions defining requirements, reviewing risk, and approving releases. 1
What you actually need to do (step-by-step)
1) Define your SDLC phases, scope, and triggers
Document:
- Phases: intake → requirements → design → build → test → release → operate → retire (or your equivalent).
- What must go through SDLC: new systems, major features, high-risk changes, infrastructure changes, and third-party components that enter the boundary.
- Triggers: what counts as a “major change” versus “standard change,” and who decides. 1
Practical tip: write this as an SDLC standard plus a one-page workflow diagram that engineering can follow without reading a policy binder.
2) Build security and privacy activities into each phase (with gates)
Create a simple control matrix mapping each phase to required actions, owners, and artifacts. Example gates that are easy to audit:
| SDLC phase | Required security & privacy actions | Gate/approval | Typical artifact |
|---|---|---|---|
| Requirements | security & privacy requirements defined; data classification noted | Security/Privacy sign-off for high-risk features | security requirements, data flow notes |
| Design | threat modeling for high-risk changes; privacy review for data use | Architecture review approval | threat model, architecture decision record |
| Build | secure coding standard; dependency controls | Pull request checks | PR template, SAST results, dependency scan |
| Test | security testing planned and executed | Release readiness | test plan, test evidence, findings log |
| Release | risk acceptance documented for exceptions | Change/release approval | change ticket, release checklist |
| Operate | vulnerability/patch process; monitoring | Operational acceptance | patch records, monitoring config |
You can tailor the exact controls, but the audit pattern stays the same: a gate must exist, someone must be accountable, and outputs must be retained. 1
3) Integrate SDLC with Agile/CI/CD without breaking delivery
Auditors do not require waterfall. They require repeatability and evidence. Make SDLC “real” by embedding it into tools teams already use:
- Add security/privacy acceptance criteria to user stories.
- Use pull request templates that require threat model links for sensitive changes.
- Make scanning outputs part of build logs, and retain them.
- Require a change ticket (or equivalent) for production releases into the authorized environment. 1
4) Extend SDLC to acquisition and third parties
For third-party software/services that affect the system boundary:
- Define minimum security and privacy requirements at intake.
- Perform due diligence before onboarding (contractual and technical).
- Require ongoing change notifications and security patch SLAs where feasible.
- Treat third-party upgrades as changes that follow the same SDLC gating logic. 1
If you manage third-party risk in a separate workflow, connect the two with a clear trigger: “No onboarding to the FedRAMP boundary until security/privacy requirements are met and documented.”
5) Create exception handling that doesn’t become the default
You need an exception process for emergency fixes and edge cases, but it must be controlled:
- Define when exceptions are allowed.
- Require compensating controls and a time-bound remediation plan.
- Require explicit risk acceptance by an authorized role.
- Keep an exception register and include it in release readiness reviews. 1
6) Prove ongoing management, not just project-time compliance
Build SDLC evidence for operational changes:
- Patch and vulnerability remediation records tied to tracked work items.
- Configuration management changes tied to approvals.
- Periodic review of the SDLC itself, with updates when engineering practices change. 1
Required evidence and artifacts to retain
Assessors typically sample releases, changes, and projects. Keep artifacts that make sampling easy:
Program-level artifacts (the “what”):
- SDLC policy/standard and lifecycle diagram. 1
- Security & privacy gate checklist by SDLC phase.
- RACI or role definitions for approvals.
- Exception process and risk acceptance template.
Execution artifacts (the “proof”):
- Requirements documents/user stories with security & privacy acceptance criteria.
- Architecture/design records; threat models where applicable.
- Code review records; CI/CD logs showing checks ran.
- Security testing evidence and findings tracking.
- Change tickets and release approvals for production.
- Third-party onboarding security review records for components in scope.
- Exception register with approvals and closure evidence. 1
Tip for speed: store artifacts by release in a single “release evidence packet” folder or ticket bundle so you can answer sampling requests quickly.
Common exam/audit questions and hangups
Expect questions like:
- “Show your SDLC. Where are security and privacy built into each phase?” 1
- “Pick three recent releases. Show evidence of requirements, testing, and approval for each.”
- “How do you decide what changes require threat modeling or privacy review?”
- “How do you control third-party software/components entering the boundary?”
- “Show exceptions for emergency changes and who approved the risk.” 1
Common hangup: teams claim Agile means no “phases.” Auditors still need to see lifecycle activities and gates, even if they occur iteratively within sprints.
Frequent implementation mistakes (and how to avoid them)
-
SDLC exists only as a policy document.
Fix: embed gates in Jira/Git/CI/CD so the workflow produces evidence automatically. 1 -
Security reviews happen only right before release.
Fix: require security/privacy requirements at intake and design-stage risk review for sensitive features. -
No clear definition of “major change.”
Fix: define triggers tied to data sensitivity, authorization boundary impact, and external exposure. Document who can classify a change. -
Third-party acquisition is handled outside SDLC.
Fix: create a procurement-to-onboarding workflow that requires security/privacy verification before production connectivity. 1 -
Exceptions become a loophole.
Fix: require compensating controls, approvals, and a tracked closure; report exception trends to governance.
Enforcement context and risk implications
No public enforcement cases were provided in the source catalog for SA-3. Practically, the risk shows up as authorization delays, assessment findings, and increased exposure to vulnerabilities introduced through rushed changes or unmanaged third-party components. SA-3 is also a “multiplier” control: weak SDLC execution undermines many other security requirements because you can’t prove security is consistently engineered into the system. 1
Practical 30/60/90-day execution plan
The plan below avoids date promises and focuses on sequencing you can execute quickly.
First 30 days (stabilize and define)
- Write the SDLC standard (phases, scope, triggers, required security/privacy activities). 1
- Create the phase-by-phase security/privacy gate checklist and approval roles.
- Identify where evidence will live (Jira project templates, Git repos, CI logs, GRC repository).
- Pilot the workflow on one engineering team and one third-party onboarding path.
Days 31–60 (operationalize and instrument)
- Convert the checklist into tool-enforced steps: PR templates, required fields in change tickets, CI checks, release checklist.
- Stand up the exception process and register.
- Train engineering, product, procurement, and privacy stakeholders on “what changes require what gates.”
- Run an internal “mini assessment” by sampling recent releases and verifying evidence completeness. 1
Days 61–90 (scale and make it audit-ready)
- Roll out SDLC workflow templates across all in-scope teams.
- Align third-party intake with SDLC: no boundary connectivity without documented security/privacy review.
- Start a cadence for SDLC governance: review exceptions, recurring findings, and control gaps; update the SDLC standard when delivery practices change. 1
Where Daydream fits naturally: many teams struggle with evidence sprawl across Jira, Git, CI, and shared drives. Daydream can act as the system of record that maps SDLC gates to required artifacts, tracks exceptions, and packages release evidence for sampling without chasing teams at audit time.
Frequently Asked Questions
Do we need a separate “FedRAMP SDLC,” or can we use our existing Agile/DevSecOps process?
You can use your existing process if it is organization-defined, documented, and consistently followed, and if security and privacy activities are built into it with retained evidence. The assessor will look for phase-equivalent gates and artifacts. 1
What counts as “privacy considerations” under SA-3?
Treat privacy as engineering requirements around data collection, use, sharing, retention, and access, and ensure reviews happen at requirements/design and before release. Retain evidence of privacy review decisions for features that handle sensitive data. 1
How do we handle emergency production fixes without failing SA-3?
Allow an expedited path with defined criteria, documented risk acceptance, and compensating controls, then require follow-up completion of missing SDLC steps. Keep every emergency change in the exception register with closure evidence. 1
Does SA-3 apply to infrastructure-as-code and CI/CD pipeline changes?
Yes if those changes can affect the authorized system’s security posture. Treat IaC and pipeline modifications as SDLC-controlled changes with testing, approval, and retained evidence. 1
What evidence is most persuasive in an assessment?
A clean sample package per release: the change request, security/privacy gates completed, test outputs, approvals, and any risk acceptances. Assessors value traceability and consistency over lengthy narratives. 1
How do we extend SDLC controls to third-party software and services?
Put security and privacy requirements into intake and procurement, verify them before onboarding, and treat upgrades and configuration changes as SDLC-controlled events. Maintain artifacts that show the third party met your requirements before entering the boundary. 1
Footnotes
Frequently Asked Questions
Do we need a separate “FedRAMP SDLC,” or can we use our existing Agile/DevSecOps process?
You can use your existing process if it is organization-defined, documented, and consistently followed, and if security and privacy activities are built into it with retained evidence. The assessor will look for phase-equivalent gates and artifacts. (Source: NIST Special Publication 800-53 Revision 5)
What counts as “privacy considerations” under SA-3?
Treat privacy as engineering requirements around data collection, use, sharing, retention, and access, and ensure reviews happen at requirements/design and before release. Retain evidence of privacy review decisions for features that handle sensitive data. (Source: NIST Special Publication 800-53 Revision 5)
How do we handle emergency production fixes without failing SA-3?
Allow an expedited path with defined criteria, documented risk acceptance, and compensating controls, then require follow-up completion of missing SDLC steps. Keep every emergency change in the exception register with closure evidence. (Source: NIST Special Publication 800-53 Revision 5)
Does SA-3 apply to infrastructure-as-code and CI/CD pipeline changes?
Yes if those changes can affect the authorized system’s security posture. Treat IaC and pipeline modifications as SDLC-controlled changes with testing, approval, and retained evidence. (Source: NIST Special Publication 800-53 Revision 5)
What evidence is most persuasive in an assessment?
A clean sample package per release: the change request, security/privacy gates completed, test outputs, approvals, and any risk acceptances. Assessors value traceability and consistency over lengthy narratives. (Source: NIST Special Publication 800-53 Revision 5)
How do we extend SDLC controls to third-party software and services?
Put security and privacy requirements into intake and procurement, verify them before onboarding, and treat upgrades and configuration changes as SDLC-controlled events. Maintain artifacts that show the third party met your requirements before entering the boundary. (Source: NIST Special Publication 800-53 Revision 5)
Authoritative Sources
Operationalize this requirement
Map requirement text to controls, owners, evidence, and review workflows inside Daydream.
See Daydream