Secure Development Lifecycle

VDA ISA 8.1.1 requires you to implement a Secure Development Lifecycle (SDLC) for any software or systems development project that handles automotive data, with security built into requirements, coding, review, and testing. To operationalize it quickly, define an SDLC standard, embed mandatory security gates in your delivery process, and retain evidence that every in-scope release followed those gates. (VDA ISA Catalog v6.0)

Key takeaways:

  • Scope first: identify which projects “handle automotive data,” then apply the SDLC consistently. (VDA ISA Catalog v6.0)
  • Minimum SDLC elements: security requirements analysis, secure coding standards, code review, and security testing. (VDA ISA Catalog v6.0)
  • Audits are evidence-driven: you pass by showing artifacts per release, not by having a policy alone. (VDA ISA Catalog v6.0)

A “secure development lifecycle requirement” sounds like a policy ask, but VDA ISA 8.1.1 is evaluated through execution. If your teams build or modify software and systems that touch automotive data, you need an SDLC that consistently injects security work into delivery: define security requirements early, code to defined standards, review changes, and test security before release. (VDA ISA Catalog v6.0)

For a CCO or GRC lead, the fastest path is to treat this as a control system with (1) defined scope, (2) required activities (“security gates”), and (3) durable evidence. You are not trying to turn compliance into a parallel engineering process. You are making security work unavoidable in the process engineers already follow (tickets, repos, CI/CD, release approvals), then collecting the artifacts automatically wherever possible.

This page gives requirement-level guidance you can hand to Engineering, Product, and Security to implement quickly, plus the evidence list and common audit traps that cause otherwise mature programs to fail review.

Regulatory text

Requirement (excerpt): “Implement a secure development lifecycle (SDLC) for software and systems development projects handling automotive data.” (VDA ISA Catalog v6.0)

Operator meaning: You must have an SDLC that applies to all in-scope development work (not only “security-critical” projects) and that visibly includes:

  • security requirements analysis,
  • secure coding standards,
  • code review, and
  • security testing. (VDA ISA Catalog v6.0)

Auditors will typically test two things: (1) your SDLC definition (the “rules of the road”) and (2) proof that real projects followed it end-to-end. (VDA ISA Catalog v6.0)

Plain-English interpretation (what the requirement is really asking)

If your organization builds, modifies, or integrates software/systems that handle automotive data, you must run development work through a repeatable process that prevents common security defects and makes security verification part of “done.” (VDA ISA Catalog v6.0)

A practical interpretation you can adopt internally:

“No in-scope code reaches production (or customer delivery) unless security requirements are captured, coding rules are defined, changes are reviewed, and security tests are executed and tracked.” (VDA ISA Catalog v6.0)

Who it applies to (entity + operational context)

Entity types: Automotive suppliers and OEMs. (VDA ISA Catalog v6.0)

Operational scope: Any software or systems development project handling automotive data, including:

  • in-house product development and embedded software,
  • internal tools that process automotive data (engineering, diagnostics, analytics),
  • integrations and interfaces (APIs, ETL jobs) moving automotive data between systems,
  • third-party developed components where you manage requirements, acceptance, or releases (you still need an SDLC wrapper and acceptance criteria). (VDA ISA Catalog v6.0)

Teams involved: Engineering (dev), QA/test, Product/Program, Security/AppSec, and Release/Operations. For audits, you (GRC) coordinate and ensure evidence is retained centrally.

What you actually need to do (step-by-step)

Step 1: Define “in-scope” and make it operational

  1. Create an SDLC scope rule: “Any project that stores, processes, transmits, or has access to automotive data is in scope.” Keep the definition short and enforceable. (VDA ISA Catalog v6.0)
  2. Tag systems and repos: Maintain an inventory field (e.g., “automotive data: yes/no”) and map it to repos/pipelines so the SDLC gates can be applied by default.
  3. Decide exceptions upfront: For urgent fixes, define an emergency path with compensating controls and mandatory follow-up security work.

Audit win: you can list in-scope applications, repos, and pipelines, and show the SDLC gates are turned on for them.

Step 2: Publish your SDLC standard (one page is fine if it’s enforceable)

Your SDLC standard should specify:

  • Security requirements analysis: what must be captured, when, and where (e.g., in user stories/epics).
  • Secure coding standards: which standards apply (language/framework specific), plus how developers access them.
  • Code review: minimum review rules (e.g., required approvals, segregation of duties where feasible).
  • Security testing: required test types and minimum release criteria. (VDA ISA Catalog v6.0)

Avoid long prose. Use a “gate checklist” format that release managers and engineers can follow.

Step 3: Build security requirements analysis into intake and design

  1. Add security requirements to templates: Update story/feature templates to include data handling, authentication/authorization needs, logging, and error-handling expectations.
  2. Define “security acceptance criteria”: For each in-scope feature, require testable statements (e.g., “access is role-based,” “input is validated”).
  3. Capture risk decisions: If a requirement is deferred, record who accepted the risk and what compensating control exists.

Evidence: completed templates, threat/risk notes attached to epics, and approval records.

Step 4: Implement secure coding standards that teams will actually follow

  1. Choose standards per stack: Secure coding standards should match your languages and frameworks.
  2. Make them accessible in the workflow: Link standards in the repo README, internal wiki, or engineering handbook.
  3. Back the standards with guardrails: Linters, dependency rules, secrets scanning, and CI checks should reinforce the standard so enforcement is consistent.

Common hangup: “We have standards, but nobody can find them.” Fix by linking standards directly in PR templates and repo docs.

Step 5: Require code review and make it auditable

  1. Define PR review rules: Require peer review for in-scope repos. Specify who can approve and when security review is needed (for sensitive modules, auth changes, crypto, etc.).
  2. Configure branch protections: Enforce reviews in the repo platform; don’t rely on “policy only.”
  3. Record review outcomes: Ensure PRs show reviewers, comments, and approvals.

Evidence: repository settings screenshots/exports, PR samples for recent releases, and documented review criteria. (VDA ISA Catalog v6.0)

Step 6: Define security testing gates tied to release

VDA ISA 8.1.1 requires “security testing,” but you must translate that into gates you can show in evidence. (VDA ISA Catalog v6.0)

A defensible approach:

  • Pre-merge / CI tests: automated checks (static analysis, dependency checks, unit tests) for all in-scope repos.
  • Pre-release tests: security test execution evidence attached to the release (scan reports, test run IDs, defect tickets).
  • Defect handling: a severity/triage rule and a documented decision when shipping with known issues (who approved, why, what mitigation).

Do not overpromise. If you cannot block releases automatically yet, start with manual release checklists and approvals, then automate.

Step 7: Centralize evidence collection (so audits don’t become archaeology)

You need a repeatable way to produce evidence per application and per release:

  • SDLC standard and change log
  • scope list (in-scope systems/repos)
  • release list for the audit period
  • for each sampled release: requirements artifacts, PR/review artifacts, security testing artifacts, and defect remediation/acceptance records. (VDA ISA Catalog v6.0)

If you use a GRC system like Daydream, set up an SDLC control with mapped evidence requests 1 and assign owners in Engineering, AppSec, and Release Management so you’re not chasing screenshots at audit time.

Required evidence and artifacts to retain (audit-ready list)

Keep these artifacts in a controlled repository (GRC tool or controlled document store):

Program-level (applies across projects)

  • SDLC policy/standard describing security requirements analysis, secure coding standards, code review, and security testing. (VDA ISA Catalog v6.0)
  • SDLC scope definition and in-scope system/repo list.
  • Secure coding standards documentation (by language/framework).
  • Code review standard (what’s required, who approves, exceptions).

Project/release-level (sampled by auditors)

  • Requirements/security acceptance criteria captured in tickets/epics.
  • Design/security notes where applicable (architecture review records if you have them).
  • PR records showing reviewers and approvals.
  • Security test outputs (scan reports, test run IDs, pen test summary if performed).
  • Defect tickets with status, remediation evidence, or risk acceptance sign-off.

Common exam/audit questions and hangups

Auditors tend to ask:

  • “Show me the SDLC you claim to use and where security is embedded.” (VDA ISA Catalog v6.0)
  • “Which projects handle automotive data, and how do you ensure they follow the SDLC?” (VDA ISA Catalog v6.0)
  • “Pick a recent release: prove requirements analysis, code review, and security testing happened.” (VDA ISA Catalog v6.0)
  • “How do you handle exceptions or urgent fixes?”
  • “What happens when security testing finds issues? Who decides to ship?”

Hangups that stall audits:

  • In-scope definition is fuzzy, so sampling expands.
  • Evidence is scattered across tools with no release-to-artifact mapping.
  • “Security testing” exists informally, but there is no consistent gate or retention of outputs.

Frequent implementation mistakes (and how to avoid them)

  1. Mistake: Writing a long SDLC policy with no gates.
    Fix: convert requirements into release criteria and repo controls (branch protection, required checks).

  2. Mistake: Treating SDLC as AppSec’s job.
    Fix: make Engineering accountable for executing steps; Security defines standards and verifies.

  3. Mistake: No linkage between releases and evidence.
    Fix: require a release record that links to ticket epics, PRs, and test reports.

  4. Mistake: “Security testing” means one annual test.
    Fix: define what runs per change/release (automated checks) and what runs periodically (deeper testing) without claiming more than you do. (VDA ISA Catalog v6.0)

  5. Mistake: Exceptions become the default path.
    Fix: require named approvers, documented rationale, and tracked follow-up work for every exception.

Risk implications (why auditors care)

An SDLC without security gates drives predictable failures: insecure auth changes, injection flaws, secrets exposure, and vulnerable dependencies. In automotive data contexts, these issues can create confidentiality risk and supply-chain propagation risk (your code ships into customer environments). VDA ISA 8.1.1 pushes you toward preventive controls that are cheaper to run continuously than incident response and retrofits. (VDA ISA Catalog v6.0)

Practical 30/60/90-day execution plan

First 30 days (foundation and scoping)

  • Confirm the scope rule for “handling automotive data,” and publish the in-scope system/repo list.
  • Publish the SDLC standard in a one-page gate format mapped to your delivery stages. (VDA ISA Catalog v6.0)
  • Turn on or tighten repo controls for code review in in-scope repos (branch protections, required reviewers).
  • Define minimum security testing evidence to attach per release (even if manual at first). (VDA ISA Catalog v6.0)

Days 31–60 (make it enforceable in workflows)

  • Update ticket templates to capture security requirements and acceptance criteria.
  • Embed secure coding standards links into PR templates and repo docs.
  • Establish a release checklist for in-scope applications that includes the four required SDLC elements. (VDA ISA Catalog v6.0)
  • Stand up a central evidence register (in Daydream or your GRC repository) with owners and due dates per artifact.

Days 61–90 (prove it with sampling and close gaps)

  • Run an internal “mock audit” on a small sample of releases; verify you can reconstruct end-to-end evidence quickly.
  • Tune the exception process and require formal sign-off for deviations.
  • Expand automation where feasible (CI checks, scan exports, auto-linking PRs to tickets).
  • Convert recurring findings into backlog items (e.g., missing acceptance criteria, inconsistent scan retention).

Frequently Asked Questions

Does VDA ISA 8.1.1 apply to internal tools, or only customer-facing products?

It applies to software and systems development projects handling automotive data, regardless of whether the tool is internal or customer-facing. Use the data-handling criterion to decide scope, then apply the SDLC gates consistently. (VDA ISA Catalog v6.0)

What is the minimum set of SDLC components we must show for this requirement?

You must be able to show security requirements analysis, secure coding standards, code review, and security testing for in-scope development. Auditors typically expect both a defined process and project-level evidence that it happened. (VDA ISA Catalog v6.0)

If a third party develops code for us, are we still on the hook?

Yes in practice, because you still need an SDLC wrapper around outsourced development: define security requirements, require code review and testing evidence as acceptance criteria, and retain artifacts for releases you accept into your environment. (VDA ISA Catalog v6.0)

Can we pass with a policy and training, even if engineering tools don’t enforce gates?

A policy and training help, but audits usually turn on whether you can prove consistent execution. If you cannot enforce gates technically yet, use release checklists and approvals as an interim control, and retain the evidence per release. (VDA ISA Catalog v6.0)

What evidence is strongest for “code review”?

Repository-native evidence is strongest: PR records showing reviewer identity, comments, and approvals, plus branch protection settings showing review is required for merges in in-scope repos.

How do we avoid drowning teams in paperwork?

Put evidence where work already happens: tickets for requirements, PRs for review, CI/CD logs for testing, and release records that link everything. A GRC workflow (including Daydream) helps by standardizing evidence requests and retention without asking engineers for bespoke documents.

Footnotes

  1. VDA ISA Catalog v6.0

Frequently Asked Questions

Does VDA ISA 8.1.1 apply to internal tools, or only customer-facing products?

It applies to software and systems development projects handling automotive data, regardless of whether the tool is internal or customer-facing. Use the data-handling criterion to decide scope, then apply the SDLC gates consistently. (VDA ISA Catalog v6.0)

What is the minimum set of SDLC components we must show for this requirement?

You must be able to show security requirements analysis, secure coding standards, code review, and security testing for in-scope development. Auditors typically expect both a defined process and project-level evidence that it happened. (VDA ISA Catalog v6.0)

If a third party develops code for us, are we still on the hook?

Yes in practice, because you still need an SDLC wrapper around outsourced development: define security requirements, require code review and testing evidence as acceptance criteria, and retain artifacts for releases you accept into your environment. (VDA ISA Catalog v6.0)

Can we pass with a policy and training, even if engineering tools don’t enforce gates?

A policy and training help, but audits usually turn on whether you can prove consistent execution. If you cannot enforce gates technically yet, use release checklists and approvals as an interim control, and retain the evidence per release. (VDA ISA Catalog v6.0)

What evidence is strongest for “code review”?

Repository-native evidence is strongest: PR records showing reviewer identity, comments, and approvals, plus branch protection settings showing review is required for merges in in-scope repos.

How do we avoid drowning teams in paperwork?

Put evidence where work already happens: tickets for requirements, PRs for review, CI/CD logs for testing, and release records that link everything. A GRC workflow (including Daydream) helps by standardizing evidence requests and retention without asking engineers for bespoke documents.

Authoritative Sources

Operationalize this requirement

Map requirement text to controls, owners, evidence, and review workflows inside Daydream.

See Daydream
TISAX Secure Development Lifecycle: Implementation Guide | Daydream