Testing and Assurance

12. Testing and Assurance #

12.1 Testing strategy and environments #

Testing must be risk‑based and executed in environments that closely mirror production. Staging should match production controls (identity, WAF, logging routes, secrets, feature flags) except for data sets. Production Restricted Data must not be used in non‑production; when realistic data is required, use anonymized or de‑identified datasets. Test data and credentials must be treated as sensitive and rotated on a defined cadence.

12.2 Security testing (pre‑release) #

Before any production release, the build must complete the following, with results included in the release evidence:

  • Static Application Security Testing (SAST): run on changed code to detect insecure patterns and taint flows; findings triaged and addressed per SLAs.

  • Software Composition Analysis (SCA) and SBOM: identify vulnerable libraries and produce an SBOM for the exact build; replace/patch High/Critical components or seek a waiver with compensating controls. PENDING: SBOM format

  • Secrets scanning: scan source, pipeline configs, and artifacts to detect embedded secrets; rotate any exposed credentials and verify redaction in logs.

  • Dynamic Application Security Testing (DAST): exercise internet‑facing web and API endpoints in staging; include auth flows, authorization checks, file uploads, and administrative functions. PENDING: Fail‑gate thresholds for Critical/High/Medium

  • Infrastructure and container scans (as applicable): scan IaC templates and container images for misconfigurations and known vulnerabilities before use.

12.3 Functional, integration, and regression testing #

Changes must pass functional and integration tests that cover business acceptance criteria, including role‑based access behavior, workflow branching, and error handling. Regression testing should cover critical user journeys (e.g., claim creation/update, file upload and scan, status change notifications). Test results must be attached or linked in the release evidence.

12.4 Performance and resilience tests (as applicable) #

For services with throughput or latency requirements, run targeted performance tests in staging to confirm acceptable response times under expected load. Where feasible, test graceful degradation and retry behavior. For components with asynchronous processing (e.g., image scanning), validate queue back‑pressure and failure handling. PENDING: Performance targets or thresholds, if any

12.5 Accessibility and usability (as applicable) #

If user‑facing portals are in scope, validate key flows for accessibility conformance (e.g., WCAG 2.x level to be confirmed) and include any known gaps and remediation plans. PENDING: Accessibility standard/level if required

12.6 User Acceptance Testing (UAT) #

UAT must validate that business acceptance criteria are met with production‑like roles and data scenarios. Evidence includes executed test cases, screenshots/recordings of key flows, and a signed acceptance checklist by the Product/Process Owner. PENDING: UAT evidence minimums; template link

12.7 Independent penetration testing #

An independent penetration test is required at least annually and after material architectural changes to externally exposed components. Scope must include authentication/authorization, admin consoles, file upload endpoints, and critical APIs. The vendor must provide a report (or executive summary) and a remediation plan within 30 days, with verification of fixes for Critical/High issues. PENDING: Report sharing scope — Executive Summary/Full Report; verification evidence requirements

12.8 Remediation, waivers, and POA&M #

All findings from SAST/SCA/DAST, IaC/container scans, functional testing, UAT, and pentests must be triaged and tracked to closure with owners and due dates in a POA&M. Proposed remediation SLAs are: Critical 7 days, High 30 days, Medium 60 days, Low 90 days; exploited issues require immediate containment with an out‑of‑band fix as soon as feasible. Any waiver (e.g., temporary WAF rule in lieu of code fix) must specify compensating controls, an owner, and an expiry, and be approved by the ISO; waivers are reviewed at Post‑Go‑Live. PENDING: Confirm or adjust SLAs; notification thresholds for overdue items

12.9 Evidence package (submission) #

At the Pre‑Production gate, submit or link a concise evidence bundle:

  • SBOM for the release build. PENDING: SPDX/CycloneDX; delivery location

  • SAST/SCA/DAST summaries; secrets‑scan confirmation; IaC/container scan results (if applicable).

  • Open findings with POA&M and any ISO‑approved waivers.

  • UAT results and acceptance sign‑off.

  • Any performance or accessibility test outputs (if applicable).

  • Change log and deployment/rollback plans (see Section 11.9 for bundle expectations).

12.10 Auditability and retention #

Testing artifacts (reports/summaries, screenshots, logs, tickets) must be retained with the release record per the Records Management and Retention Schedule and made available to CLD on request. Where raw artifacts contain sensitive details (e.g., exploit payloads, hostnames), summaries may be provided to CLD, with full artifacts available under NDA during audits. PENDING: Repository/path for storing testing artifacts