Metrics and Reporting

24. Metrics and Reporting #

24.1 Purpose #

Define a concise set of SDLC key performance indicators (KPIs) and reporting cadences so CLD can verify control effectiveness, spot trends early, and drive continuous improvement across vendors and releases.

24.2 KPI set (minimum) #

Track the following metrics per Application and aggregate quarterly across all vendor deliveries:

  • Release quality and flow

    • Release frequency (per month/quarter).

    • Change failure rate (% of releases requiring rollback, hotfix, or causing Sev1/Sev2 within 7 days).

    • Mean time to restore (MTTR) for release‑related incidents.

  • Security assurance

    • Gate pass rate (releases that passed all required gates on first attempt).

    • % releases with complete evidence bundles (SBOM + SAST/SCA/DAST + UAT + rollback plan).

    • Vulnerabilities opened/closed by severity (Critical/High/Medium/Low) and mean time to remediate (MTTR) by severity.

    • SLA adherence for remediation (breaches by count and aging).

    • Waivers in force (count, age, compensating controls, time to closure).

  • Logging and monitoring

    • SIEM coverage (% of required event families present and parseable).

    • Log ingestion health (dropped/malformed events, time skew).

    • Alert efficacy (true positive rate for top detections; mean time to acknowledge/respond for Sev1/Sev2).

  • Availability and recovery

    • SLO/uptime attainment (if defined).

    • RTO/RPO test results (pass/fail, actuals vs targets).

    • Backup/restore test cadence adherence and issues found/fixed.

  • Privacy operations

    • DPIAs completed (count, turnaround time).

    • Subject‑rights requests supported by the Application (count, on‑time completion rate).

    • Retention/deletion jobs executed on schedule (success/fail; exceptions).

PENDING: Any additional business‑specific KPIs (e.g., claim processing latency targets)

24.3 Reporting cadences #

  • Monthly (internal operations snapshot): vulnerabilities and remediation SLAs, waivers aging, SIEM coverage/ingest health, notable alerts, backup/restore evidence updates. PENDING: Confirm monthly cadence and recipients

  • Quarterly (executive summary): release flow and quality, security assurance metrics, availability/DR results, privacy operations highlights, trend graphs, top risks, and action plan status.

  • Per‑release: gate evidence and pass/fail, change class, waivers, and PIR outcomes (Section 18–19).

24.4 Responsibilities #

  • Vendors: produce per‑release evidence, monthly vulnerability/waiver status, and quarterly assurance summaries; provide raw/summary data on request.

  • ISO: compile cross‑vendor quarterly report; highlight risks, SLA breaches, recurring issues; propose corrective actions.

  • Product/Process Owners and IT Operations: review trends relevant to business impact and operational stability; sponsor improvements.

24.5 Format and delivery #

Use concise, consistent formats (dashboard or 1–2 page PDF per Application for monthly, consolidated slide/PDF for quarterly). Include a one‑page executive summary (RAG status, top three risks, top three actions) and an appendix with metric definitions. PENDING: Preferred tooling (e.g., shared dashboard, PDF), recipients, and due dates (e.g., M+10 days; Q+15 days)

24.6 Thresholds and triggers #

Define thresholds that trigger escalation or corrective action:

  • Any Critical vulnerability open > PENDING: __ hours/days without containment.

  • Remediation SLA breach rate > PENDING: __% in a quarter.

  • Gate evidence completeness < PENDING: __% over last [__] releases.

  • Two or more Sev1 incidents linked to releases in a quarter.

  • SIEM ingestion gaps > PENDING: __% of required events for > [__] days.

    Escalations go to the Vendor Manager and ISO; persistent breaches may invoke contract levers.

24.7 Data sources and integrity #

Metrics must come from authoritative systems (ticketing/CI/CD/scanners/SIEM/monitoring) and be reproducible. Any manual compilation must cite sources and assumptions. Vendors must retain underlying data for audit for at least PENDING: __ months/years .

24.8 Continuous improvement #

Quarterly, the ISO will review KPI trends and recommend adjustments to baselines (e.g., fail‑gate thresholds, upload limits, rate‑limits), training focus areas, and vendor action plans. Material changes feed the Post‑Deployment Reviews and inform the policy’s next revision cycle.