ISO 27001:2022 Your ISMS Dashboard Reports Activity. Your Auditor Will Ask About Effectiveness. These are Different Questions with Difference Evidence Standards

SGRII Insights  ·  ISO 27001:2022  ·  2026

Your ISMS Dashboard Reports Activity. Your Auditor Will Ask About Effectiveness. These Are Different Questions with Different Evidence Standards.

Number of awareness training completions: 247. Number of vulnerabilities scanned: 1,200. Number of access reviews completed: 4. These are activity metrics. They tell you the system is busy. They do not tell you the system is working.

S

SGRII Performance & Digital Solutions

ISMS Practice  ·  April 2026  ·  11 min read

I

SGRII Pillar Lens

Improvement

Improvement requires measurement that connects to outcomes, not activities. The fifth SGRII pillar demands that tracking mechanisms drive measurable progress — not measurable motion. An ISMS that reports training completions, scan counts, and review schedules is measuring effort. An ISMS that reports remediation rates, detection times, and access rights accuracy is measuring security. The standard requires the second. The industry delivers the first.

The Activity Metric Trap

Activity metrics answer the question: did we do the thing? Effectiveness metrics answer the question: did the thing work? In an ISMS, the difference is not semantic. It is the difference between a security programme that generates reports and a security programme that generates security.

Common activity metrics in ISMS dashboards: number of awareness training completions, number of phishing simulations sent, number of vulnerability scans completed, number of incidents logged, number of access reviews conducted, number of policies reviewed. Every one of these measures whether an activity occurred. None measures whether the activity achieved its security objective.

The corresponding effectiveness metrics: percentage of employees who correctly identified phishing simulations (not how many simulations were sent), percentage of critical vulnerabilities remediated within SLA (not how many were scanned), mean time to detect and contain security incidents (not how many were logged), percentage of access reviews that identified and revoked excessive permissions (not how many reviews occurred). These metrics tell you whether the ISMS is producing security outcomes — not whether it is producing activity.

Effectiveness Metrics by Control Area

Vulnerability management (A.8.8): Activity metric — number of scans completed per quarter. Effectiveness metric — percentage of critical and high vulnerabilities remediated within defined SLA; trend in mean time to remediate; number of vulnerabilities that exceeded SLA without documented exception.

Incident management (A.5.24–5.28): Activity metric — number of incidents logged. Effectiveness metric — mean time to detect (MTTD), mean time to contain (MTTC), percentage of incidents with completed post-incident review, percentage of post-incident recommendations implemented.

Access control (A.5.15, A.8.2–8.5): Activity metric — number of access reviews completed. Effectiveness metric — percentage of reviews that identified excessive permissions, number of dormant accounts identified and disabled, privileged account to total account ratio trend.

Awareness and competence (A.6.3, Clause 7.2–7.3): Activity metric — training completion rate. Effectiveness metric — phishing simulation click rate (trend over time), percentage of employees who correctly followed incident reporting procedure when tested, security competence assessment scores for security-critical roles.

Supplier security (A.5.19–5.22): Activity metric — number of supplier assessments completed. Effectiveness metric — percentage of critical suppliers meeting contractual security requirements at last assessment, number of supplier security nonconformities identified and resolved, trend in supplier security maturity.

Connecting Metrics to Management Review Decisions

A metric that is collected, charted, and filed is overhead. A metric that drives a management review decision is governance. The connection between measurement and decision is where Clause 9.1 (monitoring and measurement) meets Clause 9.3 (management review).

Each effectiveness metric should have a defined threshold that triggers a management review decision. If the vulnerability remediation SLA compliance rate drops below 85%, the management review should address it — with a decision that has an owner, a deadline, and a resource implication. If the phishing simulation click rate increases quarter-on-quarter, the management review should evaluate whether the awareness programme needs restructuring.

The test for any metric in the ISMS dashboard: if this metric deteriorated significantly, would it change a decision? If yes, it is an effectiveness metric. If it would be ‘noted’ in the management review with no action, it is an activity metric. Activity metrics are not useless — they track operational throughput. But they do not measure whether the ISMS is producing security outcomes. And it is security outcomes that the standard requires the organisation to evaluate.

A dashboard full of green status indicators that measures activities, not effectiveness, is not evidence of a secure organisation. It is evidence of a busy one. The auditor will ask: how do you know your controls are working? ‘We completed all scheduled activities’ is not the answer.

The SGRII Position

The SGRII ISMS Measurement Framework distinguishes effectiveness metrics from activity metrics for each principal control area, with defined data collection requirements, analysis responsibilities, reporting cycles, and management review decision thresholds. The Excel Template Pack includes KPI dashboards structured around effectiveness measurement — not activity tracking.

Every metric in the framework connects to either a control effectiveness objective (tracing to the SoA), an information security objective (tracing to Clause 6.2), or a risk treatment effectiveness indicator (tracing to Clause 6.1). If a metric does not connect to any of these, it is operational management data — useful but not an ISMS performance indicator.

THE SGRII ISO 27001:2022 ISMS FRAMEWORK

The SGRII ISMS Framework measures effectiveness, not activity. KPI dashboards are structured around security outcomes per control area, with defined thresholds that connect to management review decisions.

Includes: ISMS Measurement Framework with effectiveness metrics per control area, KPI Dashboard templates (vulnerability management, incident management, access control, awareness, supplier security), Management Review Agenda with metric-to-decision linkage.

GET THE ISMS FRAMEWORK — FROM $149 ›

Join the Conversation

Take your three most-reported ISMS metrics. For each one: does it tell you whether the control is working — or whether the activity occurred? If all three are activity metrics, what would the effectiveness equivalent be — and do you collect that data?

Practitioner perspectives that challenge or extend this analysis are particularly welcome. Leave your comment below — the SGRII team responds to every substantive contribution.

Leave a Reply

Discover more from SGRII Performance & Digital Solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading