Skip to content

ISO/IEC 42001:2023 Guide

What is ISO 42001?

ISO/IEC 42001:2023 is the world's first international standard for AI Management Systems (AIMS). Published in December 2023, it provides a framework for organizations to develop, deploy, and continuously improve AI systems responsibly and ethically.

Official Name: ISO/IEC 42001:2023 - Information technology — Artificial intelligence — Management system

Developed By: ISO/IEC JTC 1/SC 42 (Artificial Intelligence subcommittee)

Publication Date: December 15, 2023

Standard Type: Certifiable management system standard (like ISO 9001, ISO 27001)


Why ISO 42001 Matters for AI Governance

1. The Only Certifiable AI Governance Standard

ISO 42001 is currently the only internationally recognised standard that organizations can achieve formal certification for in AI governance.

FrameworkCertification Available?Auditing Body
ISO 42001✅ YesAccredited certification bodies (BSI, SGS, Bureau Veritas, etc.)
NIST AI RMF❌ NoSelf-assessment only
EU AI Act⚠️ PartialConformity assessment (high-risk systems only)

This matters because:

  • Certification provides third-party validation of your AI governance practices
  • Demonstrates due diligence to customers, regulators, and investors
  • May become table stakes for enterprise AI procurement (similar to ISO 27001 for security)

2. Designed Specifically for AI

Unlike generic quality or IT management standards, ISO 42001 addresses AI-specific risks:

  • Bias and fairness in AI decision-making
  • Explainability and transparency requirements
  • Data quality for training and operation
  • Third-party AI components (vendor risk management)
  • Model drift and degradation over time
  • Incident response for AI failures

Comparison:

StandardFocusAI-Specific?
ISO 9001Quality management❌ Generic (any industry)
ISO 27001Information security❌ IT security (not AI-specific)
ISO 42001AI management system✅ Purpose-built for AI

3. Harmonised with Other Standards

ISO 42001 follows the ISO Harmonized Structure (Annex SL), making it compatible with other management systems:

  • ISO 9001 (Quality)
  • ISO 27001 (Information Security)
  • ISO 27701 (Privacy)
  • ISO 37001 (Anti-Bribery)

Benefit: If you already have ISO 9001 or ISO 27001, many processes (management review, internal audit, document control) can be shared across standards, reducing certification burden.

4. Regulatory Alignment

ISO 42001 is increasingly referenced in AI regulations:

RegulationISO 42001 Status
EU AI ActRecognised as presumption of conformity for risk management (Article 40)
UK AI RegulationGovernment encourages ISO 42001 for demonstrating compliance with AI principles
Singapore Model AI Governance FrameworkISO 42001 cited as implementation guidance

Expected trend: Regulatory bodies will increasingly accept ISO 42001 certification as evidence of compliance with AI-specific legal requirements.


How SignalBreak Maps to ISO 42001

SignalBreak provides automated evidence generation for 10 key ISO 42001 clauses. Your evidence packs demonstrate compliance with:

ISO ClauseRequirementSignalBreak EvidenceCoverage
4.1Understanding contextProvider landscape analysis showing AI ecosystem dependencies🟢 Full
4.2Interested partiesWorkflow owners and business context captured🟡 Partial
6.1.3Risk assessmentDecision Readiness Scorecard with transparent methodology🟢 Full
6.2.2AI system inventoryWorkflow registry with complete AI system documentation🟢 Full
8.2Risk treatmentPrioritised recommendations with risk mitigation strategies🟢 Full
8.3Third-party relationshipsProvider concentration analysis, continuous monitoring🟢 Full
8.4Impact assessmentBusiness impact quantification per finding🟢 Full
9.1Monitoring & measurementContinuous signal monitoring (47 sources, 21 providers)🟢 Full
9.2Internal auditEvidence Pack generation as periodic self-assessment🟡 Partial
10.2Continual improvementScore trajectory, projected improvement from recommendations🟢 Full

Coverage Summary:

  • Full coverage: 8 of 10 clauses (80%)
  • Partial coverage: 2 of 10 clauses (20%)
  • No coverage (gaps): 0 clauses

ISO 42001 Clause Details

Clause 4.1: Understanding the Organization and Its Context

Requirement: Determine external and internal issues relevant to the AIMS, including the organization's purpose and how it affects achieving intended AI outcomes.

SignalBreak Evidence:

  • Provider landscape analysis (Evidence Pack p.6)
    • External context: AI provider ecosystem, industry dependencies, regulatory environment
    • Internal context: Workflow dependencies, technical constraints, business criticality

Example from Evidence Pack:

"SignalBreak continuously monitors 47 sources across 21 AI providers, providing comprehensive visibility into the external AI ecosystem. Internal context is captured through workflow documentation including business criticality (4 Critical, 2 High, 2 Medium), ownership assignment, and technical dependencies."

Audit Readiness:Fully evidenced — Evidence Pack Section 3 (Provider Analysis) demonstrates both external (provider ecosystem) and internal (workflow criticality) context.


Clause 4.2: Understanding Needs and Expectations of Interested Parties

Requirement: Identify interested parties relevant to the AIMS, determine their requirements, and decide which requirements will be addressed.

SignalBreak Evidence:

  • Workflow owners identified in registry (Evidence Pack Appendix B)
  • Business stakeholders implied through criticality ratings
  • Implicit interested parties: Customers (affected by workflow availability), compliance teams (governance requirements)

Gap: SignalBreak captures workflow owners but does not explicitly document all interested parties (e.g., external customers, regulators, partners) or their specific requirements.

Audit Readiness: 🟡 Partial — Workflow owners are documented, but full interested party registry with explicit requirements needs supplementary documentation.

Remediation for Certification:

  1. Create a stakeholder register listing all interested parties
  2. Document their specific AI-related requirements (e.g., customers require 99.5% uptime, regulators require explainability)
  3. Cross-reference to workflows (which workflows serve which stakeholders)
  4. Use SignalBreak workflow owner field as starting point

Estimated effort: 2-4 hours (small organisation), 1-2 weeks (enterprise)


Clause 6.1.3: Risk Assessment

Requirement: Define and apply an AI risk assessment process that identifies, analyses, and evaluates risks.

SignalBreak Evidence:

  • Decision Readiness Scorecard (Evidence Pack p.2-3)
  • Risk methodology documentation (Evidence Pack Appendix C)
  • Five-dimension assessment framework:
    1. Workflow Coverage (25%)
    2. Fallback Readiness (25%)
    3. Provider Diversity (20%)
    4. Signal Response (15%)
    5. Governance Maturity (15%)
  • Transparent scoring with weighted components

Example from Evidence Pack:

"The SignalBreak Decision Readiness Framework v2.1 provides a structured risk assessment across five dimensions. Methodology is fully documented and reproducible. Current score: 67/100 (Amber), indicating moderate AI dependency risk requiring mitigation strategies."

Audit Readiness:Fully evidenced — ISO 42001 requires a defined and applied risk assessment process. SignalBreak's framework satisfies both requirements with documented methodology and regular application (evidence pack generation).

Key audit questions SignalBreak answers:

  • ❓ "How do you identify AI risks?" → Scenario-based risk modelling with provider failure analysis
  • ❓ "How do you analyse risks?" → Five-dimension weighted scoring with objective metrics
  • ❓ "How do you evaluate risks?" → RAG status (Red/Amber/Green) with defined thresholds

Clause 6.2.2: AI System Inventory

Requirement: Establish and maintain an inventory of AI systems within the scope of the AIMS.

SignalBreak Evidence:

  • Workflow registry with complete inventory (Evidence Pack Appendix B)
  • Each workflow documented with:
    • Unique ID
    • Workflow name
    • AI capability type (Text Generation, Image Analysis, etc.)
    • Provider bindings (primary + fallbacks)
    • Criticality level (Critical, High, Medium, Low)
    • Assigned owner
    • Dependency mapping

Example from Evidence Pack:

"SignalBreak maintains a comprehensive workflow registry documenting all AI systems in scope. The current inventory includes 8 AI workflows across 7 providers, with full dependency mapping, ownership assignment, and criticality classification."

Audit Readiness:Fully evidenced — Workflow registry satisfies ISO 42001 Clause 6.2.2 inventory requirements. Each workflow record includes mandatory fields for certification.

Best Practice: Ensure 100% of production AI systems are registered in SignalBreak. Unregistered workflows create inventory gaps that auditors will flag as non-conformities.

Verification steps for audit:

  1. Cross-reference workflow registry against production deployment manifests
  2. Confirm no "shadow AI" (undocumented AI usage)
  3. Verify all workflows have assigned owners (no orphaned systems)

Clause 8.2: AI Risk Treatment

Requirement: Define and apply an AI risk treatment process to select appropriate risk treatment options and determine controls.

SignalBreak Evidence:

  • Prioritised recommendations (Evidence Pack p.5)
  • Risk mitigation strategies per finding
  • Owner assignment and timelines
  • Success criteria for risk reduction
  • Projected score improvement from recommended actions

Example from Evidence Pack:

"Each Evidence Pack includes prioritised recommendations with explicit risk reduction targets. Recommendations include owner assignment, timeline, success criteria, and estimated score improvement. Current recommendations project +18 point improvement if implemented within stated timelines."

Audit Readiness:Fully evidenced — Evidence Pack recommendations demonstrate risk treatment process with:

  • Identified controls: Specific actions (e.g., "Add fallback provider to Customer Support Summarisation workflow")
  • Treatment options: Accept, mitigate, transfer, or avoid (implied through criticality-based prioritisation)
  • Implementation plan: Timeline, owner, success criteria

Key audit questions SignalBreak answers:

  • ❓ "How do you select risk treatments?" → Prioritised by business impact and risk score contribution
  • ❓ "Who is responsible for risk treatment?" → Owners assigned in recommendations
  • ❓ "How do you verify effectiveness?" → Score trajectory tracking shows improvement from implemented recommendations

Clause 8.3: Third-Party Relationships

Requirement: Determine and apply criteria for the selection, evaluation, and monitoring of third parties providing AI components.

SignalBreak Evidence:

  • Provider concentration analysis (Evidence Pack p.6)
  • Provider health signals tracked continuously (Evidence Pack p.7)
  • Risk-based provider assessment (Tier 1-4 classification)
  • Continuous monitoring (5-minute polling cycle for status pages)

Example from Evidence Pack:

"SignalBreak monitors all third-party AI providers with continuous signal detection. Provider concentration analysis identifies supply chain concentration risks, with current maximum concentration at 25% (OpenAI). Elevated risk providers are flagged with specific recommendations."

Audit Readiness:Fully evidenced — SignalBreak demonstrates third-party management across:

  • Selection criteria: Provider tiers (Tier 1 = enterprise-grade, 99.9%+ SLA)
  • Evaluation: Health monitoring, incident tracking, availability metrics
  • Monitoring: Continuous signal detection from 47 sources across 21 providers

ISO 42001 vs Traditional Vendor Management:

Traditional Vendor ManagementISO 42001 AI Vendor ManagementSignalBreak Evidence
Annual vendor reviewContinuous monitoring5-minute status polling
Static risk assessmentDynamic risk scoringReal-time provider health
Contract-based SLAsActual availability trackingHistorical uptime metrics
Manual incident trackingAutomated signal detection47 sources monitored

Certification tip: Combine SignalBreak evidence with contractual documentation (SOW, SLA, DPA) for complete third-party evidence package.


Clause 8.4: AI System Impact Assessment

Requirement: Determine whether an AI system impact assessment is required and conduct such assessment where required.

SignalBreak Evidence:

  • Business impact quantification per finding (Evidence Pack p.4)
  • Downtime hours and cost estimates
  • Scenario-based impact modelling (what-if analysis for provider failures)
  • Likelihood × Impact matrices

Example from Evidence Pack:

"Each finding includes business impact quantification with transparent methodology. Impact estimates include downtime hours (24-72h), cost ranges (£15k-50k), and likelihood assessment (Medium: 2-4 incidents/year based on historical provider availability). Calculations based on workflow transaction volume, average value, and customer sensitivity factors."

Audit Readiness:Fully evidenced — Impact assessments satisfy ISO 42001 requirements for:

  • Determining necessity: Critical workflows automatically flagged for impact assessment
  • Conducting assessment: Quantified business impact (downtime, cost, customer satisfaction)
  • Documentation: Evidence Pack findings section provides auditable impact records

ISO 42001 Impact Assessment Triggers:

TriggerSignalBreak Detection
High-risk AI systemWorkflows with "Critical" or "High" criticality
New AI deploymentNew workflow creation (recommended: generate evidence pack before production)
Significant changesProvider binding changes, model upgrades (tracked in change log)
IncidentsProvider outages, model degradation (signals detected automatically)

Clause 9.1: Monitoring, Measurement, Analysis and Evaluation

Requirement: Determine what needs to be monitored and measured, methods for monitoring and measurement, when monitoring and measurement shall be performed, and when results shall be analysed and evaluated.

SignalBreak Evidence:

  • Continuous provider monitoring (47 sources, 21 providers)
  • Signal detection with 7-day reporting cycle
  • Trend analysis with historical comparison
  • Defined metrics: Decision Readiness Score (0-100), provider availability %, incident count

Example from Evidence Pack:

"SignalBreak provides continuous monitoring across 47 provider sources covering 21 AI providers. Signals are detected, classified, and correlated to customer workflows automatically. Evidence Packs are generated on configurable cadence (recommended: monthly) with trend analysis comparing to previous assessment periods."

Audit Readiness:Fully evidenced — SignalBreak demonstrates systematic monitoring:

ISO 42001 QuestionSignalBreak Answer
What to monitor?AI provider health, workflow availability, risk score
Methods for monitoring?Statuspage.io polling, signal classification, risk calculation
When to monitor?Continuous (5-minute polling), periodic (monthly evidence packs)
When to analyse?Real-time (signal detection), monthly (evidence pack generation)

Monitoring Cadence Recommendations:

Organisation SizeEvidence Pack FrequencyRationale
Startup (<50 employees)QuarterlyLower AI complexity, limited resources
SME (50-500 employees)MonthlyGrowing AI footprint, board reporting
Enterprise (500+ employees)Monthly + on-demandHigh AI dependency, regulatory requirements
Regulated (FS, Healthcare)Monthly minimumCompliance evidence, audit preparation

Clause 9.2: Internal Audit

Requirement: Conduct internal audits at planned intervals to provide information on whether the AIMS conforms to requirements and is effectively implemented.

SignalBreak Evidence:

  • Evidence Pack generation as periodic self-assessment
  • Structured methodology ensures consistent evaluation
  • Historical comparison shows compliance trends

Gap: SignalBreak provides assessment evidence but does not replace formal internal audit procedures. ISO 42001 requires:

  • Audit plan (scope, schedule, audit criteria)
  • Competent auditors (internal or external)
  • Audit reports (findings, corrective actions)
  • Management review of audit results

Audit Readiness: 🟡 Partial — Evidence Pack supports audit activities but is not itself an audit.

Remediation for Certification:

  1. Establish formal audit schedule

    • Example: Quarterly internal audits covering all ISO 42001 clauses
    • Rotate focus areas (Q1: Risk Assessment, Q2: Third-Party, Q3: Monitoring, Q4: Improvement)
  2. Use Evidence Pack as audit input

    • Provide most recent Evidence Pack to internal auditor before audit
    • Auditor uses SignalBreak data as objective evidence
    • Auditor verifies processes beyond SignalBreak scope (policies, training, management review)
  3. Document audit procedure

    • Create "Internal Audit Procedure for AI Management System"
    • Reference SignalBreak Evidence Pack as key input (but not sole evidence)
    • Define auditor competence requirements

Estimated effort: 1-2 days to create audit procedure, 2-4 hours per audit execution

Tip: Many organisations use external consultants for internal audits (especially first-time ISO 42001). SignalBreak evidence packs reduce consultant hours by 40-60% (data gathering already done).


Clause 10.2: Continual Improvement

Requirement: Continually improve the suitability, adequacy, and effectiveness of the AIMS.

SignalBreak Evidence:

  • Score trajectory showing improvement over time (Evidence Pack p.2)
  • Recommendations with projected improvement
  • Historical comparison to previous assessments
  • Trend analysis (improving, stable, or declining)

Example from Evidence Pack:

"SignalBreak tracks Decision Readiness Score over time, providing visibility into governance improvement trajectory. Current score: 67/100 (up from 52 last month, +15 points). Each Evidence Pack includes recommendations with projected score improvement. Current recommendations estimate +18 point improvement if implemented within stated timelines."

Audit Readiness:Fully evidenced — Continual improvement demonstrated through:

  • Measurement: Score tracking over time
  • Analysis: Gap analysis (current vs. target state)
  • Action: Prioritised recommendations
  • Verification: Next evidence pack shows whether recommendations were implemented

Continual Improvement Cycle with SignalBreak:

Month 1: Generate Evidence Pack → Score: 52/100 (Red)

Month 1-2: Implement top 3 recommendations
  1. Add fallback to critical workflows (+8 points)
  2. Reduce OpenAI concentration (+5 points)
  3. Assign workflow owners (+2 points)

Month 3: Generate Evidence Pack → Score: 67/100 (Amber)

Month 3-4: Implement next 3 recommendations
  4. Enable human-in-loop for high-risk workflows (+3 points)
  5. Document incident response procedures (+4 points)
  6. Expand monitoring coverage (+2 points)

Month 5: Generate Evidence Pack → Score: 76/100 (Amber → Green threshold)

ISO 42001 Audit Evidence: Show auditor 3-6 months of evidence packs demonstrating score improvement from implemented recommendations. This proves systematic continual improvement process.


Scoring Methodology (ISO 42001 Perspective)

How SignalBreak Calculates Your ISO 42001 Readiness

SignalBreak does not assign a specific "ISO 42001 compliance score." Instead, it provides:

  1. Decision Readiness Score (0-100): Overall AI risk governance maturity
  2. Clause-by-clause evidence mapping: Shows which ISO 42001 requirements are satisfied
  3. Gap analysis: Identifies missing evidence for certification

Why no ISO 42001 score? ISO 42001 is a binary certification (you're either certified or not). There's no "75% compliant" status. SignalBreak instead focuses on evidence readiness — do you have the data and documentation an auditor needs?

Evidence Pack as Certification Input

Your evidence pack provides objective evidence for 8 of 10 key clauses. For certification, you'll also need:

ISO 42001 RequirementSignalBreak EvidenceAdditional Documentation Needed
Context of organisation (4.1)✅ Provider landscapeOrganisational strategic plan, SWOT analysis
Interested parties (4.2)🟡 Workflow ownersFull stakeholder register with requirements
Leadership (5.x)❌ Not in scopeTop management commitment, resource allocation
Planning (6.x)✅ Risk assessment, inventoryAI objectives, KPIs, risk treatment plan
Support (7.x)❌ Not in scopeCompetence records, training logs, documented procedures
Operation (8.x)✅ Risk treatment, third-party, impactOperational procedures, change management
Performance (9.x)✅ Monitoring, 🟡 auditFormal audit plan, management review records
Improvement (10.x)✅ Continual improvementCorrective action records, lessons learned

SignalBreak provides ~60% of certification evidence (the data-intensive clauses). You'll need supplementary policy and procedural documentation for the remaining 40%.


Control Categories and What They Assess

ISO 42001 organises requirements into 10 main sections (plus normative references and terms):

Section 4: Context of the Organisation

What it assesses: Does your organisation understand its internal and external context for AI use?

Key controls:

  • External issues (regulatory environment, market conditions, technology trends)
  • Internal issues (capabilities, resources, culture)
  • Interested parties (stakeholders affected by AI systems)
  • AIMS scope definition

SignalBreak evidence: ✅ External context via provider landscape 🟡 Internal context via workflow criticality 🟡 Interested parties via workflow owners (needs expansion)


Section 5: Leadership

What it assesses: Does top management demonstrate commitment to the AIMS?

Key controls:

  • Management commitment and policy
  • Organisational roles and responsibilities
  • Communication of AIMS importance

SignalBreak evidence: ❌ Not in scope — requires executive-level documentation (policy statements, role definitions)

Recommendation: SignalBreak evidence packs can support management review meetings (provide data for decision-making), but you must document management's commitment separately.


Section 6: Planning

What it assesses: How does the organisation plan to address AI risks and opportunities?

Key controls:

  • Risk and opportunity assessment (Clause 6.1.3)
  • AI objectives and planning to achieve them
  • AI system inventory (Clause 6.2.2)

SignalBreak evidence: ✅ Risk assessment (Decision Readiness Framework) ✅ AI inventory (Workflow registry) ❌ AI objectives (organisational goals like "reduce AI-related downtime by 50%")

Recommendation: Use SignalBreak score as basis for AI objectives. Example: "Achieve Green status (score <30) within 12 months" or "Reduce provider concentration below 25% by Q3."


Section 7: Support

What it assesses: Does the organisation provide resources, competence, and awareness for the AIMS?

Key controls:

  • Resources (people, infrastructure, budget)
  • Competence (training, skills)
  • Awareness (understanding of AIMS importance)
  • Communication (internal and external)
  • Documented information (document control)

SignalBreak evidence: ❌ Not in scope — requires HR and training documentation

Recommendation: Create AI governance training referencing SignalBreak processes (how to interpret evidence packs, how to respond to provider signals). Document training attendance for audit.


Section 8: Operation

What it assesses: How does the organisation implement and control AI-related processes?

Key controls:

  • Operational planning and control
  • AI risk treatment (Clause 8.2)
  • Third-party relationships (Clause 8.3)
  • Impact assessment (Clause 8.4)
  • AI system development and use
  • Human oversight
  • Data management
  • Information for AI system users

SignalBreak evidence: ✅ Risk treatment (recommendations with timelines) ✅ Third-party management (provider monitoring) ✅ Impact assessment (business impact quantification) ❌ Development processes, data management (needs separate documentation)

Recommendation: SignalBreak covers third-party AI (models as a service). For internally developed AI, you'll need separate documentation on development lifecycle, testing, validation, data governance.


Section 9: Performance Evaluation

What it assesses: How does the organisation monitor, measure, and improve the AIMS?

Key controls:

  • Monitoring and measurement (Clause 9.1)
  • Analysis and evaluation
  • Internal audit (Clause 9.2)
  • Management review

SignalBreak evidence: ✅ Monitoring (continuous signal detection) 🟡 Internal audit (evidence pack as audit input, not audit itself) ❌ Management review (executive-level meeting records)

Recommendation: Present evidence pack at quarterly management review meetings. Document review outcomes (decisions, resource allocation, corrective actions) separately.


Section 10: Improvement

What it assesses: How does the organisation drive continual improvement of the AIMS?

Key controls:

  • Continual improvement (Clause 10.2)
  • Nonconformity and corrective action
  • Preventive action

SignalBreak evidence: ✅ Continual improvement (score trajectory) ❌ Nonconformity management (incident records, root cause analysis)

Recommendation: When AI-related incidents occur (e.g., provider outage affecting production), reference SignalBreak signals in incident reports. Show how SignalBreak detected the issue and what corrective actions were taken.


How to Improve Your ISO 42001 Readiness

Step 1: Achieve Green Status (Score <30)

Your Decision Readiness Score is a proxy for AI governance maturity. Auditors will ask: "How do you know your AI management system is effective?"

Evidence: Show Green status with stable or improving trend.

Actions to improve score:

  1. Add fallback providers to all Critical workflows (fastest path to score reduction)
  2. Reduce provider concentration below 25% (spreads risk)
  3. Assign workflow owners (demonstrates accountability)
  4. Enable human-in-loop for high-risk workflows (reduces estimated downtime)

See Risk Scoring Methodology for detailed improvement strategies.


Step 2: Close Clause Gaps

Address the 2 partial coverage clauses before certification audit:

Gap 1: Clause 4.2 — Interested Parties

Current state: 🟡 Workflow owners documented, but not full stakeholder registry

Remediation: Create a stakeholder register documenting:

  • Customers affected by each workflow
  • Internal teams (compliance, legal, IT)
  • External partners (data processors, AI vendors)
  • Regulators (FCA, ICO, etc. depending on industry)
  • For each stakeholder: their AI-related requirements

Template:

StakeholderRelationshipAI-Related RequirementsAddressed How
Enterprise customersExternal99.5% SLA, explainabilityFallback providers, human-in-loop
Compliance teamInternalEvidence for audits, risk reportingMonthly evidence packs
OpenAIThird-partyAPI terms compliance, data residencyContract review, DPA
ICO (UK regulator)ExternalGDPR compliance for AI processingDPIA, transparency notices

Estimated effort: 4-8 hours


Gap 2: Clause 9.2 — Internal Audit

Current state: 🟡 Evidence pack supports audits, but no formal audit procedure

Remediation: Create an Internal Audit Procedure for AI Management System covering:

  1. Audit schedule: Quarterly audits covering all AIMS clauses
  2. Audit scope: Which AI systems, processes, locations
  3. Auditor competence: Requirements (ISO 42001 knowledge, AI understanding)
  4. Audit process:
    • Planning (review last evidence pack, identify focus areas)
    • Execution (interviews, document review, observation)
    • Reporting (findings, nonconformities, observations)
    • Follow-up (corrective actions)
  5. SignalBreak integration: Evidence pack as key input to audit

Sample audit plan:

QuarterFocus AreasSignalBreak Input
Q1Risk assessment, AI inventoryEvidence pack Section 2-3
Q2Third-party management, monitoringEvidence pack Section 3-4
Q3Impact assessment, risk treatmentEvidence pack Section 2, Recommendations
Q4Continual improvement, management reviewScore trajectory, trend analysis

Estimated effort: 1-2 days to create procedure, 4-6 hours per audit execution


Step 3: Establish Management Review Process

ISO 42001 Clause 5.3 requires top management to review the AIMS at planned intervals.

Recommendation: Quarterly management review meetings covering:

  1. AIMS performance:

    • Decision Readiness Score trend (from evidence packs)
    • Provider incidents and their business impact
    • Compliance status (progress toward Green)
  2. Internal audit results:

    • Findings from last internal audit
    • Corrective actions taken
  3. Changes affecting AIMS:

    • New AI systems deployed
    • Provider changes (new contracts, terminations)
    • Regulatory changes (new AI laws)
  4. Resource needs:

    • Budget for AI governance tools (SignalBreak subscription)
    • Staff for AI risk management
    • Training requirements
  5. Improvement opportunities:

    • Recommendations from evidence pack
    • Lessons learned from incidents

Output: Management review minutes documenting decisions, actions, and resource allocation.

SignalBreak evidence pack provides agenda items 1, 2 (via audit input), and 5.


Step 4: Document Policies and Procedures

ISO 42001 requires documented information for key processes. At minimum, create:

DocumentPurposeSignalBreak Reference
AI Governance PolicyTop-level commitment to responsible AIReference Decision Readiness Framework as risk methodology
AI System Lifecycle ProcedureHow AI systems are developed/procuredReference workflow registration process
Third-Party AI Management ProcedureVendor selection, monitoring, offboardingReference provider monitoring (Evidence Pack Section 3-4)
AI Risk Assessment ProcedureHow AI risks are identified and assessedReference Decision Readiness Framework methodology
AI Incident Response ProcedureResponding to AI failuresReference provider signal alerts
Internal Audit ProcedureAudit planning, execution, reportingReference evidence pack as audit input

Estimated effort: 2-4 weeks (with consultant support) or 6-8 weeks (in-house)


Step 5: Run Pre-Certification Gap Assessment

Before engaging a certification body, conduct a gap assessment (pre-audit):

Option 1: Self-assessment

  • Use ISO 42001 clause checklist (available from ISO or consultants)
  • For each clause, assess: ✅ Implemented, 🟡 Partial, ❌ Gap
  • Prioritise gaps by certification criticality

Option 2: External consultant

  • Hire ISO 42001 consultant (1-2 days, £2k-5k)
  • Provide recent SignalBreak evidence pack as input
  • Consultant identifies gaps and remediation plan

SignalBreak reduces gap assessment time by ~50% (data-intensive clauses already evidenced).


Evidence Requirements for Certification

What Auditors Will Request

When you engage a certification body for ISO 42001 audit, expect requests for:

1. Management System Documentation

DocumentSignalBreak Provides?What You Need
AIMS scope statementDefine which AI systems, departments, locations
AI governance policyTop management commitment, principles
Organisational chartRoles and responsibilities for AIMS
Risk assessment methodologyEvidence pack Appendix C
AI system inventoryWorkflow registry (Evidence Pack Appendix B)

2. Operational Evidence

Evidence TypeSignalBreak Provides?Example
Risk assessmentsDecision Readiness Scorecard (Evidence Pack p.2-3)
Risk treatment recordsRecommendations with timelines (Evidence Pack p.5)
Third-party monitoringProvider concentration analysis (Evidence Pack p.6)
Impact assessmentsBusiness impact quantification (Evidence Pack p.4)
Monitoring recordsSignal monitoring (Evidence Pack p.7)
Incident records🟡Provider outages detected (needs incident response documentation)

3. Performance Evaluation Evidence

Evidence TypeSignalBreak Provides?Example
Monitoring resultsProvider availability metrics, signal counts
Internal audit reports🟡Evidence pack as audit input (needs formal audit report)
Management review recordsMinutes from quarterly reviews
Corrective action recordsNonconformity tracking, root cause analysis

4. Improvement Evidence

Evidence TypeSignalBreak Provides?Example
Continual improvementScore trajectory (Evidence Pack p.2)
Lessons learnedPost-incident reviews, process improvements
Effectiveness verificationNext evidence pack shows score improvement from recommendations

Audit Process Overview

Stage 1: Documentation Review (Off-site)

  • Certification body reviews your AIMS documentation
  • Provides gap report (missing documents, incomplete processes)
  • SignalBreak evidence pack demonstrates operational maturity (reduces Stage 1 findings)

Stage 2: Implementation Audit (On-site or remote)

  • Auditor interviews staff, observes processes
  • Verifies documented procedures are followed in practice
  • Checks records (evidence packs, audit reports, management reviews)
  • SignalBreak evidence pack provides objective records for interviews

Certification Decision:

  • No major nonconformities → Certificate issued (3-year validity)
  • Major nonconformities → Remediation required, re-audit
  • Minor nonconformities → Corrective actions, verified at surveillance audit

Surveillance Audits (Annual):

  • Lighter-touch audits during 3-year certificate period
  • SignalBreak evidence packs demonstrate continuous compliance

Evidence Pack Best Practices for Auditors

  1. Generate evidence pack 1-2 weeks before audit

    • Ensures data is current (auditors may question old data)
    • Allows time to review and prepare explanations
  2. Provide auditor with PDF copy in advance

    • Professional formatting demonstrates maturity
    • Allows auditor to prepare questions
  3. Cross-reference evidence pack in documentation

    • Example: Risk Assessment Procedure → "See SignalBreak Evidence Pack Section 2 for most recent assessment"
    • Shows integration between process and evidence
  4. Show trend over time

    • Provide 3-6 months of evidence packs
    • Demonstrates continual improvement (score trajectory)
  5. Explain limitations transparently

    • SignalBreak covers third-party AI (not internally developed models)
    • Evidence pack supports audit but doesn't replace it
    • Auditors appreciate honesty about scope

Certification Timeline and Costs

Typical ISO 42001 Certification Journey

PhaseDurationEstimated CostKey Activities
0. Preparation3-6 months£5k-15k (consultant)Gap assessment, policy development, SignalBreak onboarding
1. Documentation2-4 months£3k-8k (templates, consultant review)Create AIMS procedures, integrate SignalBreak evidence
2. Implementation3-6 months£0 (internal effort)Run processes, generate evidence packs monthly
3. Internal audit1-2 months£2k-5k (external auditor)Pre-certification check, corrective actions
4. Stage 1 audit1-2 weeks£8k-15k (cert body)Documentation review, gap report
5. Stage 2 audit1-2 weeksIncluded in cert body feeImplementation audit, certification decision
Total (to certification)12-18 months£18k-43kFirst-time certification (no existing AIMS)

Recertification (every 3 years): £10k-20k + annual surveillance audits (£5k-8k/year)


How SignalBreak Reduces Certification Cost

Cost CategoryWithout SignalBreakWith SignalBreakSavings
Gap assessment5 days @ £1k/day = £5k2 days @ £1k/day = £2k£3k (SignalBreak pre-evidences 60% of clauses)
Ongoing monitoringManual quarterly reviews (40h/year @ £100/h) = £4k/yearAutomated evidence packs (4h/year) = £400/year£3.6k/year
Audit preparation20h gathering evidence @ £100/h = £2k4h reviewing evidence pack = £400£1.6k per audit
Consultant hours30 days @ £1k/day = £30k15 days @ £1k/day = £15k£15k (less data gathering, more strategy)

Total estimated savings: £23k over 3-year certification cycle


Certification Body Selection

Choose an accredited certification body for ISO 42001. Major providers:

Certification BodyGeographic CoverageTypical Cost (SME)Notes
BSI (British Standards Institution)Global£12k-20kStrong UK presence, AI expertise
SGSGlobal£10k-18kLarge portfolio, multi-site experience
Bureau VeritasGlobal£10k-18kGood for manufacturing/engineering
NQAUK, Europe£8k-15kCost-effective, good for SMEs
LRQAGlobal£12k-20kStrong in regulated industries (FS, pharma)

Selection criteria:

  1. Accreditation: Verify UKAS accreditation for ISO 42001 (check UKAS website)
  2. Industry experience: Do they audit organisations in your sector?
  3. AI expertise: Do auditors understand AI/ML technologies?
  4. Geographic presence: Can they audit all your sites (if multi-location)?
  5. Cost vs. value: Cheapest isn't always best (inexperienced auditors create risk)

ISO 42001 vs Other Frameworks

Complementary Use with NIST AI RMF and EU AI Act

ISO 42001 is not mutually exclusive with other frameworks. In fact, it's designed to complement:

FrameworkRelationship to ISO 42001Use Both?
NIST AI RMFRisk assessment methodology (feeds into ISO 42001 Clause 6.1.3)✅ Yes — NIST provides risk categories, ISO provides management system
EU AI ActLegal compliance (ISO 42001 supports conformity assessment)✅ Yes — EU AI Act mandates risk management, ISO 42001 is recognised method
ISO 27001Information security (broader than AI)✅ Yes — Many clauses overlap (risk, monitoring, audit)

SignalBreak supports all three simultaneously — evidence packs include:

  • ISO 42001 clause mapping
  • EU AI Act risk classification
  • NIST AI RMF alignment (via MIT Risk Repository)

See Governance Overview for multi-framework strategy.


When to Choose ISO 42001

Choose ISO 42001 if:

  • ✅ You need third-party certification for customer/regulator assurance
  • ✅ You're in the EU and subject to AI Act high-risk requirements
  • ✅ You already have ISO 9001/ISO 27001 (easier integration via Annex SL)
  • ✅ You're a vendor selling AI products/services (certification is competitive advantage)
  • ✅ You have internal AI development (ISO 42001 covers full AI lifecycle)

Don't choose ISO 42001 if:

  • ❌ You only use third-party AI APIs with no customisation (NIST AI RMF may suffice)
  • ❌ You're a small startup with <10 AI workflows (overhead may not justify certification)
  • ❌ Your industry has sector-specific AI standards (e.g., medical devices have IEC 62304)

Hybrid approach: Many organisations use NIST AI RMF for initial risk assessment, then pursue ISO 42001 certification when they reach enterprise maturity or need regulatory evidence.


Common Questions

Do I need ISO 42001 if I'm already ISO 27001 certified?

Not necessarily, but highly recommended if AI is core to your business.

Overlap:

  • Risk management processes (~40% overlap)
  • Monitoring and measurement (~30% overlap)
  • Audit and improvement (~50% overlap)

ISO 42001-specific additions:

  • AI-specific risk categories (bias, explainability, model drift)
  • Third-party AI management (API providers, pre-trained models)
  • AI impact assessment (not required by ISO 27001)
  • Data quality for AI (training data, data drift)

Integration strategy: If you have ISO 27001, you can integrate ISO 42001 with ~50% less effort (shared processes). Many organisations pursue dual certification using a single Integrated Management System (IMS).


Can SignalBreak alone get me ISO 42001 certified?

No. SignalBreak provides ~60% of certification evidence (the data-intensive clauses), but certification requires:

What SignalBreak provides:

  • ✅ Risk assessment records (Clause 6.1.3)
  • ✅ AI system inventory (Clause 6.2.2)
  • ✅ Third-party monitoring (Clause 8.3)
  • ✅ Impact assessments (Clause 8.4)
  • ✅ Monitoring data (Clause 9.1)
  • ✅ Continual improvement evidence (Clause 10.2)

What you still need:

  • ❌ Management commitment and policy (Clause 5.x)
  • ❌ Competence and training records (Clause 7.x)
  • ❌ Documented procedures (all clauses)
  • ❌ Management review records (Clause 9.3)
  • ❌ Corrective action records (Clause 10.x)

Think of SignalBreak as your "AI governance data platform" — it collects and analyses the data an AIMS needs. You still need the management system around it (policies, procedures, audits, reviews).

Analogy: SignalBreak is like a CRM for ISO 42001 — it manages your AI provider relationships and risks, but you still need a sales process. Similarly, SignalBreak manages AI governance data, but you still need governance processes.


How often should I generate evidence packs for ISO 42001?

Minimum: Quarterly (sufficient for certification) Recommended: Monthly (better trend data, board reporting) Best practice: Monthly + on-demand (before audits, after major changes)

Why monthly?

  1. Trend analysis requires 3+ data points — quarterly packs only give 4 data points/year
  2. Auditors expect recent evidence — quarterly packs could be 90 days old
  3. Board reporting typically monthly — align evidence pack to existing governance cadence
  4. Continuous improvement requires timely feedback — quarterly is too slow to iterate

Cost-benefit:

  • Monthly evidence packs = 12 data points/year
  • Quarterly evidence packs = 4 data points/year
  • Additional cost: Negligible (automated generation)
  • Additional value: 3x better trend visibility, always audit-ready

Exception: If you have <5 AI workflows with stable providers, quarterly may suffice. For >10 workflows or high-risk use cases, monthly is essential.


Can I use SignalBreak evidence packs in my Stage 1 audit?

Yes — and you should.

Stage 1 auditors will request:

  • Evidence that your risk assessment process is applied (not just documented)
  • Proof your AI system inventory is current (not a one-time spreadsheet)
  • Demonstration that third-party monitoring is continuous (not annual vendor review)

SignalBreak evidence pack provides all three:

  • ✅ Risk assessment applied: Decision Readiness Scorecard with date stamp
  • ✅ Inventory current: Workflow registry with "Last Updated" timestamps
  • ✅ Monitoring continuous: Provider health logs, signal detection records

Best practice: Include your most recent evidence pack as Appendix A to your Stage 1 documentation submission. Reference it throughout your AIMS procedures:

  • "See Evidence Pack Section 2 for risk assessment results"
  • "See Evidence Pack Appendix B for current AI inventory"
  • "See Evidence Pack Section 4 for third-party monitoring records"

Auditor perspective: Evidence packs demonstrate operational maturity — you're not just creating documents for certification, you're actually running an AI governance process. This significantly increases auditor confidence.


What happens if my score is Red (70-100) during certification audit?

Short answer: Red status alone won't prevent certification, but auditors will scrutinise your risk treatment plans.

Longer answer: ISO 42001 doesn't mandate specific risk thresholds (there's no "you must be Green to certify"). Instead, auditors assess:

  1. Do you know your risks? ✅ (Red score shows you've assessed and quantified risks)
  2. Do you have a risk treatment plan? ⚠️ (Must show actions to reduce Red status)
  3. Is the plan being implemented? ⚠️ (Must show progress, not just intentions)
  4. Are controls effective? ⚠️ (If Red persists for 6+ months with no improvement, auditors will question effectiveness)

Certification implications:

ScenarioLikely Audit Outcome
Red score + recent (this month)✅ OK — you just implemented AI, risk is expected
Red score + action plan + timeline✅ OK — demonstrating risk treatment process
Red score + 3+ months + improving trend✅ OK — showing continual improvement
Red score + 6+ months + no action planMajor nonconformity — risk treatment not applied (Clause 8.2)
Red score + 6+ months + flat/worseningMajor nonconformity — AIMS not effective (Clause 10.2)

Recommendation: If your score is Red, delay certification until Amber (30-69) or demonstrate clear downward trajectory (e.g., Red 85 → 70 → 55 over 3 months).

Why? First audit sets precedent. If you certify at Red 85, auditors will expect Green eventually. Better to certify at Amber 65 with upward momentum (harder to achieve, but stronger certification).


Does ISO 42001 require specific AI risk thresholds?

No. ISO 42001 is risk-based, not rule-based. It requires:

  1. Define your own risk criteria (what's acceptable risk for your organisation)
  2. Apply risk assessment against those criteria
  3. Treat unacceptable risks
  4. Monitor risk treatment effectiveness

This means:

  • Startup with low-risk AI chatbot: Might define "acceptable" as Red <90
  • Bank with credit-scoring AI: Might define "acceptable" as Green <20

SignalBreak aligns with this philosophy:

  • Provides objective risk score (0-100)
  • You define your risk appetite (where's your Red/Amber/Green threshold?)
  • Evidence pack shows whether you're meeting your own criteria

Best practice: Document your risk appetite in AI Governance Policy:

  • Example: "Our target risk state is Green (<30). We accept Amber (30-69) with active risk treatment plans. Red (>70) requires board escalation and expedited risk reduction."

Auditors will verify you're consistent with your own policy, not against an absolute standard.


Next Steps

  1. Assess your current state:

    • Generate your first SignalBreak evidence pack
    • Review ISO 42001 clause mapping (p.X of evidence pack)
    • Identify which clauses have full vs. partial coverage
  2. Close certification gaps:

    • Create stakeholder register (Clause 4.2 gap)
    • Establish internal audit procedure (Clause 9.2 gap)
    • Document policies and procedures (Section 5, 7 gaps)
  3. Engage certification body:

    • Request quotes from 2-3 accredited bodies
    • Provide evidence pack as demonstration of maturity
    • Expect 12-18 month timeline to certification
  4. Establish governance rhythm:

    • Monthly evidence pack generation
    • Quarterly management reviews
    • Annual internal audits
    • 3-year certification cycle


External Resources


Last updated: 2026-01-26Based on: ISO/IEC 42001:2023, SignalBreak Evidence Pack Framework v2.1

AI Governance Intelligence