Binding regulations that require the same obligations this standard addresses. Implementing this standard can help satisfy these regulatory requirements.
ISO/IEC 42005 fills the gap between generic risk management (ISO 23894) and impact on individuals and society — it is the AI equivalent of a Data Protection Impact Assessment (DPIA). As AI impact assessment requirements appear in the EU AI Act, CETS 225, and national strategies, this standard provides the reference methodology for conducting them.
Requirements
Requirement
Details
Impact identification
Identify potential impacts of AI systems and their foreseeable applications on individuals, groups, and society
Intended and unintended use assessment
Assess intended, unintended, sensitive, restricted uses, and foreseeable misuse scenarios
Benefit and harm evaluation
Evaluate both positive and negative impacts throughout the AI lifecycle
Stakeholder perspective
Integrate perspectives of affected individuals and groups in the assessment process
Documentation
Produce assessment documentation supporting transparency, accountability, and fairness
Lifecycle integration
Apply impact assessment from design and development through deployment and post-market monitoring
Integration with risk management
Coordinate impact assessment with ISO/IEC 23894 (risk management) and ISO/IEC 42001 (management system)