REGULATORY LANDSCAPE
The Financial Conduct Authority (FCA) and the Information Commissioner’s Office (ICO) have jointly outlined a vision for responsible and innovation-friendly AI adoption that aligns with existing financial and data protection regulations. Absent AI-specific regulations, financial services firms should adopt a “first principles” approach, drawing on existing regulations and frameworks to implement AI policies and use cases.
Regulators do not expect firms to reinvent the wheel. Instead, they should apply existing data protection principles underpinned by the UK GDPR such as integrity, transparency, fairness, accountability and accuracy, along with a robust oversight and safeguarding process, to manage the privacy risks associated with AI. Accordingly, firms should approach AI through the lens of existing frameworks such as the Consumer Duty and the Senior Managers and Certification Regime (SM&CR). According to the Consumer Duty principles, firms must ensure their use of AI is understandable and transparent to customers, as well as delivering fair value and meeting their needs. The SM&CR regulatory regime establishes senior management accountability for operational resilience, data governance, and customer outcomes in financial services firms. The FCA has signalled that SM&CR provides the framework under which senior management can evaluate the safe deployment of AI.
PRACTICAL CONSIDERATIONS
AI supports a wide range of use cases, from fraud detection and market forecasting to customer profiling and support. Each use presents unique challenges for maintaining data privacy and security. Key considerations before deployment include:
- Taking a risk-based approach: Assess the intended use of AI tools from the outset. For example, generative AI chatbots pose different risks to machine learning tools used for credit scoring. Risks should therefore be considered on a case-by-case basis and throughout the lifetime of the tool to ensure AI remains targeted and reliable against its use case. Where risks are identified...