• Skip to primary navigation
  • Skip to main content
  • Skip to footer
The Evolution of ESG RegulationThe Evolution of ESG RegulationThe Evolution of ESG Regulation

CUBE global

  • Products
        • RegPlatform product overviewOur enterprise product, providing regulatory intelligence for large, global financial institutions looking to tackle complex compliance.
        • RegAssure product overviewOur highly intuitive, seamless compliance product, that grows with your small or medium sized business.
        • CUBE's technology
  • Solutions
        • PrivacyGlobal governance for data privacy regulations, the world over
        • RecordsHolistic oversight of ever-growing regulations for records
        • CybersecurityAutomated workflows for up to date, relevant data on cyber
        • Technology riskEffective policies and controls to mitigate technology risk
        • Financial crime and AMLWatertight audit trails to show risk-based rationale
        • View all solutions
  • Resources
        • Resource hubLifting the lid on financial services, compliance, and regulation
        • Read

        • Case Studies
        • Blog posts
        • Reports
        • RegNews
        • Brochures
        • Find

        • Compliance Corner
        • Compliance confessions
        • ESG Conference
        • CUBE’s regulation game
        • Listen

        • Videos
        • Webinars
        • Podcasts
  • Partners
        • Advisory and consulting partnersEnhance your regulatory compliance offering with the entire suite of CUBE regulatory data.
        • Integration partnersCompliance is complex enough without over-complicated integration procedures.
        • Technology partnersAdd value to existing customer applications with a unified window into regulatory intelligence.
        • Partners overview
  • About us
        • About usThe story of who we are, how we got here and why we’re exceptionally proud of what we do
        • TeamThe visionaries and leaders powering CUBE’s success
        • NewsThe latest news from CUBE
        • CareersOur movement to transform regulatory data into regulatory intelligence
        • ContactWant to know more? Get in touch
  • Request a demo
Customer login
Home » Resources » Regulators must clarify regulatory expectation for AI: AIPPF
AIPPF

March 2, 2022

Estimated reading time: 4 minutes

Regulators must clarify regulatory expectation for AI: AIPPF

A public-private forum, created by the Bank of England and the Financial Conduct Authority – has published their final report into the use of artificial intelligence within financial services. The report, published by the AI Public-Private Forum, highlights the need for tighter governance and calls for clarity from global regulators.

Artificial intelligence is slowly but surely becoming a foundational element of financial services. From mortgage applications to regulatory technology, AI is changing the way that financial services operate. In many cases, this is for the better, but of course with new innovation comes new challenges and risks. This is a point that the AI Public-Private Forum is quick to highlight in their report, noting that “innovation can change the trade-offs between risk and return. The speed, scale and complexity of AI systems can amplify existing risks and pose new challenges.” That is not to say, however, that AI is bad thing – far from it – as the report highlights.

The final report represents more than a year’s worth of work from the AI public-private forum (AIPPF). It is the result of meetings, workshops and discussions between AIPPF members, who looked to deepen a collective understanding of the technology that underpins AI and “explore how [it] can support the safe adoption of AI within financial services”.

The 47 page report covers the gamut of artificial intelligence for finance – from data and model risk to governance. One of the key questions it poses is how existing regulation and legislation may be applied to AI, and whether AI can be “managed through extensions of the existing regulatory framework, or whether a new approach is needed”.  

The report highlights that, while it chiefly focusses on the technical challenges of AI, all of these fall under the umbrella of “domestic and international regulatory and legislative frameworks”. It goes on to conclude that:

“Clarity of regulatory expectations on the adoption and use of AI is a key component of fostering innovation. Regulators should provide greater clarity on existing regulation and policy. Such clarification and any new guidance should not be overly prescriptive and should provide illustrative case studies. Alongside that, regulators should identify the most important and/or high-risk AI use-cases in financial services with the aim of developing mitigation strategies and/or policy initiatives”.

Moreover, it says that “regulators can support innovation and adoption of AI” and should start by “providing clarity on how existing regulations and policies apply to AI”.

Looking ahead, the AIPPF aims to publish a Discussion Paper on AI later on in the year, which will build on its existing work and extend engagement across a range of stakeholders. The report also highlights the emerging need for an industry body for practitioners, which would aim to build trust in the use of AI, though does not set out a time frame for when such a body could be established.

CUBE comment

The AIPFF report highlights that the safe adoption of AI has only just begun and calls for regulators to provide clarity. We have long heard rallying cries from compliance and surveillance teams for regulatory clarity about what their expectation is from AI. Hopefully this report, endorsed by both the FCA and the BoE, will mark the start of regulatory clarity for the future.

Artificial intelligence within financial services is a double edged sword, or a Swiss army knife. It can be used across the remit of operations to streamline systems, reduce manual processes, increase accuracy…the list goes on. It can be used to process and understand customer data to provide accurate products, it can predict changes and patterns, and it can be used in regulatory compliance to overhaul change management.

This report, in the main, looks to establish the challenges of adopting AI, how to ensure accountability for such use, to ensure it is ethical and that it is not built on bad data. What it does not do is look at the implementation and benefits of using AI within compliance – it is more front of house.

What both of these things have in common, and a pivotal thread across AI more generally, is trust. This is especially pertinent within financial services for myriad reasons. In particular, financial services deals with money and data – two things that are hugely valuable and must be protected. This is the first barrier. The second is more philosophical and pertains to a general fear of the unknown within financial service. ‘We’ve done it this way since inception’ – some might say ‘why fix what isn’t broken?’.

While understandable, both of these standpoints deny an inalienable truth, which is that manual processes often expose firms to more risks, more errors and more regulatory scrutiny (and potentially more regulatory penalties) that those that use AI. As well as this, they spend more money and expend more resources to reach the same (sometimes less accurate) results.

Regulatory clarity and a greater understanding of regulatory expectations will lay the playing field for AI. Only then will firms know what they should – and should not – be doing. With regulatory clarity comes regulatory endorsement, with regulatory endorsement comes trust. AI is advancing fast, but adoption within financial services is comparatively slower.


Find out how CUBE uses AI to transform regulatory change for financial services.

Speak to the team

Related resources
View all resources
Discover the FCA's new Buy Now Pay Later (BNPL) regulation
Blogs

The UK gets ready for Buy Now, Pay Later regulation

Kraken is penalised for crypto wrongdoings
Blogs

The crackdown on crypto continues 

CUBE explores the SEC's Division of Examination's regulatory agenda
Blogs

SEC Division of Examination’s 2023 priorities: ESG, information security, operational resiliency and crypto

AI regulations
Blogs

AI solutions – a strong ally under global review


Want CUBE updates and latest industry news sent straight to your inbox?

Footer

Add CUBE logo here

  • Products
    • Partners
    • Solutions
  • Resource hub
    • Blogs
    • Reports
    • Brochures
    • Compliance Corner
    • Webinars
    • Podcasts
    • Videos
  • Behind CUBE
    • About us
    • Meet the team
    • Careers
    • News US
    • Contact us
  • The legal bits
    • Privacy policy
    • Cookie policy
    • Terms of use
    • Accessibility
Follow us:
  • LinkedIn
  • Twitter
  • YouTube

© 2023 CUBE Content Governance Global Limited

  • English
  • US