The Financial Conduct Authority and the office of the information commissioner will draw up new rules for the use of AI with the financial service providers.
In a joint explanation, the FCA Chief Executive Nikhil Rathi and UK Information Commissioner John Edwards said: “To support good practice in the future, we will develop a legal practice code for organizations that develop or use and enable AICT decision formation and make innovation possible while they protect the privacy.”
The FCA and ICO said that regulations are not the most important blocker for innovation, but instead an enabler.
The statement says: “Well done, regulation is not a brake on innovation. It is a bridge that connects creativity with public trust. The right approach gives regulation an enabler: providing the security companies that have to invest, experiment and grow on the responsible person.”
Last month, a round table was held with market leaders to better understand the challenges in deploying artificial intelligence (AI) and how the FCA and ICO can support responsible innovation and use of personal information board.
While companies understand the broad rules, in practice companies wanted clearer examples of ‘how good appearance’ and more opportunities for involvement in building trust in trying new technologies.
To support the good practices, the FCA and ICO will develop a legal practical code for organizations that develop or implement AI and automated decision -making.
In addition, both companies will help to develop, test and evaluate AI as part of the AI Laboratory of the FCA.
The FCA is also planning to organize a round table with smaller companies later this year to better understand challenges surrounding the AI acceptance, while the forum Digital Regulation Cooperation Forum has committed supervisors to develop our collective understanding of how each other’s regulations can apply to AI and works to identify and resolve conflict points.
In the statement of Rathi and Edwards, it emphasizes that companies are concerned about who is responsible when AI is developed by third parties.
To help with this, it states that the ICO has published detailed analysis about the allocation of controllership in the generative AI -Supply Chainlink is external, while the FCA has provided information about the responsibility of companies in trying to use generative AI.
It also emphasizes that extra support has already been launched to help companies, including the Digital Sandbox, Supercharged Sandbox and AI Live tests in the FCA innovation hub.
The ICO also offers the innovation advice service, legal sandbox and innovation hub under its innovation service link externally.
Although these are available, the FCA and ICO reveal that it will increase the visibility of these services and increase practical help on the basis of practice.
Rathi and Edwards conclude their statement and call on companies and trade organizations to ‘keep talking’, not only when there is a problem, real or observed but rather in the innovation trip.
The explanation says: “We can help companies do things differently. But we need their insight to do things better.”
“With legal agility and trust to innovate and invest in new technologies, companies will give the UK the fuel to provide economic growth with electricity.”

