What do FDIC examiners think about AI? To find out, I asked one

How are bank examiners incorporating issues around the use of AI in their work? The good news is that their interest in AI is consistent with the agency’s longstanding focus on risk management and compliance. But that doesn’t mean that you don’t need to be prepared to answer the specific concerns that AI use can raise.

Recently, I had the opportunity to meet with an FDIC examiner to discuss generative artificial intelligence usage. I haven’t heard of any other banker having a deep one-on-one with an examiner on actual AI usage, so I am sharing some key insights from that conversation, so you can be better prepared to develop policies, training plans, navigate risk and make informed discussions about AI at your institution.

Takeaway #1: Currently, the FDIC does not permit its staff to use ChatGPT or similar tools.

The FDIC’s perspective on AI is evolving, much like it is for everyone else in the banking industry. During my conversation with the examiner, I learned how examiners approach AI at this stage:

Limited hands-on experience: FDIC examiners do not use generative AI tools themselves, which means their understanding remains largely theoretical and their perspectives come from other users or reporting. This is important because their lack of hands-on experience may create a gap between theoretical understanding and the transformative potential of AI technology.

 

continue reading »