Here’s What NCUA Is Saying, Doing When it Comes to AI

ALEXANDRIA, Va.–The use of artificial intelligence by credit unions, vendors and NCUA itself was the subject of an extensive conversation during the agency’s board meeting, touching on everything from what’s permissible to how exams may change to whether some “tire-kicking” can be done.

The far-ranging conversation on an issue evolving so rapidly it requires AI to keep up took place during a board meeting at which all three board members were again on hand, following a court ruling reinstating Todd Harper and Tanya Otsuka, who were fired by President Trump in April (see related reporting).

Amanda Parkhill, left, and Amber Gravius addressing the NCUA board.

The presentation on artificial intelligence to the board was made by Amanda Parkhill, acting director of the Office of examination and Insurance, and Amber Gravius, director, CDO and CIO, in the Office of Business Innovation, Acting CIO.

After offering a definition of AI (see graphic, below), Parkhill said she wanted for first clear up a question some credit unions have had, and that is that it is not only permissible for credit unions to be using AI tools, many are already doing so and it encourages the use of any technology when it is “done in a safe and sound manner.”

“The most important thing for the industry to remember is that AI should be treated like any other new product or service…when it comes to compliance with all existing laws and regulations,” including with information security, Parkhill said, describing NCUA as being “technology neutral.” 

Need for Risk Management

“Credit unions should employ the same risk management practices they are already using for managing their operations and programs currently,” she continued. “That includes identifying risks that are maybe unique to a particular service or technology, monitoring and measuring those risks, and mitigating them as necessary. Credit unions that use a vendor for a product or service, including those that contain some element of AI, should conduct adequate third party due diligence to ensure the credit union continues to operate in a safe and sound manner.

She said management and board members must also understand the risks and how the AI solution fits into the credit union’s business model.

Parkhill pointed to a GAO report the CU Daily reported on in May that  found NCUA is missing two critical tools that are hurting its ability to monitor how credit unions are using artificial intelligence—a lack of model risk management guidance and third party vendor oversight—and said the agency agreed to review existing guidance, including that of the federal banking agencies, to determine any benefit of issuing similar information to credit unions.

What Review Found

“We have completed that review, which included reviewing the banking agency guidance as well as research into the applicability and usefulness of guidance from academic sources and banking industry trade groups, and our conclusion is that focusing solely on model risk management really wouldn’t provide sufficient information of all of the ways that credit unions are or could be using AI,” said Parkhill. “Another consideration as a regulator is to make sure the guidance is just that: guidance. It should not establish expectations or requirements that would be used to assess a credit union’s AI use. Any expectations or requirements would need to be established through a formal notice and comment process. 

“However, we do know that guidance can provide useful information for credit unions that are thinking about adopting AI and for our examiners to better understand the landscape of AI use and credit unions to ensure stakeholders have access to current information.”
As a result, she said the agency is developing an AI resources page at www.ncua.gov/ai, and its cyber security and money laundering resource pages provide authoritative guidance from NIST, CISA and others.

Parkhill outlined some use-case categories for credit unions when it comes to AI.

The Barriers

She also reviewed some of the potential barriers to AI, including:

  • Technology Expertise
  • Implementation and Maintenance Costs
  • Internal & External Stakeholder Trust or Perception
  • Legal and Regulatory

“Some credit unions may have concerns about future regulations that would make a product they’ve invested in impermissible, or others may worry about how their examiner may view processes or tools that incorporate AI,” Parkhill told the board. “We want to know what we can do to help eliminate or reduce this barrier.”

To that end, Parkhill, along with the NCUA board members, all asked for input from credit unions and other stakeholders.  Feedback should be sent to https://ask.ncua.gov

The Use Cases

Amber Gravius from the agency’s Office of Business Innovation said NCUA has identified a number of use cases for AI within NCUA itself,  which are shown in the graphic below.

It has also used machine learning techniques to identify potential data reporting errors with the 5300 call report, for estimating loan default probability, for supervisory stress testing and for creating anomaly alerts, all of which she said save staff approximately 40 hours each quarter and which have vastly imroved data quality for the NCAA’s most popular public data set.

Identifying Outliers
“In the simplest of terms, NCUA uses a forecasting model with clustering and text mining and machine learning algorithms to identify outliers in business rules to reduce false positives,” Gravius said. “The output reports are sent to the NCAA regional office staff for review and potential action. Once any false positives are disregarded the examiner will contact the credit union to verify if a corrected call report must be submitted.”

She added the agency carefully evaluates all new technologies to ensure it has the financial and operational capacity to sustain any investment, which it is doing with AI tools and opportunities through the use of targeted proof of concept and small group pilots 

Hauptman: CUs Need to Remain ‘Agile’

NCUA Chairman Kyle Hauptman, who pointed to a new Trump administration n “AI Action Plan” it said it designed to maintain and boost U.S. dominance in artificial intelligence, as the CU Daily reported here, said AI is “game-changing technology” and that the president has prioritized reducing regulatory obstacles to innovation and encouraged the federal government to embrace AI.

NCUA Chairman Kyle Hauptman during meeting.

“As many of you know, one of my priorities as chairman is promoting the appropriate use of AI…” he said. “It’s also true that regulators who use technology are more apt to understand why the regulated use (the technology).”

Saying the industry has long been early adopters of technology, Hauptman said credit unions are “agile in a way that banks sometimes can’t be,” and he added the agency will continue to embrace the innovation coming out of credit unions.

Hauptman also encouraged feedback to the agency on what laws, regs or requirements need to be added or subtracted to help foster the adoption of AI.

Harper: How Can AI be Used With Exams?

Harper said he also wants to encourage the use of AI, but also ensure protection of consumer rights and that compliance takes place, while not putting the safety and soundness of the insurance fund at risk.

He then asked agency staff how NCUA staff will be employing AI to make its exams more efficient and effective?

‘Lots of Opportunity’

Gravius responded by saying NCUA sees a “lot of opportunity to leverage artificial intelligence to help us better analyze the large amounts of data that we do collect from credit unions to help us identify risks in a new way, maybe sooner.

“One of the things that we struggle with sometimes here is, ‘OK, we have a list of instructions here at NCUA. We have user guides for all the different systems we use and people can’t always find (a) piece of information or some procedure,’ and that’s where sometimes chatbots can really be helpful,” Gravius continued. “Not only does the technology give you the answer, it tells you where that answer is located so you can double-check it if you have questions. We’ve also been exploring content generation…It does save a lot of time if you take a bunch of summary notes and it forms it into text that’s usable.”

Todd Harper appearing virtually during meeting.

Harper also asked where NCUA will find the expertise to oversee and manage AI, with staff noting the agency has some technical experts already on staff, but the hiring freeze currently in place means it will not be able to bring on anyone else in the near future.

Third Party Oversight

Finally, Harper asked about third party due-diligence with vendors that provide AI-based solutions. 

“We have longstanding guidance to both credit unions and examiners on what that should look like, things they should consider,” answered Parkhill. “So, when examiners are looking at any product or service program a credit unions is offering or running that involves a third party, they’re going to be looking at (whether) the credit union has identified what the risks are. Are they mitigating or monitoring those risks and really understanding the organization or the company that they’re doing business with?…They’re going to be looking at the security of the data, where the data is being stored, who’s using the data. It really just depends on how the AI is being used. If it’s being used for lending decisions then they’re going to be looking for compliance with all of the consumer financial protections and disclosures.

Parkhill added that AI-related training is being integrated into some of its information security officer training and it also offers external training through the FDIC.

What About Some ‘Tire Kicking?’

Harper also asked whether it would be possible for AI vendors to come to the agency to share their solutions so NCUA could “kick the tires.” Parkhill said it’s something that has been discussed, but NCUA wants to be careful it does not give the impression it is endorsing any particular vendors.

Hauptman: Individuals Still Responsible

Hauptman followed by stating that regardless of the technology, in the end the CEO and others at the credit union are responsible for its effectiveness and security. 

“If something goes wrong, that is responsibility the credit union, just like it always,” he said.

Tanya Otsuka during NCUA meeting.

Otsuka: Remember It’s People’s Money

Board Member Tanya Otsuka agreed with Hauptman on where the responsibility lies, and reminded that AI has the potential to change the way things are done.

“We need to be prepared for those changes. We should continue to learn more about whether artificial intelligence can help modernize processes and make financial services more accessible and more affordable, Otsuka said. “I understand we’re just getting started on kind of understanding the scope of and the use of AI in the credit union system…But I think we should continue to make sure that it’s used in an appropriate way that’s consistent with regulations and laws. We need to make sure we understand how these AI systems work and the quality and accuracy of the data being used and where it’s coming from, because the system is only as good as the data that it’s learning from. 

“That’s really critical when it comes to people’s money and finances. NCUA should focus not just on how the agency or credit unions might use AI, but also how it impacts credit union members.”

Facebook
Twitter
LinkedIn

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.