Compliance officers of Private Equity Funds have generally taken a cautious approach to allowing employees to use artificial intelligence (“AI”).
The SEC has already taken steps towards regulating how Firms can use AI considering the requirements of Regulation Best Interest (“Reg BI”).
There are multiple regulatory considerations for compliance officers when dealing with staff inquiries related to AI.
- Reg BI
- When staff members are required to act in the best interests of their clients, it could be difficult to know that the best interests of the client will also be the aim of the AI technology.
- When best practice suggests having documented rationales for client recommendations, the unknown biases of AI programs may cause regulators to reject AI responses as a reasonable basis.
- Client data must be protected from cyberattacks and forms of cybercrime.
- The inputting of any sensitive information is problematic when the security of AI systems is currently unknown.
- The use of any non-public material information would also be problematic for businesses.
- Records Retention
- The SEC is imposing heavy fines on Firms that fail to keep proper records.
- AI technology will likely cause significant difficulties for record keeping.
- Accidental loss of information that is required to be kept accessible could occur, bringing potential for unwanted additional regulatory burden.
Compliance officers have had differing approaches to dealing with questions about AI. However, most of the industry is weary of allowing its use.
It is vital that compliance officers create or update their policies and procedures to provide guidance if AI will be allowed or completely banned.
Additionally, it is important for employees to be trained to ensure that cybersecurity, records retention, and Reg BI are all considered if any artificial intelligence software is allowed.
As AI grows, compliance departments may be able to find ways to utilize the efficiency of the technology without creating regulatory headaches.