The main watchdog of Wall Street is announcing suggested limitations on brokerages and money managers that communicate with clients using artificial intelligence.
On Wednesday, the US Securities and Exchange Commission adopted a strategy to eliminate potential conflicts of interest that could occur when financial firms implement the technologies, according to Chair Gary Gensler. The organization also enacted final regulations requiring businesses to report important cybersecurity incidents four business days after they occur.
The AI proposal is the latest shot fired by Washington authorities worried about the ability of the technology to affect everything from credit decisions to financial stability. According to an SEC announcement, companies would have to determine whether using AI or predictive data analytics results in conflicts of interest and resolving them. To ensure they remain in compliance with the rule, they would also need to strengthen written policies.
Gensler said during the meeting that these regulations would ensure that businesses uphold their commitments to put clients first, regardless of the technology they use, and would safeguard investors from conflicts of interest. Disclosure is not all that this is. In these predictive data analytics, he added, the question is whether anything is optimized for our benefit or for the benefit of financial firms.
AI has long been used by banks and brokerages for market monitoring and fraud detection. Recently, asset management, lending, and trading advice have come under the spotlight. The SEC wants to make sure that when businesses promote trades or products, they aren’t putting their own interests ahead of those of their customers.
According to an agency employee speaking anonymously at a news conference on Tuesday, the idea goes beyond the current guidelines that brokers must operate in their clients’ best interests when making suggestions.
The plan will be available for public comment, which the agency will consider before putting the final version up for a vote, probably in 2024. Final approval of the rule would need to come from four out of the commission’s five members.
The two Republicans on the panel criticized the regulation for being overly broad and asking businesses to evaluate the use of far too many different types of technologies for potential conflicts.
For instance, a wide range of frequently used tools, such as a straightforward electronic calculator or an application that analyses an investor’s potential retirement assets based on, for instance, modifying the overall asset allocation mix between stocks, bonds, and cash, could qualify, according to Commissioner Mark Uyeda. According to him, companies may decide against innovation due to the proposal’s “vagueness” and regulatory issues.
Full-Court Press
Regulators have been very clear that they are ramping up oversight of artificial intelligence in recent weeks.
The Consumer Financial Protection Bureau’s Rohit Chopra hinted at upcoming curbs on the use of artificial intelligence in lending. According to Michael Barr, vice chair for supervision at the Federal Reserve, lenders must make sure that such tools don’t encourage prejudice and discrimination in credit decisions.
The maker of ChatGPT, OpenAI Inc., which is supported by Microsoft Corp., has already been the subject of an FTC inquiry to see whether the chatbot poses hazards to consumer data and reputation.
In order to provide a framework for “responsible innovation” using the technology, President Joe Biden announced on July 21 that his government would take new executive actions in the upcoming weeks.
Since taking over the SEC in 2021, Gensler has expressed concerns about the potential for AI to use vast amounts of data to target individual investors and influence their decisions to trade, invest, or create financial accounts.
He referred to the instruments as the most revolutionary technology of our time last week, but he also issued a warning that the risk of the technology being concentrated in the hands of a small number of companies or fundamental data sets could result in future volatility in the financial markets.
Cyber Disclosures
The SEC also authorized a plan requiring businesses to disclose severe cybersecurity breaches on Wednesday.
The proposed regulation’s need to publicly disclose breaches four business days after concluding that they are “material” to a company’s operations or financial condition is retained in the final rule. If the US attorney general decides that disclosing the occurrence will endanger public safety or national security, a provision for delaying disclosure is added.
Industry organizations like the Business Roundtable have issued warnings that a four-day time frame will provide malicious actors with useful knowledge about how companies operate.
Investment advisers who only conduct business online would be able to register with the SEC under a different proposal on the agenda. Approximately 200 financial advisers, according to the government, are impacted by the present exception.