The US Securities and Exchange Commission (SEC) has expressed apprehension regarding the potential impact of generative AI on financial markets.

During a recent address at the National Press Club, SEC Chair Gary Gensler raised concerns about the advancing capabilities of generative AI, which raise the possibility of institutions relying excessively on a limited subset of information for decision-making.

Gensler highlighted the substantial demand for data and computing power, suggesting that a few dominant tech platforms could monopolize the field, restricting the range of AI models available to companies. Inaccurate or irrelevant information from a flawed model could lead financial institutions to make misguided decisions, consequently creating a market situation reminiscent of the 2008 financial crisis. Gensler drew a parallel between the potential consequences and the risks associated with a “centralized dataset or model” in the realm of finance.

AI systems have been prevalent in the financial sector for an extended period. Insurance companies and creditors, for example, employ algorithms and natural language processing to analyze financial data prior to determining loan amounts. Trading firms rely on AI to swiftly identify fraudulent activities and interpret market signals, surpassing human capabilities.

During his speech, Gensler focused on large language models (LLMs), referring to generative AI as the “most transformative technology of our time.” While occasionally conflating these systems with the broader category of AI, Gensler acknowledged that generative AI is not yet widely adopted in the financial domain.

This is not the first instance where Gensler has voiced concerns about the impact of AI on financial markets. During his tenure at MIT, Gensler, along with co-author Lily Bailey, published a paper exploring the limitations of existing regulatory structures in addressing AI-related issues in finance.

AI regulation is not a novel subject for the SEC. The agency established FinHub in 2018, a dedicated resource center aimed at addressing queries regarding AI, cryptocurrencies, and other fintech-related matters. It has actively pursued legal action against companies operating in emerging technologies, particularly within the crypto space, when it perceives violations of the law.

Furthermore, the SEC itself employs machine learning to bolster market surveillance and enforce its policies.

Gensler emphasized the need to update current risk management guidelines to align with the rapid advancements in powerful new technologies. Simultaneously, he acknowledged the necessity for the financial industry as a whole to reevaluate its approach to utilizing these technologies.

Naturally, other government agencies have also commenced investigations into AI companies. The Federal Trade Commission (FTC) recently announced the initiation of an investigation into OpenAI.