With the burgeoning influence of Artificial Intelligence (AI) in the financial sector, Gary Gensler, the head of the Securities and Exchange Commission (SEC), voiced significant concerns about its potential repercussions. In a candid conversation with the Financial Times, Gensler highlighted the looming peril of a financial crisis if regulatory mechanisms don’t evolve in tandem with AI advancements.

Consolidation in AI: A Risky Path?

Gensler pointed to a striking lack of diversity in AI models as a potential destabilizing factor. Despite the existence of myriad open-source AI models, a lion’s share of corporate entities leans heavily on a handful of tools. Notably, the ChatGPT from OpenAI has gained traction among several players.

Navigating this complex landscape is a challenge, Gensler admitted. The SEC’s traditional regulatory approach centers on individual institutions—be it banks, brokers, or money market funds. He emphasized the broader implications of AI, asserting that many organizations might depend on a shared base model or a singular data aggregator, making the regulatory issue a “horizontal” one.

Additionally, the integration of AI technologies into Wall Street’s functions, from surveillance of market trends to streamlining account setups, was underlined by the Financial Times. This intricate intertwining of AI heightens the stakes, with Gensler cautioning that a financial downturn, fueled by an over-reliance on a singular data aggregator or model, is not a distant possibility. The “economics of networks” that AI wields could usher in such a scenario, perhaps as early as the late 2020s or the dawn of the 2030s.

Echoes of Past Disruptions

The SEC chair’s sentiments carry extra weight when viewed in the context of past market upheavals. A 2010 incident, where the stock market experienced a sudden and massive “flash crash,” serves as a cautionary tale. While the market swiftly rebounded, the staggering trillion-dollar drop and its subsequent analysis revealed the role of high-frequency trading algorithms in triggering a torrent of rapid trades.

Further complicating the matter is the lesser degree of scrutiny faced by tech giants responsible for AI model creation and cloud computing services, compared to Wall Street. Gensler’s rhetorical query on the number of cloud providers underscores the vastness and complexity of the issue, labeling it a “cross-regulatory challenge.”

However, a silver lining emerges as U.S. regulatory bodies gradually lay the groundwork for more stringent AI oversight. Concerns over market concentration potentially paving the way for monopolies have led to regulatory action. The SEC has floated a proposal necessitating stock brokerage and investment advising entities to rectify “conflicts of interest” resulting from their proprietary use of predictive analytics. Notably, in July, the FTC launched an inquiry into OpenAI, stemming from allegations regarding the sometimes misleading information disseminated by its ChatGPT model.