The World Economic Forum’s new report, Navigating Uncharted Waters, was prepared in collaboration with Deloitte and released today. It is based on more than 10 months of extensive research, global workshops and contributions from companies including RBS, Morgan Stanley, UBS, JP Morgan, Credit Suisse, Microsoft, BlackRock and the New York Stock Exchange.
The report warns that widespread adoption of artificial intelligence (AI) could potentially create a fundamentally different type of financial system, in which interconnections between humans and machines grow, even as humans struggle to understand behaviors opaque AI systems.
“As a result, crises and critical events could occur more frequently and market shocks could be intensified,” the report notes.
“Emerging risks will no longer be isolated within a supervised institution, but rather will be dispersed among a set of interconnected actors including small, specialized fintechs and large technology companies. »
As a result, the report suggests, supervisors and regulators will need to reinvent themselves as system-wide intelligence hubs.
The rise of AI-based systems has some people worried that machines will become too complex to understand. Others fear it’s the humans behind the machines who are the biggest cause for concern.
The report warns that optimizing algorithms that find themselves competing with each other could inadvertently destabilize markets. For example, two AI systems can continually bid against each other, optimizing their actions to achieve a single goal like the highest market price or yield.
“The average market price continues to increase as they repeatedly outbid each other, until one player is no longer able to maintain its bids due to profitability constraints,” the report warns. “Over time, this competitive optimization can lead to a deterioration of players’ balance sheets, encouraging riskier behavior in order to maintain profitability, or excluding them from the market altogether,” the report said.
Rob Galaski, partner at Deloitte Canada and global leader in banking and financial markets consulting, said the use of AI in financial services will require an openness to a fundamentally new way of protecting data. ecosystem, different from the tools of the past.
“To accelerate the pace of AI adoption in the industry, institutions must take the lead in developing and proposing new frameworks that address new challenges, working with regulators throughout the process,” he declared.
The report notes that financial services companies that are early adopters of AI face higher risks in deploying emerging technologies without regulatory clarity. But they are also the ones who have the most to gain.
“AI offers financial service providers the opportunity to leverage the trust their customers place in them to improve access, improve customer outcomes and drive market efficiency,” said Matthew Blake, head of financial services at the World Economic Forum.
“This can provide competitive advantages to individual financial companies while improving the financial system as a whole if implemented correctly. »
Algorithmic bias is a major concern for financial institutions, regulators and customers regarding the use of AI in financial services. AI’s unique ability to quickly process new and different types of data raises concerns that AI systems may develop unintentional biases over time; Combined with their opaque nature, these biases could go unnoticed.
Despite these risks, AI also offers the possibility of reducing unfair discrimination or exclusion, for example by analyzing alternative data that can be used to evaluate “thin” clients that traditional systems cannot understand due to a lack of information.
“Since AI systems can act autonomously, they can presumably learn to engage in collusion without any instruction from their human creators, and perhaps even without any explicit, traceable communication,” the report notes .
“This challenges traditional regulatory constructs for detecting and prosecuting collusion and may require a review of existing legal frameworks. »