Users want to tread carefully, with 6 out of 10 American users believing AI in banking is a double-edged sword, while banks are eager to investigate technology.
With fraud and financial scams at an all-time high, costing banks and finance billions of dollars worldwide each year, Techopedia sat down with industry experts to better understand the disconnect between the business and its clients as AI sweeps throughout the sector, permanently reshaping it.
60% of Americans believe AI in finance is a double-edged sword.
Glassbox, an AI-driven consumer intelligence solution provider, issued “The State of Digital Banking 2024” on June 6. According to the data, more than half of those polled believe security is a top priority for digital banking.
According to the research, 60% of US customers believe AI in banking has both benefits and risks. Despite these alarming discoveries, the AI revolution is not slowing down.
In early 2023, OpenAI CEO Sam Altman stated that 92% of Fortune 500 organizations were already using OpenAI solutions. At the 2024 MIT FinTech conference, industry executives unanimously agreed that AI is being used rapidly in the financial sector.
Advertisements
The industry views generative AI as a necessary asset that adds value. Jose Lopez, Visa’s vice president of global AI and data innovation, addressed the problem during the event.
“AI is essentially front of mind for everyone. It’s been a few months of transformation and really quick acceleration.”
While implementation has been rapid, with AI being used by finance in a variety of sectors, including anti-fraud and financial service customer-centered customization, clients are not totally convinced of the modernization strategy.
Consumers appreciate AI’s security potential, yet 47% continue to believe security issues are the most significant danger associated with AI in banking.
While 67% value customization based on their banking history, they are not comfortable with AI being used to auto-populate personal information based on previous interactions.
AI, Financial Inclusion, and Regulation
Ravi Nemalikanti, Chief Product and technologies Officer of Abrigo, a banking software and consultancy firm, told Techopedia that adopting new technologies requires care and a controlled approach, particularly in the highly regulated banking and finance business.
“While megabanks like Chase Bank lead in adopting new technology, regulatory challenges remain significant.”
Finance and banking are among the world’s most regulated industries, with all organizations, large and small, subject to the same rules and compliance requirements. While some suggest that AI would democratize access and increase financial inclusion, Nemalikanti from Abrigo clarified that this is not necessarily the case.
“There are over 9000 banks and credit unions representing small businesses and commercial customers who are cautious and lack the resources to navigate regulatory complexities while meeting customer expectations.”
For these institutions, innovation is frequently dependent on their slower-changing partner landscape.
Customers’ trust in online finance has eroded.
AI is unavoidable in the financial sector, so why do customers’ trust and opinions conflict with the advancement? Roman Eloshvili, founder of XData Group, a B2B software development business, told Techopedia that consumer concern is only momentary.
“Before long, AI will be viewed as simply another kind of automation. Just think about how we used to live without ChatGPT a year ago, and today its use is ubiquitous and really important in many businesses,” Eloshvili remarked.
“To facilitate a similar transition, all the banks and fintech companies need to do is enhance transparency and put focus towards customer education regarding AI usage in banking.”
Eloshvili highlighted that AI-based fraud detection technologies will be required. “The classic approach won’t beat complex fraud schemes that employ AI,” Eloshvili stated.
“If I were a bank executive, I would make the formation of an AI department a top priority. Based on personal experience, you should begin building your own in-house AI team to stay up with the fast-moving industry.”
Customers ‘would leave their banks following an AI incident’
According to the Glassbox poll, more than half of consumers would abandon their banks if they were victims of AI fraud. Furthermore, 85% of consumers expect their banks to communicate actively about how they use AI.
Arun Kumar, EVP of AI and Data at Hero Digital, an independent customer experience provider, told Techopedia that the pace of AI integration should not outstrip the establishment of strong frameworks for security, compliance, and workforce preparation.
“This is especially critical in the financial industry, where a data breach can be devastating to an organization’s image and brand reputation. A careful, well-planned strategy is required to reap the benefits of AI in generating meaningful corporate value while limiting its hazards.”
Kumar stated that it is critical for bank management to assist develop consumer trust while using AI capabilities.
“Banks should clearly communicate how AI is being used in their services and provide regular updates on AI initiatives, specifically how the technology meets ethical guidelines and the systems in place to protect valuable consumer data,” Kumar stated.
Other suggestions include banks and financial organizations implementing robust fraud detection measures with feedback mechanisms to proactively prevent data breaches, as well as having a clear incident response plan in place to communicate any AI-related fraud events to consumers and the steps being taken to mitigate their effects.
Nemalikanti from Abrigo concurred, emphasizing the need of prioritizing honest and proactive communication about AI utilization in order to satisfy customer expectations.
This should include frequent updates on AI projects, detailed explanations of how AI improves security and services, and timely, transparent reactions to issues.
“Fraudsters are quick to exploit new technologies, using tactics like deepfake videos or synthetic identity fraud,” Nemalikanti stated.
“By fostering transparency and emphasizing security, banks can build and maintain consumer confidence, thereby reducing the risk of customer attrition due to AI-related concerns.”
The Bottom Line
The rush to integrate AI in banking has begun, with institutions viewing it as a critical driver of efficiency, security, and customization. However, there is a considerable trust gap between financial organizations and their clients. Here are several outstanding unanswered questions that clearly require public debate.
Can the banking industry outpace regulation? And should it? Balancing innovation and regulatory compliance is an ongoing problem in finance. How can rules change to keep up with the fast progress of AI while safeguarding consumers?
Will artificial intelligence (AI) democratize finance or leave some behind? While AI has the potential to increase access to financial services, smaller institutions may struggle to embrace the technology owing to limited resources and legal complexity. How will the industry achieve inclusive growth in the AI era?
Can we build trust in the age of AI-powered banking? Transparency and education are critical for increasing client confidence in AI. How can financial institutions successfully promote their use of AI and instill trust in its appropriate application?
The success of AI in finance is dependent on overcoming these issues. By encouraging open communication, emphasizing security, and assuring responsible development, the financial industry can overcome the trust gap and realize AI’s full potential for both institutions and their clients.