Chatbots Can Positively Impact Financial Standing

Insights from Posh AI Field Test

The emergence of new technologies significantly impacts financial security and customers’ opportunities throughout the financial industry. Too often, however, the design of new technologies does not consider the needs of and potential benefits for households living on low and moderate incomes (LMI). As a result, new technologies may deepen inequity by focusing only on the needs of people with greater financial resources. 

Commonwealth’s Emerging Tech for All (ETA) initiative aims to ensure that, as the next generation of emerging technologies develops, the needs, wants, and aspirations of people living on LMI are understood and integrated into the design of new financial technologies. We are shedding light on these perspectives by partnering with financial organizations, fintechs, and platform providers who implement emerging technologies to research the needs and wants of customers living on LMI. 

Our 2024 national survey report, Generative AI and Emerging Technologies, provided actionable insights around larger trends in using generative AI, chatbots, and digital financial services. The report highlighted persistent technological uptake gaps between lower and higher-income households.  

To add more real-life color to the findings of the survey report, this field test brief provides a focused look at Associated Credit Union of Texas (ACU of Texas) customers living on LMI through in-depth interviews and a survey of 247 customers who had recently used ACU of Texas’ chatbot, Ava. Through this partnership with ACU of Texas and conversational AI provider Posh, we continue refining our understanding of how chatbots can best serve households living on LMI. While a lack of access to local bank branches disproportionately affects lower-income communities, financial chatbots may offer a key path to ensuring equitable access to financial services for populations without easy access to a physical bank branch.

Key insights

In this field test, we identified several key insights:

  • Chatbots are largely viewed as positive tools for improving financial standing.
  • Transparency about chatbot features and limitations may build trust.
  • Credit building and budgeting are consistently the top-ranked features desired by research participants who said they value data-driven insights provided by judgment-free chatbots.
  • Information source and recency were often a determining factor for respondents in their preference for chatbots versus humans to provide accurate information.

Insights in-depth

The chatbot was perceived as improving financial standing. 

In a survey of ACU customers who have used the Credit Union’s chatbot, 57% of respondents somewhat or strongly agreed that the chatbot positively impacted their financial standing. Only 12% disagreed that the chatbot had a positive impact. The youngest respondents were most likely to feel that the chatbot positively impacted their financial standing. Chatbots can continue to add value for customers as technology progresses and as more of the banking population becomes comfortable with the technology.

Transparency about chatbot features and limitations may build trust.

Research participants were split on trusting chatbots to treat them fairly and operate in their best interest. For some respondents, the fact that the chatbot would be emotionless increased their trust in the chatbot. A chatbot would be unable to show a particular bias or treat one customer differently from the next. For these customers, there was a higher risk of variability with humans. Respondents feared that a human service representative might be prejudiced or even have just had a bad day and that this would shape their interactions and willingness to support the customer. 

For other participants, the lack of emotions from a chatbot decreased their trust, as a chatbot would be programmed to only operate in one way. For these respondents, a human would be able to understand their unique circumstances better; they might be moved by empathy to go beyond the customer service a chatbot would be authorized to provide. While people may always have differing opinions or feelings about chatbots, there may be an opportunity for designers and developers to use messaging that leans into the potential strengths of chatbots and emphasize the ways the chatbot can provide consistent support without judgment and highlight this reality of the chatbot as a strength rather than a perceived weakness.

The chatbot, in my opinion, which has no emotion, doesn’t know what is in best interest, it just goes off in numbers. Whereas if you speak to a person, they can hear the full story and be able to see what’s the best option for you.”

If I’m frustrated and I speak to somebody, you can tell in my voice, and [that] might rub that off, and [they] get frustrated with me. The chatbot has no emotion, so it can’t [behave] any different.”

Chatbot users indicated high levels of interest in chatbots that could assist with credit building and budgeting. They would value a chatbot that provides data-driven insights with a judgment-free delivery to improve these two fundamental elements of their financial lives.

Interview and focus group participants consistently expressed an interest in how their bank’s chatbot could better assist them with budgeting, setting financial goals, and building or repairing credit. These are topics that chatbot users living on LMI frequently think about in their personal financial lives, and they sometimes struggle to find the time or relevant guidance they need to feel confident in completing these tasks. Participants pointed out that the banking chatbot likely has a novel and valuable perspective on their finances, as it can see their account activity and balances and any loans or lines of credit with their bank. 

In addition to responding to questions, participants often wished that the chatbot could utilize this perspective to proactively assist them in improving their finances with personalized budgeting help, financial advice, and credit counseling. Chatbot users believe that their bank’s chatbot already has access to the data it would need to analyze spending habits, recognize recurring bill payments, or understand the financial impact of a home or car loan. A chatbot that could draw on these customer insights to proactively benefit the financial lives of its users would be a highly desirable and valuable new banking tool.

Chatbots are particularly well-positioned to provide personalized feedback because users overwhelmingly trust the tool to provide information without judgment. Users feel comfortable asking the chatbot for guidance with a sensitive financial topic without worrying about the shame or self-consciousness that might accompany that same conversation happening face-to-face with a human representative. In our survey, only 8.6% of respondents indicated previous use of a bank chatbot for financial advice or education compared to 45% of respondents who had used a chatbot to find information about their finances or accounts.  If chatbots can provide users with more relevant budgeting and credit advice, there is likely an opportunity to increase chatbot engagement, meet customer needs, and build more earned trust.

Chatbot users pay attention to the source of information when deciding whether to trust a chatbot’s or human representative’s accuracy.

Research participants drew a wide range of conclusions about whether they trusted humans or chatbots to provide them with accurate information. Still, the rationales behind these conclusions always focused on the information’s source. 

For some people, the chatbot was seen as having access to more information, or the most up-to-date information, and thus was trusted for accuracy. For other people, understanding a wider context and asking follow-up questions made the human representative a more trustworthy provider of accurate information. Some interviewees said they trusted chatbot and human accuracy equally, naming the fact that they should be drawing on the same information to answer. Despite arriving at different answers, it was clear from all of our interviewees that the information source is key and they factored it into their decisions about trust in accuracy.

Focus group participants focused on the recency of a banking chatbot’s data when determining whether to trust the information provided. Information that was outdated or not applicable to the current circumstances was identified as a detriment to trust in a chatbot. On the other hand, a chatbot that is clearly providing up-to-date information is seen as more trustworthy than a human who may not yet have access. Even for more advanced technologies using AI participants warned that one cannot be absolutely sure that its information is correct. Technologically savvy respondents pointed out that ChatGPT’s technology, in particular, was trained on a selection of years-old information. Common amongst these participants was the idea that, regardless of the chatbot’s trustworthiness, they would like the ability to verify its information before making a decision. A desire for the ability to find supporting evidence applied widely to information initially received from bank chatbots, bank websites, and human representatives alike.

For service providers, providing additional transparency around updates and software versions in an FAQ or another easily accessed location could add trust to the information provided.

Conclusion

The widespread implementation of chatbots and their potential for significantly more complex support with advances in AI are a prime opportunity to impact the financial health of and provide valuable support to millions of households living on LMI, who are significantly less likely to have easy access to a local branch of their bank. Through this research overall, we aim to highlight some of the key barriers and opportunities for LMI in this space to give providers the information they need to most effectively support customers living on LMI through this technology. 

Our updated Tech for Good Toolkit & Chatbot provide actionable guidance for financial actors looking to implement equitable design in conversational AI and other technologies. Sign up for our newsletter if you want to be notified of this release and other important work from Commonwealth.


This work is supported by JPMorgan Chase & Co. The views and opinions expressed in the report are those of the authors and do not necessarily reflect the views and opinions of JPMorgan Chase & Co. or its affiliates.