Some 35 million adults in the UK are financially vulnerable, according to FCA criteria. And 37% of financially vulnerable consumers want greater investment in AI-powered chatbots to help them with financial problems.
This is according to a recent report from customer experience (CX) firm Nice, The changing face of vulnerability, carried out by Focal Data, which found a rise in the number of British people who identify themselves as vulnerable, hitting 19% – an increase of over one million people since 2024, when the first study was done.
But, assessed against Financial Conduct Authority (FCA) criteria, two-thirds of UK adults – 35 million people – are potentially vulnerable, often without being aware.
The first study was undertaken by market research firm FocalData among 2,042 UK adults in November 2023, then repeated among 2,021 in November 2024.
The FCA defines a vulnerable consumer as somebody who, due to their personal circumstances, is especially susceptible to harm, particularly when a company is not acting with appropriate levels of care. It identifies four factors which may increase the risk of vulnerability: poor health; experiencing a distressing life event, such as a bereavement; low resilience; and low capability – such as financial, digital or language skills, or learning difficulties.
Vulnerable consumers are increasingly reliant on digital channels for support, according to the Nice report. Over a third (37%) said they prefer organisations to invest in better digital services like AI-powered chatbots over traditional in-person services such as real-life branches, surpassing the general population’s demand for digital services (33%).
Richard Bassett, vice-president of digital and analytics at Nice, said: “The findings pose a considerable challenge for UK organisations, particularly given regulations like the FCA’s Consumer Duty Act or Ofgem’s Vulnerability Strategy. Vulnerability stems from an increasing range of factors – from financial pressures to personal challenges – making it harder to recognise, even for themselves.
“Subtle cues, such as mentions of stress or relationship breakdowns, often surface during customer service interactions but are easily missed or affected by bias, particularly with human agents. AI and automation provide a critical solution. By analysing customer service data, AI can detect vulnerability during every interaction and provide agents with real-time guidance – ensuring no one is overlooked.
“The anonymity offered by digital channels can be especially empowering for vulnerable individuals who may feel uncomfortable discussing sensitive issues face-to-face. This presents a significant opportunity for UK organisations to leverage AI-powered chatbots and virtual agents to help vulnerable customers resolve their issues quickly and accurately.
“However, caution is essential. These solutions must be able to detect subtle vulnerability cues and respond appropriately, seamlessly escalate to a human agent or the correct workflow with full context preserved. These insights should be used alongside data from voice channels to enhance agent training and support,” he added.
It may seem counter-intuitive that AI could spot signs of vulnerability that a human agent in a contact centre is likely to miss. In a briefing with Computer Weekly, Bassett said: “There are certain situations or circumstances where people do want the confidence of a human. If you have a bot that’s diagnosing you, that might be something of a challenge to accept compared with a doctor.
“Having said that, if you’ve got financial difficulties, there’s a lot of people who are embarrassed on the back of that, even ashamed of it. They haven’t got the confidence to admit that to somebody in the voice world, but on a chat, or even with a bot, they are okay to have those conversations.”
According to Bassett, customers, especially younger ones, feel more comfortable disclosing details of financial worries to a chatbot than to a live human.
The study found that younger adults, especially those under 34, are at the forefront of being self-aware, with 31% identifying as vulnerable compared to 19% across all age groups. They are also more comfortable discussing mental health with customer service agents.
Darren Rushworth, president of Nice, said: “The increasing self-awareness among younger consumers is a promising step toward more open communication. However, organisations cannot depend solely on self-identification, as it overlooks those who are unaware of their vulnerability or choose to hide it out of fear, embarrassment or shame.
“Even when warning signs are flagged, they are often missed if advisers lack the confidence or tools to respond effectively. Worse yet, even when advisers take appropriate action, these cases can still fall through the cracks if workflows and knowledge across customer service are not properly connected.”
Financial pressures, particularly rising energy and utility costs, continue to weigh heavily on UK households. Some 35% of potentially vulnerable consumers anticipate reducing or stopping heating and hot water usage in 2025 due to financial strain, according to the study.
In addition to financial concerns (21%), many consumers feel uncomfortable discussing other causes of vulnerability, such as mental health (34%) and relationship breakdowns (28%), with human customer service agents.
Rushworth added: “UK organisations – especially energy providers – must adopt AI-powered solutions that subtly build customer confidence, such as self-service to help consumers easily find critical information when in need.
“AI-powered guidance during interactions ensures agents provide accurate, empathetic support in real time. Automation can ensure compliance and events that include vulnerability appropriately routed into the correct processes.”
#identify #financially #vulnerable #people #humans