AI Chatbots Giving Millions of Brits Financial Advice Nobody Approved

You Asked an AI How to Invest Your Money. The UK’s Financial Watchdog Says That’s a Problem

It starts innocently enough. You type “should I put my savings into a stocks and shares ISA?” into ChatGPT. It gives you a confident, well-structured answer. You follow it.

What the AI didn’t tell you  and what most people don’t know  is that nobody checked whether that advice was legal, accurate, or safe.

What the FCA Actually Said

On 26 March 2026, the FCA, the UK’s Financial Conduct Authority, the body responsible for protecting consumers from financial harm published its latest perimeter report, flagging the rapid growth of general-purpose AI tools offering financial advice or recommendations, such as AI-powered personal finance chatbots.

⚠️ Important clarification: The FCA did not call these tools “illegal.” What it said is more nuanced and in some ways more alarming. The FCA noted that these tools may not fit neatly within existing regulatory frameworks, raising questions about whether current perimeter boundaries remain appropriate if consumer harm begins to materialise and has urged the government to consider whether those regulatory boundaries should be updated.

In plain English: the rules that protect you when a financial adviser gets it wrong may not apply when an AI does.

Why That Gap Matters: With Real Evidence

A study published in March 2026 tested every major AI chatbot on financial questions. The results were alarming.

When asked about the safest way for a beginner to invest in cryptocurrency, Claude incorrectly described Binance as registered with the FCA. In reality, Binance was ordered to cease regulated activities in the UK in 2021.

“Recommending a non-FCA-registered exchange as the safest option for a beginner is about as concerning as it gets,” said Michele Tieghi, founder of psyfi money. “Investors could be left without asset protection and exposed to market manipulation.”

And it wasn’t just one model. Every AI model tested returned incorrect prices for at least 45 of the 50 popular cryptocurrencies tested.

The Inaccurate Numbers Are Already Moving

AI-enabled scams contributed to a record 444,000 inaccurate cases reported in the UK in 2025, a 6% increase on 2024 with criminals increasingly using AI tools to scale deception and target victims more efficiently.

The UK Government’s Strategy, launched in March 2026, estimates that inaccuracy now accounts for up to 45% of all crime committed in the UK. A Barclays survey found that only 36% of consumers felt confident they could identify an AI-enabled scam.

What Parliament Is Saying

The regulator isn’t the only one concerned. A January 2026 Treasury Committee report warned that the FCA’s current “wait-and-see” posture risks serious harm to consumers and the wider financial system, calling for AI-specific stress testing and comprehensive practical guidance to be published by the end of 2026.

As of November 2024, 75% of UK financial firms are already using AI, with a further 10% planning adoption within three years. The technology is already embedded. The rules are still catching up.

What You Should Do Right Now

The FCA’s guidance is straightforward. Consumers are reminded to use the FCA’s Firm Checker to confirm whether a firm is authorised before acting on any financial advice, whether that advice comes from a human or a machine.

Bottom line: AI financial tools aren’t going away, and most of them are not regulated the same way your bank or financial adviser is. That gap between what these tools can do and what they’re accountable for is where people are already losing money. The FCA has spotted it. Now you have too.

Related Articles:

AI Self-Diagnosis Now Normal in UK: ChatGPT Replacing Doctors?

AI Is Reading Your CV and Rejecting You in UK