
FOS Responds to the Mills Review
Why It Matters
Unchecked generative AI threatens the speed and fairness of dispute resolution, potentially eroding consumer trust and regulatory compliance across the financial sector.
Key Takeaways
- •AI may influence up to 35% of initial assessment responses
- •Generative AI can increase case‑worker verification workload
- •Professional reps submit AI‑generated, overly long, inaccurate documents
- •Few complaints received about firms’ AI usage
- •FOS urges FCA to set transparency and record‑keeping standards
Pulse Analysis
The Financial Conduct Authority’s Mills Review, published earlier this year, examined how artificial intelligence reshapes retail financial services. In a detailed reply dated 2 April 2026, the Financial Ombudsman Service (FOS) highlighted early data suggesting that AI contributed to roughly one‑third of initial assessment responses. While generative tools helped some consumers articulate clearer arguments, the Service warned that erroneous or excessive AI output forces case workers to spend disproportionate time verifying facts, potentially slowing the quick, informal resolution model that underpins the ombudsman’s mandate.
The FOS also flagged a surge in AI‑driven submissions from professional representatives, some stretching to 200 pages against a six‑ to eight‑page provisional decision and riddled with inaccuracies. Such bloated filings not only burden the ombudsman’s review process but also raise the risk of unnecessary escalations to senior ombudsmen, contradicting the Consumer Duty’s emphasis on proportionality and fairness. Although complaints about firms’ own AI usage remain scarce, the Service’s observations suggest that unchecked generative AI could erode trust in dispute resolution channels if left unaddressed.
To mitigate these emerging risks, the FOS is calling on the FCA for clearer expectations around AI transparency, record‑keeping, and human‑in‑the‑loop escalation paths. The Service recommends that regulated firms disclose the rationale behind algorithmic decisions, align outputs with principles‑based rules such as the Consumer Duty, and maintain auditable logs accessible to both the ombudsman and consumers. By establishing firm‑wide standards now, the regulator can prevent a future flood of AI‑generated disputes, safeguard the speed and informality of ombudsman resolutions, and reinforce confidence in the UK’s financial services ecosystem.
Comments
Want to join the conversation?
Loading comments...