Governance-Driven Adoption of Synthetic Data in Financial Services

Mark Townsend • 1 September 2025

It’s been a big year for synthetic data.

With the FCA’s Synthetic Data Expert Group (SDEG) publishing its final report, it’s clear we’re moving beyond experimentation and towards adoption particularly within regulated environments like financial services. But with progress comes pressure. How do we ensure synthetic data is not only powerful but safe, transparent and fair?


For those of my network working in data governance, quality, ethics and risk, this report offers both cautionary insights and practical direction.


What’s the Point of Synthetic Data?

At its core, synthetic data aims to mimic real datasets, preserving statistical patterns while stripping out personal or sensitive information. In financial services, this unlocks a few big opportunities:


  •    Enabling secure model development without exposing real customer data
  •   Testing systems under various stress scenarios
  •   Supporting innovation where data is limited or legally constrained
  •   Training AI models without breaching data protection laws


Sounds promising. But as the SDEG rightly points out: synthetic is not risk-free.


Synthetic data might be a privacy enhancer but it still carries risk. Poor generation techniques, unclear documentation, or hidden biases can all undermine trust and compliance. The FCA makes one thing clear: confidence in synthetic data isn't a technical problem, it's a governance challenge.


“Confidence is built through transparency, consistency and shared understanding.” FCA Synthetic Data Expert Group


Foundations for Responsible Use


The SDEG outlines three key governance foundations:

  1. Frameworks, Controls & Processes. Think stage gates, escalation pathways and oversight mechanisms.

2. Roles & Responsibilities: Who owns what from data generation to model deployment.

3. Documentation & Monitoring: A clear, auditable trail of decisions, trade-offs, and updates throughout the lifecycle.


These aren’t ‘nice-to-haves’, they’re prerequisites. Without them, synthetic data projects risk becoming compliance liabilities rather than innovation enablers. Drawing on AI ethics and model risk frameworks, the report offers a nine-point blueprint for good governance:

  • Accountability with clear ownership, even with third-party vendors
  • Safety. Systems should be robust and reliable
  • Transparency with clear visibility into data generation and use
  • Explainability, humans must understand how outputs are produced
  • Privacy & Security: Risks of re-identification must be minimised
  • Fairness, mitigate bias and prevent discrimination
  • Agency, human oversight remains critical
  • Suitability, align tech use with ethical and practical context
  • Continuous Monitoring, it’s a lifecycle, not a one-time review


For governance and data leaders, these principles offer a structure for embedding synthetic data safely into your ecosystem, not treating it as an isolated experiment.


Practical Recommendations from the FCA


The report is refreshingly actionable it's not just theory. Here are a few standout recommendations:


Conduct value vs. risk assessments before kicking off a project

Run privacy impact assessments (DPIAs) early and often

Use robust validation such as “Train Synthetic, Test Real” (TSTR)

Document trade-offs between privacy, fidelity, and utility

Maintain auditability across every decision and output

Engage cross-functional stakeholders — governance, legal, data science, compliance


This isn’t just about ticking boxes. It’s about ensuring synthetic data is not only usable, but trusted.


What Does This Mean for Data Leaders?

Whether you’re a Chief Data Officer, Head of Governance, or Data Risk Manager, the message is clear:

Synthetic data has moved into the mainstream. It will become increasingly embedded into how firms design models, test systems and share data. But governance must evolve with it.


Some questions to help keep you on track...

Do our governance frameworks account for synthetic data risks?

Are we involving the right people (examples might be legal, compliance, ethics team) at the right time?

Can we explain our synthetic data decisions to a regulator?

If not, it may be time to act.


Synthetic data offers incredible promise, but trust can’t be faked. That’s why strong governance isn’t optional. It’s the foundation for responsible innovation, better models, and ultimately, safer outcomes for customers and businesses alike.


This FCA report is a valuable blueprint for what “good” looks like.



If you’re building or scaling your data governance team to tackle this challenge, I’d love to hear from you.

Let’s connect.


📨 mark.townsend@kdrtalentsolutions.com



Man in purple thinking about data governance in financial services
by Jo Dionysiou 1 September 2025
UK financial services face growing pressure to hire data governance talent. Learn why demand is rising in 2025 and which skills are most in demand.
Man
by Jo Dionysiou 6 August 2025
Google Cloud Platform (GCP) may be third in market share but it has grown to become a significant player alongside AWS and Azure.
Image used in blog about KDR being sponsors of DAMA UK
by Jo Dionysiou 21 July 2025
KDR Talent Solutions is a proud sponsor of DAMA UK
More posts