Digital Fraud Resilience for Government.

Team members holding gift boxes, one opened to reveal a toy

Transforming a proven HMRC fraud platform into a cross-government capability for real-time fraud prevention and digital fraud resilience.

Digital Fraud Resilience for Government.

Transforming a proven HMRC fraud platform into a cross-government capability for real-time fraud prevention.

Overview

A joint effort between the National Cyber Security Programme, GDS, and HMRC needed to introduce digital fraud prevention platform technology across the UK government’s digital estate.

The technology had already been developed within HMRC, but taking something that worked in one departmental context and making it useful across government presented a different kind of challenge. This was not simply a matter of scaling a platform. It required a shared understanding of fraud, risk, data, identity, service intervention, and operational maturity across departments with very different levels of digital capability.

The work began by exploring multiple proposed solutions through research and analysis across government agencies. The aim was to understand the broad landscape of knowledge, existing initiatives, technical constraints, organisational barriers, and cultural resistance that would affect adoption.

Key challenges

Introducing real-time and machine-learning-based fraud prevention across government presented several challenges:

Outcome

The work transformed a promising HMRC fraud platform into a cross-government proposition that departments could understand, evaluate, and adopt according to their own needs.

We developed a delivery strategy from high-level policy requirements, supported by research, stakeholder engagement, product strategy, and technical proof-of-concept work. A Digital Fraud Maturity Model gave departments a common language for understanding their current capability and the steps required to improve it.

This helped clarify the platform’s role in a crowded government landscape. It was not simply another data, identity, or cyber security initiative, but a way to use operational signals from digital services to identify and respond to fraud risk earlier.

The result was a product roadmap and a transformation path: from fragmented fraud prevention activity to a shared model for digital fraud resilience.


Challenges in depth

Moving from policy ambition to practical adoption

Large transformation programmes often begin with a clear policy ambition and a promising technology. The difficulty is working out how to make that ambition meaningful across organisations that are all starting from different places.

Fraud prevention is especially difficult because it cuts across service design, identity, data, cyber security, casework, investigation, operational policy, and legal authority. One department may already have mature digital services and data science capability. Another may still rely on manual review, fragmented data, or processes designed around paper-era assumptions.

A single platform solution can’t resolve organisational differences by pretending they aren’t there. It has to provide interfaces for those differences, so that departments can connect their own maturity, operating models, data, and risk practices to a shared capability.

Before implementation could mean much, we had to understand the landscape well enough to know what kind of change the platform would need to support.

This meant researching how fraud and risk work was understood across government departments, where knowledge was concentrated, where gaps existed, and what barriers would prevent adoption. Some barriers were technical. Some were organisational. Some were cultural. Many were about language.

Without a common way to talk about maturity, departments would struggle to understand what the platform was for, what problem it solved, and what they would need to change in order to benefit from it.

Digital fraud resilience needed a shared map before it could become a shared capability.

Solutions

The first step was to develop a maturity model for digital fraud resilience.

The model was created from research across government agencies of varying sizes and levels of capability. It focused on three areas: service intervention, data insight, and coverage.

The maturity model created a shared language for departments, policy stakeholders, technologists, fraud teams, and senior decision-makers. It also made clear that different departments would need different paths to maturity.

That mattered. It meant the platform could be presented as an enabling capability, not as a one-size-fits-all technology rollout.


Positioning the platform in a crowded transformation landscape

The proposed platform didn’t exist in isolation.

Across government, related work was already happening in data transformation, identity assurance, cyber security, and fraud operations. Some of it overlapped directly with the platform. Some of it shaped the environment the platform would have to work within. Each had its own stakeholders, language, assumptions, and desired outcomes.

In that kind of environment, even a good technology can struggle to gain traction. People need to understand not only what it does, but how it relates to things they are already working on.

If this was not resolved early, the work could become trapped in comparisons. Was this an identity assurance programme, a data platform, a cyber security capability, a fraud operations tool, a machine learning product, or another version of something that already existed?

In reality, the answer was yes and no to all of these.

Solutions

The solution was to stop treating the platform as a standalone technology rollout and start treating it as a cross-government product proposition.

That meant working backwards from the outcomes different groups already cared about:

This changed the conversation. Instead of asking departments to adopt someone else’s platform, we could show how it connected to their own responsibilities, constraints, and priorities.

The product strategy and roadmap were built around those different routes into the same capability:

The point was not to force a single adoption path or replace adjacent work in identity, data, cloud, and cyber security. It was to show how the platform could sit between those efforts, using operational signals from digital services to identify and respond to fraud risk earlier.


Overcoming adoption fatigue across departments

Cross-government transformation has a history. Departments have seen platforms, shared services, data initiatives, and policy-led programmes come and go. Some have worked. Some have not. That history matters.

Even where the case for fraud resilience was strong, not every department had the same reason to participate. Some had immediate fraud problems the platform could help address. Others had less direct need, or had already tried similar approaches that had failed, stalled, or created more work than value.

That created a practical adoption problem. The programme couldn’t rely on the idea that departments would participate simply because the platform was useful somewhere else in government.

For some organisations, the value was local. For others, the value came from contributing data, signals, or operational knowledge to a wider feedback cycle that could improve fraud resilience across government.

The challenge was making participation feel worthwhile, credible, and proportionate.

Solutions

The proof-of-concept work moved the conversation from abstract ambition to practical implementation.

Rather than only describing cross-government signal exchange, the team tested what would have to be true for fraud and risk signals to move between organisations in a useful, governed way.

This exposed several issues that would need to be resolved before the capability could scale:

The proof of concept helped define the path from local fraud knowledge to exchangeable operational intelligence.

A mature fraud capability depends on more than detecting signals. It depends on knowing what those signals mean, who can use them, what can be done with them, and how the result feeds back into the wider system.


Building the team and delivery capability

Large transformation programmes run for years, but they do not need the same team at every stage.

Early work needs people who can handle ambiguity: researchers, product thinkers, technical leads, and delivery people who can test assumptions before the shape of the programme is fully known. Alpha and proof-of-concept work needs a different mix, with enough design, DevOps, platform, back-end, and front-end capability to turn the strategy into working examples. Later stages need a more stable delivery model, with clearer governance, support, operations, and long-term ownership.

The challenge was knowing what kind of team was needed at each stage, how quickly to ramp up, and how much delivery capability to create before the programme had answered its biggest questions.

Solutions

I led resourcing and delivery planning for the proof-of-concept and alpha stages, while also planning how the programme would need to grow over several years.

This meant shaping the team around the work that needed to be tested, rather than assuming the eventual programme shape was already known:

The aim was to create enough delivery capability to move from strategy into working examples, while leaving room for the team, architecture, and product direction to evolve.

The team developed enough core platform architecture and working services to prove the approach was viable. The work also showed how the platform could be delivered in useful pieces, creating momentum without waiting for the whole cross-government model to be solved at once.

This gave the programme practical evidence to work from: what could be built, what skills were needed, what dependencies existed, and what questions would need to be resolved before the capability could scale.


Resilience in the face of change

Fraudsters adapt quickly. Government systems are often large, complex, and slow to change.

Digital fraud resilience is not about pretending fraud can be eliminated. It is about improving the organisation’s ability to learn, respond, and adapt as quickly as possible.

The work showed that real-time and machine-learning-based fraud prevention could not simply be dropped into government as a finished product. It had to be made understandable, testable, and useful across departments with different needs, capabilities, incentives, and histories.

The maturity model provided the shared map. The product strategy gave departments different routes into the same capability. The proof-of-concept and alpha work showed how delivery could start in useful pieces, building momentum while the wider programme questions were still being resolved.

The transformation was from isolated departmental activity towards a shared model of digital fraud resilience: one where organisations could understand their own maturity, contribute to a wider feedback cycle, and build the foundations for earlier, faster, and more coordinated fraud response.