The role of data sharing as UK becomes key battleground against fraud 

Credit: Shutterstock

The global shift to digital payments has brought with it an unprecedented level of financial crime. As international markets make a collective stand against this ever-growing threat, David Sutton, Director of Innovation at Featurespace, talks about the emergence of the UK  as a global leader in the fight against fraud with its new regulations, developing tech, and data sharing capabilities. 

Combating financial crime and fraud, especially scams, is a never-ending battle. As technology advances and new payment platforms emerge, criminals continually change their tactics in order to try and scam consumers and businesses.

Those of us who fight against these bad actors also have to update our playbook on an ongoing basis. Some of the most important weapons in our arsenal right now are AI and machine learning, which offers the potential to completely transform scam prevention.

However, for this potential to be realised, we need to overcome some big challenges. The most critical of these is finding a way in which financial institutions can balance their data protection responsibilities with creating industry-wide data sharing principles necessary to effectively train AI models.

The UK is the key battleground in the fight against fraud and scams 

Where fraud is concerned, the UK is currently one of the key battlegrounds. There’s a perfect storm brewing, upping the ante on all sides. We have this new enabling technology in machine learning. We have regulators who are pushing for banks to take greater responsibility for protecting and compensating customers. And we have a massive problem with Authorised Push Payment (APP) scams, which is particularly acute in this country due to the advanced real-time payments landscape.

The rise of socially engineered APP scams has come about for several reasons. One of the main ones is that it is very difficult to breach internet banking security at the login stage. Because regulators focus so much on login security, the banks do too. PSD2 regulations place great emphasis on the use of strong customer authentication and two-factor authentication (2FA), making it much more difficult for criminals to break into accounts in this way. Banks also use behavioural biometrics and device fingerprinting as additional methods of authenticating users, further reducing the possibility of third-party fraud.

The knock-on effect of this is that the consumer themself is now the easiest point of attack as far as the fraudster is concerned. So, they’ve changed their tactics to focus on socially engineered APP scams. Around four out of five of these scams originate on social media. This is why it’s important that it’s not just banks involved in the fight against fraud, but social media companies and telcos too.

Whose responsibility is it to lead the fight against fraud and scams?

The question of liability is especially pertinent. The Payment Services Regulator has recently shifted the balance of responsibilities for reimbursing victims of fraud. Before 2018, the banks had no liability, but after that a voluntary code was put in place meaning it was the customer’s bank that would be on the hook for reimbursement. Now, responsibility will be shared 50:50 between the customer’s bank and the beneficiary bank – the institution to which the money is fraudulently directed. And in my view, there’s a strong argument that social media companies and telcos should also bear some of this responsibility.

Scams are everyone’s problem. And if we’re going to fight it effectively then banks need to share data with each other, as they could potentially have to pay out if they have a role in any fraud. And social media companies and telcos need to share their data as well, so the trail can be followed. Without this level of data collaboration, we will be fighting fraud with one hand tied behind our back.

Data protection and privacy-enhancing technologies

Of course, GDPR legislation and the Data Protection Act 2018 (DPA 2018), which is technically the law that governs data sharing in the UK since Brexit, mean that the way in which data can be used and shared is strictly mandated. But for the fight against fraud to be effective, there is a need to share customer data from a variety of sources – banks, payment processors, social media companies, telcos – so machine learning systems can be trained to spot fraud and scams on an industry-wide level.

In order to meet this need, some innovative solutions have sprung up. Privacy-enhancing technologies or PETs are techniques designed to facilitate data sharing without compromising the security or rights of the individuals to whom the data pertains. While there are several different types of PETs, the general principle is that they don’t share raw data so much as communicate an insight or decision among the participants in the data sharing.

One type of PET known as Federated Learning could potentially work by having small units distributed between banks that communicate a limited amount of data about transactions to a central orchestrating party, such as a payment network, or a highly trusted standards authority such as Pay.UK. The central party can extract insights from each bank about a transaction, fill in the blanks and make a decision on whether to allow the payment to proceed or not. Using a second PET called Differential Privacy, the data that is transmitted by the banks can’t be reverse engineered as it adds a certain amount of noise to the signal, which ensures privacy at the cost of a small degradation in accuracy.

With Federated Learning and Differential Privacy, you can ensure that the data that is shared is obfuscated. It’s not the raw data that’s being shared. It’s been processed in such a way that it can’t be inverted, so a criminal cannot determine things about the properties of the raw data like names, addresses, values, balances, and so on. It is data minimisation for acceptable purposes, which is protection against fraud, scams and financial crime.

Sharing is caring

The major benefit of PETs is that they enable data sharing between a large group of organisations. If there’s only one or two institutions sharing data in an attempt to combat payment fraud, then their efforts are likely to be limited in their effectiveness. If, however, the group consists of a larger number of financial institutions then it’s much more likely that this data sharing will have a greater impact in the fight against fraud.

Of course, persuading banks to share data with each other is challenging, especially in the UK. Helpfully, the Information Commissioner’s Office (ICO) – the body responsible for enforcing the Data Protection Act 2018 – has recently released guidance for organisations that encourages them to use PETs when dealing with personal information to comply with privacy laws and fraud prevention guidelines.

There are other difficulties to overcome too. From a technical point of view, proving the value of data sharing is tough as you can’t prove the value until you get people to share data – it’s a bit of a Catch-22. To some extent you can work around this by creating simulated data for the purpose of demonstrating the value. But it’s essentially meaningless, as you’re just picking up signals that you’ve decided to put into it – without real data it’s impossible to prove the real value.

This is why it’s so important that Pay.UK, in partnership with Featurespace, has begun a pilot scheme that can help to establish the benefits of this kind of data sharing in preventing fraud. It’s a significant step forward that can clearly underline the role that PET-enabled data sharing will play in making the world a safer place to transact. It will use real data, and set a real-world precedent that we can continue to build upon.

The UK is in a perfect position to lead the global fight against fraud, scams and financial crime. Using PETs, it’s possible to create an environment for industry-wide collaboration without organisations having to reveal, share, or combine their raw data. With a concerted effort involving all the relevant players, machine learning can drive our efforts to stay ahead of the criminals, meaning consumers and businesses will be safer than ever.