Labour Party: “Wherever you live you should have access to cash”
Credit: chrisdorney / Shutterstock

The British government insists that it is committed to combating the threat posed by deepfakes, Artificial Intelligence (AI) generated false images, but like all political tasks this could prove easier said than done.

Sir Chris Bryant MP – Source: Parliament

Though his political remit does not cover the full scope of technology and fraud prevention, Sir Chris Bryant, Minister of State for Creative Industries, Arts and Tourism, has reiterated the Labour government’s stance on AI deepfakes.

In a parliamentary questions session this morning, Labour’s MP for Congleton, Sarah Russell, raised the issue of deep fakes, particularly the most common form of deepfake, graphic sexual depictions of women and girls. This is an offence Labour had previously pledged to combat in its July electron manifesto.

“This is a very important issue, it affects many people,” Bryant responded. “The legislation in this area is not easy. We are looking at it in the words of Stephen Sondheim, art isn’t easy and the legislation isn’t easy either. 

“We are, however, determined. It’s already a criminal offence to share an intimate image without consent, whether it’s real or synthetically generated, and we will also deliver our manifesto commitment to ban the creation of sexually explicit deep fakes.”

The extent of deepfakes

Whilst Labour stated ambitions around deep fakes chiefly focus on the threat posed to women and girls – as the MP for Congleton noted, this is the most common criminal use of deepfakes – any legislation the party introduces could have more far reaching ramifications.

Deepfake technology is being increasingly leveraged by fraudsters as a means of deceiving victims. Fraudsters will use the AI tech to create false images such as identity documentation or depictions of human faces, which with the most advanced tech can be replicated during a live video call, to fool both ordinary people and businesses.

This has seen the tech not only used to convince people to send money to a fraudulent account, as in the case of authorised push payment fraud (APP fraud) – a major talking point and subject of newly imposed regulations. It can also be used to set up accounts with high-risk businesses, such as using Ai-generated fake ID documentation to set up a gambling account, for example.

Deep fakes are subsequently becoming a significant headache for various businesses. As Ofer Friedman, Chief Business Development Officer at ID verification firm AU10TIX, told Payment Expert: “The arms race is on, but right now overall if I look at the market – the entire market, not necessarily the clients we handle – fraudsters are on top.”

This AI challenge has not gone unnoticed by the government, which seems to be following a similar policy to its Conservative predecessor regarding AI development and regulation. 

Both sides of the British political spectrum have recognised the growing role AI is playing in the country’s tech and fintech sectors, but also of the risks it may pose, whether these risks be to artists and creators, to women and girls facing the risk of exploitation and abuse, or consumers and businesses at risk of being defrauded.

“AI presents incredible opportunities for industry, society and governments but we must stay alert to the dangers, including AI-enabled fraud,” said Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, commenting on an anti-fraud initiative by Starling Bank earlier this year.

Under Rishi Sunak, the Conservative government had a lot of faith in what AI could do for the company, committing funding of around £500m into the tech sector. Responsibility was also on its agenda, however, founding the AI Safety Institute, which later signed a cooperation deal with its US counterpart.

Labour seems to be following the same route. The party’s manifesto included a policy of creating data centres across the country to support responsible AI development, the first of which was recently opened in North East England.

Though speaking with a focus on art and musicians, Chris Byrant summed up the attitude Labour seems to have regarding AI. Finance and technology stakeholders may expect a similar approach from his Labour colleagues at the Treasury, Department for Business and Trade (DBT) and the Department for Science, Innovation and Technology (SIT).

“The rights of artists, musicians, publishers, journalists need to be protected, and we need to garner the very significant benefits that artificial intelligence can bring,” he asserted in Parliament.