Trending:

What is the deepfake porn racket targeting British women MPs ahead of UK election?

FP Explainers July 2, 2024, 15:26:06 IST

British women politicians have fallen victim to fake pornography, with their faces being superimposed onto nude images created using artificial intelligence. Among those targeted on a prominent fake pornography website are the Labour deputy leader Angela Rayner, the UK education secretary Gillian Keegan, the Commons leader Penny Mordaunt, the former UK home secretary Priti Patel, and the Labour backbencher Stella Creasy

Advertisement
Angela Rayner, deputy leader of the Labour Party, Daisy Cooper, deputy leader of the Liberal Democrats and Penny Mordaunt, leader of the House of Commons, during the ITV Election Debate 2024, in London, UK, June 13, 2024 . File Image/Reuters
Angela Rayner, deputy leader of the Labour Party, Daisy Cooper, deputy leader of the Liberal Democrats and Penny Mordaunt, leader of the House of Commons, during the ITV Election Debate 2024, in London, UK, June 13, 2024 . File Image/Reuters

The rise of deepfake technology has led to a disturbing trend affecting women, including high-profile British politicians. Deepfake pornography, where images are digitally altered to create explicit content without the subject’s consent, has seen exponential growth.

A recent investigation by Channel 4 News uncovered that more than 400 digitally altered images of over 30 prominent UK politicians have surfaced on a sexually explicit website dedicated to the abuse and degradation of women.

STORY CONTINUES BELOW THIS AD

Among the victims are Labour’s Deputy Leader Angela Rayner, UK Education Secretary Gillian Keegan, Conservative Commons Leader Penny Mordaunt, former UK Home Secretary Priti Patel, and Labour backbencher Stella Creasy. These images, created using both sophisticated AI technology and simpler methods like Photoshop, are part of a broader trend that has targeted female politicians and celebrities alike.

Angela Rayner, deputy leader of the Labour Party speaks at a campaign event by British opposition Labour Party leader Keir Starmer at The Royal Horticultural Halls in London, UK, June 29, 2024. File Image/Reuters

Dehenna Davison, who recently stood down as a Conservative MP, described the experience as “really strange” and “quite violating.” She spoke to Channel 4 News about the need for a robust AI regulatory framework to prevent “major problems” in the future. Stella Creasy echoed these sentiments, stating that the abuse is “all about power and control,” and expressed her revulsion upon learning about the images.

Isn’t this supposed to be illegal?

Despite the growing prevalence of deepfake pornography, the creation of such content remains legal in the UK. The Online Safety Act, introduced in January, made sharing non-consensual explicit images a criminal offense.

However, plans to ban the creation of deepfake porn were shelved when UK Prime Minister Rishi Sunak called for an early election. Political parties, including the Conservatives, Labour, Liberal Democrats, and Plaid Cymru, have pledged to reinstate these plans if they win the next election. The Scottish National Party (SNP) has indicated they would consider any proposed legislation carefully.

STORY CONTINUES BELOW THIS AD

Also Read: What do the various manifestos of the UK parties say?

Internationally, efforts to combat deepfake pornography are also underway. In the US, Representative Alexandria Ocasio-Cortez is advocating for similar laws, having personally experienced the trauma of encountering a deepfake of herself. She warned that such abuses could lead to severe psychological harm, including instances of suicide.

What is the extent of the deepfake porn threat?

ESET UK conducted a survey that highlighted the public’s growing concern over deepfake pornography. The survey, which involved over 2,000 Brits, revealed that 50 per cent of respondents were worried about becoming victims, and 1 in 10 reported either being a victim or knowing someone who was.

The survey, conducted in December last year, also found that 61 per cent of women were particularly concerned about being targeted, compared to 45 per cent of men. The research underscored that deepfake pornography is a significant risk associated with sharing intimate content.

STORY CONTINUES BELOW THIS AD

Despite this, 34 per cent of adults admitted to sending intimate images, with 58 per cent expressing regret. Among under-18s, 57 per cent were worried about becoming victims, and 12 per cent had sent intimate images.

ESET research also revealed that nearly half of all women (48 per cent) who send explicit images do so under the age of 16, rising to 71 per cent being under the age of 18. Representative Image/Pixabay

Jake Moore, global cybersecurity advisor at ESET, highlighted the persistence of risky online behaviours, noting that digital images are nearly impossible to delete, and deepfake technology makes it easier to create explicit content without the subject’s consent.

“Women are disproportionately targeted more often by abusers looking to exploit intimate images, and the prevalence of deepfake technology removes the need for women to take the intimate images themselves. We’re urging the government to look beyond the Online Safety Act and address this critical risk to women’s safety and security,” he stated.

What can be the psychological impact on a victim?

The psychological impact of being a victim of deepfake pornography can be profound. The ESET survey revealed that nearly half of those who had their intimate images misused felt embarrassment or shame.

STORY CONTINUES BELOW THIS AD

Women are disproportionately targeted, with a third of those surveyed reporting misuse of their explicit images. Of these, 25 per cent were threatened with the publication of these images, and 28 per cent had their photos posted publicly without permission.

Also Read: 7 telltale signs that the video you are watching on social media might be a deepfake

Despite the legal provisions, women remain reluctant to seek help, with only 28 per cent indicating they would contact the police if their images were misused. This reluctance highlights the need for increased awareness and support for victims of such abuse.

How to save yourself from becoming a victim?

To mitigate the risk of falling victim to deepfake pornography, individuals are advised to:

  • Turn social media accounts to private and be cautious about who they allow to follow them.

  • Engage in conversations about online safety and the risks of sharing intimate images.

  • Avoid sharing images that include identifiable features like faces, tattoos, or recognisable backgrounds.

  • Report instances of deepfake pornography to social media platforms and law enforcement.

The proliferation of deepfake pornography poses a significant threat to privacy, dignity, and mental health, particularly for women. As deepfake technology continues to evolve, so too must strategies for protecting individuals from its misuse.

With inputs from agencies

Home Video Shorts Live TV