The creation of deepfake pornography featuring Taylor Swift leads to renewed demands for legislation in the US.

The creation of deepfake pornography featuring Taylor Swift leads to renewed demands for legislation in the US.

The widespread circulation of fake pornographic photos featuring Taylor Swift on the internet has sparked renewed demands, even from American lawmakers, to make the act illegal. This involves using artificial intelligence to create realistic and explicit images that are not real.

Millions of people have viewed the pictures of the American pop star on various social media platforms this week. The pictures were originally shared on the app Telegram, and one of them, which was hosted on X, was viewed 47 million times before it was taken down.

X stated that their teams are currently removing any identified images and taking necessary actions against the accounts responsible for posting them.

Yvette D Clarke, a representative from the Democratic Party in New York, stated on X that the situation experienced by Taylor Swift is not unique. For many years, women have been victims of deepfakes without their permission. With the development of artificial intelligence, creating deepfakes has become simpler and more affordable. This is a concern that should unite both political parties and even fans of Swift to find a solution.

Certain states in the US have implemented their own laws against deepfakes, but there is an increasing movement to amend federal legislation.

In May 2023, Democratic congressman Joseph Morelle unveiled the proposed Preventing Deepfakes of Intimate Images Act, which would make it illegal to share deepfake pornography without consent. Morelle said the images and videos “can cause irrevocable emotional, financial, and reputational harm – and unfortunately, women are disproportionately impacted.”

He tweeted his disapproval of the Swift images, labeling them as “sexual exploitation”. However, his suggested bill has not been enacted as a law yet.

Tom Kean Jr, a Republican representative, stated that the rapid advancement of AI technology has outpaced the implementation of proper regulations. He believes that safeguards must be put in place to protect individuals, regardless of their status, from being targeted by malicious AI. Kean Jr has partnered with Morelle on a bill and also introduced his own AI Labeling Act, which would mandate all AI-generated content to be clearly labeled, including seemingly harmless chatbots used in customer service scenarios.

As of the time of publication, there has been no public statement from Swift regarding the images. Her publicist in the US has not responded to a comment request.

Deceptive doctored videos or recordings have been utilized to mimic prominent individuals, specifically politicians like Donald Trump and Joe Biden, as well as entertainers such as Drake and the Weeknd. In October 2023, Tom Hanks warned his Instagram followers not to fall for a false dental advertisement using his image.

However, the focus of this technology is primarily on women and often in a sexually exploitative manner. According to a 2019 report from DeepTrace Labs, which is referenced in the proposed US legislation, 96% of deepfake videos are non-consensual and pornographic in nature.

The problem has significantly deteriorated since 2019. The production of fake pornography, which involves using photo editing software to insert an unwilling individual’s face into a pre-existing pornographic image, has been an ongoing issue. However, the advancement of artificial intelligence has created a new challenge as it can now generate entirely authentic-looking images with just a few text commands.

Prominent females face a heightened level of vulnerability. Back in 2018, Scarlett Johansson addressed the prevalent issue of fabricated pornography utilizing her image: “I have unfortunately encountered this issue countless times. The reality is that shielding oneself from the internet and its depraved nature is essentially a futile effort, for the most part.”

In December 2022, the UK government passed a law as part of the Online Safety Bill that made non-consensual deepfake pornography and any explicit imagery taken without consent illegal. This also includes photos commonly referred to as “downblouse” images.

Deputy Prime Minister at the time, Dominic Raab, stated that there is a need for increased measures to protect women and girls from individuals who violate their privacy by taking or altering intimate photos with the intention of harassing or embarrassing them. The proposed changes will provide authorities with the necessary authority to hold these perpetrators accountable and ensure the safety of women and girls from this heinous form of abuse.

Source: theguardian.com