Trump signs a bill making non-consensual AI deepfake pornography a criminal offense.

CN
AiCoin
Follow
10 hours ago

Source: Cointelegraph
Original: “Trump Signs Bill Criminalizing Nonsensual AI Deepfake Porn”

U.S. President Trump has signed a bill that criminalizes non-consensual AI-generated deepfake porn, which also requires websites to remove any illegal images within 48 hours.

Trump officially signed the bill into law on May 19, known as the TAKE IT DOWN Act, formally titled "Addressing Known Exploitation Tools by Stopping Technical Deepfakes on Websites and Online."

This legislation, strongly supported by First Lady Melania Trump, explicitly classifies the act of publishing or threatening to publish non-consensual intimate images of adults or minors (including deepfakes) as a federal crime, especially when the perpetrator intends to cause harm or harassment. Offenders will face varying degrees of penalties, from fines to imprisonment.

According to the bill, websites, online services, or applications must remove illegal content within 48 hours and establish a comprehensive content takedown process.

In a speech in the White House Rose Garden, Trump pointed out, and later posted on the social media platform Truth Social, that the bill's scope also includes "AI-generated fake content," commonly referred to as deepfakes.

Melania Trump personally lobbied Congress members to support the bill and stated in an official statement that the passage of this law is a "national victory."

She said, "Artificial intelligence and social media are the digital candy of the next generation—sweet, addictive, and meticulously designed to influence our children's cognitive development."

She further emphasized, "But unlike candy, these new technologies can be weaponized, capable of shaping beliefs, and more concerningly, can affect emotions, potentially leading to fatal consequences."

Senators Ted Cruz and Amy Klobuchar jointly introduced the bill in June 2024, which successfully passed both chambers in April of this year.

In recent years, there has been an increasing number of cases where deepfake technology has been used for harmful purposes. One widely publicized incident occurred in January 2024, when inappropriate deepfake images of pop singer Taylor Swift rapidly spread on the X platform.

As an emergency response, the X platform temporarily banned users from searching for Taylor Swift's name, while lawmakers actively pushed for legislation to classify the creation of deepfake images as a criminal offense.

Internationally, countries like the UK have taken the lead by explicitly defining the sharing of deepfake pornographic content as illegal in the 2023 Online Safety Act.

A 2023 research report released by security tech startup Security Hero revealed a concerning fact: the vast majority of deepfake content published online is pornographic in nature, and 99% of individuals targeted by such content are women.

Related: Indonesian listed company DigiAsia's stock price soars 90%, plans to raise $100 million to purchase Bitcoin (BTC)

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Bitget:注册返10%, 送$100
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink