Charts
DataOn-chain
VIP
Market Cap
API
Rankings
CoinOSNew
CoinClaw🦞
Language
  • 简体中文
  • 繁体中文
  • English
Leader in global market data applications, committed to providing valuable information more efficiently.

Features

  • Real-time Data
  • Special Features
  • AI Grid

Services

  • News
  • Open Data(API)
  • Institutional Services

Downloads

  • Desktop
  • Android
  • iOS

Contact Us

  • Chat Room
  • Business Email
  • Official Email
  • Official Verification

Join Community

  • Telegram
  • Twitter
  • Discord

© Copyright 2013-2026. All rights reserved.

简体繁體English
|Legacy

Minnesota Moves to Ban AI Apps That Generate Fake Nude Images

CN
Decrypt
Follow
3 hours ago
AI summarizes in 5 seconds.

Minnesota lawmakers have passed a bill aimed at stopping a growing form of AI abuse by targeting the platforms that enable it.


On Thursday, the Minnesota Senate voted 65-0 to pass House File 1606, sending it to Governor Tim Walz for his signature. The measure bars websites and apps from offering tools that generate realistic fake nude images of identifiable people.


Under the bill, companies that control a website, app, or software service cannot allow users to access or use tools to create these images or generate them on a user’s behalf. Advertising or promoting such services is also prohibited.


The measure allows victims to sue the people or companies that operate or control nudification tools, such as websites, apps, or software that generate fake nude images. People depicted in AI-generated nude images can seek damages, including for mental anguish, and courts can award up to three times the actual damages, along with punitive damages, attorney fees, and orders to stop the conduct.





The bill also gives the state attorney general the power to enforce the law, with civil penalties of up to $500,000 per use. According to the bill, those penalties are directed into the state’s general fund and then appropriated to victim services, including support for survivors of sexual assault, domestic violence, and child abuse.


The bill targets tools that require little technical expertise and have become widely accessible, including to minors. If signed, the law takes effect August 1 and applies to new cases from that date forward.


While the new bill doesn’t reference a single AI developer, the news comes after a series of high-profile incidents on the social platform X, including in August 2025, when Elon Musk’s xAI tool, Grok, generated nude deepfakes of Taylor Swift. The pop superstar moved to trademark her voice and likeness with the U.S. Patent Office in April, perhaps in a move to head off future AI reproductions.


Musk is also facing mounting legal pressure, including a federal class action lawsuit filed by three Tennessee minors alleging Grok generated child sexual abuse material from their images. Also, a consumer protection lawsuit from the city of Baltimore claims the company knowingly deployed a system that produces and spreads nonconsensual sexualized content, including of minors.


Public Citizen co-president Robert Weissman said the spread of these tools reflects how quickly AI has lowered the barrier to creating nonconsensual intimate imagery and expanded its reach.


“These apps are 99% targeting women, over 90% of whom are under 18. It’s a tool of intimidation and harassment of women with really severe psychological consequences,” Weissman told Decrypt. “You’ve seen this across the country and the world. So the need for government intervention and regulation is acute.”


Weissman added that state-level laws can play a role alongside federal efforts, especially when it comes to enforcement. He said local authorities may be better positioned to act quickly in individual cases, while federal agencies may not prioritize or pursue them at all.


The Minnesota law also comes during an ongoing fight between President Donald Trump’s administration and states over who should control AI regulation. The Take It Down Act, signed into law by President Donald Trump in May 2025, criminalizes the distribution of nonconsensual intimate images and provides victims a path to seek civil damages.


“I think having complementary federal and state standards is positive, particularly in theory. We’re talking about different enforcement systems and enforcement agencies,” Weissman said. “So you might have a federal standard, but you might not have federal capacity to do enforcement actions.”


The office of Governor Walz did not immediately respond to Decrypt’s request for comment.


免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Selected Articles by Decrypt

5 hours ago
Ethereum Foundation Sells $23 Million More in ETH to Tom Lee\\\'s BitMine
5 hours ago
Doctors Use AI to Spot \\\'Hidden\\\' Sperm In Men
6 hours ago
OpenAI\\\'s GPT-5.5 Matches Claude Mythos in Cyberattack Capabilities: AI Security Institute
View More

Table of Contents

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Related Articles

avatar
avatarbitcoin.com
40 minutes ago
SBI Group, Visa Launch Crypto Card With up to 10% BTC, ETH, XRP Promo Rewards
avatar
avatarbitcoin.com
2 hours ago
Trump Says Iran Conflict Over, Nasdaq Sets Record High, Bitcoin Climbs 2.5%
avatar
avatarbitcoin.com
3 hours ago
JPX Targets 2027 Japanese Crypto ETF Launch
avatar
avatarcoindesk
3 hours ago
Bitcoin miner Riot\\\'s shares jump 8% after expanding AMD data center deal, signaling AI pivot
APP
Windows
Mac

X

Telegram

Facebook

Reddit

CopyLink