255
submitted 7 months ago by [email protected] to c/[email protected]

A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 68 points 7 months ago* (last edited 7 months ago)

What a weird populist law tbh. There's already an established law framework that covers this: defamation. Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

They'll use this as an opportunity to sneak in more government spyware/control is my guess.

[-] [email protected] 14 points 7 months ago

It's not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone's head onto someone else's nude body, because no real person's head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.

An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that's what's being pitched here.

[-] [email protected] 6 points 7 months ago

It actually proposes "possession with the intention to distribute" which just show what a meme law this is. How do you determine the intention to distribute for an image?

And I disagree with your take that this can't be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.

[-] [email protected] 2 points 7 months ago

When you find it broken down into individual baggies?

[-] [email protected] 1 points 7 months ago

When they find the scale too

[-] [email protected] 2 points 7 months ago

I guess if you have AI generate the senate house speaker fucking her in the ass in an alley full of trash while she holds money bags, it's then political satire and protected?

[-] [email protected] 4 points 7 months ago

Even better: Intentional infliction of emotional distress

There are business interests behind this. There is a push to turn a likeness (and voice, etc.) into an intellectual property. This bill is not about protecting anyone from emotional distress or harm to their reputation. It is about requiring "consent", which can obviously be acquired with money (and also commercial porn is an explicit exception). This bill would establish this new kind of IP in principle. It's a baby step but still a step.

You can see in this thread that proposing to expand this to all deepfakes gets a lot of upvotes. Indeed, there are bills out there that go all the way and would even make "piracy" of this IP a federal crime.

Taylor Swift could be out there, making music or having fun, while also making money from "her consent", IE by licensing her likeness. She could star in movies or makes cameos by deepfaking her on some nobody actor. She could license all sorts of youtube channels. Or how about a webcam chat with Taylor? She could be an avatar for ChatGPT, or she could be deepfaked onto one of those Indian or Kenyan low-wage workers who do tech support now.

We are not quite there yet, technologically, but we will obviously get there soonish. Fakes in the past were just some pervs who were making fan art of a sort. Now the smell of money is in the air.

[-] [email protected] 3 points 7 months ago

This seems like the most likely scenario tbh. I'm not sure whether personal likeness IP is a bad thing per se but one thing is sure - it's not being done to "protect the kids".

[-] [email protected] 2 points 7 months ago

personal likeness IP is a bad thing

It is. It means that famous people (or their heirs, or maybe just the rights-owner) can make even more money from their fame without having to do extra work. That should be opposed out of principle.

The extra money for the licensing fees has to come from somewhere. The only place it can come from is working people.

It would mean more inequality; more entrenchment of the current elite. I see no benefit to society.

[-] [email protected] 1 points 7 months ago

Not necessarily I'm optimistic that this could lead to empowering status and personality as main resources and push money out of society.

[-] [email protected] 1 points 7 months ago

How so? Fame is already a monetizable resource. The main changes that I see are that 1) no opportunity to show their face and make their voice heard needs to be missed for lack of time, and 2) age no longer needs to be a problem.

[-] [email protected] -3 points 7 months ago* (last edited 7 months ago)

When you steal a person's likeness for profit or defame them, then that's a CIVIL matter.

This bill will make AI sexualization a CRIMINAL matter.

[-] [email protected] 1 points 7 months ago

Where do you see that?

The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

[-] [email protected] 1 points 7 months ago* (last edited 7 months ago)

Here:

A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence.

[-] [email protected] 0 points 7 months ago

That doesn't seem to be correct. More like a typo as criminalize =/= criminal law.

this post was submitted on 31 Jan 2024
255 points (93.5% liked)

News

22886 readers
4606 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS