752
submitted 5 months ago by Stopthatgirl7@lemmy.world to c/news@lemmy.world

THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

you are viewing a single comment's thread
view the rest of the comments
[-] Asifall@lemmy.world 9 points 5 months ago* (last edited 5 months ago)

Not convinced on this one

It seems like the bill is being pitched as protecting women who have fake nudes passed around their school but the text of the bill seems more aimed at the Taylor swift case.

1 The bill only applies where there is an “intent to distribute”

2 The bill talks about damages being calculated based on the profit of the defendant

The bill also states that you can’t label the image as AI generated or rely on the context of publication to avoid running afoul of this law. That seems at odds with the 1st amendment.

[-] UnderpantsWeevil@lemmy.world 6 points 5 months ago

The bill only applies where there is an “intent to distribute”

That's a predicate for any law bound to the Commerce Clause. You need to demonstrate the regulation is being applied to interstate traffic. Anything else would be limited to state/municipal regulations.

The bill talks about damages being calculated based on the profit of the defendant

That's arguably a better rule than the more traditional flat-fee penalties, as it curbs the impulse to treat violations as cost-of-business. A firm that makes $1B/year isn't going to blink at a handful of $1000 judgements.

The bill also states that you can’t label the image as AI generated or rely on the context of publication to avoid running afoul of this law.

A revenge-porn law that can be evaded by asserting "This isn't Taylor Swift, its Tay Swiff and any resemblance of an existing celebrity is purely coincidental" would be toothless. We already apply these rules for traditional animated assets. You'd be liable for producing an animated short staring "Definitely Not Mickey Mouse" under the same reasoning.

This doesn't prevent you from creating a unique art asset. And certainly there's a superabundance of original pornographic art models and porn models generated with the consent of the living model. The hitch here is obvious, though. You're presumed to own your own likeness.

My biggest complaint is that it only seems to apply to pornography. And I suspect we'll see people challenge the application of the law by producing "parody" porn or "news commentary" porn. What the SCOTUS does with that remains to be seen.

[-] Asifall@lemmy.world 1 points 5 months ago

That’s arguably a better rule than the more traditional flat-fee penalties, as it curbs the impulse to treat violations as cost-of-business. A firm that makes $1B/year isn’t going to blink at a handful of $1000 judgements.

No argument there but it reinforces my point that this law is written for Taylor swift and not a random high schooler.

You’d be liable for producing an animated short staring “Definitely Not Mickey Mouse” under the same reasoning.

Except that there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment. You can see the parody exception used in many episodes of south park for example and even specifically used to depict Mickey Mouse. Either this bill allows for those types of uses in which case it’s toothless anyway or it’s much more restrictive to speech than existing copyright law.

[-] UnderpantsWeevil@lemmy.world 1 points 5 months ago

written for Taylor swift and not a random high schooler.

In a sane world, class action lawsuits would balance these scales.

there are fair use exceptions specifically to prevent copyright law from running afoul of the first amendment

Why would revenge porn constitute fair use? This seems more akin to slander.

[-] Asifall@lemmy.world 1 points 5 months ago

You keep referring to this as revenge porn which to me is a case where someone spreads nudes around as a way to punish their current or former partner. You could use AI to generate material to use as revenge porn, but I bet most AI nudes are not that.

Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos. Clearly that would be protected speech. If you generate the same image with generative AI though then suddenly it’s illegal even if you clearly label it as being a parody. That’s the concern. Moreover, the slander/libel angle doesn’t make sense if you include a warning that the image is generated, as you are not making a false statement.

To sum up why I think this bill is kinda weird and likely to be ineffective, it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy as long as I don’t present it as a real video. If I generate the same video but my neighbor’s dick is hanging out, straight to jail. It’s not consistent.

[-] UnderpantsWeevil@lemmy.world -1 points 5 months ago

where someone spreads nudes around as a way to punish their current or former partner

I would consider, as an example, a student who created a vulgar AI porn display of another student or teacher out of some sense of spite an example of "revenge porn". Same with a coworker or boss trying to humiliate someone at the office.

Think about a political comic showing a pro-corporate politician performing a sex act with Jeff bezos.

That's another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

it’s perfectly legal for me to generate and distribute a fake ai video of my neighbor shooting a puppy

If you used it to slander your neighbor, it would not be legal.

[-] Asifall@lemmy.world 1 points 5 months ago

That’s another good example. The Trump/Putin kissing mural is a great example of something that ends up being homophobic rather than partisan.

So you think it should be illegal?

If you used it to slander your neighbor, it would not be legal.

You’re entirely ignoring my point, I’m not trying to pass the video off as real therefore it’s not slander.

[-] UnderpantsWeevil@lemmy.world 0 points 5 months ago

So you think it should be illegal?

I think it's an example of partisan language that ends up being blandly homophobic.

You’re entirely ignoring my point

Why would putting up a giant sign reading "My neighbor murders dogs for fun" be a tort but a mural to the same effect be protected?

[-] 4lan@lemmy.world 0 points 5 months ago

Oh boo hoo you can't go in a movie theater and yell "fire" we are so oppressed

this post was submitted on 26 Jul 2024
752 points (98.6% liked)

News

23655 readers
3763 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS