this post was submitted on 21 May 2024
509 points (95.4% liked)
Technology
59232 readers
4308 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Exactly. If you can't name a victim, it shouldn't be illegal.
The problem with AI CSAM generation is that the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well... You need to feed it CSAM.
So is it right to be using images of real children to train these AI? You'd be hard-pressed to find someone who thinks that's okay.
You make the assumption that the person generating the images also trained the AI model. You also make assumptions about how the AI was trained without knowing anything about the model.
Are there any guarantees that harmful images weren't used in these AI models? Based on how image generation works now, it's very likely that harmful images were used to train the data.
And if a person is using a model based on harmful training data, they should be held responsible.
However, the AI owner/trainer has even more responsibility in perpetuating harm to children and should be prosecuted appropriately.
I will have to disagree with you for several reasons.
The difference between the things you're listing and SAM is that those other things have actual utility outside of getting off. Were our phones made with human suffering? Probably but phones have many more uses than making someone cum. Are all those things wrong? Yea, but at least good came out of it outside of just giving people sexual gratification directly from the harm of others.
The topic that you’re choosing to focus on really interesting. what are your values?
My values are none of your business. Try attacking my arguments instead of looking for something about me to attack.
At the root of it beliefs aren’t based on logic they’re based on your value system. So why dance around the actual topic?
If everywhere you go, everyone is abnormal, I have news for you
Lol, highly doubt it. These AI assholes pretend that all the training data randomly fell into the model (off the back of a truck) and that they cannot possibly be held responsible for that or know anything about it because they were too busy innovating.
There's no guarantee that most regular porn sites don't contain csam or other exploitative imagery and video (sex trafficking victims). There's absolutely zero chance that there's any kind of guarantee.
First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.
But also, AI systems can blend multiple elements together. They don't need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.
You ignored the second part of their post. Even if it didn't use any csam is it right to use pictures of real children to generate csam? I really don't think it is.
There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.
If the images were generated from CSAM, then there's a victim. If they weren't, there's no victim.
The images were created using photos of real children even if said photos weren't CSAM (which can't be guaranteed they weren't). So the victims were are the children used to generate CSAM
Sure, but isn't the the perpetrator the company that trained the model without their permission? If a doctor saves someone's life using knowledge based on nazi medical experiments, then surely the doctor isn't responsible for the crimes?
So is the car manufacturer responsible if someone drives their car into the sidewalk to kill some people?
Your analogy doesn't match the premise. (Again assuming there is no csam in the training data which is unlikely) the training data is not the problem it is how the data is used. Using those same picture to generate photos of medieval kids eating ice cream with their family is fine. Using it to make CSAM is not.
It would be more like the doctor using the nazi experiments to do some other fucked up experiments.
(Also you posted your response like 5 times)
Sorry, my app glitched out and posted my comment multiple times, and got me banned for spamming... Now that I got unbanned I can reply.
In this scenario no, because the crime was in how someone used the car, not in the creation of the car. The guy in this story did commit a crime, but for other reasons. I'm just saying that if you are claiming that children in the training data are victims of some crime, then that crime was committed when training the model. They obviously didn't agree for their photos to be used that way, and most likely didn't agree for their photos to be used for AI training at all. So by the time this guy came around, they were already victims, and would still be victims if he didn't.
I would argue that the person using the model for that purpose is further victimizing the children. Kinda like how with revenge porn the worst perpetrator is the person who uploaded the content, but every person viewing it from there is furthering the victimization. It is mentally damaging for the victim of revenge porn to know that their intimate videos are being seen/sought out.
Let's do a thought experiment, and I'd look to to tell me at what point a victim was introduced:
Or with AI:
At what point did my actions victimize someone?
If I distributed those images and those images resemble a real person, then that real person is potentially a victim.
I will say someone who does this creepy and I don't want them anywhere near children (especially mine, and yes, I have kids), but I don't think it should be illegal, provided the source material is legal. But as soon as I distribute it, there absolutely could be a victim. Being creepy shouldn't be a crime.
I think it should be illegal to make porn of a person without their permission regardless of if it was shared or not. Imagine the person it is based off of finds out someone is doing that. That causes mental strain on the person. Just like how revenge porn doesn't actively harm a person but causes mental strafe (both the initial upload and continued use of it). For scenario 1 it would be at step 2 when the porn is made of the person. For scenario 2 it would be a mix between step 3 and 4.
Thanks for sharing! I'm going to disagree with pretty much everything, so please stop reading here if you're not interested.
Sure, and there are plenty of things that can cause mental strain, but that doesn't make those things illegal. For example:
And so on. Those things aren't illegal, but someone could experience mental strain from them. Experiencing that doesn't make you a victim, it just means you experience it.
Revenge porn damages someone's reputation, at the very least, which is a large part of why it's illegal.
Someone keeping those images for private use doesn't cause harm, therefore it shouldn't be illegal.
Someone doing something creepy for their own use should never be illegal.
I'm not one to stop because of disagreement. You're in good faith and that's all that matters imo
I believe consent is a larger factor. The person who made it consented to have their photos/videos seen by that person but did not consent to them sharing it.
That's why it's not illegal to call someone a slut (even though that also damages reputation)
What if the recording was made without the person's consent. Say someone records their one night stand without the other person's knowledge but they don't share it with anyone. Should that be illegal?
Consent is certainly important, but they don't need your consent if the image was obtained legally and thus subject to fair use, or if you gave them permission in the past.
It can be, if that constitutes defamation or libel. A passing statement wouldn't, but a post on a popular website absolutely could. It all comes down to the damages that (false) statement caused.
That depends on whether there was a reasonable expectation of privacy. If it's in public, there's no reasonable expectation of privacy.
In general, I'd say intimacy likely occurs somewhere with a reasonable expectation of privacy, at which point it would come down to consent (whether implied or explicit).
If the person is a slut it wouldn't be libel but it would still damage reputation. The person being a slut is true but calling them one still damages their reputation. If you release a home made video of a pornstar it would still be illegal even though it's not something that would damage their reputation.
The reason for the illegality is the lack of consent not the reputation damage.
Even in a 1 party consent state recording someone while you are having intercourse with them is illegal without their consent, because we make exceptions for especially sensitive subjects such as sex.
To go along with that I also believe that people who uploaded photos of themselves/their children did not consent to having their photos used to make sexual content. If they did it would be another matter to me entirely.
Edit: I also would like to say (and I really am sorry for bringing them into this) but from what you said you think it would be okay (not socially acceptable but okay/fine) for someone to take pictures of your kids while they're at the park and use that to make porn. Really think about that. Is that something you think should be allowed? Imagine someone taking pictures of them at walmart and you ask what they're doing and they straight up tell you "I like how they look I'm going to add them to my training data to make porn, don't worry though I'm not sharing it with anyone" and you could do jack shit about it without facing legal consequences yourself. You think that is okay?
Sure, in which case the person wouldn't legally be a victim. It's completely legal to tell the truth.
But that strays a bit from the point. Making fake porn of someone is a false reputation of that person's character, and thus illegal, but only if it actually causes damages to reputation (i.e. you distribute it). Or at least that's the line of argumentation I think someone would use in states where "revenge porn" isn't explicitly illegal.
Even if the person is a porn star, the damage is that the porn is coming from somewhere other than the approved channels, thus the damages. Or maybe it's lost sales. Regardless, there are actual, articulable damages.
Maybe in states where it's expressly illegal. I'm talking more from a theoretical standpoint where there isn't an explicit law against it.
If there's no explicit law, tht standard is defamation/libel or violation of a reasonable expectation of privacy.
That's the reasonable expectation of privacy standard (that applies inside houses when in bedrooms, bathrooms, etc, even if it's not your house). If you're doing it in public, there's no reasonable expectation of privacy, so I think a court would consider filming in that context to be legal.
Then again, this could certainly vary by jurisdiction.
They don't need to consent for any use, if it's made available for personal use, then any individual can use it for personal use, even if that's sexual content. As long as they don't distribute it, they're fine to use it as they please.
If you want control over how how content is used, don't make it available for personal use.
Yes. I certainly don't want them to do that, but I really don't want to live in a society with the surveillance necessary to prosecute such a law. Someone being creepy with pictures of my kids is disgusting, but it honestly doesn't hurt me or my kids in any way, provided they don't share those images with anyone.
So yes, I think it's a necessary evil to have the kinds of privacy protections I think are valuable to have in a free society. Freedom means letting people do creepy things that don't hurt anyone else.
The damages would be the mental harm done to the victim. Most porn stars have content available for free so that wouldn't be a reason for damages
The expectation of privacy doesn't apply to one party consent States but they still can't record sexual activities of someone without their consent
I don't think people who uploaded pictures on Facebook consider that making it available for personal use
Did i say anything about surveillance? Just because something is made illegal doesn't make it actively pursued, it just makes it so if someone gets caught doing it or gets reported doing it they can be stopped. Like you'd be able to stop the person from doing that to your children. Or if someone gets their house raided for something else they can be charged for it. Not every person who has real csam creates it or shares it, many times they just get caught by another charge then it gets found. Or the geek squad worker sees it on their computer and reports them.
It would give people avenues to stop others from using photos of their children in such a way. You wouldn't need any extra surveillance
Do you think it's okay for someone to have real csam? Let's say the person who made it was properly prosecuted and the person who has the images/videos don't share it, they just have it to use. Do you think that's okay?
Then they shouldn't have uploaded it to Facebook and made it publicly accessible.
It's the next logical step for the pearl clutchers and amounts to "thought crime."
These people aren't doing anything to my children, they're making their own images from images they have a right to use. It's super creepy and I'd probably pick a fight with them if I found out, but I don't think it should be illegal if there's no victim.
The geek squad worker could still report these people, and it would be the prosecution's job to prove that they were acquired or created in an illegal way.
No, because that increases demand for child abuse. Those pictures are created by abuse of children, and having getting access to them encourages for child abuse to produce more content.
Possession itself isn't the problem, the problem is how they're produced.
I feel similarly about recreational drugs. Buying from dealers is bad because it encourages snuggling and everything related to it. I have no problem with weed or whatever, I have problems with the cartels. At least with drugs there's a simple solution: legalize it. I likewise want a legal avenue for these people who would otherwise participate in child abuse to not abuse children. Them looking at creepy AI content generated from pictures of my child doesn't hurt my child, just don't share those images or otherwise let me know about it.
I seriously doubt they would create any more surveillance for that than there already is for real CSAM.
That would just make it harder to prosecute people for CSAM since they will all claim their material was just ai. That would just end up helping child abusers get away with it.
I think the production of generated CSAM is unethical because it still involves photos of children without their consent
There is evidence to suggest that viewing csam increases child seeking behavior. So them viewing generated CSAM would most likely have the same if not a similar result. That would mean that even just having access to the materials would increase the likelihood of child abuse
https://www.theguardian.com/global-development/2022/mar/01/online-sexual-abuse-viewers-contacting-children-directly-study
The survey was self reported so the reality is probably higher than the 42% cited from the study
The best legal avenue for non-offending pedophiles to take is for them to find a psychologist that can help them work through their desires. Not to give them a thing that will make them want to offend even more.
That's true, and an unfortunate part of preserving freedoms. That said, if someone is actually abusing children on the regular, police have a way of tracking that individual to catch them: investigations.
I wish police had to do them more often instead of leaving that job to the prosecution. If that means we need to pull officers away from other important duties like arresting black men for possessing a joint or pulling people over for speeding on an empty highway, I guess that's what we have to do.
It involves legally acquired images and is protected under "fair use" laws. You don't need my permission to exercise your fair use rights, even if I think your use is disgusting. It's not my business. But if you make it my business (i.e. you tell me), I may choose to assault you and hope the courts will side with me that they constitute "fighting words."
Just because something is disgusting doesn't make it illegal.
As for that article:
It doesn't prove anything, what it does is draw a correlation between people who search for CSAM on the dark web and are willing to answer a survey (a pretty niche group) and self-reported inclination to contact children. Correlation isn't proof, it's correlation.
That said, I don't know if a better study could or should be conducted. Maybe survey people caught contacting children (sting operations) and those caught just distributing CSAM w/o child contact. We need go know the difference between those who progress to contact and those who don't, and I don't think this survey provides that.
I agree, and I think that should be widely accessible.
That said, I don't think giving people a criminal record helps. If they need to be locked up to protect the public (i.e. there are actual victims), then let's lock them up. But otherwise, we absolutely shouldn't. Let's make help available and push people toward getting that help.
I hate the no victim argument.
Why? Can you elaborate?
Not necessarily
You don't. You just need lists of other things, properly tagged. If you feed an AI a bunch of clothed adults and a bunch of naked adults, it will, in theory, "understand" the difference between being clothed and naked and create any of its clothed adults, naked.
With that initial set above, you feed it a bunch of clothed children. When you ask for a naked child, it will either produce a child head with naked adult body, or a "weird" naked child. It "understands" that adult and child are different things, that clothed and naked are different things, and tries to infer what "naked child" looks like from what it "knows".
This is the real question and one I don't know the answer to, because it will boil down to consent to being part of a training model, whether your own as an adult, or a child's parent, much like how it works for stock photos and videos.
"I consent to having my likeness used for AI training models, except for any use that involves NSFW content" - Fair enough. Good luck enforcing that.