this post was submitted on 23 Jun 2024
270 points (92.7% liked)

Technology

34701 readers
607 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 118 points 4 months ago (4 children)

His record should be expunged when he turns 18 because it was a crime he committed as a child. I understand their frustrations, but they're asking to jail a child over some photoshopped images.

Making a deepfake is definitely not a heavy crime that deserves jailtime or a permanent mark unless he was an adult doing it.

[–] [email protected] 41 points 4 months ago* (last edited 4 months ago) (4 children)

My personal belief still is that the prohibitive approach is futile and ultimately more harmful than the alternative: embrace the technology, promote it and create deepfakes of everyone.

Soon the taboo will be gone, the appeal as well, and everyone will have plausible deniability too, because if there are dozens of fake nudes of any given person then who is to say which are real, and why does it even matter at that point?

This would be a great opportunity to advance our societal values and morals beyond prudish notions, but instead we double down on them.

E: just to clarify I do not at all want to endorse creating nudity of minors here. Just point out that the girl in the article wouldn't have to humiliate herself trying to do damage control in the above scenario, because it would be entirely unimportant.

[–] [email protected] 61 points 4 months ago (2 children)

While I think removing the stigma associated with having deepfakes made of you is important, I don't think that desensitization through exposure is the way to go about it. That will cause a lot of damage leading up to the point you're trying to reach.

[–] [email protected] 5 points 4 months ago (1 children)

I don't seen how else you do it.

"Removing the stigma" is desensitizing by definition. So you want to desensitize through... what? Education?

[–] [email protected] 17 points 4 months ago (1 children)

I dunno, but preferably some method which doesn't involve a bunch of children committing suicide in the meantime.

[–] [email protected] 12 points 4 months ago

As a child protection caseworker, I’m right here with you. The amount of children and young people I’m working with who are self-harming and experiencing suicidal ideation over this stuff is quite prevalent. Sadly, it’s almost all girls who are targeted by this and it’s just another way to push misogyny into the next generation. Desensitisation isn’t the way; it will absolutely cause too much harm before it equalises.

[–] [email protected] 23 points 4 months ago (1 children)

This sounds like a cool idea because it is a novel approach, and it appeals to my general heuristic of the inevitability of technology and freedom. However, I don't think it's actually a good idea. People are entitled privacy, on this I hope we agree -- and I believe this is because of something more fundamental: people are entitled dignity. If you think we'll reach a point in this lifetime where it will be too commonplace to be a threat to someone's dignity, I just don't agree.

Not saying the solution is to ban the technology though.

[–] [email protected] 16 points 4 months ago (1 children)

When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them. If you aren't expecting that, then you aren't educated enough on how internet works and that's what we should be working on. Social media is really bad for privacy and many people are not aware of it.

Now if someone took a picture of you and then edited it without your consent, that is a different action and it's a lot more serious offense.

Either way, deepfakes are just an evolution of something that already existed before and isn't going away anytime soon.

[–] [email protected] 8 points 4 months ago* (last edited 4 months ago) (1 children)

Yeah I mean it's just a more easy to use Photoshop basically.

I agree people need to understand better the privacy risks of social media.

When you put out photos of yourself on the internet you should expect anyone to find them and do whatever they want to them.

Expect, yeah I guess. Doesn't mean we should tolerate it. I expect murder to happen on a daily basis. People editing images of me on their own devices and keeping that to themself, that's their business. But if they edit photos of me and proliferate, I think it becomes my business. Fortunately, there are no photos of me on the internet.

Edit: I basically agree with you regarding text content. I'm not sure why I feel different about images of me. Maybe because it's a fingerprint. I don't mind so much people editing pictures I post that don't include my face. Hmm.

[–] [email protected] 2 points 4 months ago (2 children)

Yeah I mean it's just a more easy to use Photoshop basically.

Photoshop has the same technology baked into it now. Sure, it has "safeguards" so it may not generate nudes, but it would have no trouble depicting someone "having dinner with Bill Cosby" or whatever you feel is reputation destroying.

[–] [email protected] 5 points 4 months ago

Pretty sure they're talking about generative AI created deepfakes being easier than manually cutting out someone's face and pasting it on a photo of a naked person, not comparing Adobe's AI to a different model.

[–] [email protected] 1 points 4 months ago

Ok then it's a more easy to use GIMP.

[–] [email protected] 19 points 4 months ago

It's also worth noting that too many people put out way too much imagery of themselves online. People have got to start expecting that anything you put out in the public domain becomes public domain.

[–] [email protected] 12 points 4 months ago

I second this motion. People also need to stop posting images of themselves all over the web. Especially their own kids. Parents plastering their kids images all over social media should not be condoned.

And on a related note we need much better sex-education in this country and a much healthier relationship with nudity.

[–] [email protected] 15 points 4 months ago (3 children)

You don't turn 18 and magically discover your actions have consequences.

"Not a heavy crime"? I'll introduce you to Sarah, Marie and Olivia. You can tell them it was just a joke. You can tell them the comments they've received as a result are just jokes. The catcalling, mentions that their nipples look awesome, that their pussies look nice, etc are just jokes. All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it's not that bad? The fuck is wrong with you?

[–] [email protected] 36 points 4 months ago (16 children)

Kids are kids until 18 because people mature at different rates. At 18 it is safe to assume most have matured enough. This kid could be 18 mentally, but he could also be 13 mentally.

Why are you trying emotional manipulation in order to justify punishing this one kid as if he was an adult?

Here, let me show you what you just did. Let me introduce you to Steve. His life was ruined because he made a deepfake of a girl he likes and sent it to his friend, but he shouldn't have trusted that friend, because the deepfake then found itself on every phone in class. Steve got a 3 year sentence, forcing early dropout, and due to his permanent mark, he would forever be grouped with rapists and could never find a job. He killed himself at 21. And you claim it's not that bad? The fuck is wrong with you?

[–] [email protected] 1 points 4 months ago

I don't think maturity is an explicit thing in a binary form, i would be ok with the presumption that the age of 18 provides a general expected range of maturity between individuals, it's when you start to develop your world view and really pick up on the smaller things in life and how they work together to make a functional system.

I think the idea of putting a "line" on it, is wrong, i think it's better to describe it "this is generally what you expect from this subset"

load more comments (14 replies)
[–] [email protected] 17 points 4 months ago* (last edited 4 months ago) (1 children)

Perhaps at least a small portion of the blame for what these girls are going through should be laid upon the society which obstinately teaches that a woman's worth as a person is so inextricably tied to her willingness and ability to maintain the privacy of her areolas and vulva that the mere appearance of having failed in the endeavour is treated as a valid reason to disregard her humanity.

[–] [email protected] -1 points 4 months ago (1 children)

All 3 of them are changing schools, 2 are failing their years. Because of something someone else did to them. And you claim it’s not that bad? The fuck is wrong with you?

and by the time they're 18 and moving on to college, or whatever they're probably busy not fucking worrying about whatever happened in high school, because at the end of the day you have two options here:

be a miserable fuck. try to be the least miserable fuck you can, and do something productive.

Generally people pick the second option.

And besides, at the end of the day, it's literally not real, none of this exists. It's better than having your nudes leaked. Should we execute children who spread nudes of other children now? That's a far WORSE crime to be committing, because now that shit is just out there, and it's almost definitely on the internet, AND IT'S REAL.

Seems to me like you're unintentionally nullifying the consequences of actual real CSAM material here.

Is my comment a little silly and excessive? Yes, that was my point. It's satire.

[–] [email protected] 2 points 4 months ago (1 children)

Victims of trauma dont just forget because time passes. They graduate (or dont) and move on in their lives, but the lingering effects of that traumatic experience shape the way the look at the worlds, whether they can trust, body disphoria, whether they can form long-lasting relationships, and other long last trauma responses. Time does not heal the wounds of trauma, they remain as scars that stay vulnerable forever (unless deliberate action is taken by the victim to dismantle the cognitive structure formed by the trauma event).

[–] [email protected] 1 points 4 months ago

yeah but we're also talking about something that quite literally never happened, it was all manufactured, and while i don't want to downplay the effects of that.

This is probably the best time ever to start being an e slut because you can just say it was deep faked and people don't exactly have a reason to disagree with you.

Also while trauma is permanent, i would also like to remind you that every life experience you have throughout your life is also permanent, it cannot be changed, it cannot be undone, it cannot be revoked. You simply have to live with it. The only thing that changes your experiences and memories around it, is how you handle it internally.

I would probably be more compassionate with you if we were literally talking about revenge porn, or whatever the correct stipulation would be here, i'm not sure, i don't exactly fuck people on the regular so i'm not really qualified here lmao.

But like i said, this is just AI generated. Everyone knows about AI now, how many people do you think are going to hear that and go "yeah that makes sense" probably most of them. Highschoolers might be a bit more unreasonable, but nothing changes the fact that they simply aren't real. You just have to do your best to dissociate yourself from that alternate reality where they are, because they quite literally, are not.

some people would consider it to be traumatic, others wouldn't. I wouldn't give a shit either way, i might even further the rumors because i think it would be funny. It's all a matter of perspective.

[–] [email protected] 5 points 4 months ago

Using this idea will give minors feel of complete safety when doing crimes. I don't think you have any sort of morals if you support it but it's a question for your local law enforcements. The crime in question can seriously damage the mental health of the vuctim and be a reason for severe discrimination. Older minors should be responsible for their actions too.