this post was submitted on 08 Oct 2024
352 points (91.5% liked)

Technology

58698 readers
4320 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Well, this just got darker.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 255 points 1 week ago (12 children)

This isn't surprising, it's inevitable.

If you folks knew how common pedophilic fantasies are amongst the general public, you would be shocked. Just look to cultures like Japan and Russia that don't strongly condemn such things, and you'll find it's about 15% of the population. It's only less in the West because of the near homicidal stigma attached to it that makes people vigorously hide that part of themselves.

Fortunately, this also shows that the vast majority of those people don't offend.

We also tend to define pedophilia as "anything sexual involving a minor", while reacting to it as if it means "violent rape of a toddler", so no shit, we sexualize youth all the time, the 18 year mark is a legal and social formality, not a hard limit on human attraction. Adults will find themselves attracted to teens, and they won't reveal that because who the fuck ever would?

If anything, the issue isn't that people have these attractions and fantasies, it is that some portion of those people can't separate fantasy from reality and are willing to hurt a child to get what they want, or they are sociopaths that consume child porn without feeling disgust for witnessing horrific child abuse.

[–] [email protected] 71 points 1 week ago (10 children)

I think the common incest fantasy in the west isn't too far removed from this too. Like all the actors are above age minimums but they pretend to be step kids or babysitters like these roles aren't commonly associated with children and older teens. It's clearly a form of deflection IMO.

load more comments (10 replies)
load more comments (11 replies)
[–] [email protected] 166 points 1 week ago (36 children)

I actually don't think this is shocking or something that needs to be "investigated." Other than the sketchy website that doesn't secure user's data, that is.

Actual child abuse / grooming happens on social media, chat services, and local churches. Not in a one on one between a user and a llm.

[–] [email protected] 4 points 6 days ago (1 children)

Why tf are there so many people okay with people roleplaying child sexual abuse AT ALL??? Real or fake KEEP AN EYE ON ALL OF THEM.

I dont care if its a real child or a fucking bot, that shit is disgusting, and the AI is the reason how some pedos are able to generate cp of children without having to actually get their hands on children.

The fact someone will look at this and go "Yea but what about the REAL child rapists huh??" Is astounding. Mfcker if a grown ass adult is willing to make a bot that is promoted to act like a horny toddler, then what exactly is stopping them from looking at real children that way.

Keep in mind, Im not talking about Lolicon, fuck that. I'm talking about people generating images of realistic or semirealistic children to use as profiles for sex bots. I'm talking about AI. I'VE ACTUALLY SEEN PEOPLE DO THIS, someone actually did this with my character recently. They took the likeness of my character and began generating porn with it using prompts like "Getting fcked on the playground, wet psy, little girl, 6 year old, 2 children on playground, slut..."

Digital or not this shit still affects people, it affects people like me. These assholes deserve to be investigated for even attempting this kinda shit on the clear net.

And before you ask, the character that belonged to me looks really young because I look really young. I got severe ADHD which makes me mentally stunted or childish, and that gets reflected in my OCs or fursonas. This person took a persona, an extension of me PERSONALLY, lowered her age on purpose, and made porn of her. That fuckin hurts dude. Especially after speaking about how close these characters are to me. I'm aware it could be a troll, but honest to god, the prompt they used was demonstrably specific and detailed. Some loser online drawing kanna's feet hurts me way less than someone using AI to generate faux CP and then roleplay with those same bots or prompts. What hurts me more is that there's no restrictions on some AI's to stop people from generating images like this. I don't wanna see shit like this become commonplace or "fine" to do. Keep tabs on individuals like this, cus they VERY WELL could be using the likeness/faces of REAL children for AI CP and that's just as bad.

[–] [email protected] 1 points 6 days ago (1 children)

Im not talking about Lolicon, fuck that.

I think that this is ironic and a poor choice of words. It's almost a pun.

[–] [email protected] 1 points 6 days ago

DAMN- YOU RIGHT 😭

[–] [email protected] 27 points 1 week ago

It's the "burn that witch" reaction.

See how they hate pedophiles and not child rapists.

The crowd wants to feel its power by condemning (and lynching if possible) someone.

I'd rather want to investigate those calling for "investigation" and further violation of privacy of people who for all we know have committed no crime.

That's about freedom of speech and yelling "fire" in a crowded theater and thousand hills radio, you know the argument.

load more comments (34 replies)
[–] [email protected] 72 points 1 week ago

insert surprised pikachu face here

[–] [email protected] 35 points 1 week ago (1 children)

Wait… so you meant to tell me that predatory simps are using AI incorrectly? Man…. If only someone could have called this years ago- something could have been done to minimize it!

Who knew that unchecked growth could lead to negative results?!

[–] [email protected] 26 points 1 week ago (6 children)

But they did, AI Dungeon got nerfed so bad you could only have happy adventures with.

load more comments (6 replies)
[–] [email protected] 31 points 1 week ago

Not at all surprising but also it is an AI

[–] [email protected] 29 points 1 week ago (13 children)

Ain't that what are the tools there for. I mean I don't like cp and I don't want to engage in way with people who like it. But I use those llms to describe fantasies that I wouldn't even talk about with other humans.

As long as they don't do it on real humans nobody is hurt.

[–] [email protected] 7 points 6 days ago* (last edited 6 days ago) (1 children)

As long as they don’t do it on real humans nobody is hurt.

Living out the fantasy of having sex with children (or other harmful sexual practices and fantasies) wit AI or alike can strengthen the wish to actual do it in RL. It can weaken the strength to abstain. If you constantly have fantasies where for example "the child AI wanted it too" then it can desensitize you and making it harder and harder to push that thought aside when in a tempting situation. Instead of replacing the real thing with a fantasy you are preparing for the real thing. Some pedophiles already interpret children's behavior as sexual that isn't at all, but the AI might be told to act in that way and strengthen these beliefs.

This is still something that is and has to be studied more to fully understand it. Of course this is difficult because of the stigma. There might be differences between people who only are attracted to children and ones that are attracted to adults and children and there is just not enough data yet, but even the communities in which pedophiles who do not act on their attraction discuss coping strategies this is heavily discussed and controversial.

If you are interested in the subject a bit more, this is a start: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8419289/

[–] [email protected] -2 points 6 days ago

Yeah that link is a no for me boss

[–] [email protected] 2 points 5 days ago

As a man who was descending to a dark place when I was a teen, I can say this with confidence:

This kind of content, like CP or r*pe-y stuff, even if clearly not real and only a fantasy, feeds these desires, and makes them grow. In time, if you continue to foster it, they will bleed into real life, and then it becomes a real problem. That's why this kind of stuff is scary.

Thankfully, I was able to spot this pattern before it became a problem, that is a dangerous slippery slope

[–] [email protected] -1 points 6 days ago

Wrong. People who allow these desires to fester, or as you suggest, actively seek out fulfillment for them, is not good for anyone. It’s not good for the pedophiles, because it will increase the need for fulfilling their illegal desires, and it won’t help kids, obviously because it emboldens pedophiles.

Have you ever experienced something you like, and said to yourself: “definitely not doing more next time.”

load more comments (10 replies)
[–] [email protected] 28 points 1 week ago (1 children)

A bit off topic... But from my understanding, the US currently doesn't have a single federal agency that is responsible for AI regulation... However, there is an agency for child abuse protection: the National Center on Child Abuse and Neglect within Department of HHS

If AI girlfriends generating CSAM is how we get AI regulation in the US, I'd be equally surprised and appalled

load more comments (1 replies)
load more comments
view more: next ›