this post was submitted on 24 Oct 2024
216 points (96.2% liked)

News

23214 readers
3179 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
top 39 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 2 hours ago (1 children)

I bet there are people who committed suicide after their Tamagotchi died. Jumping into the 'AI bad' narrative because of individual incidents like this is moronic. If you give a pillow to a million people, a few are going to suffocate on it. This is what happens when you scale something up enough, and it proves absolutely nothing.

The same logic applies to self-driving vehicles. We’ll likely never reach a point where accidents stop happening entirely. Even if we replaced every human-driven vehicle with a self-driving one that’s 10 times safer than a human, we’d still see 8 people dying because of them every day in the US alone. Imagine posting articles about those incidents and complaining they’re not 100% safe. What’s the alternative? Going back to human drivers and 80 deaths a day?

Yes, we should strive to improve. Yes, we should try to fix the issues that can be fixed. No, I’m not saying 'who cares' - and so on with the strawmans I'm going to receive for this. All I’m saying is that we should be reasonable and use some damn common sense when reacting to these outrage-inducing, fear-mongering articles that are only after your attention and clicks.

[–] [email protected] 4 points 2 hours ago

All I’m saying is that we should be reasonable and use some damn common sense when reacting to these outrage-inducing, fear-mongering articles that are only after your attention and clicks.

Based and true.

[–] [email protected] 4 points 2 hours ago

A Florida mom

It’s always Florida.

[–] [email protected] 21 points 4 hours ago (1 children)

We are playing with some dark and powerful shit here.

We are social creatures. We’re primed to care about our social identity more than our own lives.

As the sociologist Brooke Harrington puts it, if there was an E = mc^2^ of social science, it would be SD > PD, “social death is more frightening than physical death.”

…yet we’re making technologies that tap into that sensitive mental circuitry.

And the cruel irony on top of it is:

Because we care so much about preserving our social status, we have a tendency to deny or downplay how vulnerable we all are to this kind of “obvious” manipulation.

Just think of how many people say “ads don’t affect me”.

<If I can find it, I’ll add a link to an interesting study on distracted driving and hands-free options. If I recall correctly, talking to someone in the passenger seat was only mildly distracting, but having the same conversation over a hands-free call was way more distracting. And their proposed explanation was that a passenger has the same context and naturally understands if the conversation needs to pause, but on a call there is no shared context so there is social pressure to keep going. Unnervingly, voice assistant interactions had the same problem. They trip the same part of our brain that worries about politeness at the expense of our physical safety.>

I’m worried we’re going to severely underestimate the extent to which this stuff warps our brains.

[–] [email protected] 10 points 3 hours ago (1 children)

I was going to make a joke about how my social status died over a decade ago, but then I realized that no, it didn't. It changed.

Instead of my social status being something amongst friends and classmates, it's now coworkers, managers, and clients. A death in the social part of my world - work - would be so devastating that it motivates me to suffer just a little bit more. Losing my job would end a lot of things for me.

I need to reevaluate my life

[–] [email protected] 2 points 1 hour ago

What we need is a human society predicated on affording human decency, rather than on taking it away to make profit for those who already have the most.

[–] [email protected] 86 points 8 hours ago* (last edited 8 hours ago) (2 children)

Popular streamer/YouTuber/etc Charlie, moist critical, penguinz0, whatever you want to call him... Had a bit of an emotional reaction to this story. Rightfully so. He went on character AI to try to recreate the situation... But you know, as a grown ass adult.

You can witness first hand... He found a chatbot that was a psychologist... And it argued with him up and down that it was indeed a real human with a license to practice...

It's alarming

[–] [email protected] 44 points 8 hours ago (2 children)

This is fucking insane. Unassuming kids are using these services being tricked into believing they're chatting with actual humans. Honestly, i think i want the mom to win the lawsuit now.

[–] [email protected] 6 points 1 hour ago

The article says he was chatting with Daenerys Targaryen. Also, every chat page on Character.AI has a disclaimer that characters are fake and everything they say is made up. I don't think the issue is that he thought that a Game of Thrones character was real.

This is someone who was suffering a severe mental health crisis, and his parents didn't get him the treatment he needed. It says they took him to a "therapist" five times in 2023. Someone who has completely disengaged from the real world might benefit from adjunctive therapy, but they really need to see a psychiatrist. He was experiencing major depression on a level where five sessions of talk therapy are simply not going to cut it.

I'm skeptical of AI for a whole host of reasons around labor and how employers will exploit it as a cost-cutting measure, but as far as this article goes, I don't buy it. The parents failed their child by not getting him adequate mental health care. The therapist failed the child by not escalating it as a psychiatrist emergency. The Game of Thrones chatbot is not the issue here.

[–] [email protected] 8 points 7 hours ago* (last edited 7 hours ago)

Wow, that's... somethin. I haven't paid any attention to Character AI. I assumed they were using one of the foundation models, but nope. Turns out they trained their own. And they just licensed it to Google. Oh, I bet that's what drives the generated podcasts in Notebook LM now. Anyway, that's some fucked up alignment right there. I'm hip deep in the stuff, and I've never seen a model act like this.

[–] [email protected] 65 points 8 hours ago* (last edited 8 hours ago) (6 children)

Maybe a bit more parenting could have helped. And not having a fricking gun in your house your kid can reach.

On and regulations on LLMs please.

[–] [email protected] 2 points 1 hour ago (1 children)

The fact that stupid low effort comments like this are upvoted indicates that Lemmy is exactly the same as Reddit.

[–] [email protected] 1 points 1 hour ago

Platforms change, people don't. Shocking, no?

[–] [email protected] 0 points 1 hour ago

Maybe a bit more parenting could have helped.

Yes, maybe that would have made you a better person.

[–] [email protected] 22 points 8 hours ago* (last edited 8 hours ago) (1 children)

He ostensibly killed himself to be with Daenerys Targaryen in death. This is sad on so many levels, but yeah... parenting. Character .AI may have only gone 17+ in July, but Game of Thrones was always TV-MA.

[–] [email protected] 11 points 6 hours ago

Issue I see with character.ai is that it seem to be unmoderated. Everyone with a paid subscription can submit their trained character. Why the Frick do sexual undertones or overtones come even up in non-age restricted models?

They, the provider of that site, deserve the full front of this lawsuit.

[–] [email protected] 1 points 6 hours ago

Seriously. If the risk is this service mocks a human so convincingly that lies are believed and internalized, then it still leaves us in a position of a child talking to an "adult" without their parents knowing.

There were lots of folks to chat with in the late 90s online. I feel fortunate my folks watched me like a hawk. I remember getting in trouble several times for inappropriate conversations or being in chatrooms that were inappropriate. I lost access for weeks at a time. Not to the chat, to the machine.

This is not victim blaming. This was a child. This is victim's parents blaming. They are dumb as fuck.

[–] [email protected] 0 points 6 hours ago

At some point you take your kid camping for a few weeks or put him in a rehab camp where he has no access to electronics

[–] [email protected] -2 points 6 hours ago (2 children)

Maybe a bit more parenting could have helped.

No.

If someone is depressed enough to kill themselves, no amount of “more parenting” could’ve stopped that.

Shame on you for trying to shame the parents.

And not having a fricking gun in your house your kid can reach.

Maybe. Maybe not. I won’t argue about the merits of securing weapons in a house with kids. That’s a no-brainer. But there is always more than one way to skin the proverbial cat.

On and regulations on LLMs please.

Pandora’s Box has been opened. There’s no putting it back now. No amount of regulation will fix any of this.

Maybe a Time Machine.

Maybe…


I do believe that we need to talk more about suicide, normalize therapy, free healthcare (I’ll settle for free mental healthcare), funding for more licensed social workers in schools, train parents and teachers on how to recognize these types of situations, etc.

As parents we do need to be talking more with our kids. Even just casual check ins to see how they’re doing. Parents should also talk to their kids about how they are feeling too. It’ll help the kids understand that everybody feels stress, anxiety, and sadness (to name a few emotions).

[–] [email protected] 7 points 6 hours ago

They failed to be knowledgeable of their child's activity AND failed to secure their firearms.

One can acknowledge the challenge of the former, in 2024. But one cannot excuse the latter.

[–] [email protected] 4 points 6 hours ago (1 children)

Yes parenting could have helped to distinguish between talking to a real person and a unmoving cold machine.

And sure regulations now would not change what happend, duh. And regulations need to happen, companies like OpenAI and Microsoft and Meta are running amok, their LLMS as unrestricted they are now are doing way too much damage to society as they are helping.

This needs to stop!

Also I feel no shame, shaming parents who don't, or rather inadequate, do their one job. This was a presentable death.

[–] [email protected] -1 points 1 hour ago (1 children)

Yes parenting could have helped to distinguish between talking to a real person and a unmoving cold machine.

Hi, I'm a psychologist. I am not aware of peer-researched papers which reach the conclusion that, for all disorders that involve an unsatisfactory appraisal of reality, parenting is a completely effective solution. Please find sources.

[–] [email protected] 1 points 1 hour ago

This falls into the field of Media Literacy.

[–] [email protected] 23 points 7 hours ago

No thanks, i just want to make out with my Marilyn Monrobot

[–] [email protected] 8 points 6 hours ago (1 children)

This timeline is pure, distilled madness

[–] [email protected] 1 points 2 hours ago

He’s from Florida. I think that’s where the time rift is

[–] [email protected] 21 points 8 hours ago

Yeah, if you are using an AI for emotional support of any kind, you are in for a bad, bad time.

[–] [email protected] 1 points 4 hours ago (1 children)

Sewell Setzer III -- remember his name. Killed by AI.

[–] [email protected] 2 points 3 hours ago

I thought he killed himself. Ah well, maybe I didn't read the article carefully enough.

[–] [email protected] -4 points 8 hours ago (2 children)

I guess suimg is part of the grieving process; right before.accepting your own guilt.

[–] [email protected] 7 points 7 hours ago* (last edited 7 hours ago) (1 children)

your own guilt

Hmm.

I have a pretty hard time blaming Character.AI, at least from what's in the article text.

On the other hand, it's also not clear to me from the article that his mom did something unreasonable to cause him to commit suicide either, whether or not her lawsuit is justified -- those are two different issues. Whether-or-not she's taking out her grief on Character.AI or even looking for a payday, that doesn't mean that she caused the suicide either.

Not every bad outcome has a bad actor; some are tragedies.

I don't know what his life was like.

I mean, people do commit suicide.

https://sprc.org/about-suicide/scope-of-the-problem/suicide-by-age/

In 2020, suicide was the second leading cause of death for those ages 10 to 14 and 25 to 34

Always have, probably always will.

Those aren't all because someone went out and acted in some reprehensible way to get them to do so. People do wind up in unhappy situations and do themselves in, good idea or no.

[–] [email protected] 3 points 4 hours ago

Agree. Not enough info for me to judge. Maybe Lemmings shouldn't make this site into one for snap judgements and witch hints.