this post was submitted on 24 Oct 2024
17 points (100.0% liked)

MoreWrite

117 readers
1 users here now

post bits of your writing and links to stuff you’ve written here for constructive criticism.

if you post anything here try to specify what kind of feedback you would like. For example, are you looking for a critique of your assertions, creative feedback, or an unbiased editorial review?

if OP specifies what kind of feedback they'd like, please respect it. If they don't specify, don't take it as an invite to debate the semantics of what they are writing about. Honest feedback isn’t required to be nice, but don’t be an asshole.

founded 1 year ago
MODERATORS
 

(This is an expanded version of two of my comments [Comment A, Comment B] - go and read those if you want)

Well, Character.ai got themselves into some real deep shit recently - repeat customer Sewell Setzer shot himself and his mother, Megan Garcia, is suing the company, its founders and Google as a result, accusing them of "anthropomorphising" their chatbots and offering “psychotherapy without a license.”, among other things and demanding a full-blown recall.

Now, I'm not a lawyer, but I can see a few aspects which give Garcia a pretty solid case:

  • The site has "mental health-focused chatbots like “Therapist” and “Are You Feeling Lonely,” which Setzer interacted with" as Emma Roth noted writing for The Verge

  • Character.ai has already had multiple addiction/attachment cases like Sewell's - I found articles from Wired and news.com.au, plus a few user testimonies (Exhibit A, Exhibit B, Exhibit C) about how damn addictive the fucker is.

  • As Kevin Roose notes for NYT "many of the leading A.I. labs have resisted building A.I. companions on ethical grounds or because they consider it too great a risk". That could be used to suggest character.ai were being particularly reckless.

Which way the suit's gonna go, I don't know - my main interest's on the potential fallout.

Some Predictions

Win or lose, I suspect this lawsuit is going to sound character.ai's death knell - even if they don't get regulated out of existence, "our product killed a child" is the kind of Dasani-level PR disaster few companies can recover from, and news of this will likely prompt any would-be investors to run for the hills.

If Garcia does win the suit, it'd more than likely set a legal precedent which denies Section 230 protection to chatbots, if not AI-generated content in general. If that happens, I expect a wave of lawsuits against other chatbot apps like Replika, Kindroid and Nomi at the minimum.

As for the chatbots themselves, I expect they're gonna rapidly lock their shit down hard and fast, to prevent themselves from having a situation like this on their hands, and I expect their users are gonna be pissed.

As for the AI industry at large, I suspect they're gonna try and paint the whole thing as a frivolous lawsuit and Garcia as denying any fault for her son's suicide , a la the "McDonald's coffee case". How well this will do, I don't know - personally, considering the AI industry's godawful reputation with the public, I expect they're gonna have some difficulty.

top 2 comments
sorted by: hot top controversial new old
[–] [email protected] 12 points 2 months ago

If Garcia does win the suit, it'd more than likely set a legal precedent which denies Section 230 protection to chatbots, if not AI-generated content in general.

I’m not gonna lie, that would be hilarious just for the monkey paw effect. They want chatbots to take the place of real employees? Let’s start with holding them to the same standards and treat everything they shit out as first-party content.

[–] [email protected] 5 points 2 months ago

their users are gonna be pissed.

And I'm afraid a lot of them will be pissed for the wrong reasons.