this post was submitted on 11 Sep 2023
1122 points (96.0% liked)

Technology

59039 readers
3158 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 82 points 1 year ago (8 children)

This is so so stupid. We should also sue the ISPs then, they enabled the use of YouTube and Reddit. And the phone provider for enabling communications. This is such a dangerous slippery slope to put any blame on the platforms.

[–] [email protected] 79 points 1 year ago (5 children)

I think the thing isn't just providing access to the content, but using algorithms to promote how likely it is for deranged people to view more and more content that fuel their motives for hateful acts instead of trying to reduce how often that content is seen, all because they make more money if they watch more content, wether it is harmful or not.

[–] [email protected] 30 points 1 year ago (1 children)

This.

I don't know about Reddit, but YouTube 100% drives engagement by feeding users increasingly flammable and hateful content.

[–] [email protected] 14 points 1 year ago

Hell they'll even take ad money to promote Jan 6th conspiracies

[–] [email protected] 27 points 1 year ago (1 children)

Yeah, the difference is in whether or not the company is choosing what to put in front of a viewer's eyes.

For the most part an ISP just shows people what they request. If someone gets bomb making directions from YouTube it would be insane to sue AT&T because AT&T delivered the appropriate packets when someone went to YouTube.

On the other end of the spectrum is something like Fox News. They hire every host, give them timeslots, have the opportunity to vet guests, accept advertising money to run against their content, and so on.

Section 512 of the DMCA treats "online service providers" like YouTube and Reddit as if they're just ISPs, merely hosting content that is generated by users. OTOH, YouTube and Reddit use ML systems to decide what the users are shown. In the case of YouTube, the push to suggest content to users is pretty strong. You could argue they're much closer to the Fox News side of things than to the ISP side these days. There's no human making the decisions on what content should be shown, but does that matter?

[–] [email protected] 9 points 1 year ago (1 children)

Yep. I often fall asleep to long YouTube videos that are science or history related. The algorithm is the reason why I wake up at 3am to Joe Rogan. It’s like a terrible autocomplete.

[–] [email protected] 0 points 1 year ago

The algorithm is tailored to you. This says more about you. I never get recommended Rogan.

[–] [email protected] 10 points 1 year ago

Absolutely. I saw a Google ad the other day from maybe PragerU that was about climate change not being real, while I was searching for an old article that was more optimistic about outcomes. They actually said by the ad that they were showing it as a suggested thing, and thankfully you could report it, which I did immediately. It pissed me off a ton.

A friend recently shared a similar suggested video/ad they got on YouTube, which was saying "Ukrainians are terrorists". PragerU or TPUSA.

I can see the argument for allowing these ads to exist as a freedom of speech thing, fine. But actively promoting these ads is very different. The lawsuit would have merits on this. I'd prefer if this content was actively minimized, but at the very least it shouldn't be promoted.

[–] [email protected] 1 points 1 year ago

What if it isn't algorithms but upvotes? What if Lemmy is next?

[–] [email protected] 26 points 1 year ago

If you were head of a psychiatric ward and had an employee you knew was telling patients "Boy, I sure wish someone would kill as many black people as they could", you would absolutely share responsibility when on of them did exactly that.

If you were deliberately pairing that employee with patients who had shown violent behaviour on the basis of "they both seem to like violence", you would absolutely share responsibility for that violence.

This isn't a matter of "there's just so much content, however can we check it all?".

Reddit has hosted multiple extremist and dangerous communities, claiming "we're just the platform!" while handing over the very predictable post histories of mass shooters week after week.

YouTube has built an algorithm and monetisation system that is deliberately designed to lure people down rabbit holes then done nothing to stop it luring people towards domestic terrorism.

It's a lawsuit against companies worth billions. They're not being executed. There are grounds to accuse them of knowingly profiting from the grooming of terrorists and if they want to prove that's not the case, they can do it in court.

[–] [email protected] 23 points 1 year ago

Do ISPs actively encourage you to watch extremist content? Do they push that content toward people who are at risk of radicalization to get extra money?

[–] [email protected] 12 points 1 year ago

the isps don't encourage people to see content that makes them mad

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago)

Utilities aren't the same thing as platforms.

But giant media platforms run by giant tech corportations who have repeatedly shown that they don't give a shit about people? If they're not putting railguards on their algorithm and content out of choice and are consequently creating mass murderers, then they should be regulated to have some railguards.

No corporation has proven that it will make the best choices for society, it's up to people to force them to.

[–] [email protected] 4 points 1 year ago (1 children)

I think to blame/sue the company that is nearest to the user should work fine. (following is hyperbolical) If you don't do it that way, then yes it would be slippery because the big bang would need to be sued. But that makes no sense.

[–] [email protected] 16 points 1 year ago (5 children)

So if an attack is planned via mail you think we should sue the postal service? The phone company if it's done over the phone?

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago) (1 children)

No, because these things should be private. Social media however needs some kind of moderation. edit: also go blame the user too, but that should be a given

[–] [email protected] -2 points 1 year ago (1 children)

I think just the poster should suffice, we should leave the platforms out of it. If anything, it helps to out the assholes who would post stuff that enables this.

[–] [email protected] 4 points 1 year ago

Blocking a user and removing content from a platform should be relatively easy and fast which should prevent organized crimes. Sueing someone afterwords takes way more resources and time.

But a platform can remove content without getting sued. Why sue them too? Because if you don't sue their asses they don't care.

Of course moderation takes time and can't be perfect and this should be considered when suing the platform owners. And yes this could help the assholes, but I think you can report such behavior to the fbi or someone.

[–] [email protected] 9 points 1 year ago

Change mail (private) to moderated public notice board (not private). The owner of the public notice board should probably be sued for allowing the content to stay up.

[–] [email protected] 9 points 1 year ago (2 children)

If my buddies and spend a month plotting a crimer in my cousin's spare room, the cousin would be complicit since he knowingly allowed us to use his property for a criminal conspiracy. The USPS doesn't know what i am sending in the mail since they are a common carrier.

[–] [email protected] 2 points 1 year ago
[–] [email protected] 0 points 1 year ago

Actually, they'd just try to seize his house, since proving his complicity is more challenging than proving that the house was used for the planning of a crime.

[–] [email protected] 3 points 1 year ago

Is the postal service intentionally increasing mail to people interested in attacks by people messaging that attacks are necessary? If the postal service is doing that to increase the total postal volume, then yes, we should.

[–] [email protected] 4 points 1 year ago

I agree that his parents are culpable.

[–] [email protected] 3 points 1 year ago (1 children)

Then what just give up hold Youtube account for their actions

[–] [email protected] 6 points 1 year ago (1 children)

This comment would have really benefitted from some punctuation

[–] [email protected] 1 points 1 year ago (1 children)

THen why not just give up, and never hold youtube accountable for their actions

[–] [email protected] 3 points 1 year ago

So... Punctuation and a few extra words lol