this post was submitted on 24 Jul 2023
195 points (79.5% liked)

Technology

34821 readers
65 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

Not a good look for Mastodon - what can be done to automate the removal of CSAM?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 35 points 1 year ago* (last edited 1 year ago) (2 children)

Nothing you can do except go after server owners like usual. Has nothing to do with the fedi. Mastodon has nothing to do with either because anyone can pop up their own alternative server. This is one of many protocols they have or will use to distribute this stuff.

This just in: criminals are using the TCP protocol to distribute CP!!! What can the internet do to stop this? Oh yeah, go after server owners and groups like usual.

[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (1 children)

Things are a bit complicated in the fediverse. Sure, your instance might not host any pedo community, but if a user on your instance subscribe/interact with those community, the CSAMs might get federated into your instance without you noticing. There are tools to help you combat this, but as an instance owner you can't just assume it's not your problem if some other instance host pedo stuff.

[–] [email protected] 6 points 1 year ago

That is definitely alarming, and a downside of the fedi, but seems like a necessary evil. Unfortunately admins and mods of small communties in the fedi will be the ones exposed to this. There has been better methods if handling this though. There are shared block lists out there and they already have lists that block out undesirable stuff like that, so it at least minimizes the amount of innocent eyes of mods, who are just regular unpaid people, from seeing disgusting stuff. Also, obviously those instances should be reported to the police, fbi, or whatever the heck

[–] [email protected] 4 points 1 year ago (1 children)

There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content

Shadow banning those users would be nice too

[–] [email protected] 1 points 1 year ago

They are talking about AI generated images. That’s the volume part.