this post was submitted on 19 Sep 2023
26 points (100.0% liked)
literature.cafe meta
215 readers
1 users here now
Literature.cafe's meta community for announcements, requests, bug reports. etc etc.
Please follow the instance's rules
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
pictrs-safety is ready to host. Just need someone to dockerize to make it easy for y'all to catch CSAM before it's even federated.
The fact it requires a GPU isn't really feasible to host for me, as well as the fact idk how comfortable I am with potential legal liability it might open.
I am planning to add a feature to scan for porn (not csam) via ai horde which will bypass the need for gpu if you want to use that m
Great work!
How does this work? I get that it's a docker setup, but I mean what does this actually do and how, for picture safety against CSAM?
It scans each picture with AI before it's uploaded to determine if it's potential CSAM, and if it passes the thesholds, blocks the upload
What is the AI engine? Who controls the models for it? Does it interface with any known CSAM databases?
You can read about it here https://github.com/db0/fedi-safety
That links to an entirely different repository/software. Does pictrs-safety require fedi-safety? Are they independent ? Does pictrs-safety upload to fedi-safety on... your servers? Someone elses running fedi-safety? Does the user using pictrs-safety require a fedi-safety installation working too?
Lots of unanswered questions.
They're both independent from each other but meant to work together. However you can optionally plug anything else you want on pictrs safety.
You should run on your own systems.