this post was submitted on 01 Sep 2024
315 points (97.0% liked)
RPGMemes
10339 readers
288 users here now
Humor, jokes, memes about TTRPGs
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This won't fix it but it might help.
Make sure you have a robots.txt file with a crawl delay set for all agents once every 30 seconds and that you are disallowing most of the WordPress directories such as WP admin, the media directory, etc.
I would also strongly recommend that you use a caching system if you are not using one. It's a lot more efficient to serve the same image a hundred times to different bots from the ram than loading it off your drive.
Just my personal opinions working in a web hosting environment.
That'll probably help if it's i/o issues.
most of these AI scrapers don't respect robots.txt, so I'm not sure that really helps much, but... we have tried doing all of these things.
Someone on lemmy suggested to create a dummy endpoint that normal people won't be able to navigate to, and disallow it in robots.txt
Then when somebody crawls it you know they are ignoring robots.txt, and you ip ban them
That's pretty clever.
I think that these AI scrapers might be smart enough that this doesn't really work though - at least if I were designing them I'd have them all come from dynamic IPs and not have any of them bother hitting the same target more than once. These things are very dedicated to acquiring content without consent, and if they're capable of causing problems for (say) Reddit, I'm not sure my little website is going to have much luck deterring them.
Honestly a better strategy might be to just glaze everything I draw.
I am not sure if it costs money, but you could implement captchas.
Or use cloudflare to do that bot detecting for you.
Worst case you make it so you need to create an account to see content.
Well, we are already using cloudflare, that's one of the other reasons why the site is so slow... I don't think the other two suggestions prevent a scraper from requesting the information from the server... I think they'd just make it more arduous for real people to access the content.
I doubt that will help, they can still scrape the site and then wait until whatever version of Glaze was applied is cracked.
Instead of a tech solution, why not a legal one? Place somewhere in the website that refusal to follow your robots.txt is agreement to pay you x amount of money for your content. Then combine that with the dummy page solution the other person brought up so you can record the IP address, then take them to court so they pay you. Has potential to bring you a really really nice chunk of money.
I believe that there are multiple very high profile billion-dollar lawsuits being run against AI companies right now. I don't really have the budget to sue these companies.
Yeah, you might need some combination of fail2ban for rude AI and cloudflare caching or something.
Whoever their host is, they already appear to have some type of load balancing based on the four IPS. But I would also agree that a free cloudflare account does wonders for most WordPress users. But that's probably mostly because it filters out a shitload of bots and known bad actors. Just make sure you set up your origin certificates if you use a cloudflare account.
We are already using cloudflare.
Have you also enabled Bot Fight Mode? (There's a setting to "Block AI bots" that seems useful in your situation)
I shall mention it to my web admin