this post was submitted on 18 Jul 2024
346 points (96.3% liked)

Technology

59366 readers
3990 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 23 points 4 months ago* (last edited 4 months ago) (2 children)

Could a human have judged it better? Maybe not. I think a better question to ask is, "Should anyone be sent back into a violent domestic situation with no additional protection, no matter the calculated risk?" And as someone who has been on the receiving end of that conversation and later narrowly escaped a total-family-annihilation situation, I would say no...no one should be told that, even though they were in a terrifying, life-threatening situation, they will not be provided protection, and no further steps will be taken to keep them from being injured again, or from being killed next time. But even without algorithms, that happens constantly...the only thing the algorithm accomplishes is that the investigator / social worker / etc doesn't have to have any kind of personal connection with the victim, so they don't have to feel some kind of way for giving an innocent person a death sentence because they were just doing what the computer told them to.

Final thought: When you pair this practice with the ongoing conversation around the legality of women seeking divorce without their husband's consent, you have a terrifying and consistently deadly situation.

[–] [email protected] 6 points 4 months ago (1 children)

the only thing the algorithm accomplishes is that the investigator / social worker / etc doesn’t have to have any kind of personal connection with the victim

This even works for people pulling the trigger. Following orders, sed lex dura lex, et cetera ad infinitum.

[–] [email protected] 3 points 4 months ago

Yep! For all the psych nerds, it's pretty much a direct lift of the Milgram Shock Experiment