this post was submitted on 08 Aug 2023
101 points (96.3% liked)

Technology

59414 readers
2767 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Innocent pregnant woman jailed amid faulty facial recognition trend::US police departments continue to use the tech despite low accuracy and obvious mismatches.

top 3 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 1 year ago

If u wanna beat china, first you gotta learn to become china. GG USA.

[–] [email protected] 9 points 1 year ago

This is the best summary I could come up with:


According to The New York Times, this incident is the sixth recent reported case where an individual was falsely accused as a result of facial recognition technology used by police, and the third to take place in Detroit.

Advocacy groups, including the American Civil Liberties Union of Michigan, are calling for more evidence collection in cases involving automated face searches, as well as an end to practices that have led to false arrests.

A 2020 post on the Harvard University website by Alex Najibi details the pervasive racial discrimination within facial recognition technology, highlighting research that demonstrates significant problems with accurately identifying Black individuals.

Further, a statement from Georgetown on its 2022 report said that as a biometric investigative tool, face recognition "may be particularly prone to errors arising from subjective human judgment, cognitive bias, low-quality or manipulated evidence, and under-performing technology" and that it "doesn’t work well enough to reliably serve the purposes for which law enforcement agencies themselves want to use it."

The low accuracy of face recognition technology comes from multiple sources, including unproven algorithms, bias in training datasets, different photo angles, and low-quality images used to identify suspects.

Reuters reported in 2022, however, that some cities are beginning to rethink bans on face recognition as a crime-fighting tool amid "a surge in crime and increased lobbying from developers."


I'm a bot and I'm open source!

[–] [email protected] 6 points 1 year ago

The old fashioned AI reference chart:

Family Guy Skin Colour chart