this post was submitted on 28 Jul 2023
1385 points (98.9% liked)
Technology
59197 readers
3588 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm going to take a wild stab in the dark that all the false positives were black men.
For the same reason that my Echo dot (aka Spotify Bitch) will ignore my wife but cheerfully respond to my mumbled requests from three rooms away. If you make all this shit in Silicon Valley, it will work best for people of a similar demographic to those that work there.
The white liberals building this technology say they're all progressive yet only surround themselves with people like them and only build products for people like them. A lack of diversity in tech like this is a lack of good testing.
Also AI is taught by its creator. Tech has some of it’s most well hidden, bigotted, mid-level white people refusing to critically question their own bias and privilege. There’s a shit tone of that fragile masculinity in the tech industry just hard coding it into it.
There was a guy fired from google for writing a manifesto about how women aren’t ‘wired’ for tech. And that’s just the one that waved his crazy flag out in the open so no one in upper management could easily keep on ignoring it.
While I agree with you 100% that programming can be affected by the programmers biases, there's a much simpler problem that face recognition was having a hard time overcoming. At least when it was a main topic about a decade ago, sensors were having a lot of problems with the low contrast of some black people's faces. Anyone who's had a black friend and was a shutter bug will know what kind of problems you can run into when trying to get a proper exposure and not make a black person disappear completely from a photograph. It was just an inherent limitation of the technology they were using. The last statistics I read was something like between 20 to 30% positive matches, which we know damn well is too low for it to be a workable technology. The success rate on Caucasian and lighter skin tones weren't even that great. There was still something like a 60% false positive match rate. The software may have gotten better over the past decade but we all know that whether it did or not, they're still going to use it.
This isn’t image manipulation of the 1990s. You assume it’s set on isolated pixels with massive contrast. It’s calculated by neighbor to achieve the pattern.
This is just a result of inconsideration driving laziness that they’d crop to a median level of the graphic to cater to the skin with less reflection and reads light easier and then releasing it as ‘done’. Software is much more sophisticated than you’re giving credit. But It’s only being used to that potential in such industry as film.