this post was submitted on 01 Aug 2023
527 points (82.4% liked)
Technology
59340 readers
5735 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is just dumb rage-bait. At worst this shows a bias in training data, probably because the AI was developed in a majority white country that used images of majority white people to train it.
And likely its not even that. The AI has no concept of race, so doesnt know to make white people white and asian people asian, so would also be likely to do the reverse.
This is a real issue for non-white people. Sure in this instance it’s trivial and doesn’t have any major impact - yet.
It again highlights the necessity to diversify your training data but time and time again we have this white bias issue.
If you don’t think that paints a picture of a bleaker future when AI tools are more advanced and widespread yet STILL operating on biases, you’re being naive.
LoRAs already exist to fix this issue