52
‘We definitely messed up’: why did Google AI tool make offensive historical images?
(www.theguardian.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
was it really offensive or was it just "target selling pride clothes during pride month" offensive?
I don't know that "offensive" is the right word. More just "shitty" and "lazy".
Like they took the time out to teach it "diversity" but couldn't bother to train it past "diversity = people who are not white" or to acknowledge when the user is asking specifically for a white person or a different region or time period.
I, for one, welcome Japanese George Washington, Indian Hitler and Inuit Ghandi to our historical database.
Jojo Rabbit featured Jewish Maori Hitler and was very well received.
I think the lesson here is that political correctness isn't very machine learnable. Human history and modern social concerns are very complex in a precise way and really should be addressed with conventional rules and algorithms. Or manually, but that's obviously not scalable at all.