this post was submitted on 01 Aug 2023
527 points (82.4% liked)

Technology

59414 readers
3123 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 39 points 1 year ago (23 children)

These biases have always existed in the training data used for ML models (society and all that influencing the data we collect and the inherent biases that are latent within), but it’s definitely interesting that generative models now make these biases much much more visible (figuratively and literally with image models) to the lay person

[–] [email protected] 5 points 1 year ago (22 children)

But they know the AI's have these biases, at least now, shouldn't they be able to code them out or lessen them? Or would that just create more problems?

Sorry, I'm no programer so I have no idea if thats even possible or not. Just sounds possible in my head.

[–] [email protected] 12 points 1 year ago (2 children)

That's not how it works. You don't just "program out the biases" you have to retain the model with more inclusive training data.

[–] [email protected] 2 points 1 year ago (1 children)

even then, it will always move towards the "average" of all combined training data, unless prompted otherwise.

[–] [email protected] -2 points 1 year ago

No, that's not how it works either.

load more comments (19 replies)
load more comments (19 replies)