this post was submitted on 29 Aug 2023
11 points (66.7% liked)

Australia

3588 readers
130 users here now

A place to discuss Australia and important Australian issues.

Before you post:

If you're posting anything related to:

If you're posting Australian News (not opinion or discussion pieces) post it to Australian News

Rules

This community is run under the rules of aussie.zone. In addition to those rules:

Banner Photo

Congratulations to @[email protected] who had the most upvoted submission to our banner photo competition

Recommended and Related Communities

Be sure to check out and subscribe to our related communities on aussie.zone:

Plus other communities for sport and major cities.

https://aussie.zone/communities

Moderation

Since Kbin doesn't show Lemmy Moderators, I'll list them here. Also note that Kbin does not distinguish moderator comments.

Additionally, we have our instance admins: @[email protected] and @[email protected]

founded 1 year ago
MODERATORS
top 17 comments
sorted by: hot top controversial new old
[–] [email protected] 25 points 1 year ago* (last edited 1 year ago) (1 children)

An AI would show you pictures of tables if you told it that's what a beautiful women is.

[–] [email protected] 15 points 1 year ago (1 children)

“nice legs” No not like that!

[–] [email protected] 5 points 1 year ago

I bet she's a table top, not a bottom.

[–] [email protected] 15 points 1 year ago (1 children)

Use a different model, this isn't hard. I hate when non-techies complain about problems they cause and act like its because of the tech.

[–] [email protected] 5 points 1 year ago (1 children)

Yeah this is nothing new at all. Society has been conditioning women around the world to some arbitrary standard of "beauty" for millennia. The same conditioning has the same effect in machine learning. Well, duh.

[–] [email protected] 2 points 1 year ago

More like it was trained on American and European women. You need an African model for africans, an asian model for asians, etc etc.

[–] [email protected] 9 points 1 year ago (1 children)

IIRC AI images are generated based on the data thats been fed and existing works to make them come out correctly. AI is just reinforcing in our own interpritation on what we as a collective see as "beautiful"

[–] [email protected] 5 points 1 year ago (2 children)

It reinforces Western interpretations. The patterns used to train generative AI are still biased and unreliable in that sense.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

It reinforces Western interpretations

No. It reinforces the interpretation of whoever trained the model. Those people don't have to be western.

[–] [email protected] 2 points 1 year ago (1 children)

It's not just about individual people, it's about where money and power is located. Where are the top universities located? In which sphere do the biggest, wealthiest and most powerful corporations and organisations operate? Western bias permeates every fabric of modern society because of how dominant Western culture has been throughout history in terms of money and power. It inevitably filters through to AI, because the data being used to train it is just a reflection of what is available. That happens to be data pulled from, or narrated by, Western culture.

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

Its free to train a model. You just need a computer you can let run for a few days, plus some labor

[–] [email protected] 3 points 1 year ago

You can train a model right now for free. It's all open source. You provide the data and get the output you want. AI picture generation is not magic, it has to draw it's info from SOMEWHERE.

This article is 200% nonsense.

[–] [email protected] 4 points 1 year ago

This is the best summary I could come up with:


"GenAI is a type of artificial intelligence powered by machine learning models," Shibani Antonette, a lecturer in data science and innovation at the University of Technology Sydney, told the ABC.

When looking at the viral AI images, Dr Antonette says the model that generated them likely "did not have a diverse training dataset that contained many faces of people of colour with varying skin tones and shapes".

"After all, data for these models are pulled from the entire internet over the last few decades — without accountability for coverage, diversity, and inclusion — to cater to specific applications."

She dyed her blonde and didn't worry about letting her brown skin get darker from being in the sun, admitting it was to make herself more appealing to the male gaze here.

A research study by Cornell University from March this year revealed how popular AI models produced images of men with lighter skin tones for high-paying jobs such as a "lawyer," "judge" or "CEO".

"Tech-developers and companies rolling out services should ensure that their AI is fair and equitable by diversifying their datasets and avoiding over-representation of certain groups of people," she says.


The original article contains 998 words, the summary contains 192 words. Saved 81%. I'm a bot and I'm open source!

[–] [email protected] 4 points 1 year ago

All AI promotes whatever the hell you feed it.

[–] [email protected] 1 points 1 year ago (1 children)

What does this have to do with Australia?

[–] [email protected] 1 points 1 year ago (1 children)

ABC News posted it and I thought it would be interesting to discuss from an Australian perspective (which the article was written from). It resulted in discussion.

What is your problem with it?

[–] [email protected] 1 points 1 year ago

Probably autism. Things not being orderly enough makes me go reeeeeeeeee! :)