351
??????? rule (files.catbox.moe)
submitted 11 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 49 points 11 months ago* (last edited 11 months ago)

There's one I saw yesterday where the prompt was a Bugatti and some other supercar in a steampunk style.

The AI just threw back a silver and gold Arkham series Batmobile.

Edit: Because it's fun, I tested Bing's image generator. They auto block on the word Disney, (e.g. a mouse in the style of Walt Disney)
But they allow "a mouse in the style of Steamboat Willie."

Also I think Dall-e is being used to make prodigious amounts of porn. Pretty much anything I tried with the word "woman" gets content blocked in Bing. "Woman eating fried chicken" is blocked. Not blocked for "Man eating fried chicken."

[-] [email protected] 26 points 11 months ago

Bing is actually omega redpilled and hates women.

[-] [email protected] 21 points 11 months ago

Oh nice all women get to experience what lesbians have been for a while. Welcome sisters to being treated as inherently pornographic, you don’t get used to it

[-] [email protected] 7 points 11 months ago* (last edited 11 months ago)

So I played around some more. If I used the term "woman" I had to add that they were clothed or add the clothing they were wearing, for one of them I had to add fully clothed and specify "a full suit".

I went back over the one time it worked previously and they were nude, it only passed because it put the scene in silhouette, and that apparently enabled it to get passed the sensors.

But it had absolutely no issue reproducing Iron Man and Ultron in a two word prompt and the absolute scariest is that it can make reproductions of big celebrities.

[-] [email protected] 3 points 11 months ago

Yeah that’s the thing, it’s not even surprising. Men are socially treated as the default in our culture and especially by tech people (who are overwhelmingly men and surrounded by other men in social and professional contexts). The cultural sexualization of women showing in llms is exactly what one would expect to happen because when people are looking to use it to create a person doing a thing they’re lookin for that thing, same for a man, but for women it’s often for porn. And I would be shocked if that wasn’t a problem google had to actively combat early on.

In short, more tech people need to read feminist theory as it relates to what they’re making

[-] [email protected] 5 points 11 months ago

Oh yeah I played with an AI image generator yesterday and I couldn't convince the AI to stop drawing boobs... Also any image of a woman was marked as NSFW

this post was submitted on 20 Oct 2023
351 points (100.0% liked)

196

16245 readers
2639 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS