this post was submitted on 24 Oct 2023
35 points (80.7% liked)

Digital Art

7052 readers
2 users here now

Community rules:

How to post:

Please follow the convention of the images already uploaded so far i.e.:

Image title by Artists Name

In the description link the source to the image, and also include a direct link to the artists gallery. See previous posts for examples.

What to post:

You can post your own work here, but avoid spamming.

You can post your favourite peices here for us all to enjoy.

--

All artworks are copyright of the artists named in the posts.

Artists gallery links may contain NSFW works.

--

founded 1 year ago
MODERATORS
 

Not a member, but thought I'd share this for the artists here in case they haven't seen the news.

top 9 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 1 year ago* (last edited 1 year ago) (2 children)

Only prolonging the inevitable with this. Kinda like DRM in video games, this is going to do literally nothing to the people that want the data except maybe be a minor inconvenience for a month or two.

Wasn't the last attempt at this defeated by only 16 lines of Python code?

[–] [email protected] 6 points 1 year ago

It's not delaying anything. It won't work outside of the paper.

If you draw fantasy cats, and you bias towards pointillism dogs, and someone else draws cubist cats, so they bias towards anime dogs, you dilute the effect of the biasing data as multiple axes are flipped.

And this assumes that all artists drawing cats agree on biasing towards dogs and not that some cat artists bias towards horses and others towards cows, which again dilutes any signal to just be noise.

It had a measurable effect in what were effectively artificial lab conditions to get the authors attention with a clickbaity pitch for the paper, but in the real world this is completely worthless right out of the gate.

[–] [email protected] 1 points 1 year ago

No idea, but it's worth a shot, worst it can do is nothing.

[–] [email protected] 3 points 1 year ago (3 children)

If this takes off, it will be illegal within a year.
There'll be a new Digital Sabotage Act, written "with input from industry leaders" and voted on without debate or enough time to read the bill.

[–] [email protected] 6 points 1 year ago

If this takes off, it will be bypassed within a month. Adversarial training is something Stable Diffusion users already invented, and we use it to make our artwork better by poisoning the dataset to teach the network what a wrong result looks like. They reinvented our wheel.

[–] [email protected] 4 points 1 year ago

It's fine, as outside of laboratory conditions it won't work anyways (diverse image "reverse labels" would erase their signal to noise ratio of biased pixels across aggregate real world training data), so no need to stress about any kind of reaction to it.

[–] [email protected] 2 points 1 year ago

Wouldn't surprise me

[–] [email protected] 1 points 1 year ago (1 children)

I've heard of this recently! I like how they use the term "poison" because it makes me imagine that AI non-art is a bunch of evil aristocrats and Nightshade is the cyanide we're slipping into their beverages.

[–] [email protected] 2 points 1 year ago

Chuckles, "I like that, but I suspect they're using calling it poison because of the phrase 'poisoning the well'