this post was submitted on 27 Jan 2024
31 points (100.0% liked)

NotAwfulTech

364 readers
2 users here now

a community for posting cool tech news you don’t want to sneer at

non-awfulness of tech is not required or else we wouldn’t have any posts

founded 1 year ago
MODERATORS
 

Remember how we were told that genAI learns "just like humans", and how the law can't say about fair use, and I guess now all art is owned by big tech companies?

Well, of course it's not true. Exploiting a few of the ways in which genAI --is not-- like human learners, artists can filter their digital art in such a way that if a genAI tool consumes it, it actively reduces the quality of the model, undoing generalization and bleading into neighboring concepts.

Can an AI tool be used to undo this obfuscation? Yes. At scale, however, doing so requires increasing compute costs more and more. This also looks like an improvable method, not a dead end -- adversarial input design is a growing field of machine learning with more and more techniques becoming highly available. Imagine this as sort of "cryptography for semantics" in the sense that it presents asymetrical work on AI consumers (while leaving the human eye much less effected).

Now we just need labor laws to catch up.

Wouldn't it be funny if not only does generative AI not lead to a boring dystopia, but the proliferation and expansion of this and similar techniques to protect human meaning eventually put a lot of grifters out of business?

We must have faith in the dark times. Share this with your artist friends far and wide!

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 14 points 10 months ago (1 children)

I am buying it. I don't think @[email protected] is pro AI art, just that countermeasures like Glaze and Nightshade are not great either, and I agree.

Artists don’t care about this or need it to be.

I care about it. Some artists use Debian. Please don't shit on people who care about software freedom, even if you don't.

Making artwork unusable by exploitative machine learning models is cool and based, but using a proprietary tool that's itself made by from the same pool of exploited artists' work is less so.

[–] [email protected] 11 points 10 months ago* (last edited 10 months ago)

yeah, seconding. nothing I've seen of @corbin's posting, here or otherwise, leads me to think that they're in favour of exploitation or the numerous other issues involved in this shit