365
Rule elitism (lemmy.world)
submitted 2 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 53 points 2 months ago

The only reason you care is because you've been conditioned to attack anything that could harm your income-potential.

Instead of fighting "AI", how about we fight for a world where artists don't have to monetarily justify their existence?

[-] [email protected] 69 points 2 months ago* (last edited 2 months ago)

Or maybe artist should be able to not justify their existence monetarily and also not have their art fucking stolen and murdered to generate terrible pseudo art lmao.

[-] [email protected] 32 points 2 months ago

This. Art is expression. Wtf is AI art expressing?

[-] [email protected] 15 points 2 months ago

Whatever the artist using the AI tool is trying to express?

[-] [email protected] 24 points 2 months ago

An AI doesn't understand what human emotion is.

[-] [email protected] 18 points 2 months ago
[-] [email protected] 19 points 2 months ago

This is a false equivalency. Unless the paintbrush is stolen I guess. 🙄

[-] [email protected] 6 points 2 months ago

That's unprovable without some very strict definitions, but if we take it as a given (and for the record I don't disagree, so we should) then that's why the ai isn't the artist. It's just a tool an artist could use. MS Paint isn't an artist either, and like ai neither are many of the people using it, but it still can be used to create art.

[-] [email protected] 5 points 2 months ago

Meh, better approach it to assume it doesn't understand emotion unless proven otherwise. Does a fork understand what human emotion is? A pillow? You wouldn't assume that either I guess.

[-] [email protected] 3 points 2 months ago

So which of us are p-zombies? We've encountered the same problem by suggesting that human beings have consciousness or self awareness, or get what qualia are, except we can't prove that anyone has any of these things. The difference of AI consciousness within its development community is a sorites paradox. Large AI packages like GPT-4 have more awareness than previous versions, but not as much awareness as humans. But it Chat GPT4 does exceed human control subjects in the Turing test.

Mind you the Turing is only one of several tests we use to rate how advanced AI is, but we can't be sure even when an AI can make coffee given a machine and supplies, and construct flat-packed furniture given the IKEA visual instructions, that this counts as AGI, or is sentient.

Right now, there are artists who use generative AI to create art, and it is as much really art as photography was really art when illustrators were complaining they are just using a machine to replicate a real scene. As much as music production and music synthesis are art.

Now yes, I get that AI presents risks of workers losing income and their capacity to survive, but every time we toss our sabots into the gearworks to break the machines, we're kicking overthrow of the system down the line, until we're where we are today, not only looking at the dissolution of our democracy so that industrialists may continue to exist, but also the destruction of our habitat, because we can't address what makes them money.

So capitalism is going to end you either way, unless you end it first. And I expect if you actually tried to make a fortune on your art, you would eventually find yourself selling out all your rights to one of the big corporate controllers, and they would own everything you did, and pay you a pittance for it... Unless you are James Hetfield kind of skilled and lucky. Somehow I doubt you are.

[-] [email protected] 2 points 2 months ago

What about a cat, or a person who's different from you? It's just as impossible to prove, and yet...

[-] [email protected] 3 points 2 months ago

Welcome to radical constructivism :) The question whether other people or cats can experience emotions is in fact a problem people have been thinking about quite a lot. Answers are not very satisfactory, but one way to think about it (e.g., some constructivists would do that) is that assuming they do have a conscience simplifies your world model. In the case of "AI" though, we have good alternative explanations for their behavior and don't need to assume they can experience anything.

The other important bit is that not assuming some phenomenon exists (e.g., "AI" can experience emotions) unless proven otherwise is the basis of modern (positivistic) science.

[-] [email protected] 1 points 2 months ago

assuming they do have a conscience simplifies your world model.

Does it? Feels more like it merely excludes them from your model, since your model cannot explain their conscience. If that simplifies your model, then you can apply the same thinking to anything you don't understand by simply saying it is similar to something else you also can't explain.

The other important bit is that not assuming some phenomenon exists (e.g., "AI" can experience emotions) unless proven otherwise

The problem with this isn't that it's literally unprovable, it's that proving it requires defining "can experience emotions" in a way everyone can agree on. Most trivial definitions that include everything we think ought obviously be included often bring in many things we often think ought be excluded, and many complicated definitions that prune out the things we think ought be excluded, often also cut out things we think should be included

[-] [email protected] 3 points 2 months ago

Does it?

Yes, in the sense that "thing moves around and does stuff" becomes more predictable if you assume a certain degree of consciousness. This is easier than "thing is at this position now, was at a different position before, was at yet another position before that". You reduce some of the complexity and unpredictability by introducing an explanation for these changes of world state. In my world, so far I worked well with the assumption that other humans and animals have some consciousness and at least I'm not aware of any striking evidence that would raise doubt on that.

The problem with this isn’t that it’s literally unprovable

Yes, that's a problem, but it's relatively similar to the other one. It's actually quite hard to "prove" anything with real world connection. However, in the case of other humans/animal consciousness, evidence suggests so (at least for me). The evidence in the case of "AI" is different, though. For example, they seem to have no awareness of time and no awareness of the world beyond the limited context of a conversation. Besides a fancy marketing term that suggests there is something similar to living beings involved, what we currently see are admittedly impressive programs that run on statistics, but I don't need to assume any "consciousness" to explain what they do.

[-] [email protected] 1 points 2 months ago

You reduce some of the complexity and unpredictability by introducing an explanation for these changes of world state

My concern is that "consciousness" isn't so much an explanation as it is a sort of heuristic. We feel conscious and have an internal experience, so it seems pretty reasonable to say that such a thing exists, but other than one's own self, there's no point where it is certain to exist, and no clear criteria or mechanism that we can point to.

What about the p-zombie, the human person who just doesn't have an internal experience and just had a set of rules, but acts like every other human? What about a cat, who apparently has a less complex internal experience, but seems to act like we'd expect if it has something like that? What about a tick, or a louse? What about a water bear? A tree? A paramecium? A bacteria? A computer program?

There's a continuum one could construct that includes all those things and ranks them by how similar their behaviors are to ours, and calls the things close to us conscious and the things farther away not, but the line is ever going to be fuzzy. There's no categorical difference that separates one end of the spectrum from the other, it's just about picking where to put the line.

And yes, we have perhaps a better understanding of the mechanism behind how an ai gets from input to output than we do for a human, but it's not quite a complete one. And the mechanism for how humans get from an input to an output is similarly partially understood. We can see how the arrangement and function of nerve cells in a "brain" lead to the behaviors we see, and have even fully simulated the brains of some organisms with machine code. This is not so dissimilar from how a computational neural network is operated. The categorical difference of "well one is a computer" doesn't work when we have literally simulated an organic brain also on a computer.

[-] [email protected] 3 points 2 months ago

If I ask a painter to paint a landscape, who's making art, me or the painter?

Is the painter just a tool?

[-] [email protected] 3 points 2 months ago

You can't really have it both ways.

Is the things just a machine that's following instructions and synthesizing its training data into different things? Then it's a tool.

Is the things making choices and interpreting your inputs to produce a result? Then it's an artist.

The painter I buy a commission from is an artist. The ai I use to generate a scene is a tool.

[-] [email protected] 2 points 2 months ago

Was the "training data" produced by artists or tools?

[-] [email protected] 2 points 2 months ago

I mean, yes?

That's very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.

[-] [email protected] 2 points 2 months ago

You probably live in a different world than I do.

Don't chicken/egg this. All of the training data was man-made at some point. Until the first LLMs started outputting based on it.

Secondly, the amount of human-produced content and LLM-produced content that's in the training data is incomparable. And will continue to be so. Otherwise the models break.

[-] [email protected] 4 points 2 months ago

Art is aesthetic. It can express something but it doesn't have to.

[-] [email protected] 2 points 2 months ago

Not necessarily. Not all art is visual, not all art is made to create pleasing emosion. Not all is made to be beautiful.

[-] [email protected] 3 points 2 months ago

It doesn't have to be a pleasant aesthetic or visual. It could be anything from a full image to a font used on a resume to the choice of words used in general to the way the email address sounds if you pronounce it out loud. It can be the sequence of smells, sounds, sights, taste, and feel of a single course or five course meal.

It can be puppets designed to last generations or an explosion that exists for a brief moment.

It can even be the cleverness of how a message is woven into an otherwise meaningless looking scene.

[-] [email protected] 2 points 2 months ago

What is it that humans are expressing when they are arting?

[-] [email protected] 4 points 2 months ago

Terrible pseudo art is what you get from hollywood and big music studios right now, for the most part.

[-] [email protected] 2 points 2 months ago

Nice of you to think AI won't just lead to even more formulaic stuff, but maybe from different megacorporations.

[-] [email protected] 3 points 2 months ago

All those poor OCs, dead forever! No human can ever draw them again now that AI ate them!

[-] [email protected] 23 points 2 months ago

The only reason you care is because you’ve been conditioned to attack anything that could harm your income-potential.

My ‘favorite’ is the argument that replacing jobs is what technology is meant to do.

This isn’t just a job. If I won the lotto tomorrow, if I had billions and billions of dollars and never had to make another cent in my life, I would still be writing. Art is not just a production, it is a form of communication, between artist and audience, even if you never see them.

Writing has always been something like tossing a message in a bottle into a sea of bottles and hoping someone reads it. Even if the arguments that AI can never replace human writing in terms of quality is true, we’re still drowned out by the noise of it.

It really revs up the ol’ doomer instinct in me.

[-] [email protected] 7 points 2 months ago

Even if the arguments that AI can never replace human writing in terms of quality is true, we’re still drowned out by the noise of it.

It's a good point but honestly the internet is mostly just noise and it's not a problem we're going to solve. It's something we have to learn to live with. If you take more than a passing interest in an art, you should be able to find an island in the ocean of noise with like-minded people.

[-] [email protected] 10 points 2 months ago

That world can’t exist if all the artists get starved out of existence before it comes about.

[-] [email protected] 5 points 2 months ago
[-] [email protected] 8 points 2 months ago

Okay, go ahead.

[-] [email protected] 10 points 2 months ago

The issue is, the goal of AI companies and AI believers is the replacement of artists. They see more traditional forms of art as obsolete. Especially the AI believers, who do everything they can (including trying to generate fake art progresses, and entering art contests with AI generated images) to "disrupt the art community".

[-] [email protected] 6 points 2 months ago

The only way this could work is communism, but I don't think tech bros will call for that any time soon.

this post was submitted on 13 Jul 2024
365 points (100.0% liked)

196

16243 readers
2143 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS