this post was submitted on 28 Jan 2024
380 points (95.2% liked)

Technology

59169 readers
2838 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

GenAI tools ‘could not exist’ if firms are made to pay copyright::undefined

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 20 points 9 months ago* (last edited 9 months ago) (11 children)

So... This may be an unpopular question. Almost every time AI is discussed, a staggering number of posts support very right-wing positions. EG on topics like this one: Unearned money for capital owners. It's all Ayn Rand and not Karl Marx. Posters seem to be unaware of that, though.

Is that the "neoliberal Zeitgeist" or what you may call it?

I'm worried about what this may mean for the future.

ETA: 7 downvotes after 1 hour with 0 explanation. About what I expected.

[–] [email protected] 9 points 9 months ago (1 children)

I think it's a conflation of the ideas of what copyright should be and actually is. I don't tend to see many people who believe copyright should be abolished in its entirety, and if people write a book or a song they should have some kind of control over that work. But there's a lot of contention over the fact that copyright as it exists now is a bit of a farce, constantly traded and sold and lasting an aeon after the person who created the original work dies.

It seems fairly morally constant to think that something old and part of the zeitgeist should not be under copyright, but that the system needs an overhaul when companies are using your live journal to make a robot call center.

[–] [email protected] 3 points 9 months ago (3 children)

Lemmy seems left-wing on economics in other threads. But on AI, it's private property all the way, without regard for the consequences on society. The view on intellectual property is that of Ayn Rand. Economically, it does not get further to the right than that.

My interpretation is that people go by gut feeling and never think of the consequences. The question is, why does their gut give them a far-right answer? One answer is that somehow our culture, at present, fosters such reactions; that it is the zeitgeist. If that's the truth (and this reflects a wider trend) then inequality will continue to increase as a result of voter's demands.

[–] [email protected] 5 points 9 months ago (1 children)

My interpretation is that people go by gut feeling and never think of the consequences.

Often, yes.

The question is, why does their gut give them a far-right answer?

The political right exploits fear, and the fear of AI hits close to home. Many people either have been impacted, could be impacted, or know someone who could be impacted, either by AI itself or by something that has been enabled by or that has been blamed on AI.

When you’re afraid and/or operating from a vulnerable position, it’s a lot easier to jump on the anti-AI bandwagon. This is especially true when the counter-arguments address their flawed reasoning rather than the actual problems. They need something to fix the problem, not a sound argument about why a particular attempt to do so is flawed. And when this problem is staring you in the face, the implications of what it would otherwise mean just aren’t that important to you.

People are losing income because of AI and our society does not have enough safety nets in place to make that less terrifying. If you swap “AI” for “off-shore outsourcing” it’s the same thing.

The people arguing in favor of AI don’t have good answers for them about what needs to happen to “fix the problem.” The people arguing against AI don’t need to have sound arguments to appeal to these folks since their arguments sound like they could “fix the problem.” “If they win this lawsuit against OpenAI, ChatGPT and all the other LLMs will be shut down and companies will have to hire real people again. Anthropic even said so, see!”

UBI would solve a lot of the problems, but it doesn’t have the political support of our elected officials in either party and the amount of effort to completely upend the makeup of Congress is so high that it’s obviously not a solution in the short term.

Unions are a better short-term option, but that’s still not enough.

One feasible solution would be legislation restricting or taxing the use of AI by corporations, particularly when that use results in the displacement of human laborers. If those taxes were then used to support those same displaced laborers, then that would both encourage corporations to hire real people and lessen the sting of getting laid off.

I think another big part of this is that there’s a certain amount of feeling helpless to do anything about the situation. If you can root for the folks with the lawsuit, then that’s at least something. And it’s empowering to see that people like you - other writers, artists, etc. - are the ones spearheading this, as opposed to legislators.

But yes, the more that people’s fear is exploited and the more that they’re misdirected when it comes to having an actual solution, the worse things will get.

[–] [email protected] 1 points 9 months ago (1 children)

The fear angle makes a lot of sense, but I wonder how many people are really so immediately threatened that it would cloud their judgment.

[–] [email protected] 1 points 9 months ago

Well, when you consider that more than 60% of Americans are living paycheck to paycheck - I’d say a lot of them.

[–] [email protected] 3 points 9 months ago (1 children)

Yeah I think that this is showing a lot of people only really care about espousing anti-privatization ideas as long as it suits their personal interests and as long as they feel they have more to gain than to lose. People are selfish, and a lot of progressive, or really any kind of passionate rhetoric is often conveniently self-serving and emotionally driven, rather than truly principled.

[–] [email protected] 1 points 9 months ago

You're not wrong but how many people here are actually pursuing their own personal interest. Most people here are probably wage-earners. Yet so many people support giving more money to property owners without any kind of requirement or incentive for work. Just a rent for property owners. It feels like this should be met with knee-jerk rejection.

[–] [email protected] 1 points 9 months ago (1 children)

I'm not sure what you're referring to as a far-right position?

  • AI corporations should have the right to all works in order to train their AIs.
  • Copyright needs to be enforced.

The first is very pro-corporation in one way, but can lead to an argument for all intellectual works to be public domain.

The second is pro-mega-rights-owners, but also allows someone to write a story, publish it themselves, and make money without having it stolen from them.

[–] [email protected] 1 points 9 months ago

Fair use has always been a thing in the US.

The US constitutions allows congress to limit the freedom of the press with these words: To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries.

This has no room for fuck you, I got mine. Framing the abolition of fair use as enforcing copyright is an absolute lie.

The view of copyright as some sort of absolute property right that can be exercised against the public is a far right position. (I'd argue that's true for all property rights but that's a different subject.) What makes it far right is that it implies unfettered, heritable power for a small elite. Saying that everyone has an equal right to property, as such, is so inane that it is worthy only of ridicule. The law, in its majestic equality, forbids rich and poor alike to sleep under bridges, to beg in the streets, and to steal their bread.

The NYT is suing for money. It owns the copyright to all those articles published in the last century; all already paid. Every cent licensing fee is pure profit for the owners; beautiful shareholder value. Benefit to society? Zero. But you have to enforce copyright. It's property! You wouldn't want some corporation to steal the cardboard boxes of the homeless.

[–] [email protected] 6 points 9 months ago (2 children)

I see way too many people advocating for copyright. I understand in this case it benefits big companies rather than consumers, but if you disagree with copyright, as I do, you should be consistent.

[–] [email protected] 7 points 9 months ago (3 children)

Copyright law should benefit humans, not machines, not corporations. And no, corporations are not people. Anthony Kennedy can get bent.

[–] [email protected] 1 points 9 months ago

I hate the MAGMA companies as much as anyone, but AI such as LLMs, especially the open source stuff Facebook and Stable diffusion is making, is beneficial to us all.

[–] [email protected] 0 points 9 months ago

Abolishing copyright in the way that allows for the existence of Gen AI benefits people far more than it does corpos

[–] [email protected] 0 points 9 months ago
[–] [email protected] 5 points 9 months ago (1 children)

You don't have to be against copyright, as such. Fair Use is part of copyright law. It exists to prevent copyrights from being abused against the interests of the general public.

[–] [email protected] 1 points 9 months ago (1 children)

But I am against any copyright beyond forcing attribution to the original creator.

[–] [email protected] 4 points 9 months ago (3 children)
[–] [email protected] 2 points 9 months ago (1 children)

Here's your works cited for any generative AI:

Humanity. “The Entire Publicly Accessible Internet .” The World Wide Web, , 1 Jan. 1983, WWW.org.

[–] [email protected] 2 points 9 months ago

I doubt that covers it.

[–] [email protected] 2 points 9 months ago

AI creators, at least the open source ones, are usually pretty open about where they got the training data for their model

[–] [email protected] 0 points 9 months ago

At the very least, every AI should be able to spit out a comprehensive list of all the material it used for training. And it should be capable of removing any specific item and regenerating its algorithm.

This is a fundamental requirement of the technology itself to function. What happens if one the training materials has a retraction? Or if the authors admit they used AI to generate it? You need to purge that knowledge to keep the AI healthy and accurate.

[–] [email protected] 6 points 9 months ago* (last edited 9 months ago) (1 children)

It's interesting as it's many of the MPAA/RIAA attitudes towards Napster/BitTorrent but now towards gen AI.

I think it reflects the generational shift in who considers themselves content creators. Tech allowed for the long tail to become profitable content producers, so now there's a large public audience that sees this from what's historically been a corporate perspective.

Of course, they are making the same mistakes because they don't know their own history and thus are doomed to repeat it.

They are largely unaware that the MPAA/RIAA fighting against online sharing of media meant they ceded the inevitable tech to other companies like Apple and Netflix that developed platforms that navigated the legality alongside the tech.

So for example right now voice actors are largely opposing gen AI rather than realizing they should probably have their union develop or partner for their own owned offering which maximizes member revenues off of usage and can dictate fair terms.

In fact, the only way many of today's mass content creators have platforms to create content is because the corporate fights to hold onto IP status quo failed with platforms like YouTube, etc.

Gen AI should exist in a social construct such that it is limited in being able to produce copyrighted content. But policing training/education of anything (human or otherwise) doesn't serve us and will hold back developments that are going to have much more public good than most people seem to realize.

Also, it's unfortunate that we've effectively self propagandized for nearly a century around 'AI' being the bad guy and at odds with humanity, misaligned with our interests, an existential threat, etc. There's such an incredible priming bias right now that it's effectively become the Boogeyman rather than correctly being identified as a tool that - like every other tool in human history - is going to be able to be used for good or bad depending on the wielder (though unlike past tools this one may actually have a slight inherent and unavoidable bias towards good as Musk and Gab recently found out with their AI efforts on release denouncing their own personally held beliefs).

[–] [email protected] 3 points 9 months ago

That was a novel perspective for me. Thanks.

[–] [email protected] 4 points 9 months ago (1 children)

I'd say the main reason is companies are profiting off the work of others. It's not some grand positive motive for society, but taking the work of others, from other companies, sure, but also from small time artists, writers, etc.

Then selling access to the information they took from others.

I wouldn't call it a right wing position.

[–] [email protected] 2 points 9 months ago (1 children)

Wanting to abolish the IRS is a right-wing policy that will benefit the rich. That doesn't change when some marketing genius talks about how the IRS takes money from small time artists, writers, etc. Same thing. It's about substance and not manipulative framing.

[–] [email protected] 1 points 9 months ago (1 children)

That isn't remotely similar...

The IRS takes a portion of income. This is taking away someone's income, then charging access to it.

Like it or not, these people need money to survive. Calling it right wing to think these individuals deserve to be paid for someone taking their work, then using it for a product they sell access to, is absolutely insane to me.

[–] [email protected] 0 points 9 months ago (1 children)

I don't know how this is supposed to make sense.

[–] [email protected] 1 points 9 months ago (1 children)

One is a percentage of income that everyone pays into.

The other is stealing someone's work then using that person's work for profit.

Recognizing that stealing someone's work is not a right-wing position.

How is this complicated?

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

I see. Thanks for explaining.

This view of property rights as absolute is what right-libertarians, anarcho-capitalists, etc... espouse. Usually the cries of "theft" come when it gets to taxes, though. Is it supposed to be not right because it's about intellectual property?

Property rights are not necessarily right-wing (communism notwithstanding). What is definitely right-wing is (heritable) privilege and that's implied in these views of property.

ETA: Just to make sure that I really understand what you are saying: When you say "stealing someone's work" you do mean the unauthorized copying of copyrighted expression, yes? Do you actually understand that copyright is intellectual property and that property is not usually called work? Labor and capital are traditionally considered opposites, of a sort, particularly among the left.

[–] [email protected] 1 points 9 months ago (1 children)

So... You think their art or writing was created by what then? Magic? Do you think no time was expended in the creation of books, research, drawings, painted canvases, etc?

Do you think they should starve because we currently live in a world driven entirely around money?

I don't get your point even remotely.

[–] [email protected] 1 points 9 months ago (1 children)

I am just pointing out the meaning of words; originally just left vs right-wing.

Labor is not capital. The factories owned by Tesla were built by workers, just like the robots in them. Time was expended on their design. And yet, all that is still property. When some worker in such a factory takes a wrench home for personal use, then they are not stealing the work of Elon Musk or the other share-holders.


To make a point about policy: None of the owners of the NYT, or Getty, or others like them will starve because of fair use. They are rich people, they will stay rich, and I see no reason to give them more money simply because they own a lot of intellectual property. Anyone at actual risk of starving will only be hurt by sending more of the national income to the top.

US copyright exists "To promote the Progress of Science and useful Arts". The idea is that this can be achieved by introducing a profit motive. Requiring license fees for existing, publicly accessible works, can't conceivably serve this purpose. It seems obvious that it will only hurt the purpose.

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

You realize it wasn't just massive corporations that had their works taken, right? And that is a key reason why so many people are concerned about an AI/LLM company taking the work of others, and using it for their own profit?

Thinking this is limited to NYT or Getty is ignorant at best.

Not to mention the migration of that goal post.

[–] [email protected] 1 points 9 months ago

I am aware what the datasets contain.

I don't see non-profit AI drawing less rage, so I don't believe that the concern is about AI being used for profit. Maybe, when you say "for their own profit" this is another special expression like "taken" (by which, I believe, you mean copying without authorization)?

I don't really know why so many people are coming through for the rich. I am not an eat the rich kinda guy, but giving them money for nothing is just absolutely bonkers to me. What I know is that a lot of people were simply hoodwinked. I strongly suspect that others feel that they have to support that because of some ideological conviction. But eventually, I simply don't know what's going on with that. It's why I originally posed the question.

IDK what you mean by "migration of that goal post".

[–] [email protected] 3 points 9 months ago

Yeah...it's pretty weird. Feels like some folks have really dived into LLMs regardless of ethics and will do any amount of hand waving to avoid criticism of a for-profit company openly attacking creatives' livelihood with their own uncompensated works. In an ideal world where it wasn't a case of "earn or die cold and alone in the streets", sure, but this is just robbing those workers of the fruits of their labor and burning the ladder while

I think the "neoliberal zeitgeist" thought may be correct as neoliberal ideology devalues anything and everything that is not solely profit-driven, including just about everything that humans have historically found to make life meaningful.

[–] [email protected] 2 points 9 months ago

As an aside, when I browse TheGatewayPundit comments on AI articles, it is a lot more open, against legislation, and woke than I would expect!

[–] [email protected] 1 points 9 months ago

Every single poster here has relied on disruptive technologies in their life. They don't even realize that they couldn't even make these arguments here if it was not for people before them pushing the envelope.

They don't know the history of their technology nor corporate law. If they did they would just roll their eyes every time an entrenched economic interest started saber rattling about the next disruptive technology that is going to steal their profits.

The posters here are the people who complained about horsewhip manufactures that were going out of business because of cars. They are ignorant and act like the few sound bytes they heard make them an expert.

[–] [email protected] 0 points 9 months ago (1 children)

It's important to recognize that IP is conceptually fucky to begin with. They're seeing what it's claimed to be (creator 'ownership' of their creations) rather than what it really is (corporations using the government to enact violence on non-violent people).

It means nothing interesting. The position they feel they're taking is "corporation bad" which is in line, they just haven't analyzed how IP works in the real world.

[–] [email protected] 2 points 9 months ago (3 children)

So because corps abuse copyright, that means I should be fine with AI companies taking whatever I write--all the journal entries, short stories, blog posts, tweets, comments, etc.--and putting it through their model without being asked, and with no ability to opt out? My artist friends should be fine with their art galleries being used to train the AI models that are actively being used to deprive them of their livelihood without any ability to say "I don't want the fruits of my labor to be used in this way?"

[–] [email protected] 1 points 9 months ago (1 children)

This is the problem people have

They don’t see artists and creators as worth protecting. They’d rather screw over every small creator and take away control of their works, just because “it’d be hard to train without copyrighted data”

Plenty of creators would opt in if given the option, but I’m going to guess a large portion will not.

I don’t want my works training what will replace me, and right now copyright is the only way we can defend what was made.

[–] [email protected] 2 points 9 months ago

It's like nobody here actually knows someone who is actually creative or has bothered making anything creative themselves

I don't even have a financial interest in it because there's no way my job could be automated, and I don't have any chance of making any kind of money off my trash. I still wouldn't let LLMs train with my work, and I have a feeling that the vast majority of people would do the same

[–] [email protected] 0 points 9 months ago

The concept of copyright is insane to begin with. Corps don't make it bad - it starts out bad.

It's an invented right.

[–] [email protected] 0 points 9 months ago (1 children)

I don't know if your fears about your friends' livelihood are justified, but cutting down on fair use will not help at all. In fact, it would make their situation worse. Think through what would actually happen.

When you publish something you have to accept that people will make it their own to some degree. Think parody or R34. It may be hurtful, but the alternative is so much worse.

[–] [email protected] 1 points 9 months ago (1 children)

Huh? How does that follow at all? Judging that the specific use of training LLMs--which absolutely flunks the "amount and substantiality of the portion taken" (since it's taking the whole damn work) and "the effect on the market" (fucking DUH) tests--isn't fair use in no way impacts parody or R34. It's the same kind of logic the GOP uses when they say "if the IRS cracks down on billionaires evading taxes then Blue Collar Joe is going to get audited!"

Fuck outta here with that insane clown logic.

[–] [email protected] 0 points 9 months ago (1 children)

I think you would find it easier to help your friends if you approached the matter with reason rather than emotion. Your take on fair use isn't is missing a lot, but that's beside the point.

Assume you get what you ~~wanted~~ are asking for. What then?

[–] [email protected] 1 points 9 months ago (4 children)

Yeah, no, stop with the goddamn tone policing. I have zero interest in vagueposting and high-horse riding.

As for what I want, I want generative AI banned entirely, or at minimum restricted to training on works that are either in the public domain, or that the person creating the training model received explicit, opt-in consent to use. This is the supposed gold standard everyone demands when it comes to the widescale collection and processing of personal data that they generate just through their normal, everyday activities, why should it be different for the widescale collection and processing of the stuff we actually put our effort into creating?

load more comments (4 replies)
load more comments (3 replies)