this post was submitted on 05 Nov 2023
218 points (94.3% liked)

Technology

59197 readers
3415 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

AI companies have all kinds of arguments against paying for copyrighted content::The companies building generative AI tools like ChatGPT say updated copyright laws could interfere with their ability to train capable AI models. Here are comments from OpenAI, StabilityAI, Meta, Google, Microsoft and more.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

Who said anything about "do whatever they want"? They should obviously comply with the law.

When a human reads a comment here on Lemmy and learns something they didn't know before - copyright law doesn't stop them from using that knowledge. The same rule should apply to AI.

In my opinion if you don't want AI to learn from your work, then you shouldn't allow humans to learn from it either. That's fine - everyone has the right to keep their work private if they choose to do so... but if you make it publicly available, then you don't get to control who learns from it.

You can control who makes exact replicas of it, and if AI is doing that then sure - charge the company with copyright infringement - but generally that's not how these systems work. They generally don't produce exact copies except for highly structured content where there isn't much creative flexibility (and those tend to not be protected under copyright by the way - they would be protected by patents).

[–] [email protected] 4 points 1 year ago (1 children)

Computers aren't people. AI "learning" is a metaphorical usage of that word. Human learning is a complex mystery we've barely begun to understand, whereas we know exactly what these computer systems are doing; though we use the word "learning" for both, it is a fundamentally different process. Conflating the two is fine for normal conversation, but for technical questions like this, it's silly.

It's perfectly consistent to decide that computers "learning" breaks the rules but human learning doesn't, because they're different things. Computer "learning" is a a new thing, and it's a lot more like creating replicas than human learning is. I think we should treat it as such.

[–] [email protected] 2 points 1 year ago (1 children)

I’m so fed up trying to explain this to people. People thing LLMs are real GAI and are treating them as such.

Computers do not learn like humans. It cannot, and should not be regulated in the same way.

[–] [email protected] 2 points 1 year ago

Yes 100%. Once you drop the false equivalence, the argument boils down to X does Y and therefore Z should be able to do Y, which is obviously not true, because sometimes we need different rules for different things.