I’m very interested in this case and am curious to see where the courts draw the line here.
Beware of an incoming hot take - I don’t see the concept of training AI on published works as much different than a human learning from published works as long as they both go on to make their own original works. I have definitely seen AIs straight-up plagiarize before, but that seems like a different issue entirely from producing similar works. I think allowing plagiarism is a problem with the constraints of the training rather than a fundamental problem with the entire concept of AI training.
Beware of an incoming hot take - I don’t see the concept of training AI on published works as much different than a human learning from published works as long as they both go on to make their own original works.
The fact that this is considered a "hot take" is depressing.
It’s much less of a hot take for people in the tech community, but it is for many artists and creatives who feel threatened by AI’s potential to devalue what they’ve dedicated their lives to.
They should have felt threatened by the sheer weight of an incredibly oversaturated industry, sabotaging itself with a system that rewards the lucky and punishes 99.99% of the people that try to get into it. Everybody else who "made it" are practicing survivorship bias to justify their career choices.
Leaps in AI technology was just another barbell added to the pile.
I’m very interested in this case and am curious to see where the courts draw the line here.
Beware of an incoming hot take - I don’t see the concept of training AI on published works as much different than a human learning from published works as long as they both go on to make their own original works. I have definitely seen AIs straight-up plagiarize before, but that seems like a different issue entirely from producing similar works. I think allowing plagiarism is a problem with the constraints of the training rather than a fundamental problem with the entire concept of AI training.
The fact that this is considered a "hot take" is depressing.
It’s much less of a hot take for people in the tech community, but it is for many artists and creatives who feel threatened by AI’s potential to devalue what they’ve dedicated their lives to.
They should have felt threatened by the sheer weight of an incredibly oversaturated industry, sabotaging itself with a system that rewards the lucky and punishes 99.99% of the people that try to get into it. Everybody else who "made it" are practicing survivorship bias to justify their career choices.
Leaps in AI technology was just another barbell added to the pile.
Agreed