this post was submitted on 08 Jun 2024
361 points (97.9% liked)
Technology
59169 readers
2227 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I largely agree, current LLMs add no capabilities to humanity that it did not already possess. The point of the regulation is to encourage a certain degree of caution in future development though.
Personally I do think it's a little overly broad. Google search can aid in a cyber security attack. The kill switch idea is also a little silly, and largely a waste of time dreamed up by watching too many Terminator and Matrix movies. While we eventually might reach a point where that becomes a prudent idea, we're still quite far away.
We're not anywhere near anything that has anything in common with human level intelligence, or poses any threat.
The only possible cause for support of legislation like this is either a completely absence of understanding of what the technology is combined with treating Hollywood as reality (the layperson and probably most legislators involved in this), or an aggressive market control attempt through regulatory capture by big tech. If you understand where we are and what paths we have forward, it's very clear that there's only harm that this can do.