Wether we like it or not AI is here to stay, and in 20-30 years, it’ll be as embedded in our lives as computers and smartphones are now.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Is there a "young man yells at clouds meme" here?
"Yes, you're very clever calling out the hype train. Oooh, what a smart boy you are!" Until the dust settles...
Lemmy sounds like my grandma in 1998, "Pushah. This 'internet' is just a fad.'"
Right, it did have an AI winter few decades ago. It's indeed here to stay, it doesn't many any of the current company marketing it right now will though.
AI as a research field will stay, everything else maybe not.
It's like the least popular opinion I have here on Lemmy, but I assure you, this is the begining.
Yes, we'll see a dotcom style bust. But it's not like the world today wasn't literally invented in that time. Do you remember where image generation was 3 years ago? It was a complete joke compared to a year ago, and today, fuck no one here would know.
When code generation goes through that same cycle, you can put out an idea in plain language, and get back code that just "does" it.
I have no idea what that means for the future of my humanity.
I agree with you but not for the reason you think.
I think the golden age of ML is right around the corner, but it won’t be AGI.
It would be image recognition and video upscaling, you know, the boring stuff that is not game changing but possibly useful.
I'm just praying people will fucking quit it with the worries that we're about to get SKYNET or HAL when binary computing would inherently be incapable of recreating the fast pattern recognition required to replicate or outpace human intelligence.
Moore's law is about similar computing power, which is a measure of hardware performance, not of the software you can run on it.
Hopefully this means the haters will shut up and we can get on with using it for useful stuff
You're no no longer using the term Luddite on us! Character development!
I don't think AI is ever going to completely disappear, but I think we've hit the barrier of usefulness for now.
I just want computer parts to stop being so expensive. Remember when gaming was cheap? Pepperidge farm remembers. You used to be able to build a relatively high end pc for less than the average dogshit Walmart laptop.