this post was submitted on 19 Apr 2024
125 points (88.8% liked)
Asklemmy
43908 readers
1315 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Theorists and futurologists refer to it as the 'Great Filter' .... a series of challenges that civilizations go up against which determines if they make it past the filter or not.
Our current filters are climate change, nuclear war and artificial intelligence ... will we use nuclear tech or AI to benefit ourselves? Will we work towards dealing with climate change? or will us acting negatively with all this be the cause of our regression ... or destruction?
We have equal capability at this point ... we are just as capable of collectively solving these problems .... or using them to destroy ourselves.
Our collective futures are most definitely in our own hands ... whether or not we use those hands for good or ill is up to us.
AI right now is not a challenge. It's inflated vapourware.
AI isn't a challenge to those who know better, the rest are already building their cults about it: some say it will save us from downfall, others saying it will create the downfall. The sad part is that either group could be right, as it's all a self-fulfilling prophecy and just requires enough people participating in the myth to make it happen.
And I reject the "vapourware" label. Machine Learning has a lot of potential for the future, especially as we break out of standard Von Neumann architecture and experiment with different types of computers/computing. Will it ever do what the consumers currently expect it to do? No. Will it continue to develop and grow into it's own domain of computing? I'd bet on it.