this post was submitted on 17 Jul 2023
181 points (95.5% liked)
Technology
59298 readers
6350 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's not being creative. It's generating a statistically likely facsimile with a seperate set of input parameters. It's sampling, but keeping the same pattern of beats even if the order of the notes changes.
Because I never think I can post it enough, Let’s forget the term AI. Let’s call them Systematic Approaches to Learning Algorithms and Machine Inferences (SALAMI).
So much confusion and prejudice is thrown into this discussion by the mere fact that they're called AIs. I don't believe they are intelligent anymore than I believe a calculator is.
And even if they are, the AIs don't have the needs the humans do. So we still must value the work of the humans more highly than the work of the AIs.
I agree with you on both points. Fixed texts in my comment from AI to generative tech, mostly because I honestly dont fully have a good grasp on what exactly can be considered intelligence.
But your second point, I think, is more important, at least to me. We can have debates on what AI/AGI or whatever is, the thing that matters right now and in years (even months) to come is that we as humans have multiple needs.
We need to work, some of our work requires generating something (code, arts, blueprints, writing) that may be replaceable by these techs really soon. Such work takes years, even decades, of training and experience, especially domain knowledge experience that is invaluable to issues such as necessary human interaction, communication, bias detection and resolution, ... Yet within a couple of years, if all of that effort might get replaced by a bot (that might have more unintended consequences but cut costs), instead of augmented/assisted, many of us would struggle to have a job for living while the companies that build these profit and benefit from that.
Though, at what point does sampling become coherentism from philosophy? In the end, whether an AI performs "coherently" is all that matters. I think we are amazed now at ChatGPT because of the quality LLM from 2021 but that value will degrade or become less "coherent" over time, i.e. model collapse.