this post was submitted on 18 Jun 2024
93 points (100.0% liked)
TechTakes
1427 readers
131 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Statements from LLMs are to be seen as hallucinations unless proven otherwise by classic research.
We don't need a fancy word that makes it sound like AI is actually intelligent when talking about how AI is frequently wrong and unreliable. AI being wrong is like someone who misunderstood something or took a joke as literal repeating it as factual.
When people are wrong we don't call it hallucinating unless their senses are altered. AI doesn't have senses.
Does everyone else see this? These are the exact type of out of town haters we really want. I also think calling LLMs all but delusional is too generous and I mean that unironically.