138
submitted 3 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 13 points 3 months ago

I think solving the AI hallucination problem — I think that’ll be fixed.

Wasn't this an unsolvable problem?

[-] [email protected] 20 points 3 months ago* (last edited 3 months ago)

it's unsolvable because it's literally how LLMs work lol.

though to be fair i would indeed love for them to solve the LLMs-outputting-text problem.

[-] [email protected] 2 points 3 months ago

Yeah. We need another program to control the LLM tbh.

[-] [email protected] 5 points 3 months ago

Sed Quis custodiet ipsos custodes = But who will control the controllers?

Which in a beautiful twist of irony is thought to be an interpolation in the texts of Juvenal (in manuscript speak, an insert added by later scribes)

this post was submitted on 03 Jun 2024
138 points (100.0% liked)

TechTakes

1277 readers
83 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS