174
Nope. No. (lemmy.world)
submitted 4 months ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 0 points 4 months ago

I'm sorry to break this to you - but you probably weren't in the training dataset enough for the model to learn of your online presence - yes llms will currently hallucinate when they don't have enough data points (until they learn their own limitations) - but that's not a fundamentally unsolvable problem (not even top 10 i'd say)

there already are models that consider their knowledge and just apologize if they can't answer instead of making shit up. (eg claude)

[-] [email protected] 2 points 4 months ago

Considering these LLMs are being integrated with search engines in a way that might work toward replacing them, don't you think their training should include knowing who someone is that a bunch of hits come up for when you Google them?

this post was submitted on 18 May 2024
174 points (90.7% liked)

Just Post

555 readers
18 users here now

Just post something ๐Ÿ’›

founded 11 months ago
MODERATORS